user0034@10-111-34-14:~/caffe$ cd python; python classify.py --raw_scale 255 ../101_ObjectCategories/cougar_face/image_0033.jpg ../result.npy; cd .. WARNING: Logging before InitGoogleLogging() is written to STDERR I1121 07:57:59.845842 8486 gpu_memory.cpp:82] GPUMemory::Manager initialized CPU mode W1121 07:57:59.958019 8486 _caffe.cpp:172] DEPRECATION WARNING - deprecated use of Python interface W1121 07:57:59.958217 8486 _caffe.cpp:173] Use this instead (with the named "weights" parameter): W1121 07:57:59.958223 8486 _caffe.cpp:175] Net('../models/bvlc_reference_caffenet/deploy.prototxt', 1, weights='../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel') I1121 07:57:59.960197 8486 net.cpp:86] Initializing net from parameters: name: "CaffeNet" state { phase: TEST level: 0 } layer { name: "data" type: "Input" top: "data" input_param { shape { dim: 10 dim: 3 dim: 227 dim: 227 } } } layer { name: "conv1" type: "Convolution" bottom: "data" top: "conv1" convolution_param { num_output: 96 kernel_size: 11 stride: 4 } } layer { name: "relu1" type: "ReLU" bottom: "conv1" top: "conv1" } layer { name: "pool1" type: "Pooling" bottom: "conv1" top: "pool1" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "norm1" type: "LRN" bottom: "pool1" top: "norm1" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 } } layer { name: "conv2" type: "Convolution" bottom: "norm1" top: "conv2" convolution_param { num_output: 256 pad: 2 kernel_size: 5 group: 2 } } layer { name: "relu2" type: "ReLU" bottom: "conv2" top: "conv2" } layer { name: "pool2" type: "Pooling" bottom: "conv2" top: "pool2" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "norm2" type: "LRN" bottom: "pool2" top: "norm2" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 } } layer { name: "conv3" type: "Convolution" bottom: "norm2" top: "conv3" convolution_param { num_output: 384 pad: 1 kernel_size: 3 } } layer { name: "relu3" type: "ReLU" bottom: "conv3" top: "conv3" } layer { name: "conv4" type: "Convolution" bottom: "conv3" top: "conv4" convolution_param { num_output: 384 pad: 1 kernel_size: 3 group: 2 } } layer { name: "relu4" type: "ReLU" bottom: "conv4" top: "conv4" } layer { name: "conv5" type: "Convolution" bottom: "conv4" top: "conv5" convolution_param { num_output: 256 pad: 1 kernel_size: 3 group: 2 } } layer { name: "relu5" type: "ReLU" bottom: "conv5" top: "conv5" } layer { name: "pool5" type: "Pooling" bottom: "conv5" top: "pool5" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "fc6" type: "InnerProduct" bottom: "pool5" top: "fc6" inner_product_param { num_output: 4096 } } layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6" } layer { name: "drop6" type: "Dropout" bottom: "fc6" top: "fc6" dropout_param { dropout_ratio: 0.5 } } layer { name: "fc7" type: "InnerProduct" bottom: "fc6" top: "fc7" inner_product_param { num_output: 4096 } } layer { name: "relu7" type: "ReLU" bottom: "fc7" top: "fc7" } layer { name: "drop7" type: "Dropout" bottom: "fc7" top: "fc7" dropout_param { dropout_ratio: 0.5 } } layer { name: "fc8" type: "InnerProduct" bottom: "fc7" top: "fc8" inner_product_param { num_output: 1000 } } layer { name: "prob" type: "Softmax" bottom: "fc8" top: "prob" } I1121 07:57:59.960325 8486 net.cpp:116] Using FLOAT as default forward math type I1121 07:57:59.960335 8486 net.cpp:122] Using FLOAT as default backward math type I1121 07:57:59.960338 8486 layer_factory.hpp:172] Creating layer 'data' of type 'Input' I1121 07:57:59.960343 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.960355 8486 net.cpp:205] Created Layer data (0) I1121 07:57:59.960361 8486 net.cpp:547] data -> data I1121 07:57:59.960386 8486 net.cpp:265] Setting up data I1121 07:57:59.960403 8486 net.cpp:272] TEST Top shape for layer 0 'data' 10 3 227 227 (1545870) I1121 07:57:59.960410 8486 layer_factory.hpp:172] Creating layer 'conv1' of type 'Convolution' I1121 07:57:59.960414 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.960441 8486 net.cpp:205] Created Layer conv1 (1) I1121 07:57:59.960445 8486 net.cpp:577] conv1 <- data I1121 07:57:59.960453 8486 net.cpp:547] conv1 -> conv1 I1121 07:57:59.960752 8486 net.cpp:265] Setting up conv1 I1121 07:57:59.960763 8486 net.cpp:272] TEST Top shape for layer 1 'conv1' 10 96 55 55 (2904000) I1121 07:57:59.960772 8486 layer_factory.hpp:172] Creating layer 'relu1' of type 'ReLU' I1121 07:57:59.960777 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.960781 8486 net.cpp:205] Created Layer relu1 (2) I1121 07:57:59.960783 8486 net.cpp:577] relu1 <- conv1 I1121 07:57:59.960793 8486 net.cpp:532] relu1 -> conv1 (in-place) I1121 07:57:59.960798 8486 net.cpp:265] Setting up relu1 I1121 07:57:59.960805 8486 net.cpp:272] TEST Top shape for layer 2 'relu1' 10 96 55 55 (2904000) I1121 07:57:59.960811 8486 layer_factory.hpp:172] Creating layer 'pool1' of type 'Pooling' I1121 07:57:59.960817 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.960824 8486 net.cpp:205] Created Layer pool1 (3) I1121 07:57:59.960827 8486 net.cpp:577] pool1 <- conv1 I1121 07:57:59.960830 8486 net.cpp:547] pool1 -> pool1 I1121 07:57:59.960845 8486 net.cpp:265] Setting up pool1 I1121 07:57:59.960853 8486 net.cpp:272] TEST Top shape for layer 3 'pool1' 10 96 27 27 (699840) I1121 07:57:59.960856 8486 layer_factory.hpp:172] Creating layer 'norm1' of type 'LRN' I1121 07:57:59.960860 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.960866 8486 net.cpp:205] Created Layer norm1 (4) I1121 07:57:59.960870 8486 net.cpp:577] norm1 <- pool1 I1121 07:57:59.960876 8486 net.cpp:547] norm1 -> norm1 I1121 07:57:59.960887 8486 net.cpp:265] Setting up norm1 I1121 07:57:59.960893 8486 net.cpp:272] TEST Top shape for layer 4 'norm1' 10 96 27 27 (699840) I1121 07:57:59.960896 8486 layer_factory.hpp:172] Creating layer 'conv2' of type 'Convolution' I1121 07:57:59.960901 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.960911 8486 net.cpp:205] Created Layer conv2 (5) I1121 07:57:59.960916 8486 net.cpp:577] conv2 <- norm1 I1121 07:57:59.960918 8486 net.cpp:547] conv2 -> conv2 I1121 07:57:59.963378 8486 net.cpp:265] Setting up conv2 I1121 07:57:59.963397 8486 net.cpp:272] TEST Top shape for layer 5 'conv2' 10 256 27 27 (1866240) I1121 07:57:59.963405 8486 layer_factory.hpp:172] Creating layer 'relu2' of type 'ReLU' I1121 07:57:59.963410 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.963415 8486 net.cpp:205] Created Layer relu2 (6) I1121 07:57:59.963418 8486 net.cpp:577] relu2 <- conv2 I1121 07:57:59.963421 8486 net.cpp:532] relu2 -> conv2 (in-place) I1121 07:57:59.963426 8486 net.cpp:265] Setting up relu2 I1121 07:57:59.963429 8486 net.cpp:272] TEST Top shape for layer 6 'relu2' 10 256 27 27 (1866240) I1121 07:57:59.963433 8486 layer_factory.hpp:172] Creating layer 'pool2' of type 'Pooling' I1121 07:57:59.963435 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.963444 8486 net.cpp:205] Created Layer pool2 (7) I1121 07:57:59.963448 8486 net.cpp:577] pool2 <- conv2 I1121 07:57:59.963449 8486 net.cpp:547] pool2 -> pool2 I1121 07:57:59.963455 8486 net.cpp:265] Setting up pool2 I1121 07:57:59.963459 8486 net.cpp:272] TEST Top shape for layer 7 'pool2' 10 256 13 13 (432640) I1121 07:57:59.963462 8486 layer_factory.hpp:172] Creating layer 'norm2' of type 'LRN' I1121 07:57:59.963465 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.963474 8486 net.cpp:205] Created Layer norm2 (8) I1121 07:57:59.963479 8486 net.cpp:577] norm2 <- pool2 I1121 07:57:59.963481 8486 net.cpp:547] norm2 -> norm2 I1121 07:57:59.963495 8486 net.cpp:265] Setting up norm2 I1121 07:57:59.963515 8486 net.cpp:272] TEST Top shape for layer 8 'norm2' 10 256 13 13 (432640) I1121 07:57:59.963518 8486 layer_factory.hpp:172] Creating layer 'conv3' of type 'Convolution' I1121 07:57:59.963521 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.963527 8486 net.cpp:205] Created Layer conv3 (9) I1121 07:57:59.963531 8486 net.cpp:577] conv3 <- norm2 I1121 07:57:59.963534 8486 net.cpp:547] conv3 -> conv3 I1121 07:57:59.970557 8486 net.cpp:265] Setting up conv3 I1121 07:57:59.970577 8486 net.cpp:272] TEST Top shape for layer 9 'conv3' 10 384 13 13 (648960) I1121 07:57:59.970583 8486 layer_factory.hpp:172] Creating layer 'relu3' of type 'ReLU' I1121 07:57:59.970587 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.970593 8486 net.cpp:205] Created Layer relu3 (10) I1121 07:57:59.970597 8486 net.cpp:577] relu3 <- conv3 I1121 07:57:59.970599 8486 net.cpp:532] relu3 -> conv3 (in-place) I1121 07:57:59.970604 8486 net.cpp:265] Setting up relu3 I1121 07:57:59.970607 8486 net.cpp:272] TEST Top shape for layer 10 'relu3' 10 384 13 13 (648960) I1121 07:57:59.970610 8486 layer_factory.hpp:172] Creating layer 'conv4' of type 'Convolution' I1121 07:57:59.970613 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.970620 8486 net.cpp:205] Created Layer conv4 (11) I1121 07:57:59.970623 8486 net.cpp:577] conv4 <- conv3 I1121 07:57:59.970626 8486 net.cpp:547] conv4 -> conv4 I1121 07:57:59.975913 8486 net.cpp:265] Setting up conv4 I1121 07:57:59.975934 8486 net.cpp:272] TEST Top shape for layer 11 'conv4' 10 384 13 13 (648960) I1121 07:57:59.975940 8486 layer_factory.hpp:172] Creating layer 'relu4' of type 'ReLU' I1121 07:57:59.975944 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.975948 8486 net.cpp:205] Created Layer relu4 (12) I1121 07:57:59.975952 8486 net.cpp:577] relu4 <- conv4 I1121 07:57:59.975955 8486 net.cpp:532] relu4 -> conv4 (in-place) I1121 07:57:59.975960 8486 net.cpp:265] Setting up relu4 I1121 07:57:59.975962 8486 net.cpp:272] TEST Top shape for layer 12 'relu4' 10 384 13 13 (648960) I1121 07:57:59.975966 8486 layer_factory.hpp:172] Creating layer 'conv5' of type 'Convolution' I1121 07:57:59.975970 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.975980 8486 net.cpp:205] Created Layer conv5 (13) I1121 07:57:59.975982 8486 net.cpp:577] conv5 <- conv4 I1121 07:57:59.975986 8486 net.cpp:547] conv5 -> conv5 I1121 07:57:59.979518 8486 net.cpp:265] Setting up conv5 I1121 07:57:59.979535 8486 net.cpp:272] TEST Top shape for layer 13 'conv5' 10 256 13 13 (432640) I1121 07:57:59.979544 8486 layer_factory.hpp:172] Creating layer 'relu5' of type 'ReLU' I1121 07:57:59.979548 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.979552 8486 net.cpp:205] Created Layer relu5 (14) I1121 07:57:59.979555 8486 net.cpp:577] relu5 <- conv5 I1121 07:57:59.979559 8486 net.cpp:532] relu5 -> conv5 (in-place) I1121 07:57:59.979563 8486 net.cpp:265] Setting up relu5 I1121 07:57:59.979566 8486 net.cpp:272] TEST Top shape for layer 14 'relu5' 10 256 13 13 (432640) I1121 07:57:59.979571 8486 layer_factory.hpp:172] Creating layer 'pool5' of type 'Pooling' I1121 07:57:59.979574 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.979579 8486 net.cpp:205] Created Layer pool5 (15) I1121 07:57:59.979583 8486 net.cpp:577] pool5 <- conv5 I1121 07:57:59.979585 8486 net.cpp:547] pool5 -> pool5 I1121 07:57:59.979593 8486 net.cpp:265] Setting up pool5 I1121 07:57:59.979598 8486 net.cpp:272] TEST Top shape for layer 15 'pool5' 10 256 6 6 (92160) I1121 07:57:59.979600 8486 layer_factory.hpp:172] Creating layer 'fc6' of type 'InnerProduct' I1121 07:57:59.979604 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:57:59.979629 8486 net.cpp:205] Created Layer fc6 (16) I1121 07:57:59.979635 8486 net.cpp:577] fc6 <- pool5 I1121 07:57:59.979638 8486 net.cpp:547] fc6 -> fc6 I1121 07:58:00.286357 8486 net.cpp:265] Setting up fc6 I1121 07:58:00.286404 8486 net.cpp:272] TEST Top shape for layer 16 'fc6' 10 4096 (40960) I1121 07:58:00.286417 8486 layer_factory.hpp:172] Creating layer 'relu6' of type 'ReLU' I1121 07:58:00.286425 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:58:00.286439 8486 net.cpp:205] Created Layer relu6 (17) I1121 07:58:00.286443 8486 net.cpp:577] relu6 <- fc6 I1121 07:58:00.286450 8486 net.cpp:532] relu6 -> fc6 (in-place) I1121 07:58:00.286458 8486 net.cpp:265] Setting up relu6 I1121 07:58:00.286460 8486 net.cpp:272] TEST Top shape for layer 17 'relu6' 10 4096 (40960) I1121 07:58:00.286463 8486 layer_factory.hpp:172] Creating layer 'drop6' of type 'Dropout' I1121 07:58:00.286468 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:58:00.286476 8486 net.cpp:205] Created Layer drop6 (18) I1121 07:58:00.286479 8486 net.cpp:577] drop6 <- fc6 I1121 07:58:00.286482 8486 net.cpp:532] drop6 -> fc6 (in-place) I1121 07:58:00.286502 8486 net.cpp:265] Setting up drop6 I1121 07:58:00.286510 8486 net.cpp:272] TEST Top shape for layer 18 'drop6' 10 4096 (40960) I1121 07:58:00.286514 8486 layer_factory.hpp:172] Creating layer 'fc7' of type 'InnerProduct' I1121 07:58:00.286517 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:58:00.286528 8486 net.cpp:205] Created Layer fc7 (19) I1121 07:58:00.286530 8486 net.cpp:577] fc7 <- fc6 I1121 07:58:00.286535 8486 net.cpp:547] fc7 -> fc7 I1121 07:58:00.422643 8486 net.cpp:265] Setting up fc7 I1121 07:58:00.422691 8486 net.cpp:272] TEST Top shape for layer 19 'fc7' 10 4096 (40960) I1121 07:58:00.422703 8486 layer_factory.hpp:172] Creating layer 'relu7' of type 'ReLU' I1121 07:58:00.422709 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:58:00.422719 8486 net.cpp:205] Created Layer relu7 (20) I1121 07:58:00.422724 8486 net.cpp:577] relu7 <- fc7 I1121 07:58:00.422730 8486 net.cpp:532] relu7 -> fc7 (in-place) I1121 07:58:00.422735 8486 net.cpp:265] Setting up relu7 I1121 07:58:00.422739 8486 net.cpp:272] TEST Top shape for layer 20 'relu7' 10 4096 (40960) I1121 07:58:00.422742 8486 layer_factory.hpp:172] Creating layer 'drop7' of type 'Dropout' I1121 07:58:00.422746 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:58:00.422753 8486 net.cpp:205] Created Layer drop7 (21) I1121 07:58:00.422756 8486 net.cpp:577] drop7 <- fc7 I1121 07:58:00.422760 8486 net.cpp:532] drop7 -> fc7 (in-place) I1121 07:58:00.422765 8486 net.cpp:265] Setting up drop7 I1121 07:58:00.422767 8486 net.cpp:272] TEST Top shape for layer 21 'drop7' 10 4096 (40960) I1121 07:58:00.422770 8486 layer_factory.hpp:172] Creating layer 'fc8' of type 'InnerProduct' I1121 07:58:00.422775 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:58:00.422782 8486 net.cpp:205] Created Layer fc8 (22) I1121 07:58:00.422791 8486 net.cpp:577] fc8 <- fc7 I1121 07:58:00.422796 8486 net.cpp:547] fc8 -> fc8 I1121 07:58:00.455878 8486 net.cpp:265] Setting up fc8 I1121 07:58:00.455909 8486 net.cpp:272] TEST Top shape for layer 22 'fc8' 10 1000 (10000) I1121 07:58:00.455917 8486 layer_factory.hpp:172] Creating layer 'prob' of type 'Softmax' I1121 07:58:00.455924 8486 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1121 07:58:00.455938 8486 net.cpp:205] Created Layer prob (23) I1121 07:58:00.455942 8486 net.cpp:577] prob <- fc8 I1121 07:58:00.455947 8486 net.cpp:547] prob -> prob I1121 07:58:00.455967 8486 net.cpp:265] Setting up prob I1121 07:58:00.455971 8486 net.cpp:272] TEST Top shape for layer 23 'prob' 10 1000 (10000) I1121 07:58:00.456001 8486 net.cpp:343] prob does not need backward computation. I1121 07:58:00.456005 8486 net.cpp:343] fc8 does not need backward computation. I1121 07:58:00.456009 8486 net.cpp:343] drop7 does not need backward computation. I1121 07:58:00.456014 8486 net.cpp:343] relu7 does not need backward computation. I1121 07:58:00.456017 8486 net.cpp:343] fc7 does not need backward computation. I1121 07:58:00.456022 8486 net.cpp:343] drop6 does not need backward computation. I1121 07:58:00.456027 8486 net.cpp:343] relu6 does not need backward computation. I1121 07:58:00.456030 8486 net.cpp:343] fc6 does not need backward computation. I1121 07:58:00.456033 8486 net.cpp:343] pool5 does not need backward computation. I1121 07:58:00.456037 8486 net.cpp:343] relu5 does not need backward computation. I1121 07:58:00.456040 8486 net.cpp:343] conv5 does not need backward computation. I1121 07:58:00.456046 8486 net.cpp:343] relu4 does not need backward computation. I1121 07:58:00.456049 8486 net.cpp:343] conv4 does not need backward computation. I1121 07:58:00.456053 8486 net.cpp:343] relu3 does not need backward computation. I1121 07:58:00.456056 8486 net.cpp:343] conv3 does not need backward computation. I1121 07:58:00.456061 8486 net.cpp:343] norm2 does not need backward computation. I1121 07:58:00.456068 8486 net.cpp:343] pool2 does not need backward computation. I1121 07:58:00.456073 8486 net.cpp:343] relu2 does not need backward computation. I1121 07:58:00.456076 8486 net.cpp:343] conv2 does not need backward computation. I1121 07:58:00.456080 8486 net.cpp:343] norm1 does not need backward computation. I1121 07:58:00.456084 8486 net.cpp:343] pool1 does not need backward computation. I1121 07:58:00.456089 8486 net.cpp:343] relu1 does not need backward computation. I1121 07:58:00.456095 8486 net.cpp:343] conv1 does not need backward computation. I1121 07:58:00.456099 8486 net.cpp:343] data does not need backward computation. I1121 07:58:00.456102 8486 net.cpp:385] This network produces output prob I1121 07:58:00.456123 8486 net.cpp:408] Top memory (TEST) required for data: 68681400 diff: 68681400 I1121 07:58:00.456131 8486 net.cpp:411] Bottom memory (TEST) required for data: 68641400 diff: 68641400 I1121 07:58:00.456136 8486 net.cpp:414] Shared (in-place) memory (TEST) by data: 26658560 diff: 26658560 I1121 07:58:00.456147 8486 net.cpp:417] Parameters memory (TEST) required for data: 243860896 diff: 243860896 I1121 07:58:00.456152 8486 net.cpp:420] Parameters shared memory (TEST) by data: 0 diff: 0 I1121 07:58:00.456156 8486 net.cpp:426] Network initialization done. I1121 07:58:00.721652 8486 upgrade_proto.cpp:43] Attempting to upgrade input file specified using deprecated transformation parameters: ../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel I1121 07:58:00.721698 8486 upgrade_proto.cpp:46] Successfully upgraded file specified using deprecated data transformation parameters. W1121 07:58:00.721701 8486 upgrade_proto.cpp:48] Note that future Caffe releases will only support transform_param messages for transformation fields. I1121 07:58:00.721717 8486 upgrade_proto.cpp:52] Attempting to upgrade input file specified using deprecated V1LayerParameter: ../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel I1121 07:58:01.268941 8486 upgrade_proto.cpp:60] Successfully upgraded file specified using deprecated V1LayerParameter I1121 07:58:01.273867 8486 net.cpp:1143] Copying source layer data Type:Data #blobs=0 I1121 07:58:01.273890 8486 net.cpp:1143] Copying source layer conv1 Type:Convolution #blobs=2 I1121 07:58:01.274197 8486 net.cpp:1143] Copying source layer relu1 Type:ReLU #blobs=0 I1121 07:58:01.274206 8486 net.cpp:1143] Copying source layer pool1 Type:Pooling #blobs=0 I1121 07:58:01.274209 8486 net.cpp:1143] Copying source layer norm1 Type:LRN #blobs=0 I1121 07:58:01.274212 8486 net.cpp:1143] Copying source layer conv2 Type:Convolution #blobs=2 I1121 07:58:01.276559 8486 net.cpp:1143] Copying source layer relu2 Type:ReLU #blobs=0 I1121 07:58:01.276607 8486 net.cpp:1143] Copying source layer pool2 Type:Pooling #blobs=0 I1121 07:58:01.276610 8486 net.cpp:1143] Copying source layer norm2 Type:LRN #blobs=0 I1121 07:58:01.276613 8486 net.cpp:1143] Copying source layer conv3 Type:Convolution #blobs=2 I1121 07:58:01.283421 8486 net.cpp:1143] Copying source layer relu3 Type:ReLU #blobs=0 I1121 07:58:01.283442 8486 net.cpp:1143] Copying source layer conv4 Type:Convolution #blobs=2 I1121 07:58:01.288511 8486 net.cpp:1143] Copying source layer relu4 Type:ReLU #blobs=0 I1121 07:58:01.288528 8486 net.cpp:1143] Copying source layer conv5 Type:Convolution #blobs=2 I1121 07:58:01.291949 8486 net.cpp:1143] Copying source layer relu5 Type:ReLU #blobs=0 I1121 07:58:01.291966 8486 net.cpp:1143] Copying source layer pool5 Type:Pooling #blobs=0 I1121 07:58:01.291970 8486 net.cpp:1143] Copying source layer fc6 Type:InnerProduct #blobs=2 I1121 07:58:01.579213 8486 net.cpp:1143] Copying source layer relu6 Type:ReLU #blobs=0 I1121 07:58:01.579233 8486 net.cpp:1143] Copying source layer drop6 Type:Dropout #blobs=0 I1121 07:58:01.579237 8486 net.cpp:1143] Copying source layer fc7 Type:InnerProduct #blobs=2 I1121 07:58:01.706956 8486 net.cpp:1143] Copying source layer relu7 Type:ReLU #blobs=0 I1121 07:58:01.706976 8486 net.cpp:1143] Copying source layer drop7 Type:Dropout #blobs=0 I1121 07:58:01.706979 8486 net.cpp:1143] Copying source layer fc8 Type:InnerProduct #blobs=2 I1121 07:58:01.738294 8486 net.cpp:1135] Ignoring source layer loss (10, 3, 227, 227) Loading file: ../101_ObjectCategories/cougar_face/image_0033.jpg Classifying 1 inputs. Done in 1.09 s. Saving results into ../result.npy user0034@10-111-34-14:~/caffe$ python show_result.py data/ilsvrc12/synset_words.txt result.npy #1 | n02125311 cougar, puma, catamount, mountain lion, painter, panther, Felis concolor | 99.3% #2 | n02114712 red wolf, maned wolf, Canis rufus, Canis niger | 0.1% #3 | n02129165 lion, king of beasts, Panthera leo | 0.1% user0034@10-111-34-14:~/caffe$