chibi@2204:~/caffe$ cd python; python classify.py --raw_scale 255 ../101_ObjectCategories/panda/image_0001.jpg ../result.npy; cd .. WARNING: Logging before InitGoogleLogging() is written to STDERR I1023 14:11:32.482437 3056 gpu_memory.cpp:82] GPUMemory::Manager initialized CPU mode W1023 14:11:32.582196 3056 _caffe.cpp:172] DEPRECATION WARNING - deprecated use of Python interface W1023 14:11:32.582401 3056 _caffe.cpp:173] Use this instead (with the named "weights" parameter): W1023 14:11:32.582407 3056 _caffe.cpp:175] Net('/home/chibi/caffe/python/../models/bvlc_reference_caffenet/deploy.prototxt', 1, weights='/home/chibi/caffe/python/../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel') I1023 14:11:32.582656 3056 net.cpp:86] Initializing net from parameters: name: "CaffeNet" state { phase: TEST level: 0 } layer { name: "data" type: "Input" top: "data" input_param { shape { dim: 10 dim: 3 dim: 227 dim: 227 } } } layer { name: "conv1" type: "Convolution" bottom: "data" top: "conv1" convolution_param { num_output: 96 kernel_size: 11 stride: 4 } } layer { name: "relu1" type: "ReLU" bottom: "conv1" top: "conv1" } layer { name: "pool1" type: "Pooling" bottom: "conv1" top: "pool1" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "norm1" type: "LRN" bottom: "pool1" top: "norm1" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 } } layer { name: "conv2" type: "Convolution" bottom: "norm1" top: "conv2" convolution_param { num_output: 256 pad: 2 kernel_size: 5 group: 2 } } layer { name: "relu2" type: "ReLU" bottom: "conv2" top: "conv2" } layer { name: "pool2" type: "Pooling" bottom: "conv2" top: "pool2" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "norm2" type: "LRN" bottom: "pool2" top: "norm2" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 } } layer { name: "conv3" type: "Convolution" bottom: "norm2" top: "conv3" convolution_param { num_output: 384 pad: 1 kernel_size: 3 } } layer { name: "relu3" type: "ReLU" bottom: "conv3" top: "conv3" } layer { name: "conv4" type: "Convolution" bottom: "conv3" top: "conv4" convolution_param { num_output: 384 pad: 1 kernel_size: 3 group: 2 } } layer { name: "relu4" type: "ReLU" bottom: "conv4" top: "conv4" } layer { name: "conv5" type: "Convolution" bottom: "conv4" top: "conv5" convolution_param { num_output: 256 pad: 1 kernel_size: 3 group: 2 } } layer { name: "relu5" type: "ReLU" bottom: "conv5" top: "conv5" } layer { name: "pool5" type: "Pooling" bottom: "conv5" top: "pool5" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "fc6" type: "InnerProduct" bottom: "pool5" top: "fc6" inner_product_param { num_output: 4096 } } layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6" } layer { name: "drop6" type: "Dropout" bottom: "fc6" top: "fc6" dropout_param { dropout_ratio: 0.5 } } layer { name: "fc7" type: "InnerProduct" bottom: "fc6" top: "fc7" inner_product_param { num_output: 4096 } } layer { name: "relu7" type: "ReLU" bottom: "fc7" top: "fc7" } layer { name: "drop7" type: "Dropout" bottom: "fc7" top: "fc7" dropout_param { dropout_ratio: 0.5 } } layer { name: "fc8" type: "InnerProduct" bottom: "fc7" top: "fc8" inner_product_param { num_output: 1000 } } layer { name: "prob" type: "Softmax" bottom: "fc8" top: "prob" } I1023 14:11:32.582851 3056 net.cpp:116] Using FLOAT as default forward math type I1023 14:11:32.582857 3056 net.cpp:122] Using FLOAT as default backward math type I1023 14:11:32.582861 3056 layer_factory.hpp:172] Creating layer 'data' of type 'Input' I1023 14:11:32.582865 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.582877 3056 net.cpp:205] Created Layer data (0) I1023 14:11:32.582883 3056 net.cpp:547] data -> data I1023 14:11:32.582898 3056 net.cpp:265] Setting up data I1023 14:11:32.582902 3056 net.cpp:272] TEST Top shape for layer 0 'data' 10 3 227 227 (1545870) I1023 14:11:32.582914 3056 layer_factory.hpp:172] Creating layer 'conv1' of type 'Convolution' I1023 14:11:32.582918 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.582945 3056 net.cpp:205] Created Layer conv1 (1) I1023 14:11:32.582952 3056 net.cpp:577] conv1 <- data I1023 14:11:32.582957 3056 net.cpp:547] conv1 -> conv1 I1023 14:11:32.583283 3056 net.cpp:265] Setting up conv1 I1023 14:11:32.583287 3056 net.cpp:272] TEST Top shape for layer 1 'conv1' 10 96 55 55 (2904000) I1023 14:11:32.583302 3056 layer_factory.hpp:172] Creating layer 'relu1' of type 'ReLU' I1023 14:11:32.583307 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.583312 3056 net.cpp:205] Created Layer relu1 (2) I1023 14:11:32.583315 3056 net.cpp:577] relu1 <- conv1 I1023 14:11:32.583318 3056 net.cpp:532] relu1 -> conv1 (in-place) I1023 14:11:32.583323 3056 net.cpp:265] Setting up relu1 I1023 14:11:32.583326 3056 net.cpp:272] TEST Top shape for layer 2 'relu1' 10 96 55 55 (2904000) I1023 14:11:32.583330 3056 layer_factory.hpp:172] Creating layer 'pool1' of type 'Pooling' I1023 14:11:32.583335 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.583341 3056 net.cpp:205] Created Layer pool1 (3) I1023 14:11:32.583345 3056 net.cpp:577] pool1 <- conv1 I1023 14:11:32.583349 3056 net.cpp:547] pool1 -> pool1 I1023 14:11:32.583359 3056 net.cpp:265] Setting up pool1 I1023 14:11:32.583362 3056 net.cpp:272] TEST Top shape for layer 3 'pool1' 10 96 27 27 (699840) I1023 14:11:32.583366 3056 layer_factory.hpp:172] Creating layer 'norm1' of type 'LRN' I1023 14:11:32.583370 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.583377 3056 net.cpp:205] Created Layer norm1 (4) I1023 14:11:32.583381 3056 net.cpp:577] norm1 <- pool1 I1023 14:11:32.583385 3056 net.cpp:547] norm1 -> norm1 I1023 14:11:32.583393 3056 net.cpp:265] Setting up norm1 I1023 14:11:32.583396 3056 net.cpp:272] TEST Top shape for layer 4 'norm1' 10 96 27 27 (699840) I1023 14:11:32.583401 3056 layer_factory.hpp:172] Creating layer 'conv2' of type 'Convolution' I1023 14:11:32.583405 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.583411 3056 net.cpp:205] Created Layer conv2 (5) I1023 14:11:32.583415 3056 net.cpp:577] conv2 <- norm1 I1023 14:11:32.583418 3056 net.cpp:547] conv2 -> conv2 I1023 14:11:32.586221 3056 net.cpp:265] Setting up conv2 I1023 14:11:32.586226 3056 net.cpp:272] TEST Top shape for layer 5 'conv2' 10 256 27 27 (1866240) I1023 14:11:32.586233 3056 layer_factory.hpp:172] Creating layer 'relu2' of type 'ReLU' I1023 14:11:32.586236 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.586241 3056 net.cpp:205] Created Layer relu2 (6) I1023 14:11:32.586244 3056 net.cpp:577] relu2 <- conv2 I1023 14:11:32.586248 3056 net.cpp:532] relu2 -> conv2 (in-place) I1023 14:11:32.586252 3056 net.cpp:265] Setting up relu2 I1023 14:11:32.586256 3056 net.cpp:272] TEST Top shape for layer 6 'relu2' 10 256 27 27 (1866240) I1023 14:11:32.586261 3056 layer_factory.hpp:172] Creating layer 'pool2' of type 'Pooling' I1023 14:11:32.586263 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.586269 3056 net.cpp:205] Created Layer pool2 (7) I1023 14:11:32.586272 3056 net.cpp:577] pool2 <- conv2 I1023 14:11:32.586277 3056 net.cpp:547] pool2 -> pool2 I1023 14:11:32.586282 3056 net.cpp:265] Setting up pool2 I1023 14:11:32.586285 3056 net.cpp:272] TEST Top shape for layer 7 'pool2' 10 256 13 13 (432640) I1023 14:11:32.586290 3056 layer_factory.hpp:172] Creating layer 'norm2' of type 'LRN' I1023 14:11:32.586303 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.586310 3056 net.cpp:205] Created Layer norm2 (8) I1023 14:11:32.586314 3056 net.cpp:577] norm2 <- pool2 I1023 14:11:32.586318 3056 net.cpp:547] norm2 -> norm2 I1023 14:11:32.586333 3056 net.cpp:265] Setting up norm2 I1023 14:11:32.586336 3056 net.cpp:272] TEST Top shape for layer 8 'norm2' 10 256 13 13 (432640) I1023 14:11:32.586344 3056 layer_factory.hpp:172] Creating layer 'conv3' of type 'Convolution' I1023 14:11:32.586347 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.586355 3056 net.cpp:205] Created Layer conv3 (9) I1023 14:11:32.586359 3056 net.cpp:577] conv3 <- norm2 I1023 14:11:32.586364 3056 net.cpp:547] conv3 -> conv3 I1023 14:11:32.593693 3056 net.cpp:265] Setting up conv3 I1023 14:11:32.593698 3056 net.cpp:272] TEST Top shape for layer 9 'conv3' 10 384 13 13 (648960) I1023 14:11:32.593704 3056 layer_factory.hpp:172] Creating layer 'relu3' of type 'ReLU' I1023 14:11:32.593708 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.593713 3056 net.cpp:205] Created Layer relu3 (10) I1023 14:11:32.593717 3056 net.cpp:577] relu3 <- conv3 I1023 14:11:32.593720 3056 net.cpp:532] relu3 -> conv3 (in-place) I1023 14:11:32.593724 3056 net.cpp:265] Setting up relu3 I1023 14:11:32.593727 3056 net.cpp:272] TEST Top shape for layer 10 'relu3' 10 384 13 13 (648960) I1023 14:11:32.593732 3056 layer_factory.hpp:172] Creating layer 'conv4' of type 'Convolution' I1023 14:11:32.593735 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.593742 3056 net.cpp:205] Created Layer conv4 (11) I1023 14:11:32.593746 3056 net.cpp:577] conv4 <- conv3 I1023 14:11:32.593750 3056 net.cpp:547] conv4 -> conv4 I1023 14:11:32.599104 3056 net.cpp:265] Setting up conv4 I1023 14:11:32.599109 3056 net.cpp:272] TEST Top shape for layer 11 'conv4' 10 384 13 13 (648960) I1023 14:11:32.599117 3056 layer_factory.hpp:172] Creating layer 'relu4' of type 'ReLU' I1023 14:11:32.599120 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.599125 3056 net.cpp:205] Created Layer relu4 (12) I1023 14:11:32.599129 3056 net.cpp:577] relu4 <- conv4 I1023 14:11:32.599133 3056 net.cpp:532] relu4 -> conv4 (in-place) I1023 14:11:32.599136 3056 net.cpp:265] Setting up relu4 I1023 14:11:32.599140 3056 net.cpp:272] TEST Top shape for layer 12 'relu4' 10 384 13 13 (648960) I1023 14:11:32.599144 3056 layer_factory.hpp:172] Creating layer 'conv5' of type 'Convolution' I1023 14:11:32.599148 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.599154 3056 net.cpp:205] Created Layer conv5 (13) I1023 14:11:32.599159 3056 net.cpp:577] conv5 <- conv4 I1023 14:11:32.599161 3056 net.cpp:547] conv5 -> conv5 I1023 14:11:32.602730 3056 net.cpp:265] Setting up conv5 I1023 14:11:32.602736 3056 net.cpp:272] TEST Top shape for layer 13 'conv5' 10 256 13 13 (432640) I1023 14:11:32.602743 3056 layer_factory.hpp:172] Creating layer 'relu5' of type 'ReLU' I1023 14:11:32.602747 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.602751 3056 net.cpp:205] Created Layer relu5 (14) I1023 14:11:32.602756 3056 net.cpp:577] relu5 <- conv5 I1023 14:11:32.602759 3056 net.cpp:532] relu5 -> conv5 (in-place) I1023 14:11:32.602763 3056 net.cpp:265] Setting up relu5 I1023 14:11:32.602766 3056 net.cpp:272] TEST Top shape for layer 14 'relu5' 10 256 13 13 (432640) I1023 14:11:32.602771 3056 layer_factory.hpp:172] Creating layer 'pool5' of type 'Pooling' I1023 14:11:32.602774 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.602782 3056 net.cpp:205] Created Layer pool5 (15) I1023 14:11:32.602785 3056 net.cpp:577] pool5 <- conv5 I1023 14:11:32.602788 3056 net.cpp:547] pool5 -> pool5 I1023 14:11:32.602794 3056 net.cpp:265] Setting up pool5 I1023 14:11:32.602797 3056 net.cpp:272] TEST Top shape for layer 15 'pool5' 10 256 6 6 (92160) I1023 14:11:32.602802 3056 layer_factory.hpp:172] Creating layer 'fc6' of type 'InnerProduct' I1023 14:11:32.602806 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.602838 3056 net.cpp:205] Created Layer fc6 (16) I1023 14:11:32.602841 3056 net.cpp:577] fc6 <- pool5 I1023 14:11:32.602844 3056 net.cpp:547] fc6 -> fc6 I1023 14:11:32.907660 3056 net.cpp:265] Setting up fc6 I1023 14:11:32.907689 3056 net.cpp:272] TEST Top shape for layer 16 'fc6' 10 4096 (40960) I1023 14:11:32.907704 3056 layer_factory.hpp:172] Creating layer 'relu6' of type 'ReLU' I1023 14:11:32.907711 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.907722 3056 net.cpp:205] Created Layer relu6 (17) I1023 14:11:32.907728 3056 net.cpp:577] relu6 <- fc6 I1023 14:11:32.907733 3056 net.cpp:532] relu6 -> fc6 (in-place) I1023 14:11:32.907738 3056 net.cpp:265] Setting up relu6 I1023 14:11:32.907742 3056 net.cpp:272] TEST Top shape for layer 17 'relu6' 10 4096 (40960) I1023 14:11:32.907745 3056 layer_factory.hpp:172] Creating layer 'drop6' of type 'Dropout' I1023 14:11:32.907750 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.907756 3056 net.cpp:205] Created Layer drop6 (18) I1023 14:11:32.907763 3056 net.cpp:577] drop6 <- fc6 I1023 14:11:32.907768 3056 net.cpp:532] drop6 -> fc6 (in-place) I1023 14:11:32.907773 3056 net.cpp:265] Setting up drop6 I1023 14:11:32.907775 3056 net.cpp:272] TEST Top shape for layer 18 'drop6' 10 4096 (40960) I1023 14:11:32.907779 3056 layer_factory.hpp:172] Creating layer 'fc7' of type 'InnerProduct' I1023 14:11:32.907783 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:32.907792 3056 net.cpp:205] Created Layer fc7 (19) I1023 14:11:32.907795 3056 net.cpp:577] fc7 <- fc6 I1023 14:11:32.907799 3056 net.cpp:547] fc7 -> fc7 I1023 14:11:33.049005 3056 net.cpp:265] Setting up fc7 I1023 14:11:33.049031 3056 net.cpp:272] TEST Top shape for layer 19 'fc7' 10 4096 (40960) I1023 14:11:33.049046 3056 layer_factory.hpp:172] Creating layer 'relu7' of type 'ReLU' I1023 14:11:33.049050 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:33.049060 3056 net.cpp:205] Created Layer relu7 (20) I1023 14:11:33.049065 3056 net.cpp:577] relu7 <- fc7 I1023 14:11:33.049070 3056 net.cpp:532] relu7 -> fc7 (in-place) I1023 14:11:33.049077 3056 net.cpp:265] Setting up relu7 I1023 14:11:33.049079 3056 net.cpp:272] TEST Top shape for layer 20 'relu7' 10 4096 (40960) I1023 14:11:33.049083 3056 layer_factory.hpp:172] Creating layer 'drop7' of type 'Dropout' I1023 14:11:33.049088 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:33.049093 3056 net.cpp:205] Created Layer drop7 (21) I1023 14:11:33.049098 3056 net.cpp:577] drop7 <- fc7 I1023 14:11:33.049103 3056 net.cpp:532] drop7 -> fc7 (in-place) I1023 14:11:33.049108 3056 net.cpp:265] Setting up drop7 I1023 14:11:33.049111 3056 net.cpp:272] TEST Top shape for layer 21 'drop7' 10 4096 (40960) I1023 14:11:33.049115 3056 layer_factory.hpp:172] Creating layer 'fc8' of type 'InnerProduct' I1023 14:11:33.049120 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:33.049127 3056 net.cpp:205] Created Layer fc8 (22) I1023 14:11:33.049131 3056 net.cpp:577] fc8 <- fc7 I1023 14:11:33.049134 3056 net.cpp:547] fc8 -> fc8 I1023 14:11:33.082494 3056 net.cpp:265] Setting up fc8 I1023 14:11:33.082520 3056 net.cpp:272] TEST Top shape for layer 22 'fc8' 10 1000 (10000) I1023 14:11:33.082531 3056 layer_factory.hpp:172] Creating layer 'prob' of type 'Softmax' I1023 14:11:33.082537 3056 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I1023 14:11:33.082551 3056 net.cpp:205] Created Layer prob (23) I1023 14:11:33.082557 3056 net.cpp:577] prob <- fc8 I1023 14:11:33.082562 3056 net.cpp:547] prob -> prob I1023 14:11:33.082581 3056 net.cpp:265] Setting up prob I1023 14:11:33.082584 3056 net.cpp:272] TEST Top shape for layer 23 'prob' 10 1000 (10000) I1023 14:11:33.082628 3056 net.cpp:343] prob does not need backward computation. I1023 14:11:33.082635 3056 net.cpp:343] fc8 does not need backward computation. I1023 14:11:33.082639 3056 net.cpp:343] drop7 does not need backward computation. I1023 14:11:33.082643 3056 net.cpp:343] relu7 does not need backward computation. I1023 14:11:33.082646 3056 net.cpp:343] fc7 does not need backward computation. I1023 14:11:33.082650 3056 net.cpp:343] drop6 does not need backward computation. I1023 14:11:33.082654 3056 net.cpp:343] relu6 does not need backward computation. I1023 14:11:33.082659 3056 net.cpp:343] fc6 does not need backward computation. I1023 14:11:33.082664 3056 net.cpp:343] pool5 does not need backward computation. I1023 14:11:33.082667 3056 net.cpp:343] relu5 does not need backward computation. I1023 14:11:33.082671 3056 net.cpp:343] conv5 does not need backward computation. I1023 14:11:33.082676 3056 net.cpp:343] relu4 does not need backward computation. I1023 14:11:33.082680 3056 net.cpp:343] conv4 does not need backward computation. I1023 14:11:33.082684 3056 net.cpp:343] relu3 does not need backward computation. I1023 14:11:33.082690 3056 net.cpp:343] conv3 does not need backward computation. I1023 14:11:33.082693 3056 net.cpp:343] norm2 does not need backward computation. I1023 14:11:33.082698 3056 net.cpp:343] pool2 does not need backward computation. I1023 14:11:33.082701 3056 net.cpp:343] relu2 does not need backward computation. I1023 14:11:33.082706 3056 net.cpp:343] conv2 does not need backward computation. I1023 14:11:33.082710 3056 net.cpp:343] norm1 does not need backward computation. I1023 14:11:33.082715 3056 net.cpp:343] pool1 does not need backward computation. I1023 14:11:33.082720 3056 net.cpp:343] relu1 does not need backward computation. I1023 14:11:33.082722 3056 net.cpp:343] conv1 does not need backward computation. I1023 14:11:33.082727 3056 net.cpp:343] data does not need backward computation. I1023 14:11:33.082731 3056 net.cpp:385] This network produces output prob I1023 14:11:33.082751 3056 net.cpp:408] Top memory (TEST) required for data: 68681400 diff: 68681400 I1023 14:11:33.082756 3056 net.cpp:411] Bottom memory (TEST) required for data: 68641400 diff: 68641400 I1023 14:11:33.082758 3056 net.cpp:414] Shared (in-place) memory (TEST) by data: 26658560 diff: 26658560 I1023 14:11:33.082762 3056 net.cpp:417] Parameters memory (TEST) required for data: 243860896 diff: 42272 I1023 14:11:33.082767 3056 net.cpp:420] Parameters shared memory (TEST) by data: 0 diff: 0 I1023 14:11:33.082770 3056 net.cpp:426] Network initialization done. I1023 14:11:33.257426 3056 upgrade_proto.cpp:43] Attempting to upgrade input file specified using deprecated transformation parameters: /home/chibi/caffe/python/../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel I1023 14:11:33.257464 3056 upgrade_proto.cpp:46] Successfully upgraded file specified using deprecated data transformation parameters. W1023 14:11:33.257468 3056 upgrade_proto.cpp:48] Note that future Caffe releases will only support transform_param messages for transformation fields. I1023 14:11:33.257498 3056 upgrade_proto.cpp:52] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/chibi/caffe/python/../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel I1023 14:11:33.424485 3056 upgrade_proto.cpp:60] Successfully upgraded file specified using deprecated V1LayerParameter I1023 14:11:33.431674 3056 net.cpp:1143] Copying source layer data Type:Data #blobs=0 I1023 14:11:33.431681 3056 net.cpp:1143] Copying source layer conv1 Type:Convolution #blobs=2 I1023 14:11:33.431927 3056 net.cpp:1143] Copying source layer relu1 Type:ReLU #blobs=0 I1023 14:11:33.431931 3056 net.cpp:1143] Copying source layer pool1 Type:Pooling #blobs=0 I1023 14:11:33.431934 3056 net.cpp:1143] Copying source layer norm1 Type:LRN #blobs=0 I1023 14:11:33.431937 3056 net.cpp:1143] Copying source layer conv2 Type:Convolution #blobs=2 I1023 14:11:33.434249 3056 net.cpp:1143] Copying source layer relu2 Type:ReLU #blobs=0 I1023 14:11:33.434254 3056 net.cpp:1143] Copying source layer pool2 Type:Pooling #blobs=0 I1023 14:11:33.434257 3056 net.cpp:1143] Copying source layer norm2 Type:LRN #blobs=0 I1023 14:11:33.434262 3056 net.cpp:1143] Copying source layer conv3 Type:Convolution #blobs=2 I1023 14:11:33.440801 3056 net.cpp:1143] Copying source layer relu3 Type:ReLU #blobs=0 I1023 14:11:33.440807 3056 net.cpp:1143] Copying source layer conv4 Type:Convolution #blobs=2 I1023 14:11:33.445713 3056 net.cpp:1143] Copying source layer relu4 Type:ReLU #blobs=0 I1023 14:11:33.445719 3056 net.cpp:1143] Copying source layer conv5 Type:Convolution #blobs=2 I1023 14:11:33.448989 3056 net.cpp:1143] Copying source layer relu5 Type:ReLU #blobs=0 I1023 14:11:33.448994 3056 net.cpp:1143] Copying source layer pool5 Type:Pooling #blobs=0 I1023 14:11:33.448998 3056 net.cpp:1143] Copying source layer fc6 Type:InnerProduct #blobs=2 I1023 14:11:33.727607 3056 net.cpp:1143] Copying source layer relu6 Type:ReLU #blobs=0 I1023 14:11:33.727632 3056 net.cpp:1143] Copying source layer drop6 Type:Dropout #blobs=0 I1023 14:11:33.727636 3056 net.cpp:1143] Copying source layer fc7 Type:InnerProduct #blobs=2 I1023 14:11:33.851478 3056 net.cpp:1143] Copying source layer relu7 Type:ReLU #blobs=0 I1023 14:11:33.851503 3056 net.cpp:1143] Copying source layer drop7 Type:Dropout #blobs=0 I1023 14:11:33.851506 3056 net.cpp:1143] Copying source layer fc8 Type:InnerProduct #blobs=2 I1023 14:11:33.881742 3056 net.cpp:1135] Ignoring source layer loss (10, 3, 227, 227) Loading file: ../101_ObjectCategories/panda/image_0001.jpg Classifying 1 inputs. Done in 0.44 s. Saving results into ../result.npy chibi@2204:~/caffe$ python show_result.py data/ilsvrc12/synset_words.txt result.npy #1 | n02510455 giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca | 98.3% #2 | n02500267 indri, indris, Indri indri, Indri brevicaudatus | 1.1% #3 | n02497673 Madagascar cat, ring-tailed lemur, Lemur catta | 0.1% chibi@2204:~/caffe$