chibi@2204:~/caffe$ cd python; python classify.py --raw_scale 255 ../101_ObjectCategories/airplanes/image_0001.jpg ../result.npy; cd .. WARNING: Logging before InitGoogleLogging() is written to STDERR I0618 07:34:43.743258 2140 gpu_memory.cpp:82] GPUMemory::Manager initialized CPU mode W0618 07:34:43.913614 2140 _caffe.cpp:172] DEPRECATION WARNING - deprecated use of Python interface W0618 07:34:43.913893 2140 _caffe.cpp:173] Use this instead (with the named "weights" parameter): W0618 07:34:43.913905 2140 _caffe.cpp:175] Net('/home/chibi/caffe/python/../models/bvlc_reference_caffenet/deploy.prototxt', 1, weights='/home/chibi/caffe/python/../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel') I0618 07:34:43.917467 2140 net.cpp:86] Initializing net from parameters: name: "CaffeNet" state { phase: TEST level: 0 } layer { name: "data" type: "Input" top: "data" input_param { shape { dim: 10 dim: 3 dim: 227 dim: 227 } } } layer { name: "conv1" type: "Convolution" bottom: "data" top: "conv1" convolution_param { num_output: 96 kernel_size: 11 stride: 4 } } layer { name: "relu1" type: "ReLU" bottom: "conv1" top: "conv1" } layer { name: "pool1" type: "Pooling" bottom: "conv1" top: "pool1" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "norm1" type: "LRN" bottom: "pool1" top: "norm1" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 } } layer { name: "conv2" type: "Convolution" bottom: "norm1" top: "conv2" convolution_param { num_output: 256 pad: 2 kernel_size: 5 group: 2 } } layer { name: "relu2" type: "ReLU" bottom: "conv2" top: "conv2" } layer { name: "pool2" type: "Pooling" bottom: "conv2" top: "pool2" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "norm2" type: "LRN" bottom: "pool2" top: "norm2" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 } } layer { name: "conv3" type: "Convolution" bottom: "norm2" top: "conv3" convolution_param { num_output: 384 pad: 1 kernel_size: 3 } } layer { name: "relu3" type: "ReLU" bottom: "conv3" top: "conv3" } layer { name: "conv4" type: "Convolution" bottom: "conv3" top: "conv4" convolution_param { num_output: 384 pad: 1 kernel_size: 3 group: 2 } } layer { name: "relu4" type: "ReLU" bottom: "conv4" top: "conv4" } layer { name: "conv5" type: "Convolution" bottom: "conv4" top: "conv5" convolution_param { num_output: 256 pad: 1 kernel_size: 3 group: 2 } } layer { name: "relu5" type: "ReLU" bottom: "conv5" top: "conv5" } layer { name: "pool5" type: "Pooling" bottom: "conv5" top: "pool5" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "fc6" type: "InnerProduct" bottom: "pool5" top: "fc6" inner_product_param { num_output: 4096 } } layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6" } layer { name: "drop6" type: "Dropout" bottom: "fc6" top: "fc6" dropout_param { dropout_ratio: 0.5 } } layer { name: "fc7" type: "InnerProduct" bottom: "fc6" top: "fc7" inner_product_param { num_output: 4096 } } layer { name: "relu7" type: "ReLU" bottom: "fc7" top: "fc7" } layer { name: "drop7" type: "Dropout" bottom: "fc7" top: "fc7" dropout_param { dropout_ratio: 0.5 } } layer { name: "fc8" type: "InnerProduct" bottom: "fc7" top: "fc8" inner_product_param { num_output: 1000 } } layer { name: "prob" type: "Softmax" bottom: "fc8" top: "prob" } I0618 07:34:43.918591 2140 net.cpp:116] Using FLOAT as default forward math type I0618 07:34:43.918618 2140 net.cpp:122] Using FLOAT as default backward math type I0618 07:34:43.918635 2140 layer_factory.hpp:172] Creating layer 'data' of type 'Input' I0618 07:34:43.918651 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.919373 2140 net.cpp:205] Created Layer data (0) I0618 07:34:43.919404 2140 net.cpp:547] data -> data I0618 07:34:43.920433 2140 net.cpp:265] Setting up data I0618 07:34:43.920457 2140 net.cpp:272] TEST Top shape for layer 0 'data' 10 3 227 227 (1545870) I0618 07:34:43.920493 2140 layer_factory.hpp:172] Creating layer 'conv1' of type 'Convolution' I0618 07:34:43.920508 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.921234 2140 net.cpp:205] Created Layer conv1 (1) I0618 07:34:43.921259 2140 net.cpp:577] conv1 <- data I0618 07:34:43.921276 2140 net.cpp:547] conv1 -> conv1 I0618 07:34:43.925314 2140 net.cpp:265] Setting up conv1 I0618 07:34:43.925354 2140 net.cpp:272] TEST Top shape for layer 1 'conv1' 10 96 55 55 (2904000) I0618 07:34:43.926621 2140 layer_factory.hpp:172] Creating layer 'relu1' of type 'ReLU' I0618 07:34:43.926656 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.926688 2140 net.cpp:205] Created Layer relu1 (2) I0618 07:34:43.926712 2140 net.cpp:577] relu1 <- conv1 I0618 07:34:43.926749 2140 net.cpp:532] relu1 -> conv1 (in-place) I0618 07:34:43.926779 2140 net.cpp:265] Setting up relu1 I0618 07:34:43.926800 2140 net.cpp:272] TEST Top shape for layer 2 'relu1' 10 96 55 55 (2904000) I0618 07:34:43.926831 2140 layer_factory.hpp:172] Creating layer 'pool1' of type 'Pooling' I0618 07:34:43.926854 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.926888 2140 net.cpp:205] Created Layer pool1 (3) I0618 07:34:43.926911 2140 net.cpp:577] pool1 <- conv1 I0618 07:34:43.926934 2140 net.cpp:547] pool1 -> pool1 I0618 07:34:43.927464 2140 net.cpp:265] Setting up pool1 I0618 07:34:43.927497 2140 net.cpp:272] TEST Top shape for layer 3 'pool1' 10 96 27 27 (699840) I0618 07:34:43.927528 2140 layer_factory.hpp:172] Creating layer 'norm1' of type 'LRN' I0618 07:34:43.927552 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.927604 2140 net.cpp:205] Created Layer norm1 (4) I0618 07:34:43.927629 2140 net.cpp:577] norm1 <- pool1 I0618 07:34:43.927655 2140 net.cpp:547] norm1 -> norm1 I0618 07:34:43.927955 2140 net.cpp:265] Setting up norm1 I0618 07:34:43.927989 2140 net.cpp:272] TEST Top shape for layer 4 'norm1' 10 96 27 27 (699840) I0618 07:34:43.928020 2140 layer_factory.hpp:172] Creating layer 'conv2' of type 'Convolution' I0618 07:34:43.928045 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.928094 2140 net.cpp:205] Created Layer conv2 (5) I0618 07:34:43.928119 2140 net.cpp:577] conv2 <- norm1 I0618 07:34:43.928145 2140 net.cpp:547] conv2 -> conv2 I0618 07:34:43.936348 2140 net.cpp:265] Setting up conv2 I0618 07:34:43.936370 2140 net.cpp:272] TEST Top shape for layer 5 'conv2' 10 256 27 27 (1866240) I0618 07:34:43.936396 2140 layer_factory.hpp:172] Creating layer 'relu2' of type 'ReLU' I0618 07:34:43.936411 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.936430 2140 net.cpp:205] Created Layer relu2 (6) I0618 07:34:43.936446 2140 net.cpp:577] relu2 <- conv2 I0618 07:34:43.936463 2140 net.cpp:532] relu2 -> conv2 (in-place) I0618 07:34:43.936481 2140 net.cpp:265] Setting up relu2 I0618 07:34:43.936496 2140 net.cpp:272] TEST Top shape for layer 6 'relu2' 10 256 27 27 (1866240) I0618 07:34:43.936514 2140 layer_factory.hpp:172] Creating layer 'pool2' of type 'Pooling' I0618 07:34:43.936528 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.936553 2140 net.cpp:205] Created Layer pool2 (7) I0618 07:34:43.936569 2140 net.cpp:577] pool2 <- conv2 I0618 07:34:43.936584 2140 net.cpp:547] pool2 -> pool2 I0618 07:34:43.936615 2140 net.cpp:265] Setting up pool2 I0618 07:34:43.936631 2140 net.cpp:272] TEST Top shape for layer 7 'pool2' 10 256 13 13 (432640) I0618 07:34:43.936650 2140 layer_factory.hpp:172] Creating layer 'norm2' of type 'LRN' I0618 07:34:43.936666 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.936697 2140 net.cpp:205] Created Layer norm2 (8) I0618 07:34:43.936713 2140 net.cpp:577] norm2 <- pool2 I0618 07:34:43.936729 2140 net.cpp:547] norm2 -> norm2 I0618 07:34:43.936789 2140 net.cpp:265] Setting up norm2 I0618 07:34:43.936806 2140 net.cpp:272] TEST Top shape for layer 8 'norm2' 10 256 13 13 (432640) I0618 07:34:43.936826 2140 layer_factory.hpp:172] Creating layer 'conv3' of type 'Convolution' I0618 07:34:43.936842 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.936877 2140 net.cpp:205] Created Layer conv3 (9) I0618 07:34:43.936893 2140 net.cpp:577] conv3 <- norm2 I0618 07:34:43.936910 2140 net.cpp:547] conv3 -> conv3 I0618 07:34:43.949374 2140 net.cpp:265] Setting up conv3 I0618 07:34:43.949388 2140 net.cpp:272] TEST Top shape for layer 9 'conv3' 10 384 13 13 (648960) I0618 07:34:43.949404 2140 layer_factory.hpp:172] Creating layer 'relu3' of type 'ReLU' I0618 07:34:43.949411 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.949421 2140 net.cpp:205] Created Layer relu3 (10) I0618 07:34:43.949431 2140 net.cpp:577] relu3 <- conv3 I0618 07:34:43.949440 2140 net.cpp:532] relu3 -> conv3 (in-place) I0618 07:34:43.949448 2140 net.cpp:265] Setting up relu3 I0618 07:34:43.949455 2140 net.cpp:272] TEST Top shape for layer 10 'relu3' 10 384 13 13 (648960) I0618 07:34:43.949465 2140 layer_factory.hpp:172] Creating layer 'conv4' of type 'Convolution' I0618 07:34:43.949471 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.949487 2140 net.cpp:205] Created Layer conv4 (11) I0618 07:34:43.949496 2140 net.cpp:577] conv4 <- conv3 I0618 07:34:43.949504 2140 net.cpp:547] conv4 -> conv4 I0618 07:34:43.956202 2140 net.cpp:265] Setting up conv4 I0618 07:34:43.956213 2140 net.cpp:272] TEST Top shape for layer 11 'conv4' 10 384 13 13 (648960) I0618 07:34:43.956224 2140 layer_factory.hpp:172] Creating layer 'relu4' of type 'ReLU' I0618 07:34:43.956230 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.956239 2140 net.cpp:205] Created Layer relu4 (12) I0618 07:34:43.956248 2140 net.cpp:577] relu4 <- conv4 I0618 07:34:43.956255 2140 net.cpp:532] relu4 -> conv4 (in-place) I0618 07:34:43.956264 2140 net.cpp:265] Setting up relu4 I0618 07:34:43.956269 2140 net.cpp:272] TEST Top shape for layer 12 'relu4' 10 384 13 13 (648960) I0618 07:34:43.956277 2140 layer_factory.hpp:172] Creating layer 'conv5' of type 'Convolution' I0618 07:34:43.956283 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.956296 2140 net.cpp:205] Created Layer conv5 (13) I0618 07:34:43.956303 2140 net.cpp:577] conv5 <- conv4 I0618 07:34:43.956311 2140 net.cpp:547] conv5 -> conv5 I0618 07:34:43.960109 2140 net.cpp:265] Setting up conv5 I0618 07:34:43.960119 2140 net.cpp:272] TEST Top shape for layer 13 'conv5' 10 256 13 13 (432640) I0618 07:34:43.960129 2140 layer_factory.hpp:172] Creating layer 'relu5' of type 'ReLU' I0618 07:34:43.960135 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.960142 2140 net.cpp:205] Created Layer relu5 (14) I0618 07:34:43.960148 2140 net.cpp:577] relu5 <- conv5 I0618 07:34:43.960155 2140 net.cpp:532] relu5 -> conv5 (in-place) I0618 07:34:43.960161 2140 net.cpp:265] Setting up relu5 I0618 07:34:43.960167 2140 net.cpp:272] TEST Top shape for layer 14 'relu5' 10 256 13 13 (432640) I0618 07:34:43.960175 2140 layer_factory.hpp:172] Creating layer 'pool5' of type 'Pooling' I0618 07:34:43.960179 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.960187 2140 net.cpp:205] Created Layer pool5 (15) I0618 07:34:43.960193 2140 net.cpp:577] pool5 <- conv5 I0618 07:34:43.960199 2140 net.cpp:547] pool5 -> pool5 I0618 07:34:43.960211 2140 net.cpp:265] Setting up pool5 I0618 07:34:43.960216 2140 net.cpp:272] TEST Top shape for layer 15 'pool5' 10 256 6 6 (92160) I0618 07:34:43.960223 2140 layer_factory.hpp:172] Creating layer 'fc6' of type 'InnerProduct' I0618 07:34:43.960229 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:43.960268 2140 net.cpp:205] Created Layer fc6 (16) I0618 07:34:43.960274 2140 net.cpp:577] fc6 <- pool5 I0618 07:34:43.960280 2140 net.cpp:547] fc6 -> fc6 I0618 07:34:44.241643 2140 net.cpp:265] Setting up fc6 I0618 07:34:44.241663 2140 net.cpp:272] TEST Top shape for layer 16 'fc6' 10 4096 (40960) I0618 07:34:44.241691 2140 layer_factory.hpp:172] Creating layer 'relu6' of type 'ReLU' I0618 07:34:44.241699 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:44.241709 2140 net.cpp:205] Created Layer relu6 (17) I0618 07:34:44.241730 2140 net.cpp:577] relu6 <- fc6 I0618 07:34:44.241736 2140 net.cpp:532] relu6 -> fc6 (in-place) I0618 07:34:44.241761 2140 net.cpp:265] Setting up relu6 I0618 07:34:44.241767 2140 net.cpp:272] TEST Top shape for layer 17 'relu6' 10 4096 (40960) I0618 07:34:44.241786 2140 layer_factory.hpp:172] Creating layer 'drop6' of type 'Dropout' I0618 07:34:44.241804 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:44.241816 2140 net.cpp:205] Created Layer drop6 (18) I0618 07:34:44.241822 2140 net.cpp:577] drop6 <- fc6 I0618 07:34:44.241827 2140 net.cpp:532] drop6 -> fc6 (in-place) I0618 07:34:44.241833 2140 net.cpp:265] Setting up drop6 I0618 07:34:44.241838 2140 net.cpp:272] TEST Top shape for layer 18 'drop6' 10 4096 (40960) I0618 07:34:44.241844 2140 layer_factory.hpp:172] Creating layer 'fc7' of type 'InnerProduct' I0618 07:34:44.241849 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:44.241858 2140 net.cpp:205] Created Layer fc7 (19) I0618 07:34:44.241863 2140 net.cpp:577] fc7 <- fc6 I0618 07:34:44.241883 2140 net.cpp:547] fc7 -> fc7 I0618 07:34:44.366037 2140 net.cpp:265] Setting up fc7 I0618 07:34:44.366057 2140 net.cpp:272] TEST Top shape for layer 19 'fc7' 10 4096 (40960) I0618 07:34:44.366084 2140 layer_factory.hpp:172] Creating layer 'relu7' of type 'ReLU' I0618 07:34:44.366091 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:44.366114 2140 net.cpp:205] Created Layer relu7 (20) I0618 07:34:44.366122 2140 net.cpp:577] relu7 <- fc7 I0618 07:34:44.366142 2140 net.cpp:532] relu7 -> fc7 (in-place) I0618 07:34:44.366163 2140 net.cpp:265] Setting up relu7 I0618 07:34:44.366168 2140 net.cpp:272] TEST Top shape for layer 20 'relu7' 10 4096 (40960) I0618 07:34:44.366187 2140 layer_factory.hpp:172] Creating layer 'drop7' of type 'Dropout' I0618 07:34:44.366194 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:44.366205 2140 net.cpp:205] Created Layer drop7 (21) I0618 07:34:44.366223 2140 net.cpp:577] drop7 <- fc7 I0618 07:34:44.366228 2140 net.cpp:532] drop7 -> fc7 (in-place) I0618 07:34:44.366250 2140 net.cpp:265] Setting up drop7 I0618 07:34:44.366256 2140 net.cpp:272] TEST Top shape for layer 21 'drop7' 10 4096 (40960) I0618 07:34:44.366262 2140 layer_factory.hpp:172] Creating layer 'fc8' of type 'InnerProduct' I0618 07:34:44.366268 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:44.366276 2140 net.cpp:205] Created Layer fc8 (22) I0618 07:34:44.366281 2140 net.cpp:577] fc8 <- fc7 I0618 07:34:44.366287 2140 net.cpp:547] fc8 -> fc8 I0618 07:34:44.396910 2140 net.cpp:265] Setting up fc8 I0618 07:34:44.396929 2140 net.cpp:272] TEST Top shape for layer 22 'fc8' 10 1000 (10000) I0618 07:34:44.396956 2140 layer_factory.hpp:172] Creating layer 'prob' of type 'Softmax' I0618 07:34:44.396963 2140 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0618 07:34:44.396976 2140 net.cpp:205] Created Layer prob (23) I0618 07:34:44.396997 2140 net.cpp:577] prob <- fc8 I0618 07:34:44.397017 2140 net.cpp:547] prob -> prob I0618 07:34:44.398972 2140 net.cpp:265] Setting up prob I0618 07:34:44.398984 2140 net.cpp:272] TEST Top shape for layer 23 'prob' 10 1000 (10000) I0618 07:34:44.399032 2140 net.cpp:343] prob does not need backward computation. I0618 07:34:44.399053 2140 net.cpp:343] fc8 does not need backward computation. I0618 07:34:44.399060 2140 net.cpp:343] drop7 does not need backward computation. I0618 07:34:44.399081 2140 net.cpp:343] relu7 does not need backward computation. I0618 07:34:44.399087 2140 net.cpp:343] fc7 does not need backward computation. I0618 07:34:44.399092 2140 net.cpp:343] drop6 does not need backward computation. I0618 07:34:44.399114 2140 net.cpp:343] relu6 does not need backward computation. I0618 07:34:44.399120 2140 net.cpp:343] fc6 does not need backward computation. I0618 07:34:44.399139 2140 net.cpp:343] pool5 does not need backward computation. I0618 07:34:44.399144 2140 net.cpp:343] relu5 does not need backward computation. I0618 07:34:44.399163 2140 net.cpp:343] conv5 does not need backward computation. I0618 07:34:44.399168 2140 net.cpp:343] relu4 does not need backward computation. I0618 07:34:44.399187 2140 net.cpp:343] conv4 does not need backward computation. I0618 07:34:44.399194 2140 net.cpp:343] relu3 does not need backward computation. I0618 07:34:44.399214 2140 net.cpp:343] conv3 does not need backward computation. I0618 07:34:44.399219 2140 net.cpp:343] norm2 does not need backward computation. I0618 07:34:44.399225 2140 net.cpp:343] pool2 does not need backward computation. I0618 07:34:44.399245 2140 net.cpp:343] relu2 does not need backward computation. I0618 07:34:44.399263 2140 net.cpp:343] conv2 does not need backward computation. I0618 07:34:44.399268 2140 net.cpp:343] norm1 does not need backward computation. I0618 07:34:44.399273 2140 net.cpp:343] pool1 does not need backward computation. I0618 07:34:44.399302 2140 net.cpp:343] relu1 does not need backward computation. I0618 07:34:44.399307 2140 net.cpp:343] conv1 does not need backward computation. I0618 07:34:44.399313 2140 net.cpp:343] data does not need backward computation. I0618 07:34:44.399318 2140 net.cpp:385] This network produces output prob I0618 07:34:44.399353 2140 net.cpp:408] Top memory (TEST) required for data: 68681400 diff: 68681400 I0618 07:34:44.399358 2140 net.cpp:411] Bottom memory (TEST) required for data: 68641400 diff: 68641400 I0618 07:34:44.399363 2140 net.cpp:414] Shared (in-place) memory (TEST) by data: 26658560 diff: 26658560 I0618 07:34:44.399369 2140 net.cpp:417] Parameters memory (TEST) required for data: 243860896 diff: 42272 I0618 07:34:44.399389 2140 net.cpp:420] Parameters shared memory (TEST) by data: 0 diff: 0 I0618 07:34:44.399394 2140 net.cpp:426] Network initialization done. I0618 07:34:44.992411 2140 upgrade_proto.cpp:43] Attempting to upgrade input file specified using deprecated transformation parameters: /home/chibi/caffe/python/../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel I0618 07:34:44.992434 2140 upgrade_proto.cpp:46] Successfully upgraded file specified using deprecated data transformation parameters. W0618 07:34:44.992440 2140 upgrade_proto.cpp:48] Note that future Caffe releases will only support transform_param messages for transformation fields. I0618 07:34:44.992478 2140 upgrade_proto.cpp:52] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/chibi/caffe/python/../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel I0618 07:34:45.273731 2140 upgrade_proto.cpp:60] Successfully upgraded file specified using deprecated V1LayerParameter I0618 07:34:45.284523 2140 net.cpp:1143] Copying source layer data Type:Data #blobs=0 I0618 07:34:45.284535 2140 net.cpp:1143] Copying source layer conv1 Type:Convolution #blobs=2 I0618 07:34:45.284868 2140 net.cpp:1143] Copying source layer relu1 Type:ReLU #blobs=0 I0618 07:34:45.284874 2140 net.cpp:1143] Copying source layer pool1 Type:Pooling #blobs=0 I0618 07:34:45.284879 2140 net.cpp:1143] Copying source layer norm1 Type:LRN #blobs=0 I0618 07:34:45.284884 2140 net.cpp:1143] Copying source layer conv2 Type:Convolution #blobs=2 I0618 07:34:45.287062 2140 net.cpp:1143] Copying source layer relu2 Type:ReLU #blobs=0 I0618 07:34:45.287070 2140 net.cpp:1143] Copying source layer pool2 Type:Pooling #blobs=0 I0618 07:34:45.287074 2140 net.cpp:1143] Copying source layer norm2 Type:LRN #blobs=0 I0618 07:34:45.287079 2140 net.cpp:1143] Copying source layer conv3 Type:Convolution #blobs=2 I0618 07:34:45.292937 2140 net.cpp:1143] Copying source layer relu3 Type:ReLU #blobs=0 I0618 07:34:45.292944 2140 net.cpp:1143] Copying source layer conv4 Type:Convolution #blobs=2 I0618 07:34:45.297277 2140 net.cpp:1143] Copying source layer relu4 Type:ReLU #blobs=0 I0618 07:34:45.297283 2140 net.cpp:1143] Copying source layer conv5 Type:Convolution #blobs=2 I0618 07:34:45.300266 2140 net.cpp:1143] Copying source layer relu5 Type:ReLU #blobs=0 I0618 07:34:45.300271 2140 net.cpp:1143] Copying source layer pool5 Type:Pooling #blobs=0 I0618 07:34:45.300276 2140 net.cpp:1143] Copying source layer fc6 Type:InnerProduct #blobs=2 I0618 07:34:45.536832 2140 net.cpp:1143] Copying source layer relu6 Type:ReLU #blobs=0 I0618 07:34:45.536841 2140 net.cpp:1143] Copying source layer drop6 Type:Dropout #blobs=0 I0618 07:34:45.536846 2140 net.cpp:1143] Copying source layer fc7 Type:InnerProduct #blobs=2 I0618 07:34:45.643069 2140 net.cpp:1143] Copying source layer relu7 Type:ReLU #blobs=0 I0618 07:34:45.643076 2140 net.cpp:1143] Copying source layer drop7 Type:Dropout #blobs=0 I0618 07:34:45.643081 2140 net.cpp:1143] Copying source layer fc8 Type:InnerProduct #blobs=2 I0618 07:34:45.669266 2140 net.cpp:1135] Ignoring source layer loss (10, 3, 227, 227) Loading file: ../101_ObjectCategories/airplanes/image_0001.jpg Classifying 1 inputs. Done in 0.22 s. Saving results into ../result.npy chibi@2204:~/caffe$ python show_result.py data/ilsvrc12/synset_words.txt result.npy #1 | n04552348 warplane, military plane | 85.0% #2 | n04008634 projectile, missile | 5.4% #3 | n02690373 airliner | 5.0% chibi@2204:~/caffe$