chibi@1604:~/caffe$ cd python; python classify.py --raw_scale 255 ../101_ObjectCategories/panda/image_0008.jpg ../result.npy; cd .. WARNING: Logging before InitGoogleLogging() is written to STDERR I0813 11:54:48.226882 6317 gpu_memory.cpp:82] GPUMemory::Manager initialized CPU mode W0813 11:54:48.327240 6317 _caffe.cpp:172] DEPRECATION WARNING - deprecated use of Python interface W0813 11:54:48.327347 6317 _caffe.cpp:173] Use this instead (with the named "weights" parameter): W0813 11:54:48.327350 6317 _caffe.cpp:175] Net('../models/bvlc_reference_caffenet/deploy.prototxt', 1, weights='../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel') I0813 11:54:48.328655 6317 net.cpp:86] Initializing net from parameters: name: "CaffeNet" state { phase: TEST level: 0 } layer { name: "data" type: "Input" top: "data" input_param { shape { dim: 10 dim: 3 dim: 227 dim: 227 } } } layer { name: "conv1" type: "Convolution" bottom: "data" top: "conv1" convolution_param { num_output: 96 kernel_size: 11 stride: 4 } } layer { name: "relu1" type: "ReLU" bottom: "conv1" top: "conv1" } layer { name: "pool1" type: "Pooling" bottom: "conv1" top: "pool1" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "norm1" type: "LRN" bottom: "pool1" top: "norm1" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 } } layer { name: "conv2" type: "Convolution" bottom: "norm1" top: "conv2" convolution_param { num_output: 256 pad: 2 kernel_size: 5 group: 2 } } layer { name: "relu2" type: "ReLU" bottom: "conv2" top: "conv2" } layer { name: "pool2" type: "Pooling" bottom: "conv2" top: "pool2" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "norm2" type: "LRN" bottom: "pool2" top: "norm2" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 } } layer { name: "conv3" type: "Convolution" bottom: "norm2" top: "conv3" convolution_param { num_output: 384 pad: 1 kernel_size: 3 } } layer { name: "relu3" type: "ReLU" bottom: "conv3" top: "conv3" } layer { name: "conv4" type: "Convolution" bottom: "conv3" top: "conv4" convolution_param { num_output: 384 pad: 1 kernel_size: 3 group: 2 } } layer { name: "relu4" type: "ReLU" bottom: "conv4" top: "conv4" } layer { name: "conv5" type: "Convolution" bottom: "conv4" top: "conv5" convolution_param { num_output: 256 pad: 1 kernel_size: 3 group: 2 } } layer { name: "relu5" type: "ReLU" bottom: "conv5" top: "conv5" } layer { name: "pool5" type: "Pooling" bottom: "conv5" top: "pool5" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "fc6" type: "InnerProduct" bottom: "pool5" top: "fc6" inner_product_param { num_output: 4096 } } layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6" } layer { name: "drop6" type: "Dropout" bottom: "fc6" top: "fc6" dropout_param { dropout_ratio: 0.5 } } layer { name: "fc7" type: "InnerProduct" bottom: "fc6" top: "fc7" inner_product_param { num_output: 4096 } } layer { name: "relu7" type: "ReLU" bottom: "fc7" top: "fc7" } layer { name: "drop7" type: "Dropout" bottom: "fc7" top: "fc7" dropout_param { dropout_ratio: 0.5 } } layer { name: "fc8" type: "InnerProduct" bottom: "fc7" top: "fc8" inner_product_param { num_output: 1000 } } layer { name: "prob" type: "Softmax" bottom: "fc8" top: "prob" } I0813 11:54:48.328794 6317 net.cpp:116] Using FLOAT as default forward math type I0813 11:54:48.328799 6317 net.cpp:122] Using FLOAT as default backward math type I0813 11:54:48.328801 6317 layer_factory.hpp:172] Creating layer 'data' of type 'Input' I0813 11:54:48.328806 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.328814 6317 net.cpp:205] Created Layer data (0) I0813 11:54:48.328819 6317 net.cpp:547] data -> data I0813 11:54:48.328830 6317 net.cpp:265] Setting up data I0813 11:54:48.328835 6317 net.cpp:272] TEST Top shape for layer 0 'data' 10 3 227 227 (1545870) I0813 11:54:48.328840 6317 layer_factory.hpp:172] Creating layer 'conv1' of type 'Convolution' I0813 11:54:48.328845 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.328862 6317 net.cpp:205] Created Layer conv1 (1) I0813 11:54:48.328866 6317 net.cpp:577] conv1 <- data I0813 11:54:48.328869 6317 net.cpp:547] conv1 -> conv1 I0813 11:54:48.329103 6317 net.cpp:265] Setting up conv1 I0813 11:54:48.329108 6317 net.cpp:272] TEST Top shape for layer 1 'conv1' 10 96 55 55 (2904000) I0813 11:54:48.329116 6317 layer_factory.hpp:172] Creating layer 'relu1' of type 'ReLU' I0813 11:54:48.329119 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.329124 6317 net.cpp:205] Created Layer relu1 (2) I0813 11:54:48.329128 6317 net.cpp:577] relu1 <- conv1 I0813 11:54:48.329130 6317 net.cpp:532] relu1 -> conv1 (in-place) I0813 11:54:48.329136 6317 net.cpp:265] Setting up relu1 I0813 11:54:48.329140 6317 net.cpp:272] TEST Top shape for layer 2 'relu1' 10 96 55 55 (2904000) I0813 11:54:48.329144 6317 layer_factory.hpp:172] Creating layer 'pool1' of type 'Pooling' I0813 11:54:48.329146 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.329152 6317 net.cpp:205] Created Layer pool1 (3) I0813 11:54:48.329155 6317 net.cpp:577] pool1 <- conv1 I0813 11:54:48.329159 6317 net.cpp:547] pool1 -> pool1 I0813 11:54:48.329167 6317 net.cpp:265] Setting up pool1 I0813 11:54:48.329171 6317 net.cpp:272] TEST Top shape for layer 3 'pool1' 10 96 27 27 (699840) I0813 11:54:48.329175 6317 layer_factory.hpp:172] Creating layer 'norm1' of type 'LRN' I0813 11:54:48.329180 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.329185 6317 net.cpp:205] Created Layer norm1 (4) I0813 11:54:48.329188 6317 net.cpp:577] norm1 <- pool1 I0813 11:54:48.329191 6317 net.cpp:547] norm1 -> norm1 I0813 11:54:48.329196 6317 net.cpp:265] Setting up norm1 I0813 11:54:48.329200 6317 net.cpp:272] TEST Top shape for layer 4 'norm1' 10 96 27 27 (699840) I0813 11:54:48.329203 6317 layer_factory.hpp:172] Creating layer 'conv2' of type 'Convolution' I0813 11:54:48.329208 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.329213 6317 net.cpp:205] Created Layer conv2 (5) I0813 11:54:48.329216 6317 net.cpp:577] conv2 <- norm1 I0813 11:54:48.329219 6317 net.cpp:547] conv2 -> conv2 I0813 11:54:48.331080 6317 net.cpp:265] Setting up conv2 I0813 11:54:48.331085 6317 net.cpp:272] TEST Top shape for layer 5 'conv2' 10 256 27 27 (1866240) I0813 11:54:48.331090 6317 layer_factory.hpp:172] Creating layer 'relu2' of type 'ReLU' I0813 11:54:48.331094 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.331097 6317 net.cpp:205] Created Layer relu2 (6) I0813 11:54:48.331100 6317 net.cpp:577] relu2 <- conv2 I0813 11:54:48.331104 6317 net.cpp:532] relu2 -> conv2 (in-place) I0813 11:54:48.331108 6317 net.cpp:265] Setting up relu2 I0813 11:54:48.331111 6317 net.cpp:272] TEST Top shape for layer 6 'relu2' 10 256 27 27 (1866240) I0813 11:54:48.331115 6317 layer_factory.hpp:172] Creating layer 'pool2' of type 'Pooling' I0813 11:54:48.331118 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.331122 6317 net.cpp:205] Created Layer pool2 (7) I0813 11:54:48.331125 6317 net.cpp:577] pool2 <- conv2 I0813 11:54:48.331130 6317 net.cpp:547] pool2 -> pool2 I0813 11:54:48.331135 6317 net.cpp:265] Setting up pool2 I0813 11:54:48.331137 6317 net.cpp:272] TEST Top shape for layer 7 'pool2' 10 256 13 13 (432640) I0813 11:54:48.331141 6317 layer_factory.hpp:172] Creating layer 'norm2' of type 'LRN' I0813 11:54:48.331144 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.331152 6317 net.cpp:205] Created Layer norm2 (8) I0813 11:54:48.331156 6317 net.cpp:577] norm2 <- pool2 I0813 11:54:48.331159 6317 net.cpp:547] norm2 -> norm2 I0813 11:54:48.331163 6317 net.cpp:265] Setting up norm2 I0813 11:54:48.331172 6317 net.cpp:272] TEST Top shape for layer 8 'norm2' 10 256 13 13 (432640) I0813 11:54:48.331176 6317 layer_factory.hpp:172] Creating layer 'conv3' of type 'Convolution' I0813 11:54:48.331179 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.331184 6317 net.cpp:205] Created Layer conv3 (9) I0813 11:54:48.331187 6317 net.cpp:577] conv3 <- norm2 I0813 11:54:48.331192 6317 net.cpp:547] conv3 -> conv3 I0813 11:54:48.336570 6317 net.cpp:265] Setting up conv3 I0813 11:54:48.336575 6317 net.cpp:272] TEST Top shape for layer 9 'conv3' 10 384 13 13 (648960) I0813 11:54:48.336581 6317 layer_factory.hpp:172] Creating layer 'relu3' of type 'ReLU' I0813 11:54:48.336585 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.336588 6317 net.cpp:205] Created Layer relu3 (10) I0813 11:54:48.336591 6317 net.cpp:577] relu3 <- conv3 I0813 11:54:48.336596 6317 net.cpp:532] relu3 -> conv3 (in-place) I0813 11:54:48.336599 6317 net.cpp:265] Setting up relu3 I0813 11:54:48.336603 6317 net.cpp:272] TEST Top shape for layer 10 'relu3' 10 384 13 13 (648960) I0813 11:54:48.336606 6317 layer_factory.hpp:172] Creating layer 'conv4' of type 'Convolution' I0813 11:54:48.336611 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.336616 6317 net.cpp:205] Created Layer conv4 (11) I0813 11:54:48.336618 6317 net.cpp:577] conv4 <- conv3 I0813 11:54:48.336622 6317 net.cpp:547] conv4 -> conv4 I0813 11:54:48.340675 6317 net.cpp:265] Setting up conv4 I0813 11:54:48.340680 6317 net.cpp:272] TEST Top shape for layer 11 'conv4' 10 384 13 13 (648960) I0813 11:54:48.340685 6317 layer_factory.hpp:172] Creating layer 'relu4' of type 'ReLU' I0813 11:54:48.340689 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.340708 6317 net.cpp:205] Created Layer relu4 (12) I0813 11:54:48.340713 6317 net.cpp:577] relu4 <- conv4 I0813 11:54:48.340714 6317 net.cpp:532] relu4 -> conv4 (in-place) I0813 11:54:48.340718 6317 net.cpp:265] Setting up relu4 I0813 11:54:48.340723 6317 net.cpp:272] TEST Top shape for layer 12 'relu4' 10 384 13 13 (648960) I0813 11:54:48.340724 6317 layer_factory.hpp:172] Creating layer 'conv5' of type 'Convolution' I0813 11:54:48.340728 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.340734 6317 net.cpp:205] Created Layer conv5 (13) I0813 11:54:48.340736 6317 net.cpp:577] conv5 <- conv4 I0813 11:54:48.340739 6317 net.cpp:547] conv5 -> conv5 I0813 11:54:48.343433 6317 net.cpp:265] Setting up conv5 I0813 11:54:48.343438 6317 net.cpp:272] TEST Top shape for layer 13 'conv5' 10 256 13 13 (432640) I0813 11:54:48.343443 6317 layer_factory.hpp:172] Creating layer 'relu5' of type 'ReLU' I0813 11:54:48.343446 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.343449 6317 net.cpp:205] Created Layer relu5 (14) I0813 11:54:48.343451 6317 net.cpp:577] relu5 <- conv5 I0813 11:54:48.343454 6317 net.cpp:532] relu5 -> conv5 (in-place) I0813 11:54:48.343458 6317 net.cpp:265] Setting up relu5 I0813 11:54:48.343462 6317 net.cpp:272] TEST Top shape for layer 14 'relu5' 10 256 13 13 (432640) I0813 11:54:48.343466 6317 layer_factory.hpp:172] Creating layer 'pool5' of type 'Pooling' I0813 11:54:48.343468 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.343473 6317 net.cpp:205] Created Layer pool5 (15) I0813 11:54:48.343477 6317 net.cpp:577] pool5 <- conv5 I0813 11:54:48.343480 6317 net.cpp:547] pool5 -> pool5 I0813 11:54:48.343487 6317 net.cpp:265] Setting up pool5 I0813 11:54:48.343490 6317 net.cpp:272] TEST Top shape for layer 15 'pool5' 10 256 6 6 (92160) I0813 11:54:48.343494 6317 layer_factory.hpp:172] Creating layer 'fc6' of type 'InnerProduct' I0813 11:54:48.343497 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.343510 6317 net.cpp:205] Created Layer fc6 (16) I0813 11:54:48.343514 6317 net.cpp:577] fc6 <- pool5 I0813 11:54:48.343520 6317 net.cpp:547] fc6 -> fc6 I0813 11:54:48.573362 6317 net.cpp:265] Setting up fc6 I0813 11:54:48.573391 6317 net.cpp:272] TEST Top shape for layer 16 'fc6' 10 4096 (40960) I0813 11:54:48.573410 6317 layer_factory.hpp:172] Creating layer 'relu6' of type 'ReLU' I0813 11:54:48.573415 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.573423 6317 net.cpp:205] Created Layer relu6 (17) I0813 11:54:48.573428 6317 net.cpp:577] relu6 <- fc6 I0813 11:54:48.573432 6317 net.cpp:532] relu6 -> fc6 (in-place) I0813 11:54:48.573438 6317 net.cpp:265] Setting up relu6 I0813 11:54:48.573441 6317 net.cpp:272] TEST Top shape for layer 17 'relu6' 10 4096 (40960) I0813 11:54:48.573444 6317 layer_factory.hpp:172] Creating layer 'drop6' of type 'Dropout' I0813 11:54:48.573449 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.573455 6317 net.cpp:205] Created Layer drop6 (18) I0813 11:54:48.573458 6317 net.cpp:577] drop6 <- fc6 I0813 11:54:48.573462 6317 net.cpp:532] drop6 -> fc6 (in-place) I0813 11:54:48.573467 6317 net.cpp:265] Setting up drop6 I0813 11:54:48.573470 6317 net.cpp:272] TEST Top shape for layer 18 'drop6' 10 4096 (40960) I0813 11:54:48.573474 6317 layer_factory.hpp:172] Creating layer 'fc7' of type 'InnerProduct' I0813 11:54:48.573477 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.573483 6317 net.cpp:205] Created Layer fc7 (19) I0813 11:54:48.573487 6317 net.cpp:577] fc7 <- fc6 I0813 11:54:48.573490 6317 net.cpp:547] fc7 -> fc7 I0813 11:54:48.675756 6317 net.cpp:265] Setting up fc7 I0813 11:54:48.675781 6317 net.cpp:272] TEST Top shape for layer 19 'fc7' 10 4096 (40960) I0813 11:54:48.675801 6317 layer_factory.hpp:172] Creating layer 'relu7' of type 'ReLU' I0813 11:54:48.675806 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.675814 6317 net.cpp:205] Created Layer relu7 (20) I0813 11:54:48.675818 6317 net.cpp:577] relu7 <- fc7 I0813 11:54:48.675822 6317 net.cpp:532] relu7 -> fc7 (in-place) I0813 11:54:48.675827 6317 net.cpp:265] Setting up relu7 I0813 11:54:48.675829 6317 net.cpp:272] TEST Top shape for layer 20 'relu7' 10 4096 (40960) I0813 11:54:48.675833 6317 layer_factory.hpp:172] Creating layer 'drop7' of type 'Dropout' I0813 11:54:48.675837 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.675844 6317 net.cpp:205] Created Layer drop7 (21) I0813 11:54:48.675848 6317 net.cpp:577] drop7 <- fc7 I0813 11:54:48.675849 6317 net.cpp:532] drop7 -> fc7 (in-place) I0813 11:54:48.675855 6317 net.cpp:265] Setting up drop7 I0813 11:54:48.675858 6317 net.cpp:272] TEST Top shape for layer 21 'drop7' 10 4096 (40960) I0813 11:54:48.675861 6317 layer_factory.hpp:172] Creating layer 'fc8' of type 'InnerProduct' I0813 11:54:48.675864 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.675870 6317 net.cpp:205] Created Layer fc8 (22) I0813 11:54:48.675887 6317 net.cpp:577] fc8 <- fc7 I0813 11:54:48.675890 6317 net.cpp:547] fc8 -> fc8 I0813 11:54:48.700850 6317 net.cpp:265] Setting up fc8 I0813 11:54:48.700870 6317 net.cpp:272] TEST Top shape for layer 22 'fc8' 10 1000 (10000) I0813 11:54:48.700888 6317 layer_factory.hpp:172] Creating layer 'prob' of type 'Softmax' I0813 11:54:48.700894 6317 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT I0813 11:54:48.700906 6317 net.cpp:205] Created Layer prob (23) I0813 11:54:48.700911 6317 net.cpp:577] prob <- fc8 I0813 11:54:48.700915 6317 net.cpp:547] prob -> prob I0813 11:54:48.700930 6317 net.cpp:265] Setting up prob I0813 11:54:48.700933 6317 net.cpp:272] TEST Top shape for layer 23 'prob' 10 1000 (10000) I0813 11:54:48.700958 6317 net.cpp:343] prob does not need backward computation. I0813 11:54:48.700963 6317 net.cpp:343] fc8 does not need backward computation. I0813 11:54:48.700966 6317 net.cpp:343] drop7 does not need backward computation. I0813 11:54:48.700969 6317 net.cpp:343] relu7 does not need backward computation. I0813 11:54:48.700973 6317 net.cpp:343] fc7 does not need backward computation. I0813 11:54:48.700978 6317 net.cpp:343] drop6 does not need backward computation. I0813 11:54:48.700981 6317 net.cpp:343] relu6 does not need backward computation. I0813 11:54:48.700985 6317 net.cpp:343] fc6 does not need backward computation. I0813 11:54:48.700997 6317 net.cpp:343] pool5 does not need backward computation. I0813 11:54:48.701000 6317 net.cpp:343] relu5 does not need backward computation. I0813 11:54:48.701004 6317 net.cpp:343] conv5 does not need backward computation. I0813 11:54:48.701009 6317 net.cpp:343] relu4 does not need backward computation. I0813 11:54:48.701012 6317 net.cpp:343] conv4 does not need backward computation. I0813 11:54:48.701015 6317 net.cpp:343] relu3 does not need backward computation. I0813 11:54:48.701018 6317 net.cpp:343] conv3 does not need backward computation. I0813 11:54:48.701023 6317 net.cpp:343] norm2 does not need backward computation. I0813 11:54:48.701027 6317 net.cpp:343] pool2 does not need backward computation. I0813 11:54:48.701031 6317 net.cpp:343] relu2 does not need backward computation. I0813 11:54:48.701035 6317 net.cpp:343] conv2 does not need backward computation. I0813 11:54:48.701038 6317 net.cpp:343] norm1 does not need backward computation. I0813 11:54:48.701042 6317 net.cpp:343] pool1 does not need backward computation. I0813 11:54:48.701046 6317 net.cpp:343] relu1 does not need backward computation. I0813 11:54:48.701050 6317 net.cpp:343] conv1 does not need backward computation. I0813 11:54:48.701054 6317 net.cpp:343] data does not need backward computation. I0813 11:54:48.701057 6317 net.cpp:385] This network produces output prob I0813 11:54:48.701071 6317 net.cpp:408] Top memory (TEST) required for data: 68681400 diff: 68681400 I0813 11:54:48.701076 6317 net.cpp:411] Bottom memory (TEST) required for data: 68641400 diff: 68641400 I0813 11:54:48.701077 6317 net.cpp:414] Shared (in-place) memory (TEST) by data: 26658560 diff: 26658560 I0813 11:54:48.701081 6317 net.cpp:417] Parameters memory (TEST) required for data: 243860896 diff: 243860896 I0813 11:54:48.701084 6317 net.cpp:420] Parameters shared memory (TEST) by data: 0 diff: 0 I0813 11:54:48.701086 6317 net.cpp:426] Network initialization done. I0813 11:54:48.774591 6317 upgrade_proto.cpp:43] Attempting to upgrade input file specified using deprecated transformation parameters: ../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel I0813 11:54:48.774614 6317 upgrade_proto.cpp:46] Successfully upgraded file specified using deprecated data transformation parameters. W0813 11:54:48.774619 6317 upgrade_proto.cpp:48] Note that future Caffe releases will only support transform_param messages for transformation fields. I0813 11:54:48.774633 6317 upgrade_proto.cpp:52] Attempting to upgrade input file specified using deprecated V1LayerParameter: ../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel I0813 11:54:48.893983 6317 upgrade_proto.cpp:60] Successfully upgraded file specified using deprecated V1LayerParameter I0813 11:54:48.897544 6317 net.cpp:1143] Copying source layer data Type:Data #blobs=0 I0813 11:54:48.897552 6317 net.cpp:1143] Copying source layer conv1 Type:Convolution #blobs=2 I0813 11:54:48.897891 6317 net.cpp:1143] Copying source layer relu1 Type:ReLU #blobs=0 I0813 11:54:48.897895 6317 net.cpp:1143] Copying source layer pool1 Type:Pooling #blobs=0 I0813 11:54:48.897898 6317 net.cpp:1143] Copying source layer norm1 Type:LRN #blobs=0 I0813 11:54:48.897902 6317 net.cpp:1143] Copying source layer conv2 Type:Convolution #blobs=2 I0813 11:54:48.900985 6317 net.cpp:1143] Copying source layer relu2 Type:ReLU #blobs=0 I0813 11:54:48.901010 6317 net.cpp:1143] Copying source layer pool2 Type:Pooling #blobs=0 I0813 11:54:48.901012 6317 net.cpp:1143] Copying source layer norm2 Type:LRN #blobs=0 I0813 11:54:48.901015 6317 net.cpp:1143] Copying source layer conv3 Type:Convolution #blobs=2 I0813 11:54:48.909281 6317 net.cpp:1143] Copying source layer relu3 Type:ReLU #blobs=0 I0813 11:54:48.909286 6317 net.cpp:1143] Copying source layer conv4 Type:Convolution #blobs=2 I0813 11:54:48.915896 6317 net.cpp:1143] Copying source layer relu4 Type:ReLU #blobs=0 I0813 11:54:48.915901 6317 net.cpp:1143] Copying source layer conv5 Type:Convolution #blobs=2 I0813 11:54:48.920101 6317 net.cpp:1143] Copying source layer relu5 Type:ReLU #blobs=0 I0813 11:54:48.920106 6317 net.cpp:1143] Copying source layer pool5 Type:Pooling #blobs=0 I0813 11:54:48.920109 6317 net.cpp:1143] Copying source layer fc6 Type:InnerProduct #blobs=2 I0813 11:54:49.272478 6317 net.cpp:1143] Copying source layer relu6 Type:ReLU #blobs=0 I0813 11:54:49.272502 6317 net.cpp:1143] Copying source layer drop6 Type:Dropout #blobs=0 I0813 11:54:49.272505 6317 net.cpp:1143] Copying source layer fc7 Type:InnerProduct #blobs=2 I0813 11:54:49.431002 6317 net.cpp:1143] Copying source layer relu7 Type:ReLU #blobs=0 I0813 11:54:49.431025 6317 net.cpp:1143] Copying source layer drop7 Type:Dropout #blobs=0 I0813 11:54:49.431027 6317 net.cpp:1143] Copying source layer fc8 Type:InnerProduct #blobs=2 I0813 11:54:49.468545 6317 net.cpp:1135] Ignoring source layer loss (10, 3, 227, 227) Loading file: ../101_ObjectCategories/panda/image_0008.jpg Classifying 1 inputs. Done in 0.56 s. Saving results into ../result.npy chibi@1604:~/caffe$ python show_result.py data/ilsvrc12/synset_words.txt result.npy #1 | n02510455 giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca | 99.1% #2 | n02488702 colobus, colobus monkey | 0.2% #3 | n02500267 indri, indris, Indri indri, Indri brevicaudatus | 0.1% chibi@1604:~/caffe$