chibi@2204:~/caffe$ cd python; python classify.py --raw_scale 255 ../101_ObjectCategories/cougar_face/image_0003.jpg ../result.npy; cd ..
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0215 05:49:07.636334 21885 gpu_memory.cpp:82] GPUMemory::Manager initialized
CPU mode
W0215 05:49:07.779239 21885 _caffe.cpp:172] DEPRECATION WARNING - deprecated use of Python interface
W0215 05:49:07.779465 21885 _caffe.cpp:173] Use this instead (with the named "weights" parameter):
W0215 05:49:07.779487 21885 _caffe.cpp:175] Net('/home/chibi/caffe/python/../models/bvlc_reference_caffenet/deploy.prototxt', 1, weights='/home/chibi/caffe/python/../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel')
I0215 05:49:07.779786 21885 net.cpp:86] Initializing net from parameters:
name: "CaffeNet"
state {
  phase: TEST
  level: 0
}
layer {
  name: "data"
  type: "Input"
  top: "data"
  input_param {
    shape {
      dim: 10
      dim: 3
      dim: 227
      dim: 227
    }
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  convolution_param {
    num_output: 96
    kernel_size: 11
    stride: 4
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "pool1"
  top: "norm1"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "norm1"
  top: "conv2"
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    group: 2
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "pool2"
  top: "norm2"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "norm2"
  top: "conv3"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 2
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    group: 2
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "pool5"
  type: "Pooling"
  bottom: "conv5"
  top: "pool5"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "fc6"
  type: "InnerProduct"
  bottom: "pool5"
  top: "fc6"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "fc6"
  top: "fc6"
}
layer {
  name: "drop6"
  type: "Dropout"
  bottom: "fc6"
  top: "fc6"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc7"
  type: "InnerProduct"
  bottom: "fc6"
  top: "fc7"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "fc7"
  top: "fc7"
}
layer {
  name: "drop7"
  type: "Dropout"
  bottom: "fc7"
  top: "fc7"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc8"
  type: "InnerProduct"
  bottom: "fc7"
  top: "fc8"
  inner_product_param {
    num_output: 1000
  }
}
layer {
  name: "prob"
  type: "Softmax"
  bottom: "fc8"
  top: "prob"
}
I0215 05:49:07.781092 21885 net.cpp:116] Using FLOAT as default forward math type
I0215 05:49:07.781116 21885 net.cpp:122] Using FLOAT as default backward math type
I0215 05:49:07.781129 21885 layer_factory.hpp:172] Creating layer 'data' of type 'Input'
I0215 05:49:07.781141 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.781174 21885 net.cpp:205] Created Layer data (0)
I0215 05:49:07.781193 21885 net.cpp:547] data -> data
I0215 05:49:07.781225 21885 net.cpp:265] Setting up data
I0215 05:49:07.781240 21885 net.cpp:272] TEST Top shape for layer 0 'data' 10 3 227 227 (1545870)
I0215 05:49:07.781263 21885 layer_factory.hpp:172] Creating layer 'conv1' of type 'Convolution'
I0215 05:49:07.781275 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.781327 21885 net.cpp:205] Created Layer conv1 (1)
I0215 05:49:07.781342 21885 net.cpp:577] conv1 <- data
I0215 05:49:07.781356 21885 net.cpp:547] conv1 -> conv1
I0215 05:49:07.781752 21885 net.cpp:265] Setting up conv1
I0215 05:49:07.781770 21885 net.cpp:272] TEST Top shape for layer 1 'conv1' 10 96 55 55 (2904000)
I0215 05:49:07.781802 21885 layer_factory.hpp:172] Creating layer 'relu1' of type 'ReLU'
I0215 05:49:07.781816 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.781832 21885 net.cpp:205] Created Layer relu1 (2)
I0215 05:49:07.781844 21885 net.cpp:577] relu1 <- conv1
I0215 05:49:07.781857 21885 net.cpp:532] relu1 -> conv1 (in-place)
I0215 05:49:07.781872 21885 net.cpp:265] Setting up relu1
I0215 05:49:07.781884 21885 net.cpp:272] TEST Top shape for layer 2 'relu1' 10 96 55 55 (2904000)
I0215 05:49:07.781898 21885 layer_factory.hpp:172] Creating layer 'pool1' of type 'Pooling'
I0215 05:49:07.781910 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.781924 21885 net.cpp:205] Created Layer pool1 (3)
I0215 05:49:07.781936 21885 net.cpp:577] pool1 <- conv1
I0215 05:49:07.781951 21885 net.cpp:547] pool1 -> pool1
I0215 05:49:07.781973 21885 net.cpp:265] Setting up pool1
I0215 05:49:07.781987 21885 net.cpp:272] TEST Top shape for layer 3 'pool1' 10 96 27 27 (699840)
I0215 05:49:07.782001 21885 layer_factory.hpp:172] Creating layer 'norm1' of type 'LRN'
I0215 05:49:07.782014 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.782032 21885 net.cpp:205] Created Layer norm1 (4)
I0215 05:49:07.782044 21885 net.cpp:577] norm1 <- pool1
I0215 05:49:07.782056 21885 net.cpp:547] norm1 -> norm1
I0215 05:49:07.782070 21885 net.cpp:265] Setting up norm1
I0215 05:49:07.782083 21885 net.cpp:272] TEST Top shape for layer 4 'norm1' 10 96 27 27 (699840)
I0215 05:49:07.782095 21885 layer_factory.hpp:172] Creating layer 'conv2' of type 'Convolution'
I0215 05:49:07.782109 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.782131 21885 net.cpp:205] Created Layer conv2 (5)
I0215 05:49:07.782145 21885 net.cpp:577] conv2 <- norm1
I0215 05:49:07.782158 21885 net.cpp:547] conv2 -> conv2
I0215 05:49:07.785022 21885 net.cpp:265] Setting up conv2
I0215 05:49:07.785055 21885 net.cpp:272] TEST Top shape for layer 5 'conv2' 10 256 27 27 (1866240)
I0215 05:49:07.785077 21885 layer_factory.hpp:172] Creating layer 'relu2' of type 'ReLU'
I0215 05:49:07.785090 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.785105 21885 net.cpp:205] Created Layer relu2 (6)
I0215 05:49:07.785117 21885 net.cpp:577] relu2 <- conv2
I0215 05:49:07.785130 21885 net.cpp:532] relu2 -> conv2 (in-place)
I0215 05:49:07.785146 21885 net.cpp:265] Setting up relu2
I0215 05:49:07.785159 21885 net.cpp:272] TEST Top shape for layer 6 'relu2' 10 256 27 27 (1866240)
I0215 05:49:07.785176 21885 layer_factory.hpp:172] Creating layer 'pool2' of type 'Pooling'
I0215 05:49:07.785187 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.785202 21885 net.cpp:205] Created Layer pool2 (7)
I0215 05:49:07.785215 21885 net.cpp:577] pool2 <- conv2
I0215 05:49:07.785228 21885 net.cpp:547] pool2 -> pool2
I0215 05:49:07.785241 21885 net.cpp:265] Setting up pool2
I0215 05:49:07.785254 21885 net.cpp:272] TEST Top shape for layer 7 'pool2' 10 256 13 13 (432640)
I0215 05:49:07.785266 21885 layer_factory.hpp:172] Creating layer 'norm2' of type 'LRN'
I0215 05:49:07.785279 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.785300 21885 net.cpp:205] Created Layer norm2 (8)
I0215 05:49:07.785312 21885 net.cpp:577] norm2 <- pool2
I0215 05:49:07.785327 21885 net.cpp:547] norm2 -> norm2
I0215 05:49:07.785367 21885 net.cpp:265] Setting up norm2
I0215 05:49:07.785378 21885 net.cpp:272] TEST Top shape for layer 8 'norm2' 10 256 13 13 (432640)
I0215 05:49:07.785391 21885 layer_factory.hpp:172] Creating layer 'conv3' of type 'Convolution'
I0215 05:49:07.785405 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.785424 21885 net.cpp:205] Created Layer conv3 (9)
I0215 05:49:07.785436 21885 net.cpp:577] conv3 <- norm2
I0215 05:49:07.785450 21885 net.cpp:547] conv3 -> conv3
I0215 05:49:07.793702 21885 net.cpp:265] Setting up conv3
I0215 05:49:07.793742 21885 net.cpp:272] TEST Top shape for layer 9 'conv3' 10 384 13 13 (648960)
I0215 05:49:07.793767 21885 layer_factory.hpp:172] Creating layer 'relu3' of type 'ReLU'
I0215 05:49:07.793779 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.793797 21885 net.cpp:205] Created Layer relu3 (10)
I0215 05:49:07.793808 21885 net.cpp:577] relu3 <- conv3
I0215 05:49:07.793821 21885 net.cpp:532] relu3 -> conv3 (in-place)
I0215 05:49:07.793834 21885 net.cpp:265] Setting up relu3
I0215 05:49:07.793846 21885 net.cpp:272] TEST Top shape for layer 10 'relu3' 10 384 13 13 (648960)
I0215 05:49:07.793859 21885 layer_factory.hpp:172] Creating layer 'conv4' of type 'Convolution'
I0215 05:49:07.793874 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.793889 21885 net.cpp:205] Created Layer conv4 (11)
I0215 05:49:07.793901 21885 net.cpp:577] conv4 <- conv3
I0215 05:49:07.793912 21885 net.cpp:547] conv4 -> conv4
I0215 05:49:07.800000 21885 net.cpp:265] Setting up conv4
I0215 05:49:07.800041 21885 net.cpp:272] TEST Top shape for layer 11 'conv4' 10 384 13 13 (648960)
I0215 05:49:07.800061 21885 layer_factory.hpp:172] Creating layer 'relu4' of type 'ReLU'
I0215 05:49:07.800074 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.800091 21885 net.cpp:205] Created Layer relu4 (12)
I0215 05:49:07.800103 21885 net.cpp:577] relu4 <- conv4
I0215 05:49:07.800117 21885 net.cpp:532] relu4 -> conv4 (in-place)
I0215 05:49:07.800129 21885 net.cpp:265] Setting up relu4
I0215 05:49:07.800141 21885 net.cpp:272] TEST Top shape for layer 12 'relu4' 10 384 13 13 (648960)
I0215 05:49:07.800153 21885 layer_factory.hpp:172] Creating layer 'conv5' of type 'Convolution'
I0215 05:49:07.800179 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.800204 21885 net.cpp:205] Created Layer conv5 (13)
I0215 05:49:07.800218 21885 net.cpp:577] conv5 <- conv4
I0215 05:49:07.800231 21885 net.cpp:547] conv5 -> conv5
I0215 05:49:07.804414 21885 net.cpp:265] Setting up conv5
I0215 05:49:07.804447 21885 net.cpp:272] TEST Top shape for layer 13 'conv5' 10 256 13 13 (432640)
I0215 05:49:07.804469 21885 layer_factory.hpp:172] Creating layer 'relu5' of type 'ReLU'
I0215 05:49:07.804481 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.804497 21885 net.cpp:205] Created Layer relu5 (14)
I0215 05:49:07.804510 21885 net.cpp:577] relu5 <- conv5
I0215 05:49:07.804526 21885 net.cpp:532] relu5 -> conv5 (in-place)
I0215 05:49:07.804543 21885 net.cpp:265] Setting up relu5
I0215 05:49:07.804554 21885 net.cpp:272] TEST Top shape for layer 14 'relu5' 10 256 13 13 (432640)
I0215 05:49:07.804567 21885 layer_factory.hpp:172] Creating layer 'pool5' of type 'Pooling'
I0215 05:49:07.804579 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.804594 21885 net.cpp:205] Created Layer pool5 (15)
I0215 05:49:07.804607 21885 net.cpp:577] pool5 <- conv5
I0215 05:49:07.804620 21885 net.cpp:547] pool5 -> pool5
I0215 05:49:07.804634 21885 net.cpp:265] Setting up pool5
I0215 05:49:07.804646 21885 net.cpp:272] TEST Top shape for layer 15 'pool5' 10 256 6 6 (92160)
I0215 05:49:07.804658 21885 layer_factory.hpp:172] Creating layer 'fc6' of type 'InnerProduct'
I0215 05:49:07.804672 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:07.804718 21885 net.cpp:205] Created Layer fc6 (16)
I0215 05:49:07.804731 21885 net.cpp:577] fc6 <- pool5
I0215 05:49:07.804744 21885 net.cpp:547] fc6 -> fc6
I0215 05:49:08.195231 21885 net.cpp:265] Setting up fc6
I0215 05:49:08.197454 21885 net.cpp:272] TEST Top shape for layer 16 'fc6' 10 4096 (40960)
I0215 05:49:08.197484 21885 layer_factory.hpp:172] Creating layer 'relu6' of type 'ReLU'
I0215 05:49:08.197496 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:08.197526 21885 net.cpp:205] Created Layer relu6 (17)
I0215 05:49:08.197542 21885 net.cpp:577] relu6 <- fc6
I0215 05:49:08.197556 21885 net.cpp:532] relu6 -> fc6 (in-place)
I0215 05:49:08.197571 21885 net.cpp:265] Setting up relu6
I0215 05:49:08.197582 21885 net.cpp:272] TEST Top shape for layer 17 'relu6' 10 4096 (40960)
I0215 05:49:08.197594 21885 layer_factory.hpp:172] Creating layer 'drop6' of type 'Dropout'
I0215 05:49:08.197608 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:08.197624 21885 net.cpp:205] Created Layer drop6 (18)
I0215 05:49:08.197636 21885 net.cpp:577] drop6 <- fc6
I0215 05:49:08.197649 21885 net.cpp:532] drop6 -> fc6 (in-place)
I0215 05:49:08.197664 21885 net.cpp:265] Setting up drop6
I0215 05:49:08.197676 21885 net.cpp:272] TEST Top shape for layer 18 'drop6' 10 4096 (40960)
I0215 05:49:08.197690 21885 layer_factory.hpp:172] Creating layer 'fc7' of type 'InnerProduct'
I0215 05:49:08.197701 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:08.197714 21885 net.cpp:205] Created Layer fc7 (19)
I0215 05:49:08.197727 21885 net.cpp:577] fc7 <- fc6
I0215 05:49:08.197739 21885 net.cpp:547] fc7 -> fc7
I0215 05:49:08.391813 21885 net.cpp:265] Setting up fc7
I0215 05:49:08.394188 21885 net.cpp:272] TEST Top shape for layer 19 'fc7' 10 4096 (40960)
I0215 05:49:08.394220 21885 layer_factory.hpp:172] Creating layer 'relu7' of type 'ReLU'
I0215 05:49:08.394234 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:08.394256 21885 net.cpp:205] Created Layer relu7 (20)
I0215 05:49:08.394269 21885 net.cpp:577] relu7 <- fc7
I0215 05:49:08.394282 21885 net.cpp:532] relu7 -> fc7 (in-place)
I0215 05:49:08.394296 21885 net.cpp:265] Setting up relu7
I0215 05:49:08.394308 21885 net.cpp:272] TEST Top shape for layer 20 'relu7' 10 4096 (40960)
I0215 05:49:08.394320 21885 layer_factory.hpp:172] Creating layer 'drop7' of type 'Dropout'
I0215 05:49:08.394331 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:08.394345 21885 net.cpp:205] Created Layer drop7 (21)
I0215 05:49:08.394357 21885 net.cpp:577] drop7 <- fc7
I0215 05:49:08.394368 21885 net.cpp:532] drop7 -> fc7 (in-place)
I0215 05:49:08.394383 21885 net.cpp:265] Setting up drop7
I0215 05:49:08.394397 21885 net.cpp:272] TEST Top shape for layer 21 'drop7' 10 4096 (40960)
I0215 05:49:08.394409 21885 layer_factory.hpp:172] Creating layer 'fc8' of type 'InnerProduct'
I0215 05:49:08.394423 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:08.394438 21885 net.cpp:205] Created Layer fc8 (22)
I0215 05:49:08.394449 21885 net.cpp:577] fc8 <- fc7
I0215 05:49:08.394462 21885 net.cpp:547] fc8 -> fc8
I0215 05:49:08.466801 21885 net.cpp:265] Setting up fc8
I0215 05:49:08.466835 21885 net.cpp:272] TEST Top shape for layer 22 'fc8' 10 1000 (10000)
I0215 05:49:08.466853 21885 layer_factory.hpp:172] Creating layer 'prob' of type 'Softmax'
I0215 05:49:08.466861 21885 layer_factory.hpp:184] Layer's types are Ftype:FLOAT Btype:FLOAT Fmath:FLOAT Bmath:FLOAT
I0215 05:49:08.466881 21885 net.cpp:205] Created Layer prob (23)
I0215 05:49:08.466890 21885 net.cpp:577] prob <- fc8
I0215 05:49:08.466899 21885 net.cpp:547] prob -> prob
I0215 05:49:08.466925 21885 net.cpp:265] Setting up prob
I0215 05:49:08.466933 21885 net.cpp:272] TEST Top shape for layer 23 'prob' 10 1000 (10000)
I0215 05:49:08.466975 21885 net.cpp:343] prob does not need backward computation.
I0215 05:49:08.466985 21885 net.cpp:343] fc8 does not need backward computation.
I0215 05:49:08.467003 21885 net.cpp:343] drop7 does not need backward computation.
I0215 05:49:08.467010 21885 net.cpp:343] relu7 does not need backward computation.
I0215 05:49:08.467017 21885 net.cpp:343] fc7 does not need backward computation.
I0215 05:49:08.467024 21885 net.cpp:343] drop6 does not need backward computation.
I0215 05:49:08.467031 21885 net.cpp:343] relu6 does not need backward computation.
I0215 05:49:08.467037 21885 net.cpp:343] fc6 does not need backward computation.
I0215 05:49:08.467051 21885 net.cpp:343] pool5 does not need backward computation.
I0215 05:49:08.467057 21885 net.cpp:343] relu5 does not need backward computation.
I0215 05:49:08.467064 21885 net.cpp:343] conv5 does not need backward computation.
I0215 05:49:08.467072 21885 net.cpp:343] relu4 does not need backward computation.
I0215 05:49:08.467077 21885 net.cpp:343] conv4 does not need backward computation.
I0215 05:49:08.467084 21885 net.cpp:343] relu3 does not need backward computation.
I0215 05:49:08.467108 21885 net.cpp:343] conv3 does not need backward computation.
I0215 05:49:08.467118 21885 net.cpp:343] norm2 does not need backward computation.
I0215 05:49:08.467126 21885 net.cpp:343] pool2 does not need backward computation.
I0215 05:49:08.467133 21885 net.cpp:343] relu2 does not need backward computation.
I0215 05:49:08.467140 21885 net.cpp:343] conv2 does not need backward computation.
I0215 05:49:08.467155 21885 net.cpp:343] norm1 does not need backward computation.
I0215 05:49:08.467167 21885 net.cpp:343] pool1 does not need backward computation.
I0215 05:49:08.467175 21885 net.cpp:343] relu1 does not need backward computation.
I0215 05:49:08.467181 21885 net.cpp:343] conv1 does not need backward computation.
I0215 05:49:08.467188 21885 net.cpp:343] data does not need backward computation.
I0215 05:49:08.467195 21885 net.cpp:385] This network produces output prob
I0215 05:49:08.467231 21885 net.cpp:408] Top memory (TEST) required for data: 68681400 diff: 68681400
I0215 05:49:08.467240 21885 net.cpp:411] Bottom memory (TEST) required for data: 68641400 diff: 68641400
I0215 05:49:08.467247 21885 net.cpp:414] Shared (in-place) memory (TEST) by data: 26658560 diff: 26658560
I0215 05:49:08.467270 21885 net.cpp:417] Parameters memory (TEST) required for data: 243860896 diff: 42272
I0215 05:49:08.467278 21885 net.cpp:420] Parameters shared memory (TEST) by data: 0 diff: 0
I0215 05:49:08.467285 21885 net.cpp:426] Network initialization done.
I0215 05:49:09.089401 21885 upgrade_proto.cpp:43] Attempting to upgrade input file specified using deprecated transformation parameters: /home/chibi/caffe/python/../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel
I0215 05:49:09.089432 21885 upgrade_proto.cpp:46] Successfully upgraded file specified using deprecated data transformation parameters.
W0215 05:49:09.089443 21885 upgrade_proto.cpp:48] Note that future Caffe releases will only support transform_param messages for transformation fields.
I0215 05:49:09.089466 21885 upgrade_proto.cpp:52] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/chibi/caffe/python/../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel
I0215 05:49:09.589708 21885 upgrade_proto.cpp:60] Successfully upgraded file specified using deprecated V1LayerParameter
I0215 05:49:09.613269 21885 net.cpp:1143] Copying source layer data Type:Data #blobs=0
I0215 05:49:09.613298 21885 net.cpp:1143] Copying source layer conv1 Type:Convolution #blobs=2
I0215 05:49:09.613582 21885 net.cpp:1143] Copying source layer relu1 Type:ReLU #blobs=0
I0215 05:49:09.613592 21885 net.cpp:1143] Copying source layer pool1 Type:Pooling #blobs=0
I0215 05:49:09.613600 21885 net.cpp:1143] Copying source layer norm1 Type:LRN #blobs=0
I0215 05:49:09.613607 21885 net.cpp:1143] Copying source layer conv2 Type:Convolution #blobs=2
I0215 05:49:09.615955 21885 net.cpp:1143] Copying source layer relu2 Type:ReLU #blobs=0
I0215 05:49:09.615968 21885 net.cpp:1143] Copying source layer pool2 Type:Pooling #blobs=0
I0215 05:49:09.615978 21885 net.cpp:1143] Copying source layer norm2 Type:LRN #blobs=0
I0215 05:49:09.615983 21885 net.cpp:1143] Copying source layer conv3 Type:Convolution #blobs=2
I0215 05:49:09.631708 21885 net.cpp:1143] Copying source layer relu3 Type:ReLU #blobs=0
I0215 05:49:09.631732 21885 net.cpp:1143] Copying source layer conv4 Type:Convolution #blobs=2
I0215 05:49:09.638783 21885 net.cpp:1143] Copying source layer relu4 Type:ReLU #blobs=0
I0215 05:49:09.642316 21885 net.cpp:1143] Copying source layer conv5 Type:Convolution #blobs=2
I0215 05:49:09.646821 21885 net.cpp:1143] Copying source layer relu5 Type:ReLU #blobs=0
I0215 05:49:09.646855 21885 net.cpp:1143] Copying source layer pool5 Type:Pooling #blobs=0
I0215 05:49:09.646867 21885 net.cpp:1143] Copying source layer fc6 Type:InnerProduct #blobs=2
I0215 05:49:10.045526 21885 net.cpp:1143] Copying source layer relu6 Type:ReLU #blobs=0
I0215 05:49:10.045575 21885 net.cpp:1143] Copying source layer drop6 Type:Dropout #blobs=0
I0215 05:49:10.045588 21885 net.cpp:1143] Copying source layer fc7 Type:InnerProduct #blobs=2
I0215 05:49:10.229210 21885 net.cpp:1143] Copying source layer relu7 Type:ReLU #blobs=0
I0215 05:49:10.229346 21885 net.cpp:1143] Copying source layer drop7 Type:Dropout #blobs=0
I0215 05:49:10.229403 21885 net.cpp:1143] Copying source layer fc8 Type:InnerProduct #blobs=2
I0215 05:49:10.260283 21885 net.cpp:1135] Ignoring source layer loss
(10, 3, 227, 227)
Loading file: ../101_ObjectCategories/cougar_face/image_0003.jpg
Classifying 1 inputs.
Done in 1.03 s.
Saving results into ../result.npy
chibi@2204:~/caffe$ python show_result.py data/ilsvrc12/synset_words.txt result.npy
#1 | n02125311 cougar, puma, catamount, mountain lion, painter, panther, Felis concolor | 97.9%
#2 | n01877812 wallaby, brush kangaroo |  0.7%
#3 | n02124075 Egyptian cat |  0.3%
chibi@2204:~/caffe$