营销网站运营的基本环节,现在去山东会被隔离吗?,stp营销战略,wordpress外观小工具转载自#xff1a;http://blog.csdn.net/kkk584520/article/details/52721838
http://blog.csdn.net/kkk584520 博客内容基于新书《深度学习#xff1a;21 天实战 Caffe》#xff0c;书中课后习题答案欢迎读者留言讨论。以下进入正文。 在使用 Caffe 过程中经常会有这样的…转载自http://blog.csdn.net/kkk584520/article/details/52721838
http://blog.csdn.net/kkk584520 博客内容基于新书《深度学习21 天实战 Caffe》书中课后习题答案欢迎读者留言讨论。以下进入正文。 在使用 Caffe 过程中经常会有这样的需求已有 Layer 不符合我的应用场景我需要这样这样的功能原版代码没有实现或者已经实现但效率太低我有更好的实现。 方案一简单粗暴的解法——偷天换日 如果你对 ConvolutionLayer 的实现不满意那就直接改这两个文件$CAFFE_ROOT/include/caffe/layers/conv_layer.hpp 和 $CAFFE_ROOT/src/caffe/layers/conv_layer.cpp 或 conv_layer.cu 将 im2col gemm 替换为你自己的实现比如基于 winograd 算法的实现。
优点快速迭代不需要对 Caffe 框架有过多了解糙快狠准。
缺点代码难维护不能 merge 到 caffe master branch容易给使用代码的人带来困惑效果和 #define TRUE false 差不多。 方案二稍微温柔的解法——千人千面
和方案一类似只是通过预编译宏来确定使用哪种实现。例如可以保留 ConvolutionLayer 默认实现同时在代码中增加如下段 [cpp] view plaincopy print? #ifdef SWITCH_MY_IMPLEMENTATION // 你的实现代码 #else // 默认代码 #endif 这样可以在需要使用该 Layer 的代码中增加宏定义 [cpp] view plaincopy print? #define SWITCH_MY_IMPLEMENTATION 就可以使用你的实现。而未定义该宏的代码仍然使用原版实现。 优点可以在新旧实现代码之间灵活切换
缺点每次切换需要重新编译 方案三优雅转身——山路十八弯
同一个功能的 Layer 有不同实现希望能灵活切换又不需要重新编译代码该如何实现
这时不得不使用 ProtoBuffer 工具了。
首先要把你的实现要像正常的 Layer 类一样分解为声明部分和实现部分分别放在 .hpp 与 .cpp、.cu 中。Layer 名称要起一个能区别于原版实现的新名称。.hpp 文件置于 $CAFFE_ROOT/include/caffe/layers/而 .cpp 和 .cu 置于 $CAFFE_ROOT/src/caffe/layers/这样你在 $CAFFE_ROOT 下执行 make 编译时会自动将这些文件加入构建过程省去了手动设置编译选项的繁琐流程。
其次在 $CAFFE_ROOT/src/caffe/proto/caffe.proto 中增加新 LayerParameter 选项这样你在编写 train.prototxt 或者 test.prototxt 或者 deploy.prototxt 时就能把新 Layer 的描述写进去便于修改网络结构和替换其他相同功能的 Layer 了。
最后也是最容易忽视的一点在 Layer 工厂注册新 Layer 加工函数不然在你运行过程中可能会报如下错误 [plain] view plaincopy print? F1002 01:51:22.656038 1954701312 layer_factory.hpp:81] Check failed: registry.count(type) 1 (0 vs. 1) Unknown layer type: AllPass (known types: AbsVal, Accuracy, ArgMax, BNLL, BatchNorm, BatchReindex, Bias, Concat, ContrastiveLoss, Convolution, Crop, Data, Deconvolution, Dropout, DummyData, ELU, Eltwise, Embed, EuclideanLoss, Exp, Filter, Flatten, HDF5Data, HDF5Output, HingeLoss, Im2col, ImageData, InfogainLoss, InnerProduct, Input, LRN, Log, MVN, MemoryData, MultinomialLogisticLoss, PReLU, Pooling, Power, ReLU, Reduction, Reshape, SPP, Scale, Sigmoid, SigmoidCrossEntropyLoss, Silence, Slice, Softmax, SoftmaxWithLoss, Split, TanH, Threshold, Tile, WindowData) *** Check failure stack trace: *** 0x10243154e google::LogMessage::Fail() 0x102430c53 google::LogMessage::SendToLog() 0x1024311a9 google::LogMessage::Flush() 0x1024344d7 google::LogMessageFatal::~LogMessageFatal() 0x10243183b google::LogMessageFatal::~LogMessageFatal() 0x102215356 caffe::LayerRegistry::CreateLayer() 0x102233ccf caffe::Net::Init() 0x102235996 caffe::Net::Net() 0x102118d8b time() 0x102119c9a main 0x7fff851285ad start 0x4 (unknown) Abort trap: 6 下面给出一个实际案例走一遍方案三的流程。
这里我们实现一个新 Layer名称为 AllPassLayer顾名思义就是全通 Layer“全通”借鉴于信号处理中的全通滤波器将信号无失真地从输入转到输出。
虽然这个 Layer 并没有什么卵用但是在这个基础上增加你的处理是非常简单的事情。另外也是出于实验考虑全通层的 Forward/Backward 函数非常简单不需要读者有任何高等数学和求导的背景知识。读者使用该层时可以插入到任何已有网络中而不会影响训练、预测的准确性。 首先看头文件 [cpp] view plaincopy print? #ifndef CAFFE_ALL_PASS_LAYER_HPP_ #define CAFFE_ALL_PASS_LAYER_HPP_ #include vector #include caffe/blob.hpp #include caffe/layer.hpp #include caffe/proto/caffe.pb.h #include caffe/layers/neuron_layer.hpp namespace caffe { template typename Dtype class AllPassLayer : public NeuronLayerDtype { public: explicit AllPassLayer(const LayerParameter param) : NeuronLayerDtype(param) {} virtual inline const char* type() const { return AllPass; } protected: virtual void Forward_cpu(const vectorBlobDtype* bottom, const vectorBlobDtype* top); virtual void Forward_gpu(const vectorBlobDtype* bottom, const vectorBlobDtype* top); virtual void Backward_cpu(const vectorBlobDtype* top, const vectorbool propagate_down, const vectorBlobDtype* bottom); virtual void Backward_gpu(const vectorBlobDtype* top, const vectorbool propagate_down, const vectorBlobDtype* bottom); }; } // namespace caffe #endif // CAFFE_ALL_PASS_LAYER_HPP_ 再看源文件 [cpp] view plaincopy print? #include algorithm #include vector #include caffe/layers/all_pass_layer.hpp #include iostream using namespace std; #define DEBUG_AP(str) coutstrendl namespace caffe { template typename Dtype void AllPassLayerDtype::Forward_cpu(const vectorBlobDtype* bottom, const vectorBlobDtype* top) { const Dtype* bottom_data bottom[0]-cpu_data(); Dtype* top_data top[0]-mutable_cpu_data(); const int count bottom[0]-count(); for (int i 0; i count; i) { top_data[i] bottom_data[i]; } DEBUG_AP(Here is All Pass Layer, forwarding.); DEBUG_AP(this-layer_param_.all_pass_param().key()); } template typename Dtype void AllPassLayerDtype::Backward_cpu(const vectorBlobDtype* top, const vectorbool propagate_down, const vectorBlobDtype* bottom) { if (propagate_down[0]) { const Dtype* bottom_data bottom[0]-cpu_data(); const Dtype* top_diff top[0]-cpu_diff(); Dtype* bottom_diff bottom[0]-mutable_cpu_diff(); const int count bottom[0]-count(); for (int i 0; i count; i) { bottom_diff[i] top_diff[i]; } } DEBUG_AP(Here is All Pass Layer, backwarding.); DEBUG_AP(this-layer_param_.all_pass_param().key()); } #ifdef CPU_ONLY STUB_GPU(AllPassLayer); #endif INSTANTIATE_CLASS(AllPassLayer); REGISTER_LAYER_CLASS(AllPass); } // namespace caffe 时间考虑我没有实现 GPU 模式的 forward、backward故本文例程仅支持 CPU_ONLY 模式。 编辑 caffe.proto找到 LayerParameter 描述增加一项 [cpp] view plaincopy print? message LayerParameter { optional string name 1; // the layer name optional string type 2; // the layer type repeated string bottom 3; // the name of each bottom blob repeated string top 4; // the name of each top blob // The train / test phase for computation. optional Phase phase 10; // The amount of weight to assign each top blob in the objective. // Each layer assigns a default value, usually of either 0 or 1, // to each top blob. repeated float loss_weight 5; // Specifies training parameters (multipliers on global learning constants, // and the name and other settings used for weight sharing). repeated ParamSpec param 6; // The blobs containing the numeric parameters of the layer. repeated BlobProto blobs 7; // Specifies on which bottoms the backpropagation should be skipped. // The size must be either 0 or equal to the number of bottoms. repeated bool propagate_down 11; // Rules controlling whether and when a layer is included in the network, // based on the current NetState. You may specify a non-zero number of rules // to include OR exclude, but not both. If no include or exclude rules are // specified, the layer is always included. If the current NetState meets // ANY (i.e., one or more) of the specified rules, the layer is // included/excluded. repeated NetStateRule include 8; repeated NetStateRule exclude 9; // Parameters for data pre-processing. optional TransformationParameter transform_param 100; // Parameters shared by loss layers. optional LossParameter loss_param 101; // Layer type-specific parameters. // // Note: certain layers may have more than one computational engine // for their implementation. These layers include an Engine type and // engine parameter for selecting the implementation. // The default for the engine is set by the ENGINE switch at compile-time. optional AccuracyParameter accuracy_param 102; optional ArgMaxParameter argmax_param 103; optional BatchNormParameter batch_norm_param 139; optional BiasParameter bias_param 141; optional ConcatParameter concat_param 104; optional ContrastiveLossParameter contrastive_loss_param 105; optional ConvolutionParameter convolution_param 106; optional CropParameter crop_param 144; optional DataParameter data_param 107; optional DropoutParameter dropout_param 108; optional DummyDataParameter dummy_data_param 109; optional EltwiseParameter eltwise_param 110; optional ELUParameter elu_param 140; optional EmbedParameter embed_param 137; optional ExpParameter exp_param 111; optional FlattenParameter flatten_param 135; optional HDF5DataParameter hdf5_data_param 112; optional HDF5OutputParameter hdf5_output_param 113; optional HingeLossParameter hinge_loss_param 114; optional ImageDataParameter image_data_param 115; optional InfogainLossParameter infogain_loss_param 116; optional InnerProductParameter inner_product_param 117; optional InputParameter input_param 143; optional LogParameter log_param 134; optional LRNParameter lrn_param 118; optional MemoryDataParameter memory_data_param 119; optional MVNParameter mvn_param 120; optional PoolingParameter pooling_param 121; optional PowerParameter power_param 122; optional PReLUParameter prelu_param 131; optional PythonParameter python_param 130; optional ReductionParameter reduction_param 136; optional ReLUParameter relu_param 123; optional ReshapeParameter reshape_param 133; optional ScaleParameter scale_param 142; optional SigmoidParameter sigmoid_param 124; optional SoftmaxParameter softmax_param 125; optional SPPParameter spp_param 132; optional SliceParameter slice_param 126; optional TanHParameter tanh_param 127; optional ThresholdParameter threshold_param 128; optional TileParameter tile_param 138; optional WindowDataParameter window_data_param 129; optional AllPassParameter all_pass_param 155; } 注意新增数字不要和以前的 Layer 数字重复。 仍然在 caffe.proto 中增加 AllPassParameter 声明位置任意。我设定了一个参数可以用于从 prototxt 中读取预设值。 [cpp] view plaincopy print? message AllPassParameter { optional float key 1 [default 0]; } 在 cpp 代码中通过 [cpp] view plaincopy print? this-layer_param_.all_pass_param().key() 这句来读取 prototxt 预设值。 在 $CAFFE_ROOT 下执行 make clean然后重新 make all。要想一次编译成功务必规范代码对常见错误保持敏锐的嗅觉并加以避免。 万事具备只欠 prototxt 了。 不难我们写个最简单的 deploy.prototxt不需要 data layer 和 softmax layerjust for fun。 [cpp] view plaincopy print? name: AllPassTest layer { name: data type: Input top: data input_param { shape: { dim: 10 dim: 3 dim: 227 dim: 227 } } } layer { name: ap type: AllPass bottom: data top: conv1 all_pass_param { key: 12.88 } } 注意这里的 type 后面写的内容应该是你在 .hpp 中声明的新类 class name 去掉 Layer 后的名称。
上面设定了 key 这个参数的预设值为 12.88嗯你想到了刘翔对不对。 为了检验该 Layer 是否能正常创建和执行 forward, backward我们运行 caffe time 命令并指定刚刚实现的 prototxt [plain] view plaincopy print? $ ./build/tools/caffe.bin time -model deploy.prototxt I1002 02:03:41.667682 1954701312 caffe.cpp:312] Use CPU. I1002 02:03:41.671360 1954701312 net.cpp:49] Initializing net from parameters: name: AllPassTest state { phase: TRAIN } layer { name: data type: Input top: data input_param { shape { dim: 10 dim: 3 dim: 227 dim: 227 } } } layer { name: ap type: AllPass bottom: data top: conv1 all_pass_param { key: 12.88 } } I1002 02:03:41.671463 1954701312 layer_factory.hpp:77] Creating layer data I1002 02:03:41.671484 1954701312 net.cpp:91] Creating Layer data I1002 02:03:41.671499 1954701312 net.cpp:399] data - data I1002 02:03:41.671555 1954701312 net.cpp:141] Setting up data I1002 02:03:41.671566 1954701312 net.cpp:148] Top shape: 10 3 227 227 (1545870) I1002 02:03:41.671592 1954701312 net.cpp:156] Memory required for data: 6183480 I1002 02:03:41.671605 1954701312 layer_factory.hpp:77] Creating layer ap I1002 02:03:41.671620 1954701312 net.cpp:91] Creating Layer ap I1002 02:03:41.671630 1954701312 net.cpp:425] ap - data I1002 02:03:41.671644 1954701312 net.cpp:399] ap - conv1 I1002 02:03:41.671663 1954701312 net.cpp:141] Setting up ap I1002 02:03:41.671674 1954701312 net.cpp:148] Top shape: 10 3 227 227 (1545870) I1002 02:03:41.671685 1954701312 net.cpp:156] Memory required for data: 12366960 I1002 02:03:41.671695 1954701312 net.cpp:219] ap does not need backward computation. I1002 02:03:41.671705 1954701312 net.cpp:219] data does not need backward computation. I1002 02:03:41.671710 1954701312 net.cpp:261] This network produces output conv1 I1002 02:03:41.671720 1954701312 net.cpp:274] Network initialization done. I1002 02:03:41.671746 1954701312 caffe.cpp:320] Performing Forward Here is All Pass Layer, forwarding. 12.88 I1002 02:03:41.679689 1954701312 caffe.cpp:325] Initial loss: 0 I1002 02:03:41.679714 1954701312 caffe.cpp:326] Performing Backward I1002 02:03:41.679738 1954701312 caffe.cpp:334] *** Benchmark begins *** I1002 02:03:41.679746 1954701312 caffe.cpp:335] Testing for 50 iterations. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.681139 1954701312 caffe.cpp:363] Iteration: 1 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.682394 1954701312 caffe.cpp:363] Iteration: 2 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.683653 1954701312 caffe.cpp:363] Iteration: 3 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.685096 1954701312 caffe.cpp:363] Iteration: 4 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.686326 1954701312 caffe.cpp:363] Iteration: 5 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.687713 1954701312 caffe.cpp:363] Iteration: 6 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.689038 1954701312 caffe.cpp:363] Iteration: 7 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.690251 1954701312 caffe.cpp:363] Iteration: 8 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.691548 1954701312 caffe.cpp:363] Iteration: 9 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.692805 1954701312 caffe.cpp:363] Iteration: 10 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.694056 1954701312 caffe.cpp:363] Iteration: 11 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.695264 1954701312 caffe.cpp:363] Iteration: 12 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.696761 1954701312 caffe.cpp:363] Iteration: 13 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.698225 1954701312 caffe.cpp:363] Iteration: 14 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.699653 1954701312 caffe.cpp:363] Iteration: 15 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.700945 1954701312 caffe.cpp:363] Iteration: 16 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.702761 1954701312 caffe.cpp:363] Iteration: 17 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.704056 1954701312 caffe.cpp:363] Iteration: 18 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.706471 1954701312 caffe.cpp:363] Iteration: 19 forward-backward time: 2 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.708784 1954701312 caffe.cpp:363] Iteration: 20 forward-backward time: 2 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.710043 1954701312 caffe.cpp:363] Iteration: 21 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.711272 1954701312 caffe.cpp:363] Iteration: 22 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.712528 1954701312 caffe.cpp:363] Iteration: 23 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.713964 1954701312 caffe.cpp:363] Iteration: 24 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.715248 1954701312 caffe.cpp:363] Iteration: 25 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.716487 1954701312 caffe.cpp:363] Iteration: 26 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.717725 1954701312 caffe.cpp:363] Iteration: 27 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.718962 1954701312 caffe.cpp:363] Iteration: 28 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.720289 1954701312 caffe.cpp:363] Iteration: 29 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.721837 1954701312 caffe.cpp:363] Iteration: 30 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.723042 1954701312 caffe.cpp:363] Iteration: 31 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.724261 1954701312 caffe.cpp:363] Iteration: 32 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.725587 1954701312 caffe.cpp:363] Iteration: 33 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.726771 1954701312 caffe.cpp:363] Iteration: 34 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.728013 1954701312 caffe.cpp:363] Iteration: 35 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.729249 1954701312 caffe.cpp:363] Iteration: 36 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.730716 1954701312 caffe.cpp:363] Iteration: 37 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.732275 1954701312 caffe.cpp:363] Iteration: 38 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.733809 1954701312 caffe.cpp:363] Iteration: 39 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.735049 1954701312 caffe.cpp:363] Iteration: 40 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.737144 1954701312 caffe.cpp:363] Iteration: 41 forward-backward time: 2 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.739090 1954701312 caffe.cpp:363] Iteration: 42 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.741575 1954701312 caffe.cpp:363] Iteration: 43 forward-backward time: 2 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.743450 1954701312 caffe.cpp:363] Iteration: 44 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.744732 1954701312 caffe.cpp:363] Iteration: 45 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.745970 1954701312 caffe.cpp:363] Iteration: 46 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.747185 1954701312 caffe.cpp:363] Iteration: 47 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.748430 1954701312 caffe.cpp:363] Iteration: 48 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.749826 1954701312 caffe.cpp:363] Iteration: 49 forward-backward time: 1 ms. Here is All Pass Layer, forwarding. 12.88 Here is All Pass Layer, backwarding. 12.88 I1002 02:03:41.751124 1954701312 caffe.cpp:363] Iteration: 50 forward-backward time: 1 ms. I1002 02:03:41.751147 1954701312 caffe.cpp:366] Average time per layer: I1002 02:03:41.751157 1954701312 caffe.cpp:369] data forward: 0.00108 ms. I1002 02:03:41.751183 1954701312 caffe.cpp:372] data backward: 0.001 ms. I1002 02:03:41.751194 1954701312 caffe.cpp:369] ap forward: 1.37884 ms. I1002 02:03:41.751205 1954701312 caffe.cpp:372] ap backward: 0.01156 ms. I1002 02:03:41.751220 1954701312 caffe.cpp:377] Average Forward pass: 1.38646 ms. I1002 02:03:41.751231 1954701312 caffe.cpp:379] Average Backward pass: 0.0144 ms. I1002 02:03:41.751240 1954701312 caffe.cpp:381] Average Forward-Backward: 1.42 ms. I1002 02:03:41.751250 1954701312 caffe.cpp:383] Total Time: 71 ms. I1002 02:03:41.751260 1954701312 caffe.cpp:384] *** Benchmark ends *** 可见该 Layer 可以正常创建、加载预设参数、执行 forward、backward 函数。