sigmoid cross-entropy loss: normalize loss by different schemes
authorEvan Shelhamer <shelhamer@imaginarynumber.net>
Thu, 17 Nov 2016 04:39:42 +0000 (20:39 -0800)
committerEvan Shelhamer <shelhamer@imaginarynumber.net>
Thu, 17 Nov 2016 06:21:41 +0000 (22:21 -0800)
commit3d62e3cc9da66dbf3328567d0f30d5183b318d81
tree0ff08d3a512b0f5a1a469d287203a944ee8eff9b
parentc6ab96596d9eae01c2c403487dc8be8e3edc8fbb
sigmoid cross-entropy loss: normalize loss by different schemes

sig-ce loss handles all the same normalizations as the softmax loss;
refer to #3296 for more detail.

this preserves the default normalization for sig-ce loss: batch size.
include/caffe/layers/sigmoid_cross_entropy_loss_layer.hpp
src/caffe/layers/sigmoid_cross_entropy_loss_layer.cpp
src/caffe/layers/sigmoid_cross_entropy_loss_layer.cu
src/caffe/proto/caffe.proto