From 15456286da9378df6af796496ced5313ad7169f4 Mon Sep 17 00:00:00 2001 From: Jonathan L Long Date: Sat, 6 Sep 2014 21:39:56 -0700 Subject: [PATCH] [docs] tutorial/layers: brief descriptions of some loss layers --- docs/tutorial/layers.md | 12 +++++++++--- 1 file changed, 9 insertions(+), 3 deletions(-) diff --git a/docs/tutorial/layers.md b/docs/tutorial/layers.md index 11ca70e..036d9ad 100644 --- a/docs/tutorial/layers.md +++ b/docs/tutorial/layers.md @@ -123,17 +123,21 @@ Loss drives learning by comparing an output to a target and assigning cost to mi #### Softmax -`SOFTMAX_LOSS` +* LayerType: `SOFTMAX_LOSS` + +The softmax loss layer computes the multinomial logistic loss of the softmax of its inputs. It's conceptually identical to a softmax layer followed by a multinomial logistic loss layer, but provides a more numerically stable gradient. #### Sum-of-Squares / Euclidean -`EUCLIDEAN_LOSS` +* LayerType: `EUCLIDEAN_LOSS` + +The Euclidean loss layer computes the sum of squares of differences of its two inputs, $$\frac 1 {2N} \sum_{i=1}^N \| x^1_i - x^2_i \|_2^2$$. #### Hinge / Margin * LayerType: `HINGE_LOSS` * CPU implementation: `./src/caffe/layers/hinge_loss_layer.cpp` -* CUDA GPU implementation: `NOT_AVAILABLE` +* CUDA GPU implementation: none yet * Parameters (`HingeLossParameter hinge_loss_param`) - Optional - `norm` [default L1]: the norm used. Currently L1, L2 @@ -164,6 +168,8 @@ Loss drives learning by comparing an output to a target and assigning cost to mi } } +The hinge loss layer computes a one-vs-all hinge or squared hinge loss. + #### Sigmoid Cross-Entropy `SIGMOID_CROSS_ENTROPY_LOSS` -- 2.7.4