From a003d7287686f0963aae38fa97f20d66f113aefc Mon Sep 17 00:00:00 2001 From: "A. Unique TensorFlower" Date: Tue, 29 May 2018 11:32:25 -0700 Subject: [PATCH] Add note to bijector_impl.py explaining that `log_det_jacobian` is `log(*abs*(det(Jacobian)))`. PiperOrigin-RevId: 198428995 --- tensorflow/python/ops/distributions/bijector_impl.py | 11 +++++++++-- 1 file changed, 9 insertions(+), 2 deletions(-) diff --git a/tensorflow/python/ops/distributions/bijector_impl.py b/tensorflow/python/ops/distributions/bijector_impl.py index 969553b..b65e64d 100644 --- a/tensorflow/python/ops/distributions/bijector_impl.py +++ b/tensorflow/python/ops/distributions/bijector_impl.py @@ -160,13 +160,20 @@ class Bijector(object): 3. `log_det_jacobian(x)` - "The log of the determinant of the matrix of all first-order partial - derivatives of the inverse function." + "The log of the absolute value of the determinant of the matrix of all + first-order partial derivatives of the inverse function." Useful for inverting a transformation to compute one probability in terms of another. Geometrically, the Jacobian determinant is the volume of the transformation and is used to scale the probability. + We take the absolute value of the determinant before log to avoid NaN + values. Geometrically, a negative determinant corresponds to an + orientation-reversing transformation. It is ok for us to discard the sign + of the determinant because we only integrate everywhere-nonnegative + functions (probability densities) and the correct orientation is always the + one that produces a nonnegative integrand. + By convention, transformations of random variables are named in terms of the forward transformation. The forward transformation creates samples, the inverse is useful for computing probabilities. -- 2.7.4