-
-
Notifications
You must be signed in to change notification settings - Fork 995
Description
It would be nice to have an AsymmetricSoftLaplace
distribution, combining features of AsymmetricLaplace and SoftLaplace:
- A (loc, scale) family amenable to
LocScaleReparam
- Learnable skew
- Concave log density, leading to unimodal posteriors
- Infinitely differentiable, leading to smoother gradients in SVI and HMC and tighter Gaussian approximations
One approach is to use the density
p(x) = const / (exp(x/k) + exp(-k*x))
however according to wolfram alpha, the normalizing const
and cdf
involve torch.special.hyp2f1()
which is not yet implemented.
Another approach might be to sum two Gamma random variables in opposite directions, with tunable smoothness (e.g. a shared alpha
parameter). This has an easy .rsample()
and I believe a tractable .log_prob()
, but the density is not infinitely differentiable and requires a smoothness parameter.
A third possibility is to Gaussian-blur a standard AsymmetricLaplace
. This has tractable .log_prob()
and .rsample()
and a tunable smoothness parameter with a plausible default (i.e. half the variance is due to each of the asymmetric laplace and normal terms).