Yesterday I ran into an equation that was a sum of an exponential and a linear term:

\displaystyle \alpha x+\beta=\delta e^{x}

It doesn’t take long to figure out that there is no analytical solution, and so I set out to write some crappy numerical code. After wasting some time with a fixed point iteration that did not really work, it occured to me that I most probably wasn’t the first person out there trying to solve such a simple equation. Indeed not.

The equation above has a solution in terms of a special function called Lambert’s W, and an nicer-looking one in terms of its cousin the generalised log (introduced by D. Kalman here).

Just like {\log x} is the inverse of {\exp x}, {\mbox{glog}x} is the inverse of {x^{-1}\exp x}, and Lambert’s {W} is the inverse of {x\exp x}. Neither glog nor W can be computed analytically, but fast implementations for {W} are available (for R, it’s in the GSL package), and:

\displaystyle \mbox{glog}\left(x\right)=-W\left(-1/x\right)

In terms of the generalised log function the solution to the equation is:

\displaystyle \mbox{glog}\left(\frac{\alpha}{\delta}\exp\left(\frac{\beta}{\alpha}\right)\right)-\frac{\beta}{\alpha}

The (easy) proof is on page 5 of Kalman’s article. Here’s some R code:

require(gsl)
solve.lexpeq <- function(alpha,beta,delta)
{
v <- beta/alpha
-lambert_W0(-(delta/alpha)*exp(-v)) -v
}

 

So where does this turn up in statistics? Well, one example is finding the Maximum A Posterior estimate of a Poisson mean, if you put a Gaussian prior on the log of the mean.

About these ads