November 27, 2012

Gaussian integral of an error function

In Surely You're Joking, Mr. Feynman!, Richard Feynman mentions a useful technique he used for evaluating integrals, namely taking the derivative under the integral sign. I will show here how this trick works in calculating the Gaussian integral of an error function. Averages over Gaussian distributions are omnipresent in physics, and the error function is just the primitive of the Gaussian, making the calculations relatively easy (and the result quite elegant.) Nevertheless, Mathematica (version 8) cannot perform this integral, and I could not find it in Gradshteyn & Ryzhik. I needed it to describe the interaction of a phase front with an external field, see the paper here.

Let us define: \[ I(\alpha, \beta, \gamma ) = \int_{-\infty}^{\infty} \text{d}x \exp (-\alpha x^2) \,\text{erf}(\beta x + \gamma) \] with \( \alpha, \beta \, \text{and}\, \gamma\) real and \( \alpha \) positive. For \( \gamma = 0\) the integrand is an odd function, so \( I(\alpha, \beta, 0 ) = 0\). We can also estimate \[ I' (\gamma) = \frac{\partial}{\partial \gamma} I(\alpha, \beta, \gamma) = \frac{2}{\sqrt{\pi}} \int_{-\infty}^{\infty} \text{d}x \exp (-\alpha x^2)\, \exp \left [-(\beta x + \gamma)^2\right ] \] which is a simple Gaussian integral: \( \displaystyle I' (\gamma) = \frac{2}{\sqrt{\alpha + \beta^2}} \exp \left ( - \frac{\alpha \gamma ^2}{\alpha + \beta^2}\right )\)

Finally, \[ I(\alpha, \beta, \gamma ) = \int_{0}^{\gamma} \text{d}u \, I' (u) = \sqrt{\vphantom{\beta}\frac{\pi}{\alpha}} \,\text{erf} \; \left ( \gamma \sqrt{\frac{\vphantom{\beta} \alpha}{\alpha + \beta^2}} \right )\] The reader can check that all derivatives exist and all integrals converge. What happens if we replace the linear term in the error function by a quadratic one?

1 comment:

  1. Is there an analytical solution if the integral is over minus infinity to zero? Can you share the steps?

    ReplyDelete