How do I approximate the Jacobian and Hessian of a function numerically? How do I approximate the Jacobian and Hessian of a function numerically? numpy numpy

How do I approximate the Jacobian and Hessian of a function numerically?


(Updated in late 2017 because there's been a lot of updates in this space.)

Your best bet is probably automatic differentiation. There are now many packages for this, because it's the standard approach in deep learning:

  • Autograd works transparently with most numpy code. It's pure-Python, requires almost no code changes for typical functions, and is reasonably fast.
  • There are many deep-learning-oriented libraries that can do this.Some of the most popular are TensorFlow, PyTorch, Theano, Chainer, and MXNet. Each will require you to rewrite your function in their kind-of-like-numpy-but-needlessly-different API, and in return will give you GPU support and a bunch of deep learning-oriented features that you may or may not care about.
  • FuncDesigner is an older package I haven't used whose website is currently down.

Another option is to approximate it with finite differences, basically just evaluating (f(x + eps) - f(x - eps)) / (2 * eps) (but obviously with more effort put into it than that). This will probably be slower and less accurate than the other approaches, especially in moderately high dimensions, but is fully general and requires no code changes. numdifftools seems to be the standard Python package for this.

You could also attempt to find fully symbolic derivatives with SymPy, but this will be a relatively manual process.


Restricted to just SciPy, the most convenient way I found was scipy.misc.derivative, within the appropriate loops, with lambdas to curry the function of interest.