Adept: A fast automatic differentiation library for C++
Adept (Automatic Differentiation using Expression Templates) is a
software library that enables algorithms written in C and C++ to
differentiated. It uses an operator overloading approach, so
very little code modification is required. Differentiation can be
performed in forward mode, reverse mode (to compute the adjoint),
or the full Jacobian matrix can be computed. Moreover, the way that
templates have been used and several other important
optimizations mean that reverse-mode differentiation is
significantly faster than the leading libraries that provide
and less memory is used. In fact, Adept is also typically only
around 10-25% slower than an adjoint code you might write by hand,
very time consuming and error-prone. For further details, read the
Adept provides the following functionality:
- Full Jacobian matrix. Given the non-linear
function y=f(x) coded in C or C++, Adept will
compute the matrix H=∂y/x, where the
element at row i and column j of H
is Hi,j=∂yi/∂xj. This
matrix will be computed much more rapidly and accurately than if you
simply recompute the function multiple times perturbing each element
of x one by one. The Jacobian matrix is used in
- Reverse-mode differentiation. This is a key component in
optimization problems where a non-linear function needs to be
minimized but the state vector x is too large for it to make
sense to compute the full Jacobian matrix. Atmospheric data
assimilation is the canonical example in Meteorology. Given a
non-linear function y=f(x) and a vector of
adjoints ∂J/∂y (where J is the scalar
function to be minimized), Adept will compute the vector of adjoints
∂J/∂x is given by the matrix-vector
product HT∂J/∂y, but it is
computed here without computing the full Jacobian
matrix H. The adjoint may then be used in
- Forward-mode differentiation. Given the non-linear
function y=f(x) and a vector of perturbations
δx, Adept will compute the corresponding vector
δy arising from a linearization of the
function f. Formally, δy is given by the
matrix-vector product Hδx, but it is computed
here without computing the full Jacobian matrix H. Note that
Adept is optimized for the reverse case, so might not be as fast
(and will certainly not be as economical in memory) in the forward
mode as libraries written especially for that purpose.
Note that at present Adept is missing some functionality that you
- Differentiation is first-order only: it cannot directly compute
higher-order derivatives such as the Hessian matrix.
- Limited support for complex numbers; no support for mathematical
functions of complex numbers, and expressions involving operations
on complex numbers are not optimized.
- Your code must only operate on variables individually: they can be
stored in C-style arrays or std::vector types, but Adept
does not work fully with containers that allow operations on entire
arrays such as the std::valarray type.
- C and C++ only; this library could not be written in Fortran since
the language provides no template capability.
It is hoped that future versions will remedy these limitations (and
it is hoped that a future version of Fortran will support
templates). We are, however, working actively on the following
features, which will be available in the next version:
- Windows and Mac support: mainly this means specifying thread-local
variables appropriately or disabling this feature completely for
- OpenMP parallelization of Jacobian calculations (we are also
working on CUDA parallelization but the results are not quite so
encouraging at this stage).
- Paper: Hogan, R. J., 2014: Fast reverse-mode automatic
differentiation using expression templates in C++. ACM
Trans. Math. Softw., 40,
- Documentation: User Guide: PDF
- Talk to data assimilation research group, University of
- Talk at 14th Euro Automatic Differentiation Workshop,
Oxford, December 2013: Powerpoint
If you use Adept in a publication, please cite the Hogan (2014)
paper. This is a request and not a condition of the license.
The paper uses algorithms from
the Multiscatter package to
evaluate Adept against other automatic differentation libraries.
version: adept-1.0.tar.gz (3
September 2013). The main changes since version 0.9 are:
- LIFO requirement removed in allocation of active objects in
memory: more efficient for codes that don't deallocate objects in
the reverse order from their allocation.
- Interface change: independent variables no longer need to be
initialized using set_value; they can now be initialized
using the ordinary assignment operator (=) provided that
the new_recording function is called immediately after
they are initialized (see section 5.2 of the User Guide).
- Ability to interface to code components that compute their own
Jacobian (e.g. those written in Fortran).
- More test programs, including one that interfaces to
the GNU Scientific
Library to perform a real minimization.
- C++ exceptions are thrown when an error occurs.
- Recording of derivative information can be "paused" so that the
same function can be called from within the program both with and
without automatic differentiation.
- Source code can be compiled twice to produce two versions of a
function, one with automatic differentiation and one without.
- A detailed User Guide.
(2 May 2012).
If you use the code and have any comments, queries, requests or
bug-fixes then please
contact Robin Hogan. I'm
also interested to know of any uses of the code - then I can also keep
you updated on changes, bug-fixes etc.
This library is released under
the GNU General
Public License (GPL). This means that you can use and modify the
library for any purpose, but that if you distribute a software
package that incorporates the library or a modified version of it,
you must release the source code for the entire software package
under the conditions of the GNU GPL. If you would like to use Adept
under other terms, please
contact Robin Hogan.