API reference

Index

Docstrings

ImplicitDifferentiation.AbstractLinearSolverType
AbstractLinearSolver

All linear solvers used within an ImplicitFunction must satisfy this interface.

It can be useful to roll out your own solver if you need more fine-grained control on convergence / speed / behavior in case of singularity. Check out the source code of IterativeLinearSolver and DirectLinearSolver for implementation examples.

Required methods

  • presolve(linear_solver, A, y): Returns a matrix-like object A for which it is cheaper to solve several linear systems with different vectors b of type similar to y (a typical example would be to perform LU factorization).
  • solve(linear_solver, A, b): Returns a vector x satisfying Ax = b. If the linear system has not been solved to satisfaction, every element of x should be a NaN of the appropriate floating point type.
source
ImplicitDifferentiation.DirectLinearSolverType
DirectLinearSolver

An implementation of AbstractLinearSolver using the built-in backslash operator.

Fields

  • verbose::Bool: Whether to throw a warning when the solver fails (defaults to true)
source
ImplicitDifferentiation.ImplicitFunctionType
ImplicitFunction{F,C,L,B}

Wrapper for an implicit function defined by a forward mapping y and a set of conditions c.

An ImplicitFunction object behaves like a function, and every call is differentiable with respect to the first argument x. When a derivative is queried, the Jacobian of y is computed using the implicit function theorem:

∂/∂y c(x, y(x)) * ∂/∂x y(x) = -∂/∂x c(x, y(x))

This requires solving a linear system A * J = -B where A = ∂c/∂y, B = ∂c/∂x and J = ∂y/∂x.

Fields

  • forward::F: a callable, does not need to be compatible with automatic differentiation
  • conditions::C: a callable, must be compatible with automatic differentiation
  • linear_solver::L: a subtype of AbstractLinearSolver, defines how the linear system will be solved
  • conditions_backend::B: either nothing or a subtype of AbstractDifferentiation.AbstractBackend, defines how the conditions will be differentiated within the implicit function theorem

There are two possible signatures for forward and conditions, which must be consistent with one another:

  1. Standard: forward(x, args...; kwargs...) = y and conditions(x, y, args...; kwargs...) = c
  2. Byproduct: forward(x, args...; kwargs...) = (y, z) and conditions(x, y, z, args...; kwargs...) = c.

In both cases, x, y and c must be arrays, with size(y) = size(c). In the second case, the byproduct z can be an arbitrary object generated by forward. The positional arguments args... and keyword arguments kwargs... must be the same for both forward and conditions.

Warning

The byproduct z and the other positional arguments args... beyond x are considered constant for differentiation purposes.

source
ImplicitDifferentiation.ImplicitFunctionMethod
(implicit::ImplicitFunction)(x::AbstractArray, args...; kwargs...)

Return implicit.forward(x, args...; kwargs...), which can be either an array y or a tuple (y, z).

This call is differentiable.

source
ImplicitDifferentiation.IterativeLinearSolverType
IterativeLinearSolver

An implementation of AbstractLinearSolver using Krylov.gmres, set as the default for ImplicitFunction.

Fields

  • verbose::Bool: Whether to display a warning when the solver fails and returns NaNs (defaults to true)
  • accept_inconsistent::Bool: Whether to accept approximate least squares solutions for inconsistent systems, or fail and return NaNs (defaults to false)
Note

If you find that your implicit gradients contains NaNs, try using this solver with accept_inconsistent=true. However, beware that the implicit function theorem does not cover the case of inconsistent linear systems AJ = B, so it is unclear what the result will mean.

source
ChainRulesCore.rruleFunction
rrule(rc, implicit, x, args...; kwargs...)

Custom reverse rule for an ImplicitFunction, to ensure compatibility with reverse mode autodiff.

This is only available if ChainRulesCore.jl is loaded (extension), except on Julia < 1.9 where it is always available.

We compute the vector-Jacobian product Jᵀv by solving Aᵀu = v and setting Jᵀv = -Bᵀu. Positional and keyword arguments are passed to both implicit.forward and implicit.conditions.

source