Tutorial

We present a typical workflow with DifferentiationInterfaceTest.jl, building on the tutorial of the DifferentiationInterface.jl documentation (which we encourage you to read first).

julia> using DifferentiationInterface, DifferentiationInterfaceTest
julia> import ForwardDiff, Enzyme

Introduction

The AD backends we want to compare are ForwardDiff.jl and Enzyme.jl.

backends = [AutoForwardDiff(), AutoEnzyme(; mode=Enzyme.Reverse)]
2-element Vector{ADTypes.AbstractADType}:
 AutoForwardDiff()
 AutoEnzyme(mode=EnzymeCore.ReverseMode{false, EnzymeCore.FFIABI, false}())

To do that, we are going to take gradients of a simple function:

f(x::AbstractArray) = sum(sin, x)
f (generic function with 1 method)

Of course we know the true gradient mapping:

∇f(x::AbstractArray) = cos.(x)
∇f (generic function with 1 method)

DifferentiationInterfaceTest.jl relies with so-called "scenarios", in which you encapsulate the information needed for your test:

  • the function f
  • the input x and output y of the function f
  • the reference output of the operator (here grad)
  • the number of arguments for f (either 1 or 2)
  • the behavior of the operator (either :inplace or :outofplace)

There is one scenario constructor per operator, and so here we will use GradientScenario:

xv = rand(Float32, 3)
xm = rand(Float64, 3, 2)
scenarios = [
    GradientScenario(f; x=xv, y=f(xv), grad=∇f(xv), nb_args=1, place=:inplace),
    GradientScenario(f; x=xm, y=f(xm), grad=∇f(xm), nb_args=1, place=:inplace)
];

Testing

The main entry point for testing is the function test_differentiation. It has many options, but the main ingredients are the following:

julia> test_differentiation(
           backends,  # the backends you want to compare
           scenarios,  # the scenarios you defined,
           correctness=true,  # compares values against the reference
           type_stability=false,  # checks type stability with JET.jl
           detailed=true,  # prints a detailed test set
       )Test Summary:                                                                | Pass  Total   Time
Testing correctness                                                          |   68     68  39.1s
  AutoForwardDiff()                                                          |   34     34   4.5s
    gradient                                                                 |   34     34   4.5s
      Scenario{:gradient,1,:inplace} f : Vector{Float32} -> Float32          |   17     17   2.3s
      Scenario{:gradient,1,:inplace} f : Matrix{Float64} -> Float64          |   17     17   2.2s
  AutoEnzyme(mode=EnzymeCore.ReverseMode{false, EnzymeCore.FFIABI, false}()) |   34     34  34.5s
    gradient                                                                 |   34     34  34.5s
      Scenario{:gradient,1,:inplace} f : Vector{Float32} -> Float32          |   17     17  32.9s
      Scenario{:gradient,1,:inplace} f : Matrix{Float64} -> Float64          |   17     17   1.6s

If you are too lazy to manually specify the reference, you can also provide an AD backend as the ref_backend keyword argument, which will serve as the ground truth for comparison.

Benchmarking

Once you are confident that your backends give the correct answers, you probably want to compare their performance. This is made easy by the benchmark_differentiation function, whose syntax should feel familiar:

df = benchmark_differentiation(backends, scenarios);
12×11 DataFrame
Rowbackendscenariooperatorcallssamplesevalstimeallocsbytesgc_fractioncompile_fraction
Abstract…Scenario…SymbolInt64Int64Int64Float64Float64Float64Float64Float64
1AutoForwardDiff()Scenario{:gradient,1,:inplace} f : Vector{Float32} -> Float32prepare_gradient0114.168e-611.0528.00.00.0
2AutoForwardDiff()Scenario{:gradient,1,:inplace} f : Vector{Float32} -> Float32value_and_gradient!14703218.0e-81.032.00.00.0
3AutoForwardDiff()Scenario{:gradient,1,:inplace} f : Vector{Float32} -> Float32gradient!14045816.9e-80.00.00.00.0
4AutoForwardDiff()Scenario{:gradient,1,:inplace} f : Matrix{Float64} -> Float64prepare_gradient0116.202e-611.01776.00.00.0
5AutoForwardDiff()Scenario{:gradient,1,:inplace} f : Matrix{Float64} -> Float64value_and_gradient!13525912.0e-75.0192.00.00.0
6AutoForwardDiff()Scenario{:gradient,1,:inplace} f : Matrix{Float64} -> Float64gradient!13093611.9e-74.0160.00.00.0
7AutoEnzyme(mode=ReverseMode{false, FFIABI, false}())Scenario{:gradient,1,:inplace} f : Vector{Float32} -> Float32prepare_gradient0111.3e-70.00.00.00.0
8AutoEnzyme(mode=ReverseMode{false, FFIABI, false}())Scenario{:gradient,1,:inplace} f : Vector{Float32} -> Float32value_and_gradient!15772418.52e-79.0192.00.00.0
9AutoEnzyme(mode=ReverseMode{false, FFIABI, false}())Scenario{:gradient,1,:inplace} f : Vector{Float32} -> Float32gradient!19906216.9e-80.00.00.00.0
10AutoEnzyme(mode=ReverseMode{false, FFIABI, false}())Scenario{:gradient,1,:inplace} f : Matrix{Float64} -> Float64prepare_gradient0112.1e-70.00.00.00.0
11AutoEnzyme(mode=ReverseMode{false, FFIABI, false}())Scenario{:gradient,1,:inplace} f : Matrix{Float64} -> Float64value_and_gradient!15567919.02e-79.0192.00.00.0
12AutoEnzyme(mode=ReverseMode{false, FFIABI, false}())Scenario{:gradient,1,:inplace} f : Matrix{Float64} -> Float64gradient!19610111.0e-70.00.00.00.0

The resulting object is a DataFrame from DataFrames.jl, whose columns correspond to the fields of DifferentiationBenchmarkDataRow: