Skip to content

Derivative methods (gradients and Hessians)

When constructing a PEtabODEProblem, PEtab.jl supports several methods for computing gradients and Hessians. This page summarizes the available methods and their tunable options.

Gradient methods

PEtab.jl supports three gradient methods for PEtabODEProblem: forward-mode automatic differentiation (:ForwardDiff), forward-sensitivity equations (:ForwardEquations), and adjoint sensitivity analysis (:Adjoint). Introductions to the underlying mathematics and autodiff can be found in [20, 21]. Below is a brief overview.

  • :ForwardDiff: Uses ForwardDiff.jl to compute gradients with forward-mode automatic differentiation [22]. The main tuning option is chunksize (number of directional derivatives per pass). The default is usually good, but tuning can yield small speedups; this method is often fastest for small models [1, 23].

  • :ForwardEquations: Computes gradients via forward sensitivities by solving an expanded ODE system. The main option is sensealg. The default is sensealg=:ForwarDiff, which uses ForwardDiff-based sensitivity and is often the fastest. PEtab.jl also supports ForwardSensitivity() and ForwardDiffSensitivity() from SciMLSensitivity.jl for sensealg; see the SciMLSensitivity documentation for details and tunable options.

  • :Adjoint: Computes gradients via adjoint sensitivity analysis by solving an adjoint problem backward in time. Benchmarks often find adjoints most efficient for large models [24, 25]. The main option is sensealg, selecting a SciMLSensitivity adjoint algorithm (InterpolatingAdjoint, GaussAdjoint, or QuadratureAdjoint). See the SciMLSensitivity documentation for tunable options.

Using SciMLSensitivity methods

To use SciMLSensitivity-based methods (e.g. adjoints), load the package with using SciMLSensitivity before creating the PEtabODEProblem.

Hessian methods

Three Hessian methods are supported; forward-mode automatic differentiation (:ForwardDiff), a block approximation (:BlockForwardDiff), and a Gauss-Newton approximation (:GaussNewton). Below is a brief overview.

  • :ForwardDiff: Computes the full Hessian via forward-mode automatic differentiation with ForwardDiff.jl. The main tuning option is chunksize. This method has quadratic cost in the number of estimated parameters (O(n2)), and is typically only feasible up to around n ≈ 20 parameters. When feasible, access to the full Hessian can improve convergence, especially in multi-start estimation [1].

  • :BlockForwardDiff: Computes a block-diagonal Hessian approximation using forward-mode automatic differentiation with ForwardDiff.jl. In many PEtab problems, parameters can be split into ODE parameters xp and non-ODE parameters xq. This method computes each block Hessian while setting cross-terms to zero:

Hblock=[Hp00Hq]
  • :GaussNewton: Approximates the Hessian using the Gauss–Newton method. It often performs better than (L)BFGS [24], but requires forward sensitivities (similar to :ForwardEquations). For models with many parameters (often >75), computing forward sensitivities can be too expensive; in that regime, (L)BFGS approximations are often the only practical option. For mathematical details, see [26].

References

  1. S. Persson, F. Fröhlich, S. Grein, T. Loman, D. Ognissanti, V. Hasselgren, J. Hasenauer and M. Cvijovic. PEtab. jl: advancing the efficiency and utility of dynamic modelling. Bioinformatics 41, btaf497 (2025).

  2. F. Sapienza, J. Bolibar, F. Schäfer, B. Groenke, A. Pal, V. Boussange, P. Heimbach, G. Hooker, F. Pérez, P.-O. Persson and others. Differentiable Programming for Differential Equations: A Review, arXiv preprint arXiv:2406.09699 (2024).

  3. M. Blondel and V. Roulet. The elements of differentiable programming, arXiv preprint arXiv:2403.14606 (2024).

  4. J. Revels, M. Lubin and T. Papamarkou. Forward-mode automatic differentiation in Julia, arXiv preprint arXiv:1607.07892 (2016).

  5. R. Mester, A. Landeros, C. Rackauckas and K. Lange. Differential methods for assessing sensitivity in biological models. PLoS computational biology 18, e1009598 (2022).

  6. F. Fröhlich, B. Kaltenbacher, F. J. Theis and J. Hasenauer. Scalable parameter estimation for genome-scale biochemical reaction networks. PLoS computational biology 13, e1005331 (2017).

  7. Y. Ma, V. Dixit, M. J. Innes, X. Guo and C. Rackauckas. A comparison of automatic differentiation and continuous sensitivity analysis for derivatives of differential equation solutions. In: 2021 IEEE High Performance Extreme Computing Conference (HPEC) (IEEE, 2021); pp. 1–9.

  8. A. Raue, B. Steiert, M. Schelker, C. Kreutz, T. Maiwald, H. Hass, J. Vanlier, C. Tönsing, L. Adlung, R. Engesser and others. Data2Dynamics: a modeling environment tailored to parameter estimation in dynamical systems. Bioinformatics 31, 3558–3560 (2015).