Skip to content

Optimization algorithms and recommendations

For calibrate and calibrate_multistart, PEtab.jl supports optimizers from three popular packages: Optim.jl, Ipopt.jl, and Fides.jl. This page summarizes the available algorithms and provides recommendations by model size.

When choosing an optimization algorithm, the no free lunch principle applies: performance is problem-dependent and no method is universally best. Still, benchmark studies have identified methods that often work well for ODE models in biology (and likely beyond) [3, 4, 6]. In brief, the choice is driven by model size: for small models an accurate Hessian (or good approximation) is often feasible to compute, while for larger models it typically is not (see gradient and Hessian support). Thereby, we recommend:

  • Small models (<10 ODEs and <20 estimated parameters): Optim.IPNewton() with the exact Hessian (hessian_method = :ForwardDiff in the PEtabODEProblem).

  • Medium-sized models (≈10-20 ODEs and <75 estimated parameters): Fides.CustomHessian() with a Gauss-Newton Hessian approximation (hessian_method = :GaussNewton in the PEtabODEProblem). Gauss-Newton often outperforms the (L)BFGS approximation in this regime [2].

  • Large models (>20 ODEs or >75 estimated parameters): (L)BFGS methods such as Ipopt or Fides.BFGS.

Fides

Fides.jl is a trust-region Newton method for box-constrained optimization [2]. It performs particularly well with a Gauss-Newton Hessian approximation, but it also has built-in support for other approximations (e.g., BFGS, SR1; see the Fides documentation).

Fides evaluates the objective, gradient, and Hessian in each iteration, enabling reuse of intermediate quantities. Since the Gauss–Newton Hessian is based on forward sensitivities (which can also be used to compute the gradient), a recommended PEtabODEProblem configuration is:

julia
using Fides
petab_prob = PEtabODEProblem(
    model;
    gradient_method = :ForwardEquations,
    hessian_method = :GaussNewton,
    reuse_sensitivities = true,
)
res = calibrate(
    petab_prob, x0, Fides.CustomHessian();
    options = FidesOptions(maxiter = 1000)
)

Fides solver options are set with FidesOptions (see the Fides documentation). Fides also provides built-in Hessian approximations such as BFGS:

julia
res = calibrate(petab_prob, x0, Fides.BFGS())

Optim.jl

PEtab.jl supports three algorithms from Optim.jl: LBFGS, BFGS, and IPNewton (interior-point Newton).

Solver settings are passed via options = Optim.Options(...) (see the Optim.jl documenation page). For example, to run Optim.LBFGS() for 10_000 iterations:

julia
using Optim
res = calibrate(
    petab_prob, x0, Optim.LBFGS();
    options = Optim.Options(iterations = 10_000)
)

Ipopt

Ipopt is an interior-point Newton method for nonlinear optimization [13]. In PEtab.jl, Ipopt can either use the Hessian provided by the PEtabODEProblem or an L-BFGS Hessian approximation via IpoptOptimizer:

PEtab.IpoptOptimizer Type
julia
IpoptOptimizer(LBFGS::Bool)

Setup the Ipopt Interior-point Newton method optmizer for parameter estimation.

Ipopt can be configured to use either the Hessian method from the PEtabODEProblem (LBFGS=false) or an LBFGS scheme (LBFGS=true). For setting other Ipopt options, see IpoptOptions.

See also calibrate and calibrate_multistart.

Description

Ipopt is an Interior-point Newton method for constrained non-linear optimization problems. More information on the algorithm can be found in [1].

  1. Wächter and Biegler, Mathematical programming, pp 25-57 (2006)
source

Ipopt exposes many solver options. A commonly used subset can be set via IpoptOptions:

PEtab.IpoptOptions Type
julia
IpoptOptions(; kwargs...)

Options for parameter estimation with IpoptOptimizer.

More details on the options can be found in the Ipopt documentation.

See also IpoptOptimizer, calibrate, and calibrate_multistart.

Keyword Arguments

  • print_level = 0: Output verbosity level (valid values are 0 ≤ print_level ≤ 12)

  • max_iter = 1000: Maximum number of iterations

  • tol = 1e-8: Relative convergence tolerance

  • acceptable_tol = 1e-6: Acceptable relative convergence tolerance

  • max_wall_time 1e20: Maximum wall time optimization is allowed to run

  • acceptable_obj_change_tol 1e20: Acceptance stopping criterion based on objective function change.

source

For example, to run Ipopt for 1000 iterations using the L-BFGS Hessian approximation:

julia
using Ipopt
res = calibrate(
    petab_prob, x0, IpoptOptimizer(true);
    options = IpoptOptions(max_iter = 1000)
)

For details and the full option list, see the Ipopt documentation and the original publication [13].

Note

To use Ipopt, load Ipopt.jl with using Ipopt before running parameter estimation.

References

  1. F. Fröhlich and P. K. Sorger. Fides: Reliable trust-region optimization for parameter estimation of ordinary differential equation models. PLoS computational biology 18, e1010322 (2022).

  2. A. Raue, M. Schilling, J. Bachmann, A. Matteson, M. Schelke, D. Kaschek, S. Hug, C. Kreutz, B. D. Harms, F. J. Theis and others. Lessons learned from quantitative dynamical modeling in systems biology. PloS one 8, e74335 (2013).

  3. H. Hass, C. Loos, E. Raimundez-Alvarez, J. Timmer, J. Hasenauer and C. Kreutz. Benchmark problems for dynamic modeling of intracellular processes. Bioinformatics 35, 3073–3082 (2019).

  4. A. F. Villaverde, F. Fröhlich, D. Weindl, J. Hasenauer and J. R. Banga. Benchmarking optimization methods for parameter estimation in large kinetic models. Bioinformatics 35, 830–838 (2019).

  5. A. Wächter and L. T. Biegler. On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming. Mathematical programming 106, 25–57 (2006).