Available and Recommended Optimization Algorithms

For the calibrate and calibrate_multistart functions, PEtab.jl supports optimization algorithms from several popular optimization packages: Optim.jl, Ipopt.jl, and Fides.py. This page provides information on each package, as well as recommendations.

When choosing an optimization algorithm, it is important to keep the no free lunch principle in mind: while an algorithm may work well for one problem, there is no universally best method. Nevertheless, benchmark studies have identified algorithms that often perform well for ODE models in biology (and likely beyond) [2, 3, 5]. In particular, the best algorithm to use depends on the size of the parameter estimation problem. This is because the problem considered here is a non-linear continuous optimization problem, and for such problems, having access to a good Hessian approximation improves performance. And, the problem size dictates which type of Hessian approximation can be computed (see this page for more details). Following this, we recommend:

  • For small models (fewer than 10 ODEs and fewer than 20 parameters to estimate) where computing the Hessian is often computationally feasible, the IPNewton() method from Optim.jl.
  • For medium sized models (roughly more than 10 ODEs and fewer than 75 parameters), where a Gauss-Newton Hessian can be computed, Fides. The Gauss-Newton Hessian approximation typically outperforms the more common (L)-BFGS approximation, and benchmarks have shown that Fides performs well with such a Hessian approximation [1]. If Fides is difficult to install, Optim.BFGS also performs well.
  • For large models (more than 20 ODEs and more than 75 parameters to estimate), where a Gauss-Newton approximation is too computationally expensive, a (L)-BFGS optimizer is recommended, such as Ipopt or Optim.BFGS.

Optim.jl

PEtab.jl supports three optimization algorithms from Optim.jl: LBFGS, BFGS, and IPNewton (Interior-point Newton). Options for these algorithms can be specified via Optim.Options(), and a complete list of options can be found here. For example, to use LBFGS with 10,000 iterations, do:

using Optim
res = calibrate(petab_prob, x0, Optim.LBFGS();
                options=Optim.Options(iterations = 10000))

If no options are provided, the default ones are used:

Optim.Options(iterations = 1000,
              show_trace = false,
              allow_f_increases = true,
              successive_f_tol = 3,
              f_tol = 1e-8,
              g_tol = 1e-6,
              x_tol = 0.0)

For more details on each algorithm and tunable options, see the Optim.jl documentation.

Ipopt

Ipopt is an Interior-point Newton method for nonlinear optimization [10]. In PEtab.jl, Ipopt can be configured to either use the Hessian from the PEtabODEProblem or a LBFGS Hessian approximation through the IpoptOptimizer:

PEtab.IpoptOptimizerType
IpoptOptimizer(LBFGS::Bool)

Setup the Ipopt Interior-point Newton method optmizer for parameter estimation.

Ipopt can be configured to use either the Hessian method from the PEtabODEProblem (LBFGS=false) or an LBFGS scheme (LBFGS=true). For setting other Ipopt options, see IpoptOptions.

See also calibrate and calibrate_multistart.

Description

Ipopt is an Interior-point Newton method for constrained non-linear optimization problems. More information on the algorithm can be found in [1].

  1. Wächter and Biegler, Mathematical programming, pp 25-57 (2006)
source

Ipopt offers a wide range of options (perhaps too many, in the words of the authors). A subset of these options can be specified using IpoptOptions:

PEtab.IpoptOptionsType
IpoptOptions(; kwargs...)

Options for parameter estimation with IpoptOptimizer.

More details on the options can be found in the Ipopt documentation.

See also IpoptOptimizer, calibrate, and calibrate_multistart.

Keyword Arguments

  • print_level = 0: Output verbosity level (valid values are 0 ≤ print_level ≤ 12)
  • max_iter = 1000: Maximum number of iterations
  • tol = 1e-8: Relative convergence tolerance
  • acceptable_tol = 1e-6: Acceptable relative convergence tolerance
  • max_wall_time 1e20: Maximum wall time optimization is allowed to run
  • acceptable_obj_change_tol 1e20: Acceptance stopping criterion based on objective function change.
source

For example, to use Ipopt with 10,000 iterations and the LBFGS Hessian approximation, do:

using Ipopt
res = calibrate(petab_prob, x0, IpoptOptimizer(true); 
                options=IpoptOptions(max_iter = 10000))

For more information on Ipopt and its available options, see the Ipopt documentation and the original publication [10].

Note

To use Ipopt, the Ipopt.jl package must be loaded with using Ipopt before running parameter estimation.

Fides

Fides.py is a trust-region Newton method designed for box-constrained optimization problems [1]. It is particularly efficient when the Hessian is approximated using the Gauss-Newton method.

The only drawback with Fides is that it is a Python package, but fortunately, it can be used from PEtab.jl through PyCall. To this end, you must build PyCall with a Python environment that has Fides installed:

using PyCall
# Path to Python executable with Fides installed
path_python_exe = "path_python"
ENV["PYTHON"] = path_python_exe
# Build PyCall with the Fides Python environment
import Pkg
Pkg.build("PyCall")

Fides supports several Hessian approximations, which can be specified in the Fides constructor:

PEtab.FidesType
Fides(hessian_method; verbose::Bool=false)

Setup the Fides box-constrained Newton-trust region optimizer for parameter estimation.

See also calibrate and calibrate_multistart.

Arguments

  • hessian_method: Method for computing the Hessian. Allowed options are:
    • nothing: The Hessian computed by the PEtabODEProblem is used.
    • :BB: Broyden's "bad" method.
    • :BFGS: Broyden-Fletcher-Goldfarb-Shanno update strategy.
    • :BG: Broyden's "good" method.
    • :Broyden: Broyden-class update scheme.
    • :SR1: Symmetric Rank 1 update.
    • :SSM: Structured Secant Method.
    • :TSSM: Totally Structured Secant Method.
  • verbose: Whether to print progress during the parameter estimation.

Description

Fides is a Newton-trust region optimizer for box-constrained optmization problems. More information on the algorithm can be found in [1]. Fides particularly excels when the full Hessian is too computationally expensive to compute, but a Gauss-Newton Hessian approximation can be computed (for more details see the documentation). In addition to supporting user Hessians via the PEtabODEProblem, it supports several Hessian approximation methods. Aa more extensive description than above see the Fides documentation.

  1. Fröhlich and Sorger, PLoS computational biology, pp e1010322 (2022)
source

A notable feature of Fides is that in each optimization step, the objective, gradient, and Hessian are computed simultaneously. This opens up the possibility for efficient reuse of computed quantities, especially when the Hessian is computed via the Gauss-Newton approximation. Because, to compute the Gauss-Newton Hessian the forward sensitivities are used, which can also be used to compute the gradient. Hence, a good PEtabODEProblem configuration for Fides with Gauss-Newton is:

petab_prob = PEtabODEProblem(model; gradient_method = :ForwardEquations, 
                             hessian_method = :GaussNewton,
                             reuse_sensitivities = true)

Given this setup, the Hessian method from the PEtabODEProblem can be used to run Fides for 200 iterations with:

using PyCall
res = calibrate(petab_prob, x0, Fides(nothing);
                options=py"{'maxiter' : 1000}")

As noted above, for Fides options are specified using a Python dictionary. Available options and their default values can be found in the Fides documentation, and more information on the algorithm can be found in the original publication [1].

References

[1]
F. Fröhlich and P. K. Sorger. Fides: Reliable trust-region optimization for parameter estimation of ordinary differential equation models. PLoS computational biology 18, e1010322 (2022).
[2]
A. Raue, M. Schilling, J. Bachmann, A. Matteson, M. Schelke, D. Kaschek, S. Hug, C. Kreutz, B. D. Harms, F. J. Theis and others. Lessons learned from quantitative dynamical modeling in systems biology. PloS one 8, e74335 (2013).
[3]
H. Hass, C. Loos, E. Raimundez-Alvarez, J. Timmer, J. Hasenauer and C. Kreutz. Benchmark problems for dynamic modeling of intracellular processes. Bioinformatics 35, 3073–3082 (2019).
[5]
A. F. Villaverde, F. Fröhlich, D. Weindl, J. Hasenauer and J. R. Banga. Benchmarking optimization methods for parameter estimation in large kinetic models. Bioinformatics 35, 830–838 (2019).
[10]
A. Wächter and L. T. Biegler. On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming. Mathematical programming 106, 25–57 (2006).