Skip to content

Local Solvers

Local solvers are algorithms that find the local minimum of a function. In globalsearch-rs, we use the cobyla crate and the argmin crate to provide local solvers. The local solvers are used to refine the points in the reference set.

Available Local Solvers

The following local solvers are available in globalsearch-rs:

  • COBYLA (default)
  • L-BFGS
  • Nelder-Mead
  • Steepest Descent
  • Trust Region
  • Newton-CG

Requirements Comparison

Below is a comparison of the requirements for each local solver:

SolverGradientHessianLine SearchNotes
COBYLANoNoNoDerivative-free, constraints support
L-BFGSYesNoYesOptional L1-regularization via OWL-QN
Nelder-MeadNoNoNoDerivative-free
Steepest DescentYesNoYesFirst-order derivatives
Trust RegionYesYesNoUses trust-region radius updates
Newton-CGYesYesYesConjugate-gradient Newton steps

Detailed Overview of Each Solver

COBYLA

The COBYLA (Constrained Optimization BY Linear Approximation) algorithm is a derivative-free optimization method that is used to find the local minimum of a function. It is particularly useful for problems with nonlinear constraints. The algorithm works by iteratively constructing linear approximations of the objective function and constraints. This algorithm requires that the Problem struct implements the objective method. If the problem has constraints, the Problem struct must also implement the constraints method.

COBYLA is the only local solver in globalsearch-rs that supports constraints and bounds.

Parameters for COBYLA

ParameterTypeDescriptionDefault
max_iterusizeMaximum number of iterations300
initial_step_sizef64Initial step size1.0
ftol_relf64Tolerance for the objective function (relative)1e-6
ftol_absf64Tolerance for the objective function (absolute)1e-8
xtol_relf64Tolerance for the solution (relative)0 (Disabled)
xtol_absf64Tolerance for the solution (absolute)0 (Disabled)

L-BFGS

The L-BFGS (Limited-memory Broyden-Fletcher-Goldfarb-Shanno) algorithm is a quasi-Newton method that is used to find the local minimum of a function. It is a popular algorithm for large-scale optimization problems and is known for its efficiency and effectiveness. It also provides L1 regularization via the Orthant-Wise Limited-memory Quasi-Newton (OWL-QN) method.

This algorithm requires that the Problem struct implements the objective and gradient methods. It also requires a linesearch method to be implemented in the OQNLPParams struct, which is used to find the step size for the L-BFGS algorithm.

Parameters for L-BFGS

ParameterTypeDescriptionDefault
max_iterusizeThe maximum number of iterations to perform.300
tolerance_gradf64The gradient tolerance.sqrt(EPSILON)
tolerance_costf64The cost tolerance.EPSILON
history_sizeusizeNumber of past updates to the position (xx) and gradient (βˆ‡f\nabla f) used to approximate the inverse Hessian.10
l1_coefficientOption<f64>L1 regularization coefficient (OWL-QN) to penalize high-value coefficients.None
line_search_paramsLineSearchParamsLine search parameters.LineSearchParams::default()

Nelder-Mead

The Nelder-Mead algorithm is a simplex-based optimization method that is used to find the local minimum of a function. It is a derivative-free optimization method that is suitable for non-smooth functions. The algorithm is simple to implement and is widely used in practice.

This algorithm requires that the Problem struct implements the objective method.

Parameters for Nelder-Mead

ParameterTypeDescriptionDefault
simplex_deltaf64Step size for generating the simplex from a given point0.1
sd_tolerancef64Sample standard deviation toleranceEPSILON
max_iterusizeMaximum number of iterations300
alphaf64Reflection coefficient1.0
gammaf64Expansion coefficient2.0
rhof64Contraction coefficient0.5
sigmaf64Shrinkage coefficient0.5

Steepest Descent

The Steepest Descent algorithm is a first-order optimization method that is used to find the local minimum of a function. It is based on the idea of moving in the direction of the steepest descent of the function. The algorithm is simple to implement and is suitable for smooth functions.

This algorithm requires that the Problem struct implements the objective and gradient methods.

It also requires a linesearch method to be implemented in the OQNLPParams struct, which is used to find the step size for the Steepest Descent algorithm.

Parameters for Steepest Descent

ParameterTypeDescriptionDefault
max_iterusizeMaximum number of iterations300
line_search_paramsLineSearchParamsLine search parametersLineSearchParams::default()

Trust Region

The Trust Region algorithm is a popular optimization method that is used to find the local minimum of a function. It is based on the idea of approximating the objective function within a local region and iteratively refining the solution. The algorithm is particularly effective for non-convex functions and is widely used in practice.

This algorithm requires that the Problem struct implements the objective, gradient, and hessian methods.

Parameters for Trust Region

ParameterTypeDescriptionDefault
trust_region_radius_methodTrustRegionRadiusMethodTrust region radius methodCauchy
radiusf64Initial trust region radius1.0
max_radiusf64Maximum trust region radius100.0
etaf64Trust region update parameter0.125

Newton-CG

The Newton-CG (Newton-Conjugate Gradient) algorithm is a second-order optimization method that is used to find the local minimum of a function. It combines the Newton’s method with the conjugate gradient method to efficiently handle large-scale optimization problems. The algorithm is particularly effective for non-linear optimization problems and is widely used in practice.

This algorithm requires that the Problem struct implements the objective, gradient, and hessian methods.

It also requires a linesearch method to be implemented in the OQNLPParams struct, which is used to find the step size for the Newton-CG algorithm.

Parameters for Newton-CG

ParameterTypeDescriptionDefault
max_iterusizeMaximum number of iterations300
curvature_thresholdf64Curvature threshold0.0
tolerancef64Tolerance for the objective functionEPSILON
line_search_paramsLineSearchParamsLine search parametersLineSearchParams::default()