nonlinear-optimization

Various iterative algorithms for optimization of nonlinear functions.

https://github.com/meteficha/nonlinear-optimization

Latest on Hackage:0.3.12.1

This package is not currently in any snapshots. If you're interested in using it, we recommend adding it to Stackage Nightly. Doing so will make builds more reliable, and allow stackage.org to host generated Haddocks.

LicenseRef-GPL licensed by Felipe A. Lessa (Haskell code), William W. Hager and Hongchao Zhang (CM_DESCENT code).
Maintained by Felipe A. Lessa

This library implements numerical algorithms to optimize nonlinear functions. Optimization means that we try to find a minimum of the function. Currently all algorithms guarantee only that local minima will be found, not global ones.

Almost any continuosly differentiable function f : R^n -> R may be optimized by this library. Any further restrictions are listed in the modules that need them.

We use the vector package to represent vectors and matrices, although it would be possible to use something like hmatrix easily.

Currently only CG_DESCENT method is implemented.

If you want to use automatic differentiation to avoid hand-writing gradient functions, you can use nonlinear-optimization-ad package or nonlinear-optimization-backprop package.