A (Partial) List of Optimizers in Matlab, Python, and Julia
Matlab
The best optimizer in Matlab for most of our problems (nonlinear, differentiable) is fmincon. It is easy to use, robust, and has a wide variety of options. If you have a nonlinear differentiable problem that is failing with fmincon this usually means that you’ve formulated the problem poorly or incorrectly. If you’re problem is nondifferentiable then ga is one of the best options. Other options for specific needs are discussed below.
We have university licenses to Matlab and the Optimization Toolbox. This toolbox provides the following methods:
- fminsearch, gradient-free, nonlinear unconstrained, Nelder-Mead simplex method.
- fminunc, gradient-based, nonlinear unconstrained, includes a quasi-newton and a trust-region method.
- fmincon, gradient-based, nonlinear constrained, includes an interior-point, sqp, active-set, and trust-region-reflective method.
- linprog, linear programming problems.
- intlinprog, mixed-integer linear programming problems.
- quadprog, quadratic programming problems.
We do not have a university license for this toolbox. There are a few options for use:
- We have a limited number of floating licenses for the Global Optimization Toolbox on the CAEDM servers. If you go through Citrix and use Matlab 2018a you should be able to access the toolbox.
- Mathworks offers a trial version of Matlab and any toolboxes, which will last for one month (long enough for the gradient-free assignments in this class).
- You can make a personal purchase.
- Use one of the third-party tools discussed in the next section.
The global optimization toolbox has the following methods (all of these are gradient-free approaches):
- patternsearch, pattern search solver for derivative-free optimization, constrained or unconstrained
- ga, genetic algorithm solver for mixed-integer or continuous-variable optimization, constrained or unconstrained
- gamultiobj, multiobjective genetic algorithm
- particleswarm, particle swarm solver for derivative-free unconstrained optimization or optimization with bounds
- simulannealbnd, simulated annealing solver for derivative-free unconstrained optimization or optimization with bounds
Third Party Gradient-Free Algorithms
Because we don’t have university-wide access to the Global Optimization Toolbox, I list here a number of third-party options contributed by the user community. These are simplistic implementations. Based on past experience, they should work fine for the homework, but may not be good enough for your project or anything of similar complexity.
- NGPM. NSGA-II is a well-known genetic algorithm, and this is a Matlab implementation of that methodology.
- GODLIKE, a basic genetic algorithm, differential evolution, particle swarm, and adaptive simulated annealing method
Convex Optimization
- CVX, a nice modeling language for disciplined convex problem, includes some free solvers. Developed at Stanford, works within Matlab. Interfaces with some commercial solvers like Gurobi.
- Gurobi, an excellent commercial optimizer for disciplined convex problems, or mixed-integer ``convex’’ problems. Free for academic use. Can be used with CVX or through other interfaces (Python, R, C, C++, etc.)
Python
The easiest options to start out with are the ones in SciPy, because you already have them. However, in my experience none of the optimizers in SciPy are particularly good. They should be sufficient for the homework problems, and may even work well enough for some of your projects, but if you try to do problems of even modest complexity you will likely find them wanting. Better options are discussed below, but they require more work to get setup.
SciPy
SciPy provides direct access to several optimizers, or you can can use the minimize function described below to more easily switch between different options.
- scipy.optimize.minimize provides an interface to several optimization methods. Some of the more useful ones are:
- SLSQP, a simple SQP implementation.
- BFGS and CG, a simple BFGS Quasi-Newton, and Conjugate Gradient implementation (unconstrained)
- Nelder-Mead, a gradient-free Nelder-Mead simplex method.
- COBYLA, a gradient-free method using successive linear approximations.
- scipy.optimize.differential_evolution a differential evolution method (effectively a real-encoded genetic algorithm)
pyOptSparse is not an optimizer, but rather a wrapper to a dozen or so optimizers. This allows you to define your problem once, and easily change between solvers without having the worry about the different conventions used by the various optimizers. There are some solver options here that are much better than the ones in SciPy. SNOPT, for example, is one we use a lot in our research problems. Some of these, like SNOPT, are not freely available. I have purchased or obtained department licenses where needed. If you are a BYU student and want access to one of the optimizers in bold, please come see me. Please note that the optimizers SNOPT, NLPQLP, and our version of IPOPT are for academic use only, and you should cite the authors if used for any publications. Setup of pyOptSparse requires a C and a Fortran compiler.
Some of the available optimizers include:
- SNOPT, SQP method, effective for small- and large-scale problems. I have purchased a department license.
- NLPQLP, quadratic Lagrangian approximation with linear constraints, non-monotone line search to help with numerical instabilities. I have obtained an academic license.
- IPOPT, an interior point method. The source code is freely-available, but not the linear solver (there is a free one but it is not very good). I have obtained academic licenses to the better linear solvers.
- PSQP, another basic SQP implementation
- FSQP, a feasible SQP implementation
- SLSQP, same as in Scipy
- CONMIN, a basic feasible direction method
- NSGA2, a well-known genetic algorithm
- ALPSO, particle swarm method
Convex Optimization
See the notes above on Gurobi
Julia
Julia has excellent support for algoritnmic differentiation (see for example ForwardDiff.jl and ReverseDiff.jl). It also has great support for convex optimization and unconstrained optimization, but support for gradient-based optimization with nonlinear constraints is lacking (there is good support for third-party commercial solvers similar to the Python situation above). Here are some options:
- Snopt (mentioned above), with a Julia wrapper that we wrote: Snopt.jl. This is what we use in our research lab.
- JuMP with Ipopt.jl is a decent alternative.
- Optim.jl is a nice package for native Julia solvers. It has good support for gradient-free methods (Nelder Mead, simulated annealing, particle swarm), and unconstrained gradient-based (conjugate gradient, L-BFGS). There is one algorithm for nonlinear constrained, but it requires the Hessian.
Convex Optimization
Julia has tons of great options here including JuMP with Gurobi, or MOSEK, etc. or Convex.jl.