Contents

A (Partial) List of Optimizers in Matlab, Python, and Julia

Matlab

The best optimizer in Matlab for most of our problems (nonlinear, differentiable) is fmincon. It is easy to use, robust, and has a wide variety of options. If you have a nonlinear differentiable problem that is failing with fmincon this usually means that you’ve formulated the problem poorly or incorrectly. If you’re problem is nondifferentiable then ga is one of the best options. Other options for specific needs are discussed below.

Optimization Toolbox

We have university licenses to Matlab and the Optimization Toolbox. This toolbox provides the following methods:

Global Optimization Toolbox

We do not have a university license for this toolbox. There are a few options for use:

The global optimization toolbox has the following methods (all of these are gradient-free approaches):

Third Party Gradient-Free Algorithms

Because we don’t have university-wide access to the Global Optimization Toolbox, I list here a number of third-party options contributed by the user community. These are simplistic implementations. Based on past experience, they should work fine for the homework, but may not be good enough for your project or anything of similar complexity.

Convex Optimization

Python

The easiest options to start out with are the ones in SciPy, because you already have them. However, in my experience none of the optimizers in SciPy are particularly good. They should be sufficient for the homework problems, and may even work well enough for some of your projects, but if you try to do problems of even modest complexity you will likely find them wanting. Better options are discussed below, but they require more work to get setup.

SciPy

SciPy provides direct access to several optimizers, or you can can use the minimize function described below to more easily switch between different options.

pyOptSparse

pyOptSparse is not an optimizer, but rather a wrapper to a dozen or so optimizers. This allows you to define your problem once, and easily change between solvers without having the worry about the different conventions used by the various optimizers. There are some solver options here that are much better than the ones in SciPy. SNOPT, for example, is one we use a lot in our research problems. Some of these, like SNOPT, are not freely available. I have purchased or obtained department licenses where needed. If you are a BYU student and want access to one of the optimizers in bold, please come see me. Please note that the optimizers SNOPT, NLPQLP, and our version of IPOPT are for academic use only, and you should cite the authors if used for any publications. Setup of pyOptSparse requires a C and a Fortran compiler.

Some of the available optimizers include:

Convex Optimization

See the notes above on Gurobi

Julia

Julia has excellent support for algoritnmic differentiation (see for example ForwardDiff.jl and ReverseDiff.jl). It also has great support for convex optimization and unconstrained optimization, but support for gradient-based optimization with nonlinear constraints is lacking (there is good support for third-party commercial solvers similar to the Python situation above). Here are some options:

Convex Optimization

Julia has tons of great options here including JuMP with Gurobi, or MOSEK, etc. or Convex.jl.