Scipy.optimize minimize is taking too long

14,346

Solution 1

Your tolerance should be set to whatever tolerance you need. Setting it higher just tells the optimiser to stop sooner and doesn't actually speed it up. That being said, allowing it to go to a greater tollerence might be a waste of your time if not needed.

Possible ways to reduce the time required are as follows:

  • Use a different optimiser
  • Use a different gradient finding method
  • Speed up your objective function
  • Reduce the number of design variables
  • Choose a better initial guess
  • Use parallel processing

Gradient methods

As you are using finite difference, you need (1 + the number of design variables) evaluations of your objective function to get the total sensitivity.

As ev-br said, if you can find the analytical solution to the jacobian then this isn't needed. Based on the fact you have 1500 design variables. Im guessing this isnt easy, though if your objective function allows, automatic differentiation might be an option. Iv had some experience with AlgoPy which you could look at.

Objective function speed

Due to the high number of objective function evaluations, this may be the easiest approach. Once again, see ev-br's answer for things like compiling using cython, and general reducing complexity. You could try running parts of the code using timeit so see if changes are beneficial.

Design variables

Reducing the number of design variables linearly lowers the objective function calls needed for the finite difference. Do all your variables change significantly? Could some be fixed at a set value? Can you derive some as a function of others?

Initial Guess

Depending on your problem, you may be able to select a better starting point that will mean your optimiser is 'closer' to the final solution. Depending on your problem, you may also be able to 'restart' your optimisation from a previous result.

Parallelisation

The finite difference evaluations don't have to be done in order so you could write your own finite difference function and then run the calls in parallel using the multiprocessing class. The effectiveness of this is based on your system and number of cores available.

Solution 2

Here's what I'd do:

  • profile the minimization. From your output it seems that evaluating the function is the bottleneck. Check if it's so. If it is, then:
  • see if you can compute the jacobian with paper and pencil or a CAS system. Use it instead of finite differences.
  • see if you can speed up the function itself (mathematical simplifications, numpy vectorization, cython)
Share:
14,346
RussellB.
Author by

RussellB.

Updated on June 05, 2022

Comments

  • RussellB.
    RussellB. almost 2 years

    I am running a constrained optimization problem with about 1500 variables and it is taking over 30 minutes to run....

    If I reduce the tolerance to 1 the minimization will complete in about five minutes, but that doesn't seem like a good way to speed things up.

    from scipy.optimize import minimize
    
    results = minimize(objFun, initialVals, method='SLSQP', bounds = bnds, constraints=cons, tol = toler)
    
    print(results)
    
    fun: -868.72033130318198
    jac: array([ 0.,  0.,  0., ...,  0.,  0.,  0.])
    message: 'Optimization terminated successfully.'
    nfev: 1459
    nit: 1
    njev: 1
    status: 0
    success: True
    x: array([ 0.,  0.,  0., ...,  1.,  1.,  1.])
    

    Any suggestions would be appreciated.

  • RussellB.
    RussellB. almost 8 years
    How would I go about incorporating AlgoPy generated derivatives into the minimize function? As my initial my initial guesses are pretty good, calculating the jacobian separately seems like the best option. I could probably cut down on the number of variables, so I'll look into that too.
  • Wokpak
    Wokpak almost 8 years
    Thats probably best its own question, and I havnt used it for a while, but generally, automatic differentiation doesn't actually produce a symbolic / coded differentiation function but rather, with given inputs, gives you the derivatives with those inputs by parsing over your original function. So I think your gradient function would be a wrapper for the auto dif code that uses the objective function.
  • RussellB.
    RussellB. almost 8 years
    Ok. I've calculated the derivatives by hand and put them in a function that returns the Jacobian at each point... we'll see how that works.