"This notebook is a very brief introduction to SciPy optimizers, documenting the example appendix/scipy_optim.py."

]

},

{

"cell_type": "markdown",

"metadata": {},

"source": [

"There are several optimizers in SciPy, in the module scipy.optimize. You can simply install them with +pip install scipy. \n",

"\n",

"You may find the user manual of this module in https://docs.scipy.org/doc/scipy/tutorial/optimize.html#tutorial-sqlsp.\n",

"\n",

"In this serie of notebooks, we mostly use BFGS, a quasi-Newton constraint-free algorithm, and SLSQP, a sequential QP solver accepting both equality and inequality constraints.\n",

"\n",

"We will then need the two +fmin functions from the scipy.optimize module, as well as +numpy to represent algebraic vectors."

"They are generally following a similar API, taking as main argument the cost function to optimize +f, the initial guess +x0, and optiminally a callback function +callback and some constraints.\n",

"\n",

"The cost objective should be defined as a function mapping the parameter space $x$ to a real value $f(x)$. Here is a simple polynomial example for $x \\in R^2$:"

"The callback takes the same signature but returns nothing: it only works by side effect, for example printing something, or displaying some informations in a viewer or on a plot, or possibly storing data in a logger. Here is for example a callback written as the functor of an object, that can be used to adjust its behavior or store some data."

"print('\\n *** Xopt in BFGS = %s \\n\\n\\n\\n' % str(xopt_bfgs))\n"

]

},

{

"cell_type": "markdown",

"metadata": {},

"source": [

"In that case, the gradients of the cost are computed by BFGS using finite differencing (i.e. not very accurately, but the algorithmic cost is typically very bad). If you can provide some derivatives by yourself, it would greatly improve the result. Yet, as a first draft, it is generally not too bad."

"print('\\n *** Xopt in LSQ = %s \\n\\n\\n\\n' % str(xopt_lsq))\n"

]

},

{

"cell_type": "markdown",

"metadata": {},

"source": [

"Now, SLSQP can also handle explicit constraints. Equality and inequality constraints must be given separately as function from the parameter $x$ to a vector stacking all the numerical quantities, that must be null for equalities, and positive for inequalities.\n",

"\n",

"We introduce here, as an example, two set of polynomial "

]

},

{

"cell_type": "code",

"execution_count": null,

"metadata": {},

"outputs": [],

"source": []

}

],

"metadata": {

"kernelspec": {

"display_name": "Python 3",

"language": "python",

"name": "python3"

},

"language_info": {

"codemirror_mode": {

"name": "ipython",

"version": 3

},

"file_extension": ".py",

"mimetype": "text/x-python",

"name": "python",

"nbconvert_exporter": "python",

"pygments_lexer": "ipython3",

"version": "3.6.9"

}

},

"nbformat": 4,

"nbformat_minor": 4

}

%% Cell type:markdown id: tags:

# Optimizers in SciPy

This notebook is a very brief introduction to SciPy optimizers, documenting the example appendix/scipy_optim.py.

%% Cell type:markdown id: tags:

There are several optimizers in SciPy, in the module scipy.optimize. You can simply install them with +pip install scipy.

You may find the user manual of this module in https://docs.scipy.org/doc/scipy/tutorial/optimize.html#tutorial-sqlsp.

In this serie of notebooks, we mostly use BFGS, a quasi-Newton constraint-free algorithm, and SLSQP, a sequential QP solver accepting both equality and inequality constraints.

We will then need the two +fmin functions from the scipy.optimize module, as well as +numpy to represent algebraic vectors.

%% Cell type:code id: tags:

``` python

# %load appendix/generated/scipy_optim_import

importnumpyasnp

fromscipy.optimizeimportfmin_bfgs,fmin_slsqp

```

%% Cell type:markdown id: tags:

They are generally following a similar API, taking as main argument the cost function to optimize +f, the initial guess +x0, and optiminally a callback function +callback and some constraints.

The cost objective should be defined as a function mapping the parameter space $x$ to a real value $f(x)$. Here is a simple polynomial example for $x \in R^2$:

%% Cell type:code id: tags:

``` python

# %load appendix/generated/scipy_optim_cost

defcost(x):

'''Cost f(x,y) = x^2 + 2y^2 - 2xy - 2x '''

x0=x[0]

x1=x[1]

return-1*(2*x0*x1+2*x0-x0**2-2*x1**2)

```

%% Cell type:markdown id: tags:

The callback takes the same signature but returns nothing: it only works by side effect, for example printing something, or displaying some informations in a viewer or on a plot, or possibly storing data in a logger. Here is for example a callback written as the functor of an object, that can be used to adjust its behavior or store some data.

print('\n *** Xopt in BFGS = %s \n\n\n\n'%str(xopt_bfgs))

```

%% Output

===CBK=== 1 1.010000 -0.000000

===CBK=== 2 2.014799 1.009848

===CBK=== 3 2.000000 1.000000

Optimization terminated successfully.

Current function value: -2.000000

Iterations: 3

Function evaluations: 12

Gradient evaluations: 4

*** Xopt in BFGS = [1.99999977 0.99999994]

%% Cell type:markdown id: tags:

In that case, the gradients of the cost are computed by BFGS using finite differencing (i.e. not very accurately, but the algorithmic cost is typically very bad). If you can provide some derivatives by yourself, it would greatly improve the result. Yet, as a first draft, it is generally not too bad.

Now, SLSQP can also handle explicit constraints. Equality and inequality constraints must be given separately as function from the parameter $x$ to a vector stacking all the numerical quantities, that must be null for equalities, and positive for inequalities.

We introduce here, as an example, two set of polynomial