SciPy, fa'alelei

SciPy, fa'alelei

SciPy (pronounced sai pie) o se mea faʻaoga numera faʻavae i luga o le Numpy Python faʻaopoopoga. Faatasi ai ma le SciPy, o lau vasega Python fefaʻasoaaʻi e avea ma faʻamatalaga atoatoa faʻasaienisi ma faʻalavelave faʻataʻitaʻiga siosiomaga e pei o MATLAB, IDL, Octave, R-Lab, ma SciLab. O le asō ou te manaʻo e talanoa faʻapuupuu pe faʻafefea ona faʻaogaina nisi algorithms optimization lauiloa i le scipy.optimize package. E mafai ona maua i taimi uma le fesoasoani () fesoasoani poʻo le faʻaaogaina o le Shift + Tab.

Faatomuaga

Ina ia faʻasaoina oe lava ma le au faitau mai le suʻesuʻeina ma le faitauina o punaoa autu, o fesoʻotaʻiga i faʻamatalaga o metotia o le a tele lava i Wikipedia. I le avea ai o se tulafono, o lenei faʻamatalaga ua lava e malamalama ai i metotia i tulaga lautele ma tulaga mo latou faʻaoga. Ina ia malamalama i le aano o metotia faʻa-matematika, mulimuli i fesoʻotaʻiga i faʻasalalauga sili atu faʻatagaina, lea e mafai ona maua i le faʻaiʻuga o tusiga taʻitasi poʻo i lau masini suʻesuʻe e sili ona e fiafia i ai.

O lea la, o le scipy.optimize module e aofia ai le faʻatinoina o faiga nei:

  1. Fa'aitiitiga fa'aitiitiga fa'aitiitiga fa'aleaogaina o galuega fa'akomepiuta o le tele o fesuiaiga (la'ititi) e fa'aaoga ai algorithms eseese (Nelder-Mead simplex, BFGS, Newton conjugate gradients, COBYLA и SLSQP)
  2. Fa'atonuga fa'alelalolagi (mo se fa'ata'ita'iga: fa'atauva'a, diff_evolution)
  3. Fa'aitiitia toega MNC (least_squares) ma le fa'aogaina o le pi'o algorithms e fa'aaoga ai sikuea la'ititi e le laina (curve_fit)
  4. Fa'aiti'itia galuega fa'akomepiuta o le tasi fesuiaiga (minim_scalar) ma le su'eina o a'a (root_scalar)
  5. Fa'ato'a fa'avasegaina o faiga fa'atusa (a'a) e fa'aaoga ai algorithms eseese (hybrid Powell, Levenberg-Marquardt po'o auala tetele e pei o Newton-Krylov).

I lenei tusiga o le a tatou iloiloina na o le mea muamua mai lenei lisi atoa.

Fa'aiti'itia fa'aitiitiga o se galuega scalar o le tele o fesuiaiga

Ole fa'aitiitiga ole galuega mai le scipy.optimize afifi e maua ai se feso'ota'iga lautele mo le fo'ia o fa'afitauli fa'aitiitiga fa'aletonu ma fa'aitiitiga ole galuega scalar o le tele o fesuiaiga. Ina ia faʻaalia pe faʻapefea ona galue, o le a tatou manaʻomia se galuega talafeagai o le tele o fesuiaiga, lea o le a tatou faʻaititia i auala eseese.

Mo nei faʻamoemoega, o le Rosenbrock galuega o N fesuiaiga e atoatoa, o loʻo i ai le foliga:

SciPy, fa'alelei

E ui lava i le mea moni e faapea o le Rosenbrock galuega ma ona Jacobi ma Hessian matrices (o le muamua ma le lona lua derivatives, faasologa) ua uma ona faamatalaina i le scipy.optimize afifi, o le a tatou faamatalaina i tatou lava.

import numpy as np

def rosen(x):
    """The Rosenbrock function"""
    return np.sum(100.0*(x[1:]-x[:-1]**2.0)**2.0 + (1-x[:-1])**2.0, axis=0)

Mo le manino, seʻi o tatou tusia i le 3D le tau o le Rosenbrock galuega o ni fesuiaiga se lua.

Fa'ailoga tusi

from mpl_toolkits.mplot3d import Axes3D
import matplotlib.pyplot as plt
from matplotlib import cm
from matplotlib.ticker import LinearLocator, FormatStrFormatter

# Настраиваем 3D график
fig = plt.figure(figsize=[15, 10])
ax = fig.gca(projection='3d')

# Задаем угол обзора
ax.view_init(45, 30)

# Создаем данные для графика
X = np.arange(-2, 2, 0.1)
Y = np.arange(-1, 3, 0.1)
X, Y = np.meshgrid(X, Y)
Z = rosen(np.array([X,Y]))

# Рисуем поверхность
surf = ax.plot_surface(X, Y, Z, cmap=cm.coolwarm)
plt.show()

SciPy, fa'alelei

Ia iloa muamua o le la'ititi o le 0 i SciPy, fa'alelei, seʻi o tatou vaʻavaʻai i faʻataʻitaʻiga o le auala e fuafua ai le tau maualalo ole galuega a Rosenbrock e faʻaaoga ai faiga eseese scipy.optimize.

Nelder-Mead simplex metotia

Ia iai se ulua'i mata'itusi x0 ile 5-dimensional avanoa. Sei o tatou su'e le pito pito i lalo ole galuega a Rosenbrock e sili ona latalata ile fa'aogaina ole algorithm Nelder-Mead simplex (o le algorithm o loʻo faʻamaonia o le tau o le metotia metotia):

from scipy.optimize import minimize
x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2])
res = minimize(rosen, x0, method='nelder-mead',
    options={'xtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 339
         Function evaluations: 571
[1. 1. 1. 1. 1.]

O le auala simplex o le auala sili lea ona faigofie e faʻaitiitia ai se faʻamatalaga manino ma faʻalelei lelei. E le manaʻomia le faʻatusatusaina o mea e maua mai i se galuega; e lava le faʻamaonia naʻo ona tau. O le auala Nelder-Mead o se filifiliga lelei mo faʻafitauli faigofie faʻaititia. Ae ui i lea, talu ai e le faʻaaogaina fua faʻatatau, e ono umi se taimi e suʻe ai le laʻititi.

Powell metotia

O le isi optimization algorithm lea e naʻo le tau o galuega e faʻatatauina auala a Powell. Ina ia faʻaaogaina, e tatau ona e setiina le metotia = 'powell' i le galuega faʻaititia.

x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2])
res = minimize(rosen, x0, method='powell',
    options={'xtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 19
         Function evaluations: 1622
[1. 1. 1. 1. 1.]

Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm

Ina ia maua vave le convergence i se fofo, o le faiga BFGS fa'aaoga le fa'alili ole galuega fa'atino. O le gradient e mafai ona faʻamaonia o se galuega poʻo le fuafuaina e faʻaaoga ai le eseesega o le faasologa muamua. I so'o se tulaga, o le BFGS auala e masani ona mana'omia ai ni nai galuega fa'atino nai lo le auala simplex.

Sei o tatou suʻeina le faʻavae o le Rosenbrock galuega ile faʻavasegaina:

SciPy, fa'alelei

SciPy, fa'alelei

O lenei faʻamatalaga e aoga mo faʻamatalaga o fesuiaiga uma sei vagana ai le muamua ma le mulimuli, o loʻo faʻamatalaina e faapea:

SciPy, fa'alelei

SciPy, fa'alelei

Sei o tatou vaʻai i le galuega a le Python e faʻatatauina lenei gradient:

def rosen_der (x):
    xm = x [1: -1]
    xm_m1 = x [: - 2]
    xm_p1 = x [2:]
    der = np.zeros_like (x)
    der [1: -1] = 200 * (xm-xm_m1 ** 2) - 400 * (xm_p1 - xm ** 2) * xm - 2 * (1-xm)
    der [0] = -400 * x [0] * (x [1] -x [0] ** 2) - 2 * (1-x [0])
    der [-1] = 200 * (x [-1] -x [-2] ** 2)
    return der

O le fa'asologa o le fa'atatauga fa'atatau o lo'o fa'amaoti mai o le tau o le jac parakalafa o le fa'aitiitia galuega, e pei ona fa'aalia i lalo.

res = minimize(rosen, x0, method='BFGS', jac=rosen_der, options={'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 25
         Function evaluations: 30
         Gradient evaluations: 30
[1.00000004 1.0000001  1.00000021 1.00000044 1.00000092]

Fa'atasi le fa'agasologa algorithm (Newton)

O le algorithm Newton's conjugate gradients o se auala fou a Newton.
O le auala a Newton e faʻavae i luga o le faʻatusatusaina o se galuega i totonu o le lotoifale e ala i le polynomial o le tikeri lona lua:

SciPy, fa'alelei

le mea SciPy, fa'alelei o le matrix o mea e maua mai lona lua (Hessian matrix, Hessian).
Afai o le Hessian e mautinoa lelei, ona mafai lea ona maua le pito maualalo i le lotoifale o lenei galuega e ala i le faʻatusatusaina o le gradient zero o le faʻailoga quadratic i le zero. O le taunuuga o le a avea ma faʻamatalaga:

SciPy, fa'alelei

O le Hessian fa'afeagai e fa'atatau i le fa'aogaina o le fa'aogaina o le gradient method. O se faʻataʻitaʻiga o le faʻaogaina o lenei metotia e faʻaitiitia ai le galuega a Rosenbrock o loʻo tuʻuina atu i lalo. Mo le faʻaogaina o le Newton-CG metotia, e tatau ona e faʻamaonia se galuega e faʻatatau ai le Hessian.
O le Hessian o le Rosenbrock galuega i le auiliiliga e tutusa ma:

SciPy, fa'alelei

SciPy, fa'alelei

le mea SciPy, fa'alelei и SciPy, fa'alelei, fa'amatala le matrix SciPy, fa'alelei.

O elemene o totoe e le-zero o le matrix e tutusa ma:

SciPy, fa'alelei

SciPy, fa'alelei

SciPy, fa'alelei

SciPy, fa'alelei

Mo se faʻataʻitaʻiga, i le lima-dimensional space N = 5, o le Hessian matrix mo le galuega a Rosenbrock o loʻo i ai foliga o se fusi:

SciPy, fa'alelei

Laiti e fa'atatauina lenei Hessian fa'atasi ai ma le fa'ailoga mo le fa'aitiitia o le galuega a Rosenbrock e fa'aaoga ai le fa'aogaina o le gradient (Newton):

def rosen_hess(x):
    x = np.asarray(x)
    H = np.diag(-400*x[:-1],1) - np.diag(400*x[:-1],-1)
    diagonal = np.zeros_like(x)
    diagonal[0] = 1200*x[0]**2-400*x[1]+2
    diagonal[-1] = 200
    diagonal[1:-1] = 202 + 1200*x[1:-1]**2 - 400*x[2:]
    H = H + np.diag(diagonal)
    return H

res = minimize(rosen, x0, method='Newton-CG', 
               jac=rosen_der, hess=rosen_hess,
               options={'xtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 24
         Function evaluations: 33
         Gradient evaluations: 56
         Hessian evaluations: 24
[1.         1.         1.         0.99999999 0.99999999]

O se faʻataʻitaʻiga faʻatasi ai ma le faʻamatalaga o le gaioiga o oloa a le Hessian ma le faʻaogaina o le vector

I faʻafitauli moni o le lalolagi, o le faʻatulagaina ma le teuina o le matrix Hessian atoa e mafai ona manaʻomia ai le tele o taimi ma punaoa manatua. I lenei tulaga, e leai se mea e manaʻomia e faʻamaonia ai le Hessian matrix lava ia, aua ole faiga fa'aitiitiga e mana'omia ai na'o se ve'a tutusa ma le oloa a le Hessian ma se isi ve'a fa'atonu. O le mea lea, mai se manatu faʻatusatusa, e sili atu ona lelei le faʻamalamalamaina vave o se galuega e toe faʻafoʻi mai ai le taunuuga o le oloa a le Hessian ma se vector faʻaogaina.

Mafaufau i le galuega o le hess, lea e ave ai le vector minimization e fai ma finauga muamua, ma se vector faʻamaonia e avea ma finauga lona lua (faʻatasi ai ma isi finauga o le galuega e faʻaitiitia). I la matou tulaga, o le fuafuaina o le oloa o le Hessian o le Rosenbrock galuega faʻatasi ai ma se faʻailoga faʻapitoa e le faigata tele. Afai p o se ve'a fa'atonu, ona fa'apea ai lea o le oloa SciPy, fa'alelei foliga mai:

SciPy, fa'alelei

O le galuega e fa'atatauina le oloa a le Hessian ma se ve'a fa'aletonu e pasia e pei o le tau o le finauga hessp i le fa'aitiitia galuega:

def rosen_hess_p(x, p):
    x = np.asarray(x)
    Hp = np.zeros_like(x)
    Hp[0] = (1200*x[0]**2 - 400*x[1] + 2)*p[0] - 400*x[0]*p[1]
    Hp[1:-1] = -400*x[:-2]*p[:-2]+(202+1200*x[1:-1]**2-400*x[2:])*p[1:-1] 
    -400*x[1:-1]*p[2:]
    Hp[-1] = -400*x[-2]*p[-2] + 200*p[-1]
    return Hp

res = minimize(rosen, x0, method='Newton-CG',
               jac=rosen_der, hessp=rosen_hess_p,
               options={'xtol': 1e-8, 'disp': True})

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 24
         Function evaluations: 33
         Gradient evaluations: 56
         Hessian evaluations: 66

Conjugate gradient trust region algorithm (Newton)

O le le lelei o le fa'atulagaina o le Hessian matrix ma fa'atonuga su'esu'e le sa'o e mafai ona mafua ai le le aoga o le fa'asologa o le fa'asologa o le algorithm a Newton. I tulaga faapea, e ave i ai le faamuamua auala fa'alagolago i le itulagi (fa'alagolago-fa'ilagi) fa'atasia fa'ama'i Newton.

Fa'ata'ita'iga ma le fa'amatalaga o le matrix Hessian:

res = minimize(rosen, x0, method='trust-ncg',
               jac=rosen_der, hess=rosen_hess,
               options={'gtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 20
         Function evaluations: 21
         Gradient evaluations: 20
         Hessian evaluations: 19
[1. 1. 1. 1. 1.]

Fa'ata'ita'iga fa'atasi ai ma le fa'atinoga o oloa a le Hessian ma se ve'a fa'atonu:

res = minimize(rosen, x0, method='trust-ncg', 
                jac=rosen_der, hessp=rosen_hess_p, 
                options={'gtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 20
         Function evaluations: 21
         Gradient evaluations: 20
         Hessian evaluations: 0
[1. 1. 1. 1. 1.]

Krylov ituaiga metotia

E pei o le auala faʻalagolago-ncg, Krylov-ituaiga auala e fetaui lelei mo le foia o faʻafitauli tetele aua latou te faʻaaogaina naʻo oloa matrix-vector. O lo latou autu o le foia lea o se faʻafitauli i totonu o se itulagi mautinoa faʻatapulaʻaina e se Krylov subspace ua tipi. Mo faʻafitauli le mautonu, e sili atu le faʻaaogaina o lenei metotia, talu ai e faʻaaogaina se numera laʻititi o faʻasalalauga e le faʻaogaina ona o le laʻititi laʻititi o oloa matrix-vector i subproblem, faʻatusatusa i le auala faʻalagolago-ncg. E le gata i lea, o le fofo i le quadratic subproblem e sili atu ona saʻo nai lo le faʻaaogaina o le trust-ncg method.
Fa'ata'ita'iga ma le fa'amatalaga o le matrix Hessian:

res = minimize(rosen, x0, method='trust-krylov',
               jac=rosen_der, hess=rosen_hess,
               options={'gtol': 1e-8, 'disp': True})

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 19
         Function evaluations: 20
         Gradient evaluations: 20
         Hessian evaluations: 18

print(res.x)

    [1. 1. 1. 1. 1.]

Fa'ata'ita'iga fa'atasi ai ma le fa'atinoga o oloa a le Hessian ma se ve'a fa'atonu:

res = minimize(rosen, x0, method='trust-krylov',
               jac=rosen_der, hessp=rosen_hess_p,
               options={'gtol': 1e-8, 'disp': True})

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 19
         Function evaluations: 20
         Gradient evaluations: 20
         Hessian evaluations: 0

print(res.x)

    [1. 1. 1. 1. 1.]

Algorithm mo tali fa'atatau i le itulagi mautinoa

O metotia uma (Newton-CG, trust-ncg ma trust-krylov) e fetaui lelei mo le foia o faafitauli tetele (faatasi ai ma le faitau afe o fesuiaiga). E mafua ona o le mea moni o le conjugate gradient algorithm o loʻo faʻaalia ai se faʻatusatusaga o le faʻaogaina o le matrix Hessian. O le fofo o loʻo maua faʻasolosolo, e aunoa ma le faʻalauteleina o le Hessian. Talu ai e na'o oe e mana'omia le fa'amalamalamaina o se galuega mo le oloa a le Hessian ma se ve'a fa'aletonu, o lenei algorithm e sili ona lelei mo le galue i matrices seasea (band diagonal). Ole mea lea e maua ai tau maualalo ole manatua ma fa'asaoina taimi taua.

Mo faʻafitauli lapopoa, o le tau o le teuina ma le faʻavasegaina o le Hessian e le taua. O lona uiga e mafai ona maua se fofo i ni nai fa'asologa, fo'ia fa'afitauli laiti o le vaega fa'alagolago e toetoe lava a sa'o. Ina ia faia lenei mea, o nisi fa'atusa e le laina laina e fofo fa'asolosolo mo fa'afitauli fa'afa'afa ta'itasi. O sea fofo e masani ona manaʻomia 3 poʻo 4 Cholesky decompositions o le matrix Hessian. O le i'uga, o le metotia e fa'afefe i ni nai fa'asologa ma e mana'omia ai ni fa'atatauga fa'atatau o galuega fa'atino nai lo isi metotia fa'atino a le itulagi mautinoa. O lenei algorithm e faʻaalia ai le faʻamoemoeina o le matrix Hessian atoa ma e le lagolagoina le mafai ona faʻaogaina le gaioiga o oloa a le Hessian ma se vector faʻaogaina.

Fa'ata'ita'iga ma le fa'aitiitiga ole galuega a Rosenbrock:

res = minimize(rosen, x0, method='trust-exact',
               jac=rosen_der, hess=rosen_hess,
               options={'gtol': 1e-8, 'disp': True})
res.x

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 13
         Function evaluations: 14
         Gradient evaluations: 13
         Hessian evaluations: 14

array([1., 1., 1., 1., 1.])

Atonu o le a tatou tu ai iina. I le isi tusiga o le a ou taumafai e taʻu atu mea sili ona manaia e uiga i le faʻaitiitia o tulaga, o le faʻaogaina o le faʻaititia i le foia o faʻafitauli faʻatatau, faʻaitiitia o se galuega a le tasi fesuiaiga, faʻaitiitia faʻaletonu, ma le sailia o aʻa o se faiga o faʻatusatusaga e faʻaaoga ai le scipy.optimize afifi.

puna: https://docs.scipy.org/doc/scipy/reference/

puna: www.habr.com

Faaopoopo i ai se faamatalaga