SciPy, optimization

SciPy, optimization

SciPy (e bitsoa sai pie) ke sephutheloana sa ts'ebeliso ea lipalo se ipapisitseng le katoloso ea Numpy Python. Ka SciPy, seboka sa hau se sebetsanang sa Python se fetoha saense e felletseng ea data le tikoloho e rarahaneng ea prototyping joalo ka MATLAB, IDL, Octave, R-Lab, le SciLab. Kajeno ke batla ho bua ka bokhutšoanyane ka mokhoa oa ho sebelisa li-algorithms tse tsebahalang tsa optimization ka har'a sephutheloana sa scipy.optimize. Thuso e qaqileng haholoanyane le ea morao-rao mabapi le ho sebelisa lits'ebetso e ka fumaneha kamehla ka thuso () taelo kapa ho sebelisa Shift+Tab.

Selelekela

E le hore u ipholose le ho boloka babali ho tsoa ho batla le ho bala mehloli ea mantlha, lihokela tsa litlhaloso tsa mekhoa li tla ba haholo ho Wikipedia. E le molao, tlhahisoleseding ena e lekane ho utloisisa mekhoa ka kakaretso le maemo a kopo ea bona. Ho utloisisa bohlokoa ba mekhoa ea lipalo, latela likhokahano tse lebisang likhatisong tse nang le matla, tse ka fumanoang qetellong ea sengoloa ka seng kapa mochineng oo u o ratang oa ho batla.

Kahoo, scipy.optimize module e kenyelletsa ts'ebetsong ea mekhoa e latelang:

  1. Ho fokotsa maemo le ntle ho maemo a mesebetsi ea scalar ea mefuta e mengata (minim) ho sebelisoa li-algorithms tse fapaneng (Nelder-Mead simplex, BFGS, Newton conjugate gradients, COBYLA и SLSQP)
  2. Ntlafatso ea lefats'e (mohlala: basinhopping, diff_evolution)
  3. Ho fokotsa masala MNC (least_squares) le curve fit fit algorithms ho sebelisa nonlinear least squares (curve_fit)
  4. Ho fokotsa mesebetsi ea scalar ea mofuta o le mong (minim_scalar) le ho batla metso (root_scalar)
  5. Multidimensional solvers of system of equations (motso) o sebelisa li-algorithms tse fapaneng (Powell lebasetere, Levenberg-Marquardt kapa mekhoa e meholo e kang Newton-Krylov).

Sehloohong sena re tla sheba feela ntho ea pele lethathamong lena kaofela.

Ho fokotsa ho sa hlokahaleng ha mosebetsi oa scalar oa mefuta e mengata

Mosebetsi oa minim o tsoang ho sephutheloana sa scipy.optimize o fana ka sebopeho se akaretsang sa ho rarolla mathata a ho fokotsa maemo le a sa hlokahaleng a mesebetsi ea scalar ea mefuta e mengata. Ho bontša hore na e sebetsa joang, re tla hloka ts'ebetso e loketseng ea mefuta e mengata, eo re tla e fokotsa ka litsela tse fapaneng.

Bakeng sa merero ena, mosebetsi oa Rosenbrock oa mefuta ea N o nepahetse, o nang le sebopeho:

SciPy, optimization

Ho sa tsotellehe taba ea hore mosebetsi oa Rosenbrock le matrices a Jacobi le Hessian (li-derivatives tsa pele le tsa bobeli, ka ho latellana) li se li hlalositsoe ka har'a sephutheloana sa scipy.optimize, re tla e hlalosa ka borona.

import numpy as np

def rosen(x):
    """The Rosenbrock function"""
    return np.sum(100.0*(x[1:]-x[:-1]**2.0)**2.0 + (1-x[:-1])**2.0, axis=0)

Bakeng sa ho hlaka, a re huleleng 3D boleng ba ts'ebetso ea Rosenbrock ea mefuta e 'meli.

Ho taka khoutu

from mpl_toolkits.mplot3d import Axes3D
import matplotlib.pyplot as plt
from matplotlib import cm
from matplotlib.ticker import LinearLocator, FormatStrFormatter

# Настраиваем 3D график
fig = plt.figure(figsize=[15, 10])
ax = fig.gca(projection='3d')

# Задаем угол обзора
ax.view_init(45, 30)

# Создаем данные для графика
X = np.arange(-2, 2, 0.1)
Y = np.arange(-1, 3, 0.1)
X, Y = np.meshgrid(X, Y)
Z = rosen(np.array([X,Y]))

# Рисуем поверхность
surf = ax.plot_surface(X, Y, Z, cmap=cm.coolwarm)
plt.show()

SciPy, optimization

Ho tseba esale pele hore bonyane ke 0 ho SciPy, optimization, ha re shebeng mehlala ea mokhoa oa ho fumana boleng bo tlase ba ts'ebetso ea Rosenbrock ho sebelisa mekhoa e fapaneng ea scipy.optimize.

Mokhoa oa Nelder-Mead simplex

Ho ke ho be le ntlha ea mantlha x0 sebakeng sa 5-dimensional. Ha re fumane ntlha e tlase ea ts'ebetso ea Rosenbrock e haufi le eona re sebelisa algorithm Nelder-Mead simplex (algorithm e hlalositsoe e le boleng ba parameter ea mokhoa):

from scipy.optimize import minimize
x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2])
res = minimize(rosen, x0, method='nelder-mead',
    options={'xtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 339
         Function evaluations: 571
[1. 1. 1. 1. 1.]

Mokhoa oa simplex ke mokhoa o bonolo ka ho fetisisa oa ho fokotsa ts'ebetso e hlalositsoeng ka mokhoa o hlakileng le e bonolo. Ha e hloke ho bala li-derivatives tsa ts'ebetso; ho lekane ho hlalosa feela boleng ba eona. Mokhoa oa Nelder-Mead ke khetho e ntle bakeng sa mathata a bonolo a ho fokotsa. Leha ho le joalo, kaha ha e sebelise likhakanyo tsa gradient, ho ka nka nako e telele ho fumana bonyane.

mokhoa oa Powell

Algorithm e 'ngoe ea optimization eo ho eona ho baloang litekanyetso tsa ts'ebetso feela Mokhoa oa Powell. Ho e sebelisa, o hloka ho beha mokhoa = 'powell' ts'ebetsong ea minim.

x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2])
res = minimize(rosen, x0, method='powell',
    options={'xtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 19
         Function evaluations: 1622
[1. 1. 1. 1. 1.]

Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm

Ho fumana khokahanyo e potlakileng ho tharollo, ts'ebetso BFGS e sebelisa gradient ea mosebetsi oa sepheo. Gradiente e ka hlalosoa e le mosebetsi kapa ea baloa ho sebelisoa liphapang tsa taelo ea pele. Leha ho le joalo, mokhoa oa BFGS hangata o hloka mehala e fokolang ea ts'ebetso ho feta mokhoa oa simplex.

A re fumaneng motsoako oa ts'ebetso ea Rosenbrock ka mokhoa oa tlhahlobo:

SciPy, optimization

SciPy, optimization

Polelo ena e nepahetse bakeng sa li-derivatives tsa mefuta eohle ntle le ea pele le ea ho qetela, e hlalosoang e le:

SciPy, optimization

SciPy, optimization

Ha re shebeng ts'ebetso ea Python e balang gradient ena:

def rosen_der (x):
    xm = x [1: -1]
    xm_m1 = x [: - 2]
    xm_p1 = x [2:]
    der = np.zeros_like (x)
    der [1: -1] = 200 * (xm-xm_m1 ** 2) - 400 * (xm_p1 - xm ** 2) * xm - 2 * (1-xm)
    der [0] = -400 * x [0] * (x [1] -x [0] ** 2) - 2 * (1-x [0])
    der [-1] = 200 * (x [-1] -x [-2] ** 2)
    return der

Mosebetsi oa ho bala oa gradient o hlalositsoe e le boleng ba paramethara ea jac ea minim function, joalo ka ha ho bonts'itsoe ka tlase.

res = minimize(rosen, x0, method='BFGS', jac=rosen_der, options={'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 25
         Function evaluations: 30
         Gradient evaluations: 30
[1.00000004 1.0000001  1.00000021 1.00000044 1.00000092]

Conjugate gradient algorithm (Newton)

Algorithm Newton's conjugate gradients ke mokhoa o fetotsoeng oa Newton.
Mokhoa oa Newton o ipapisitse le ho lekanya ts'ebetso sebakeng sa lehae ka polynomial ea degree ea bobeli:

SciPy, optimization

moo SciPy, optimization ke matrix a derivatives ea bobeli (Hessian matrix, Hessian).
Haeba Hessian e le positive definite, joale bonyane ba sebaka sa tšebetso ena bo ka fumanoa ka ho lekanya zero gradient ea foromo ea quadratic ho zero. Sephetho e tla ba polelo ena:

SciPy, optimization

Hessian e fapaneng e baloa ho sebelisoa mokhoa oa conjugate gradient. Mohlala oa ho sebelisa mokhoa ona ho fokotsa mosebetsi oa Rosenbrock o fanoe ka tlase. Ho sebelisa mokhoa oa Newton-CG, o tlameha ho hlakisa tšebetso e balang Hessian.
Mosebetsi oa Hessian oa Rosenbrock ka mokhoa oa tlhahlobo o lekana le:

SciPy, optimization

SciPy, optimization

moo SciPy, optimization и SciPy, optimization, hlalosa matrix SciPy, optimization.

Lintho tse setseng tseo e seng zero tsa matrix li lekana le:

SciPy, optimization

SciPy, optimization

SciPy, optimization

SciPy, optimization

Mohlala, sebakeng sa mahlakore a mahlano N = 5, matrix ea Hessian bakeng sa ts'ebetso ea Rosenbrock e na le sebopeho sa sehlopha:

SciPy, optimization

Khoutu e balang Hessian ena hammoho le khoutu ea ho fokotsa ts'ebetso ea Rosenbrock ka mokhoa oa conjugate gradient (Newton):

def rosen_hess(x):
    x = np.asarray(x)
    H = np.diag(-400*x[:-1],1) - np.diag(400*x[:-1],-1)
    diagonal = np.zeros_like(x)
    diagonal[0] = 1200*x[0]**2-400*x[1]+2
    diagonal[-1] = 200
    diagonal[1:-1] = 202 + 1200*x[1:-1]**2 - 400*x[2:]
    H = H + np.diag(diagonal)
    return H

res = minimize(rosen, x0, method='Newton-CG', 
               jac=rosen_der, hess=rosen_hess,
               options={'xtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 24
         Function evaluations: 33
         Gradient evaluations: 56
         Hessian evaluations: 24
[1.         1.         1.         0.99999999 0.99999999]

Mohlala o nang le tlhaloso ea ts'ebetso ea sehlahisoa sa Hessian le vector e ikemetseng

Mathateng a nnete a lefats'e, komporo le ho boloka matrix eohle ea Hessian ho ka hloka nako ea bohlokoa le lisebelisoa tsa mohopolo. Tabeng ena, ha ho hlokahale hore u hlalose matrix a Hessian ka boeona, hobane mokhoa oa ho fokotsa o hloka feela vector e lekanang le sehlahisoa sa Hessian se nang le vector e 'ngoe e sa lumellaneng. Kahoo, ho ea ka pono ea computational, ho molemo haholo ho hlalosa hang-hang ts'ebetso e khutlisetsang sephetho sa sehlahisoa sa Hessian ka vector e hanyetsanang.

Nahana ka ts'ebetso ea hess, e nkang vector ea ho fokotsa joalo ka khang ea pele, le vector e hanyetsanang joalo ka khang ea bobeli (hammoho le likhang tse ling tsa ts'ebetso e lokelang ho fokotsoa). Tabeng ea rona, ho bala sehlahisoa sa Hessian ea mosebetsi oa Rosenbrock ka vector e hanyetsanang ha ho thata haholo. Haeba p ke vector e ikemetseng, ebe sehlahisoa SciPy, optimization e na le foromo:

SciPy, optimization

Mosebetsi o balang sehlahisoa sa Hessian le vector e sa sebetseng o fetisoa e le boleng ba khang ea hessp ho fokotsa mosebetsi:

def rosen_hess_p(x, p):
    x = np.asarray(x)
    Hp = np.zeros_like(x)
    Hp[0] = (1200*x[0]**2 - 400*x[1] + 2)*p[0] - 400*x[0]*p[1]
    Hp[1:-1] = -400*x[:-2]*p[:-2]+(202+1200*x[1:-1]**2-400*x[2:])*p[1:-1] 
    -400*x[1:-1]*p[2:]
    Hp[-1] = -400*x[-2]*p[-2] + 200*p[-1]
    return Hp

res = minimize(rosen, x0, method='Newton-CG',
               jac=rosen_der, hessp=rosen_hess_p,
               options={'xtol': 1e-8, 'disp': True})

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 24
         Function evaluations: 33
         Gradient evaluations: 56
         Hessian evaluations: 66

Conjugate gradient trust region algorithm (Newton)

Boemo bo bobe ba matrix a Hessian le litsela tse fosahetseng tsa ho batla li ka etsa hore algorithm ea Newton ea conjugate gradient e se ke ea sebetsa. Maemong a joalo, khetho e fuoa mokhoa oa sebaka sa tšepo (trust-region) conjugate Newton gradients.

Mohlala ka tlhaloso ea matrix ea Hessian:

res = minimize(rosen, x0, method='trust-ncg',
               jac=rosen_der, hess=rosen_hess,
               options={'gtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 20
         Function evaluations: 21
         Gradient evaluations: 20
         Hessian evaluations: 19
[1. 1. 1. 1. 1.]

Mohlala ka ts'ebetso ea sehlahisoa sa Hessian le vector e ikemetseng:

res = minimize(rosen, x0, method='trust-ncg', 
                jac=rosen_der, hessp=rosen_hess_p, 
                options={'gtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 20
         Function evaluations: 21
         Gradient evaluations: 20
         Hessian evaluations: 0
[1. 1. 1. 1. 1.]

Mekhoa ea mofuta oa Krylov

Joalo ka mokhoa oa trust-ncg, mekhoa ea mofuta oa Krylov e loketse hantle ho rarolla mathata a maholo hobane e sebelisa lihlahisoa tsa matrix-vector feela. Taba ea bona ke ho rarolla bothata sebakeng sa kholiseho se lekantsoeng ke sebaka se senyenyane sa Krylov. Bakeng sa mathata a sa tsitsang, ho molemo ho sebelisa mokhoa ona, kaha o sebelisa palo e fokolang ea ho pheta-pheta ho se nang moeli ka lebaka la palo e nyenyane ea lihlahisoa tsa matrix-vector ka subproblem, ha e bapisoa le mokhoa oa trust-ncg. Ho phaella moo, tharollo ea quadratic subproblem e fumanoa ka nepo ho feta ho sebelisa mokhoa oa trust-ncg.
Mohlala ka tlhaloso ea matrix ea Hessian:

res = minimize(rosen, x0, method='trust-krylov',
               jac=rosen_der, hess=rosen_hess,
               options={'gtol': 1e-8, 'disp': True})

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 19
         Function evaluations: 20
         Gradient evaluations: 20
         Hessian evaluations: 18

print(res.x)

    [1. 1. 1. 1. 1.]

Mohlala ka ts'ebetso ea sehlahisoa sa Hessian le vector e ikemetseng:

res = minimize(rosen, x0, method='trust-krylov',
               jac=rosen_der, hessp=rosen_hess_p,
               options={'gtol': 1e-8, 'disp': True})

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 19
         Function evaluations: 20
         Gradient evaluations: 20
         Hessian evaluations: 0

print(res.x)

    [1. 1. 1. 1. 1.]

Algorithm bakeng sa tharollo e ka bang sebakeng sa kholiseho

Mekhoa eohle (Newton-CG, trust-ncg le trust-krylov) e loketse hantle bakeng sa ho rarolla mathata a maholo (ka mefuta e likete). Sena se bakoa ke taba ea hore algorithm e ka tlase ea conjugate gradient e fana ka maikutlo a khakanyo ea matrix a Hessian a fapaneng. Tharollo e fumanoa khafetsa, ntle le katoloso e hlakileng ea Hessian. Kaha o hloka feela ho hlalosa ts'ebetso bakeng sa sehlahisoa sa Hessian le vector e ikemetseng, algorithm ena e ntle haholo bakeng sa ho sebetsa le matrices a sparse (band diagonal). Sena se fana ka litšenyehelo tse tlase tsa ho hopola le ho boloka nako ea bohlokoa.

Bakeng sa mathata a boholo bo mahareng, litšenyehelo tsa ho boloka le ho lokisa Hessian ha li bohlokoa. Sena se bolela hore hoa khoneha ho fumana tharollo ka makhetlo a fokolang, ho rarolla mathata a sebaka sa trust hoo e batlang e le hantle. Ho etsa sena, li-equations tse ling tse seng molaong li rarolloa khafetsa bakeng sa subproblem ka 'ngoe ea quadratic. Tharollo e joalo hangata e hloka 3 kapa 4 Cholesky decompositions of the Hessian matrix. Ka lebaka leo, mokhoa ona o kopana ka makhetlo a fokolang 'me o hloka lipalo tse fokolang tsa ts'ebetso ho feta mekhoa e meng e kenngoeng ea sebaka sa kholiseho. Algorithm ena e kenyelletsa feela ho khetholla matrix a Hessian e felletseng mme ha e tšehetse bokhoni ba ho sebelisa ts'ebetso ea sehlahisoa sa Hessian le vector e hanyetsanang.

Mohlala ka ho fokotsa ts'ebetso ea Rosenbrock:

res = minimize(rosen, x0, method='trust-exact',
               jac=rosen_der, hess=rosen_hess,
               options={'gtol': 1e-8, 'disp': True})
res.x

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 13
         Function evaluations: 14
         Gradient evaluations: 13
         Hessian evaluations: 14

array([1., 1., 1., 1., 1.])

Mohlomong re tla emisa moo. Sehloohong se latelang ke tla leka ho bolela lintho tse thahasellisang ka ho fokotsa maemo, ho sebelisoa ha ho fokotsa ho rarolla mathata a ho lekanya, ho fokotsa ts'ebetso ea mefuta e le 'ngoe e fapaneng, e fokolang, le ho fumana metso ea tsamaiso ea li-equations ho sebelisa scipy.optimize. sephutheloana.

Source: https://docs.scipy.org/doc/scipy/reference/

Source: www.habr.com

Eketsa ka tlhaloso