SciPy, arotautanga

SciPy, arotautanga

Ko te SciPy (ko te sai pie) he kete tono pangarau i runga i te toronga Numpy Python. Ma te SciPy, ko to wahanga Python tauwhitiwhiti he rite tonu te puiaotanga raraunga me te taiao tauira punaha matatini ki MATLAB, IDL, Octave, R-Lab, me SciLab. I tenei ra e hiahia ana ahau ki te korero poto me pehea te whakamahi i etahi algorithm arotautanga rongonui i roto i te kete scipy.optimize. Ka taea tonu te tiki awhina mo te whakamahi i nga mahi ma te tono awhina() ma te whakamahi Shift+Ripa ranei.

Whakataki

Hei whakaora i a koe me nga kaipanui mai i te rapu me te panui i nga puna tuatahi, ko nga hononga ki nga whakaahuatanga o nga tikanga ka noho te nuinga ki Wikipedia. Hei tikanga, he ranea enei korero ki te mohio ki nga tikanga i roto i nga tikanga whanui me nga tikanga mo to raatau tono. Kia mohio ai koe ki te ngako o nga tikanga pangarau, whaia nga hononga ki etahi atu tuhinga whaimana, ka kitea i te mutunga o ia tuhinga, i to miihini rapu tino pai ranei.

Na, kei roto i te waahanga scipy.optimize te whakatinanatanga o nga tikanga e whai ake nei:

  1. Te whakaiti herenga me te kore herenga o nga mahi scalar o te maha o nga taurangi (te iti) ma te whakamahi i nga momo algorithms (Nelder-Mead simplex, BFGS, Newton conjugate rōnaki, COBYLA и SLSQP)
  2. Te arotautanga o te ao (hei tauira: poka wai, diff_evolution)
  3. Te whakaiti i nga toenga MNC (te iti_tapawha) me te anau whakauru hātepe e whakamahi ana i nga tapawha iti rawa (curve_fit)
  4. Te whakaiti i nga mahi scalar o tetahi taurangi (minim_scalar) me te rapu mo nga pakiaka (root_scalar)
  5. Nga kaiwhakarewa ahu maha o te punaha wharite (pakiaka) e whakamahi ana i nga momo hātepe (powell ranu, Levenberg-Marquardt tikanga tauine nui ranei penei Newton-Krylov).

I roto i tenei tuhinga ka whakaarohia e maatau te mea tuatahi mai i tenei rarangi katoa.

Te whakaiti herekore o te taumahinga o nga taurangi maha

Ko te mahi iti mai i te kete scipy.optimize e whakarato ana i te atanga whanui mo te whakaoti rapanga whakaiti herenga me te kore herenga o nga mahi scalar o etahi taurangi. Hei whakaatu me pehea te mahi, ka hiahia tatou ki te mahi tika o nga taurangi maha, ka whakaitihia e tatou ma nga huarahi rereke.

Mo enei kaupapa, ko te mahi a Rosenbrock o nga taurangi N he tino tika, kei a ia te ahua:

SciPy, arotautanga

Ahakoa te meka ko te mahi a Rosenbrock me ona matrice Jacobi me Hessian (te tuatahi me te tuarua o nga pärönaki, ia) kua tautuhia i roto i te kete scipy.optimize, ma tatou ano e tautuhi.

import numpy as np

def rosen(x):
    """The Rosenbrock function"""
    return np.sum(100.0*(x[1:]-x[:-1]**2.0)**2.0 + (1-x[:-1])**2.0, axis=0)

Mo te whakamarama, me tuhi ki te 3D nga uara o te mahi Rosenbrock o nga taurangi e rua.

Waehere tuhi

from mpl_toolkits.mplot3d import Axes3D
import matplotlib.pyplot as plt
from matplotlib import cm
from matplotlib.ticker import LinearLocator, FormatStrFormatter

# Настраиваем 3D график
fig = plt.figure(figsize=[15, 10])
ax = fig.gca(projection='3d')

# Задаем угол обзора
ax.view_init(45, 30)

# Создаем данные для графика
X = np.arange(-2, 2, 0.1)
Y = np.arange(-1, 3, 0.1)
X, Y = np.meshgrid(X, Y)
Z = rosen(np.array([X,Y]))

# Рисуем поверхность
surf = ax.plot_surface(X, Y, Z, cmap=cm.coolwarm)
plt.show()

SciPy, arotautanga

Ma te mohio i mua ko te iti rawa ko te 0 i SciPy, arotautanga, me titiro ki nga tauira me pehea te whakatau i te uara iti o te mahi Rosenbrock ma te whakamahi i nga momo tikanga scipy.optimize.

Tikanga Nelder-Mead simplex

Me waiho he ira tuatahi x0 i te mokowā ahu-5. Me kimihia te tohu iti rawa o te mahi Rosenbrock e tata ana ki a ia ma te whakamahi i te algorithm Nelder-Mead simplex (kua tohua te algorithm hei uara o te tawhā tikanga):

from scipy.optimize import minimize
x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2])
res = minimize(rosen, x0, method='nelder-mead',
    options={'xtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 339
         Function evaluations: 571
[1. 1. 1. 1. 1.]

Ko te tikanga simplex te huarahi ngawari ki te whakaiti i te mahi tino marama me te ngawari. Karekau e hiahia ki te tatau i nga pärönaki o tëtahi taumahi; he nui noa iho te tautuhi i ona uara. Ko te tikanga Nelder-Mead he whiringa pai mo nga raru whakaiti ngawari. Heoi, i te mea karekau e whakamahi i nga whakatau tata rōnaki, ka roa pea ka kitea te iti rawa.

Tikanga Powell

Ko tetahi atu algorithm arotautanga e tatau ana ko nga uara mahi anake Te tikanga a Powell. Hei whakamahi, me whakarite tikanga = 'powell' i roto i te mahi iti.

x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2])
res = minimize(rosen, x0, method='powell',
    options={'xtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 19
         Function evaluations: 1622
[1. 1. 1. 1. 1.]

Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm

Kia tere ake ai te whakakotahitanga ki tetahi otinga, te tikanga BFGS ka whakamahi i te rōnaki o te taumahi whainga. Ka taea te tohu i te rōnaki hei mahi, te tatau ranei ma te whakamahi i nga rereketanga o te raupapa tuatahi. Ahakoa he aha, ko te tikanga BFGS he iti ake nga waea mahi i te tikanga ngawari.

Me rapu tatou i te takenga o te mahi Rosenbrock i roto i te ahua tātari:

SciPy, arotautanga

SciPy, arotautanga

He tika tenei kii mo nga hua o nga taurangi katoa engari ko te tuatahi me te whakamutunga, kua tautuhia ko:

SciPy, arotautanga

SciPy, arotautanga

Me titiro ki te mahi Python e tatau ana i tenei rōnaki:

def rosen_der (x):
    xm = x [1: -1]
    xm_m1 = x [: - 2]
    xm_p1 = x [2:]
    der = np.zeros_like (x)
    der [1: -1] = 200 * (xm-xm_m1 ** 2) - 400 * (xm_p1 - xm ** 2) * xm - 2 * (1-xm)
    der [0] = -400 * x [0] * (x [1] -x [0] ** 2) - 2 * (1-x [0])
    der [-1] = 200 * (x [-1] -x [-2] ** 2)
    return der

Ko te taumahi tatauranga rōnaki kua tohua hei uara o te tawhā jac o te mahi iti, penei i raro nei.

res = minimize(rosen, x0, method='BFGS', jac=rosen_der, options={'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 25
         Function evaluations: 30
         Gradient evaluations: 30
[1.00000004 1.0000001  1.00000021 1.00000044 1.00000092]

Whakakotahitia te rōnaki hātepe (Newton)

Algorithm Ngā rōnaki conjugate a Newton he tikanga whakarereke a Newton.
Ko te tikanga a Newton e ahu mai ana i runga i te whakatairite i tetahi mahi i roto i te takiwa o te takiwa ma te tohu-rua o te tohu tuarua:

SciPy, arotautanga

te wahi SciPy, arotautanga Ko te matrix o nga pärönaki tuarua (Hessian matrix, Hessian).
Mēnā he tōrunga te Hessian, kātahi ka kitea te iti rawa o tēnei mahi mā te whakaōrite i te rōnaki kore o te puka tapawhā ki te kore. Ko te hua ko te korero:

SciPy, arotautanga

Ka tātaihia te Hessian kōaro mā te tikanga rōnaki conjugate. Ko tetahi tauira o te whakamahi i tenei tikanga hei whakaiti i te mahi Rosenbrock kei raro nei. Hei whakamahi i te tikanga Newton-CG, me tohu he taumahi e tatau ana i te Hessian.
Ko te Hessian o te mahi Rosenbrock i roto i te ahua tātari he rite ki:

SciPy, arotautanga

SciPy, arotautanga

te wahi SciPy, arotautanga и SciPy, arotautanga, tautuhia te matrix SciPy, arotautanga.

Ko nga toenga kore-kore o te matrix he rite ki:

SciPy, arotautanga

SciPy, arotautanga

SciPy, arotautanga

SciPy, arotautanga

Hei tauira, i roto i te mokowā ahu-rima N = 5, ko te Hessian matrix mo te mahi Rosenbrock te ahua o te roopu:

SciPy, arotautanga

Waehere e tatau ana i tenei Hessian me te waehere mo te whakaiti i te mahi a Rosenbrock ma te whakamahi i te tikanga whakaroono (Newton):

def rosen_hess(x):
    x = np.asarray(x)
    H = np.diag(-400*x[:-1],1) - np.diag(400*x[:-1],-1)
    diagonal = np.zeros_like(x)
    diagonal[0] = 1200*x[0]**2-400*x[1]+2
    diagonal[-1] = 200
    diagonal[1:-1] = 202 + 1200*x[1:-1]**2 - 400*x[2:]
    H = H + np.diag(diagonal)
    return H

res = minimize(rosen, x0, method='Newton-CG', 
               jac=rosen_der, hess=rosen_hess,
               options={'xtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 24
         Function evaluations: 33
         Gradient evaluations: 56
         Hessian evaluations: 24
[1.         1.         1.         0.99999999 0.99999999]

He tauira me te whakamaramatanga o te mahi hua o te Hessian me te vector arbitrary

I roto i nga raru o te ao, ko te rorohiko me te penapena i te matrix Hessian katoa ka nui te waa me nga rauemi mahara. I tenei keehi, kaore he take ki te tohu i te matrix Hessian ano, na te mea Ko te tikanga whakaiti me whai i tetahi vector e rite ana ki te hua o te Hessian me tetahi atu vector takahuri noa. No reira, mai i te tirohanga rorohiko, he pai ake te tautuhi i tetahi mahi e whakahoki mai ana i te hua o te hua o te Hessian me te weriweri.

Whakaarohia te mahi hess, e tango ana i te vector whakaiti hei tohenga tuatahi, me te vector takahuri hei tohenga tuarua (me etahi atu tohenga o te mahi hei whakaiti). I roto i to maatau, ko te tatau i te hua o te Hessian o te mahi Rosenbrock me te vector takawaenga ehara i te mea tino uaua. Mehemea p he vector hanga noa, katahi ko te hua SciPy, arotautanga he ahua:

SciPy, arotautanga

Ko te taumahi e tatau ana i te hua o te Hessian me te weriweri tuku noa ka tukuna hei uara o te tohenga hessp ki te mahi whakaiti:

def rosen_hess_p(x, p):
    x = np.asarray(x)
    Hp = np.zeros_like(x)
    Hp[0] = (1200*x[0]**2 - 400*x[1] + 2)*p[0] - 400*x[0]*p[1]
    Hp[1:-1] = -400*x[:-2]*p[:-2]+(202+1200*x[1:-1]**2-400*x[2:])*p[1:-1] 
    -400*x[1:-1]*p[2:]
    Hp[-1] = -400*x[-2]*p[-2] + 200*p[-1]
    return Hp

res = minimize(rosen, x0, method='Newton-CG',
               jac=rosen_der, hessp=rosen_hess_p,
               options={'xtol': 1e-8, 'disp': True})

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 24
         Function evaluations: 33
         Gradient evaluations: 56
         Hessian evaluations: 66

Whakakotahitia te rōnaki whirinaki rohe hātepe hātepe (Newton)

Ko te pai o te whakamaaramatanga o te Hessian matrix me te he o nga huarahi rapu ka kore e whai hua te algorithm o te rōnaki conjugate a Newton. I roto i nga ahuatanga penei, ka tukuna he manakohanga tikanga rohe whakawhirinaki (rohe whakawhirinaki) whakakotahi i nga rōnaki Newton.

He tauira me te whakamaramatanga o te matrix Hessian:

res = minimize(rosen, x0, method='trust-ncg',
               jac=rosen_der, hess=rosen_hess,
               options={'gtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 20
         Function evaluations: 21
         Gradient evaluations: 20
         Hessian evaluations: 19
[1. 1. 1. 1. 1.]

He tauira me te mahi hua o te Hessian me te vector hangai:

res = minimize(rosen, x0, method='trust-ncg', 
                jac=rosen_der, hessp=rosen_hess_p, 
                options={'gtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 20
         Function evaluations: 21
         Gradient evaluations: 20
         Hessian evaluations: 0
[1. 1. 1. 1. 1.]

tikanga momo Krylov

Ka rite ki te tikanga whakawhirinaki-ncg, he pai nga tikanga momo Krylov mo te whakaoti rapanga nui-nui na te mea ka whakamahia e ratou nga hua matrix-vector anake. Ko ta ratou ngako ko te whakaoti rapanga i roto i te rohe maia kua whakawhāitihia e te mokowāroto Krylov kua tapahia. Mo nga raruraru ohorere, he pai ake te whakamahi i tenei tikanga, na te mea he iti ake te whakamahi i nga whitiwhiti kore-kore na te iti ake o nga hua matrix-vector mo ia rapanga iti, ka whakaritea ki te tikanga whakawhirinaki-ncg. I tua atu, ka kitea te otinga o te rapanga haurua kua tino tika atu i te whakamahi i te tikanga whakawhirinaki-ncg.
He tauira me te whakamaramatanga o te matrix Hessian:

res = minimize(rosen, x0, method='trust-krylov',
               jac=rosen_der, hess=rosen_hess,
               options={'gtol': 1e-8, 'disp': True})

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 19
         Function evaluations: 20
         Gradient evaluations: 20
         Hessian evaluations: 18

print(res.x)

    [1. 1. 1. 1. 1.]

He tauira me te mahi hua o te Hessian me te vector hangai:

res = minimize(rosen, x0, method='trust-krylov',
               jac=rosen_der, hessp=rosen_hess_p,
               options={'gtol': 1e-8, 'disp': True})

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 19
         Function evaluations: 20
         Gradient evaluations: 20
         Hessian evaluations: 0

print(res.x)

    [1. 1. 1. 1. 1.]

Hātepe mō te otinga tata ki te rohe māia

Ko nga tikanga katoa (Newton-CG, trust-ncg me trust-krylov) he pai mo te whakaoti rapanga nui (me nga mano o nga taurangi). Ko te tikanga tenei na te mea ko te tikanga o te taarua o te taarua o te taarua o raro e whakaatu ana he whakatau tata o te matrix Hessian. Ka kitea te otinga, karekau he roha o te Hessian. I te mea ka hiahia koe ki te tautuhi i tetahi mahi mo te hua o te Hessian me te kaitao, he pai rawa atu tenei algorithm mo te mahi me nga matrices mokowhiti (pae hauroki). He iti nga utu mahara me te penapena wa nui.

Mo nga raruraru reo-rahi, ko te utu mo te penapena me te tauwehe i te Hessian ehara i te mea nui. Ko te tikanga ka taea te whiwhi otinga i roto i te iti ake o nga whitiwhitinga, te whakaoti rapanga iti o te rohe whakawhirinaki tata tonu. Ki te mahi i tenei, ka whakatauhia etahi wharite-kore mo ia rapanga haurua. Ko te tikanga kia 3, e 4 ranei nga wahanga Cholesky o te matrix Hessian. Ko te mutunga mai, he iti ake nga whitiwhitinga o te aratuka me te iti ake o nga tatauranga mahi whainga i era atu tikanga rohe whakawhirinaki kua whakatinanahia. Ko tenei algorithm anake te whakatau i te matrix Hessian oti me te kore e tautoko i te kaha ki te whakamahi i te mahi hua o te Hessian me te vector arbitrary.

He tauira me te whakaiti i te mahi Rosenbrock:

res = minimize(rosen, x0, method='trust-exact',
               jac=rosen_der, hess=rosen_hess,
               options={'gtol': 1e-8, 'disp': True})
res.x

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 13
         Function evaluations: 14
         Gradient evaluations: 13
         Hessian evaluations: 14

array([1., 1., 1., 1., 1.])

Ka mutu pea tatou ki reira. I roto i te tuhinga e whai ake nei ka ngana ahau ki te korero i nga mea tino pai mo te whakaiti herenga, te whakamahinga o te whakaiti i roto i te whakaoti rapanga tata, te whakaiti i te mahi o te taurangi kotahi, te whakaiti i nga tikanga, me te rapu i nga putake o te punaha wharite ma te whakamahi i te scipy.optimize. mōkihi.

Source: https://docs.scipy.org/doc/scipy/reference/

Source: will.com

Tāpiri i te kōrero