SciPy, optimization

SciPy, optimization

SciPy (hais sai pie) yog pob ntawv thov lej raws li Numpy Python txuas ntxiv. Nrog SciPy, koj qhov kev sib tham sib tham Python dhau los ua cov ntaub ntawv ua tiav kev tshawb fawb thiab cov txheej txheem tsim qauv ib puag ncig xws li MATLAB, IDL, Octave, R-Lab, thiab SciLab. Hnub no kuv xav tham luv luv txog yuav ua li cas siv qee qhov kev paub txog kev ua kom zoo tshaj plaws hauv pob scipy.optimize. Cov ncauj lus kom ntxaws thiab hloov tshiab ntawm kev siv cov haujlwm tuaj yeem tau txais los ntawm kev pab () hais kom ua lossis siv Shift + Tab.

Taw qhia

Txhawm rau cawm koj tus kheej thiab cov neeg nyeem los ntawm kev tshawb nrhiav thiab nyeem cov ntsiab lus tseem ceeb, txuas mus rau cov lus piav qhia ntawm txoj hauv kev yuav nyob hauv Wikipedia. Raws li txoj cai, cov ntaub ntawv no txaus kom nkag siab txog cov txheej txheem hauv cov ntsiab lus dav dav thiab cov xwm txheej rau lawv daim ntawv thov. Txhawm rau nkag siab txog cov ntsiab lus ntawm kev ua lej, ua raws li cov kev txuas mus rau ntau cov ntawv tshaj tawm, uas tuaj yeem pom nyob rau ntawm qhov kawg ntawm txhua tsab xov xwm lossis hauv koj lub tshuab tshawb nrhiav uas koj nyiam.

Yog li, scipy.optimize module suav nrog kev siv cov txheej txheem hauv qab no:

  1. Conditional thiab unconditional minimization ntawm scalar zog ntawm ntau qhov sib txawv (yam tsawg) siv ntau yam algorithms (Nelder-Mead simplex, BFGS, Newton conjugate gradients, COV ΠΈ SLSQP)
  2. Ntiaj teb no optimization (piv txwv li: basinhopping, diff_evolution kev)
  3. Txo qhov seem seem MNC (tsawg_squares) thiab nkhaus haum algorithms siv nonlinear tsawg squares (curve_fit)
  4. Kev txo qis kev ua haujlwm ntawm ib qho sib txawv (minim_scalar) thiab nrhiav cov hauv paus (root_scalar)
  5. Multidimensional solvers ntawm qhov system ntawm kev sib npaug (hauv paus) siv ntau yam algorithms (hybrid Powell, Levenberg-Marquardt los yog loj scale txoj kev xws li Newton-Krylov).

Hauv tsab xov xwm no peb yuav txiav txim siab tsuas yog thawj yam khoom los ntawm tag nrho cov npe no.

Unconditional minimization ntawm scalar muaj nuj nqi ntawm ntau qhov sib txawv

Qhov tsawg kawg nkaus muaj nuj nqi los ntawm pob scipy.optimize muab ib qho kev sib txuas dav dav rau kev daws teeb meem raws li kev cai thiab tsis xwm yeem txo qis ntawm scalar zog ntawm ntau qhov sib txawv. Txhawm rau ua kom pom nws ua haujlwm li cas, peb yuav xav tau cov haujlwm tsim nyog ntawm ntau qhov sib txawv, uas peb yuav txo qis hauv ntau txoj kev.

Rau cov hom phiaj no, Rosenbrock muaj nuj nqi ntawm N sib txawv yog zoo meej, uas muaj daim ntawv:

SciPy, optimization

Txawm tias muaj tseeb hais tias Rosenbrock muaj nuj nqi thiab nws cov Jacobi thiab Hessian matrices (thawj thiab thib ob derivatives, feem) twb tau txhais nyob rau hauv lub scipy.optimize pob, peb yuav txhais nws tus kheej.

import numpy as np

def rosen(x):
    """The Rosenbrock function"""
    return np.sum(100.0*(x[1:]-x[:-1]**2.0)**2.0 + (1-x[:-1])**2.0, axis=0)

Txhawm rau kom pom tseeb, cia peb kos hauv 3D qhov txiaj ntsig ntawm Rosenbrock muaj nuj nqi ntawm ob qhov sib txawv.

Kos duab

from mpl_toolkits.mplot3d import Axes3D
import matplotlib.pyplot as plt
from matplotlib import cm
from matplotlib.ticker import LinearLocator, FormatStrFormatter

# НастраиваСм 3D Π³Ρ€Π°Ρ„ΠΈΠΊ
fig = plt.figure(figsize=[15, 10])
ax = fig.gca(projection='3d')

# Π—Π°Π΄Π°Π΅ΠΌ ΡƒΠ³ΠΎΠ» ΠΎΠ±Π·ΠΎΡ€Π°
ax.view_init(45, 30)

# Π‘ΠΎΠ·Π΄Π°Π΅ΠΌ Π΄Π°Π½Π½Ρ‹Π΅ для Π³Ρ€Π°Ρ„ΠΈΠΊΠ°
X = np.arange(-2, 2, 0.1)
Y = np.arange(-1, 3, 0.1)
X, Y = np.meshgrid(X, Y)
Z = rosen(np.array([X,Y]))

# РисуСм ΠΏΠΎΠ²Π΅Ρ€Ρ…Π½ΠΎΡΡ‚ΡŒ
surf = ax.plot_surface(X, Y, Z, cmap=cm.coolwarm)
plt.show()

SciPy, optimization

Paub ua ntej tias qhov tsawg kawg nkaus yog 0 ntawm SciPy, optimization, cia peb saib cov piv txwv ntawm yuav ua li cas los txiav txim qhov tsawg kawg nkaus tus nqi ntawm Rosenbrock muaj nuj nqi siv ntau yam scipy.optimize cov txheej txheem.

Nelder-Mead simplex method

Cia muaj qhov pib taw tes x0 hauv 5-dimensional qhov chaw. Cia peb pom qhov tsawg kawg nkaus ntawm Rosenbrock muaj nuj nqi ze rau nws siv lub algorithm Nelder-Mead simplex (lub algorithm yog teev raws li tus nqi ntawm txoj kev parameter):

from scipy.optimize import minimize
x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2])
res = minimize(rosen, x0, method='nelder-mead',
    options={'xtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 339
         Function evaluations: 571
[1. 1. 1. 1. 1.]

Txoj kev yooj yim yog txoj hauv kev yooj yim tshaj plaws kom txo tau qhov kev qhia meej meej thiab ua haujlwm zoo. Nws tsis tas yuav suav cov derivatives ntawm ib qho kev ua haujlwm; nws txaus los qhia tsuas yog nws qhov tseem ceeb. Txoj kev Nelder-Mead yog ib qho kev xaiv zoo rau cov teeb meem me me. Txawm li cas los xij, vim nws tsis siv qhov kev kwv yees gradient, nws yuav siv sij hawm ntev los nrhiav qhov tsawg kawg nkaus.

Powell txoj kev

Lwm optimization algorithm uas tsuas yog cov nuj nqis muaj nuj nqi yog xam Powell txoj kev. Txhawm rau siv nws, koj yuav tsum teeb tsa txoj kev = 'powell' hauv qhov tsawg kawg nkaus.

x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2])
res = minimize(rosen, x0, method='powell',
    options={'xtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 19
         Function evaluations: 1622
[1. 1. 1. 1. 1.]

Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm

Yuav kom tau txais kev sib koom ua ke sai dua rau kev daws teeb meem, txheej txheem BFGS siv cov gradient ntawm lub hom phiaj muaj nuj nqi. Cov gradient tuaj yeem teev raws li kev ua haujlwm lossis suav nrog kev txiav txim thawj qhov sib txawv. Txawm li cas los xij, txoj kev BFGS feem ntau yuav tsum tau hu xov tooj tsawg dua li txoj hauv kev yooj yim.

Cia peb pom cov txiaj ntsig ntawm Rosenbrock muaj nuj nqi hauv daim ntawv tshuaj ntsuam:

SciPy, optimization

SciPy, optimization

Qhov kev qhia no siv tau rau cov derivatives ntawm txhua qhov sib txawv tshwj tsis yog thawj zaug thiab qhov kawg, uas txhais tau tias:

SciPy, optimization

SciPy, optimization

Wb saib Python muaj nuj nqi uas xam qhov gradient no:

def rosen_der (x):
    xm = x [1: -1]
    xm_m1 = x [: - 2]
    xm_p1 = x [2:]
    der = np.zeros_like (x)
    der [1: -1] = 200 * (xm-xm_m1 ** 2) - 400 * (xm_p1 - xm ** 2) * xm - 2 * (1-xm)
    der [0] = -400 * x [0] * (x [1] -x [0] ** 2) - 2 * (1-x [0])
    der [-1] = 200 * (x [-1] -x [-2] ** 2)
    return der

Lub gradient xam muaj nuj nqi yog teev raws li tus nqi ntawm lub jac parameter ntawm qhov tsawg kawg nkaus muaj nuj nqi, raws li qhia hauv qab no.

res = minimize(rosen, x0, method='BFGS', jac=rosen_der, options={'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 25
         Function evaluations: 30
         Gradient evaluations: 30
[1.00000004 1.0000001  1.00000021 1.00000044 1.00000092]

Conjugate gradient algorithm (Newton)

Algorithm Newton's conjugate gradients yog ib qho kev hloov kho Newton txoj kev.
Newton txoj kev yog ua raws li kev kwv yees ntawm kev ua haujlwm hauv ib cheeb tsam los ntawm polynomial ntawm qib thib ob:

SciPy, optimization

qhov twg SciPy, optimization yog lub matrix ntawm ob derivatives (Hessian matrix, Hessian).
Yog tias Hessian zoo meej, ces qhov tsawg kawg nkaus hauv cheeb tsam ntawm qhov kev ua haujlwm no tuaj yeem pom los ntawm kev sib npaug ntawm xoom gradient ntawm daim ntawv plaub mus rau xoom. Qhov tshwm sim yuav yog qhov nthuav qhia:

SciPy, optimization

Lub inverse Hessian yog suav nrog txoj kev sib txuas gradient. Ib qho piv txwv ntawm kev siv txoj kev no kom txo qis Rosenbrock muaj nuj nqi yog muab hauv qab no. Txhawm rau siv txoj kev Newton-CG, koj yuav tsum qhia meej txog lub luag haujlwm uas suav nrog Hessian.
Lub Hessian ntawm Rosenbrock muaj nuj nqi hauv daim ntawv tshuaj ntsuam yog sib npaug rau:

SciPy, optimization

SciPy, optimization

qhov twg SciPy, optimization ΠΈ SciPy, optimization, txhais cov matrix SciPy, optimization.

Cov seem uas tsis yog xoom ntawm cov matrix yog sib npaug rau:

SciPy, optimization

SciPy, optimization

SciPy, optimization

SciPy, optimization

Piv txwv li, nyob rau hauv tsib-dimensional qhov chaw N = 5, lub Hessian matrix rau Rosenbrock muaj nuj nqi muaj daim ntawv ntawm ib tug band:

SciPy, optimization

Code uas xam qhov Hessian no nrog rau cov cai rau txo qis Rosenbrock muaj nuj nqi siv lub conjugate gradient (Newton) txoj kev:

def rosen_hess(x):
    x = np.asarray(x)
    H = np.diag(-400*x[:-1],1) - np.diag(400*x[:-1],-1)
    diagonal = np.zeros_like(x)
    diagonal[0] = 1200*x[0]**2-400*x[1]+2
    diagonal[-1] = 200
    diagonal[1:-1] = 202 + 1200*x[1:-1]**2 - 400*x[2:]
    H = H + np.diag(diagonal)
    return H

res = minimize(rosen, x0, method='Newton-CG', 
               jac=rosen_der, hess=rosen_hess,
               options={'xtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 24
         Function evaluations: 33
         Gradient evaluations: 56
         Hessian evaluations: 24
[1.         1.         1.         0.99999999 0.99999999]

Ib qho piv txwv nrog lub ntsiab lus ntawm cov khoom muaj nuj nqi ntawm Hessian thiab ib tug arbitrary vector

Hauv cov teeb meem tiag tiag hauv ntiaj teb, kev suav thiab khaws tag nrho Hessian matrix tuaj yeem xav tau lub sijhawm tseem ceeb thiab kev nco. Hauv qhov no, tsis muaj qhov xav tau los qhia Hessian matrix nws tus kheej, vim cov txheej txheem minimization yuav tsum tsuas yog ib vector sib npaug rau cov khoom ntawm Hessian nrog rau lwm tus vector arbitrary. Yog li, los ntawm kev xam pom, nws yog qhov zoo dua los txiav txim siab tam sim ntawd ib qho kev ua haujlwm uas rov qab los ntawm cov khoom ntawm Hessian nrog rau qhov tsis txaus ntseeg vector.

Xav txog qhov muaj nuj nqi hess, uas yuav siv lub minimization vector raws li thawj qhov kev sib cav, thiab ib qho kev sib cav vector raws li qhov kev sib cav thib ob (nrog rau lwm qhov kev sib cav ntawm qhov ua haujlwm kom txo qis). Nyob rau hauv peb cov ntaub ntawv, xam cov khoom ntawm Hessian ntawm Rosenbrock muaj nuj nqi nrog ib tug arbitrary vector tsis yooj yim heev. Yog p yog ib tug arbitrary vector, ces cov khoom SciPy, optimization zoo li:

SciPy, optimization

Kev ua haujlwm uas suav cov khoom ntawm Hessian thiab ib qho arbitrary vector yog dhau raws li tus nqi ntawm cov lus sib cav hessp mus rau qhov tsawg kawg nkaus muaj nuj nqi:

def rosen_hess_p(x, p):
    x = np.asarray(x)
    Hp = np.zeros_like(x)
    Hp[0] = (1200*x[0]**2 - 400*x[1] + 2)*p[0] - 400*x[0]*p[1]
    Hp[1:-1] = -400*x[:-2]*p[:-2]+(202+1200*x[1:-1]**2-400*x[2:])*p[1:-1] 
    -400*x[1:-1]*p[2:]
    Hp[-1] = -400*x[-2]*p[-2] + 200*p[-1]
    return Hp

res = minimize(rosen, x0, method='Newton-CG',
               jac=rosen_der, hessp=rosen_hess_p,
               options={'xtol': 1e-8, 'disp': True})

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 24
         Function evaluations: 33
         Gradient evaluations: 56
         Hessian evaluations: 66

Conjugate gradient ntseeg cheeb tsam algorithm (Newton)

Kev tsis zoo ntawm Hessian matrix thiab cov lus qhia tshawb nrhiav tsis raug tuaj yeem ua rau Newton's conjugate gradient algorithm ua haujlwm tsis zoo. Nyob rau hauv cov ntaub ntawv no, nyiam yog muab rau txoj kev ntseeg cheeb tsam (trust-region) conjugate Newton gradients.

Piv txwv nrog lub ntsiab txhais ntawm Hessian matrix:

res = minimize(rosen, x0, method='trust-ncg',
               jac=rosen_der, hess=rosen_hess,
               options={'gtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 20
         Function evaluations: 21
         Gradient evaluations: 20
         Hessian evaluations: 19
[1. 1. 1. 1. 1.]

Piv txwv nrog cov khoom muaj nuj nqi ntawm Hessian thiab ib qho arbitrary vector:

res = minimize(rosen, x0, method='trust-ncg', 
                jac=rosen_der, hessp=rosen_hess_p, 
                options={'gtol': 1e-8, 'disp': True})
print(res.x)

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 20
         Function evaluations: 21
         Gradient evaluations: 20
         Hessian evaluations: 0
[1. 1. 1. 1. 1.]

Krylov hom txoj kev

Zoo li txoj kev ntseeg-ncg, Krylov-hom txoj kev yog qhov zoo rau kev daws teeb meem loj vim tias lawv siv cov khoom siv matrix-vector nkaus xwb. Lawv cov ntsiab lus yog los daws qhov teeb meem hauv cheeb tsam kev ntseeg siab uas txwv los ntawm Krylov subspace. Rau cov teeb meem tsis paub tseeb, nws yog qhov zoo dua los siv txoj kev no, vim nws siv tsawg dua ntawm cov khoom tsis sib xws vim qhov tsawg dua ntawm cov khoom lag luam matrix-vector ib qhov teeb meem, piv rau txoj kev ntseeg siab-ncg. Tsis tas li ntawd, qhov kev daws teeb meem rau cov teeb meem quadratic yog pom tseeb dua li siv txoj kev ntseeg siab-ncg.
Piv txwv nrog lub ntsiab txhais ntawm Hessian matrix:

res = minimize(rosen, x0, method='trust-krylov',
               jac=rosen_der, hess=rosen_hess,
               options={'gtol': 1e-8, 'disp': True})

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 19
         Function evaluations: 20
         Gradient evaluations: 20
         Hessian evaluations: 18

print(res.x)

    [1. 1. 1. 1. 1.]

Piv txwv nrog cov khoom muaj nuj nqi ntawm Hessian thiab ib qho arbitrary vector:

res = minimize(rosen, x0, method='trust-krylov',
               jac=rosen_der, hessp=rosen_hess_p,
               options={'gtol': 1e-8, 'disp': True})

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 19
         Function evaluations: 20
         Gradient evaluations: 20
         Hessian evaluations: 0

print(res.x)

    [1. 1. 1. 1. 1.]

Algorithm rau kwv yees kev daws teeb meem hauv cheeb tsam kev ntseeg siab

Txhua txoj hauv kev (Newton-CG, trust-ncg thiab trust-krylov) yog qhov zoo rau kev daws teeb meem loj (nrog ntau txhiab qhov sib txawv). Qhov no yog vim lub fact tias lub hauv paus conjugate gradient algorithm implies ib tug kwv yees kev txiav txim ntawm inverse Hessian matrix. Qhov kev daws teeb meem yog pom rov ua dua, tsis muaj kev nthuav dav ntawm Hessian. Txij li thaum koj tsuas yog yuav tsum tau txhais ib qho kev ua haujlwm rau cov khoom ntawm Hessian thiab ib qho arbitrary vector, qhov algorithm no tshwj xeeb tshaj yog zoo rau kev ua hauj lwm nrog sparse (band diagonal) matrices. Qhov no muab cov nqi nco tsawg thiab txuag lub sijhawm tseem ceeb.

Rau cov teeb meem nruab nrab, tus nqi khaws cia thiab kev tsim cov Hessian tsis yog qhov tseem ceeb. Qhov no txhais tau hais tias nws muaj peev xwm tau txais kev daws teeb meem tsawg dua, daws cov teeb meem ntawm thaj tsam kev ntseeg siab yuav luag. Txhawm rau ua qhov no, qee qhov kev sib npaug nonlinear raug daws rov qab rau txhua qhov teeb meem quadratic. Qhov kev daws teeb meem no feem ntau xav tau 3 lossis 4 Cholesky decompositions ntawm Hessian matrix. Raws li qhov tshwm sim, txoj kev sib koom ua ke tsawg dua thiab yuav tsum muaj tsawg lub hom phiaj ua haujlwm suav dua li lwm txoj hauv kev ntseeg siab hauv cheeb tsam. Qhov algorithm no tsuas yog hais txog qhov kev txiav txim siab ntawm Hessian matrix tiav thiab tsis txhawb nqa lub peev xwm los siv cov khoom lag luam ntawm Hessian thiab ib qho arbitrary vector.

Piv txwv nrog kev txo qis ntawm Rosenbrock muaj nuj nqi:

res = minimize(rosen, x0, method='trust-exact',
               jac=rosen_der, hess=rosen_hess,
               options={'gtol': 1e-8, 'disp': True})
res.x

Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 13
         Function evaluations: 14
         Gradient evaluations: 13
         Hessian evaluations: 14

array([1., 1., 1., 1., 1.])

Tej zaum peb yuav nres ntawd. Nyob rau hauv tsab xov xwm tom ntej no kuv yuav sim qhia qhov nthuav tshaj plaws txog kev txo qis, daim ntawv thov kev txo qis hauv kev daws cov teeb meem kwv yees, txo qis kev ua haujlwm ntawm ib qho kev sib txawv, arbitrary minimizers, thiab nrhiav cov hauv paus ntawm cov kab ke sib npaug siv scipy.optimize. pob.

Tau qhov twg los: https://docs.scipy.org/doc/scipy/reference/

Tau qhov twg los: www.hab.com

Ntxiv ib saib