SciPy, optimization nrog tej yam kev mob

SciPy, optimization nrog tej yam kev mob

SciPy (hais sai pie) yog ib pob lej suav nrog suav nrog C thiab Fortran cov tsev qiv ntawv. SciPy hloov koj qhov kev sib tham sib tham Python rau hauv cov ntaub ntawv kawm tiav ib puag ncig zoo li MATLAB, IDL, Octave, R, lossis SciLab.

Hauv tsab xov xwm no, peb yuav saib cov txheej txheem ntawm kev ua lej hauv kev ua lej - daws teeb meem kev ua kom zoo dua qub rau kev ua haujlwm ntawm ntau qhov sib txawv siv pob scipy.optimize. Unconstrained optimization algorithms twb tau tham nyob rau hauv kab lus kawg. Cov ncauj lus kom ntxaws thiab hloov tshiab tshiab ntawm kev ua haujlwm scipy tuaj yeem tau txais los ntawm kev pab () hais kom ua, Shift + Tab lossis hauv cov ntaub ntawv raug cai.

Taw qhia

Ib qho kev sib cuam tshuam rau kev daws teeb meem ob qho tib si thiab tsis muaj teeb meem optimization teeb meem hauv pob scipy.optimize yog muab los ntawm cov haujlwm minimize(). Txawm li cas los xij, nws paub tias tsis muaj txoj hauv kev los daws txhua yam teeb meem, yog li kev xaiv ntawm txoj kev tsim nyog, raws li ib txwm muaj, poob ntawm lub xub pwg nyom ntawm tus kws tshawb fawb.
Qhov tsim nyog optimization algorithm tau teev tseg siv qhov kev sib cav ua haujlwm minimize(..., method="").
Rau kev ua kom zoo tshaj plaws ntawm kev ua haujlwm ntawm ntau qhov sib txawv, kev siv ntawm cov hauv qab no muaj:

  • trust-constr - tshawb nrhiav qhov tsawg kawg nkaus hauv cheeb tsam hauv cheeb tsam kev ntseeg siab. Wiki tsab xov xwm, tsab xov xwm ntawm Habre;
  • SLSQP - Cov kev ua haujlwm quadratic txuas ntxiv nrog kev txwv, Newtonian txoj kev daws teeb meem Lagrange. Wiki tsab xov xwm.
  • TNC - Truncated Newton Constrained, txwv tus naj npawb ntawm iterations, zoo rau nonlinear functions nrog ib tug loj tus naj npawb ntawm ywj siab variable. Wiki tsab xov xwm.
  • L-BFGS-B - ib txoj hauv kev los ntawm pab pawg Broyden-Fletcher-Goldfarb-Shanno, tau siv nrog txo kev siv lub cim xeeb vim kev thauj khoom ib nrab ntawm vectors los ntawm Hessian matrix. Wiki tsab xov xwm, tsab xov xwm ntawm Habre.
  • COBYLA - MARE Constrained Optimization By Linear Approximation, constrained optimization with linear approximation (tsis suav gradient). Wiki tsab xov xwm.

Nyob ntawm txoj kev xaiv, cov xwm txheej thiab kev txwv rau kev daws qhov teeb meem tau teeb tsa txawv:

  • chav kawm khoom Bounds rau txoj kev L-BFGS-B, TNC, SLSQP, ntseeg-constr;
  • lis (min, max) rau tib txoj kev L-BFGS-B, TNC, SLSQP, ntseeg-constr;
  • ib yam khoom lossis ib daim ntawv teev cov khoom LinearConstraint, NonlinearConstraint rau COBYLA, SLSQP, txoj kev ntseeg-constr;
  • phau ntawv txhais lus lossis cov npe ntawm phau ntawv txhais lus {'type':str, 'fun':callable, 'jac':callable,opt, 'args':sequence,opt} rau COBYLA, SLSQP txoj kev.

Kab lus piav qhia:
1) Xav txog kev siv cov txheej txheem kev ua kom zoo tshaj plaws hauv cheeb tsam kev ntseeg siab (txoj kev = "trust-constr") nrog cov kev txwv uas tau teev tseg raws li cov khoom Bounds, LinearConstraint, NonlinearConstraint ;
2) Xav txog cov txheej txheem ua ntu zus siv qhov tsawg tshaj plaws squares txoj kev (txoj kev = "SLSQP") nrog cov kev txwv uas tau teev tseg hauv daim ntawv txhais lus {'type', 'fun', 'jac', 'args'};
3) Txheeb xyuas ib qho piv txwv ntawm kev ua kom zoo ntawm cov khoom tsim khoom siv qhov piv txwv ntawm lub vev xaib studio.

Conditional optimization method = "trust-constr"

Kev ua raws li txoj kev trust-constr nyob ntawm EQSQP rau cov teeb meem nrog kev txwv ntawm daim ntawv ntawm kev sib luag thiab rau KEV NTSEEG rau cov teeb meem nrog kev txwv nyob rau hauv daim ntawv ntawm kev tsis sib xws. Ob txoj kev yog siv los ntawm algorithms los nrhiav qhov tsawg kawg nkaus hauv cheeb tsam hauv cheeb tsam kev ntseeg siab thiab ua tau zoo rau cov teeb meem loj.

Mathematical formulation ntawm qhov teeb meem ntawm kev nrhiav qhov tsawg kawg nkaus hauv daim ntawv:

SciPy, optimization nrog tej yam kev mob

SciPy, optimization nrog tej yam kev mob

SciPy, optimization nrog tej yam kev mob

Rau kev sib npaug sib luag nruj, qhov qis qis yog teem sib npaug rau sab sauv SciPy, optimization nrog tej yam kev mob.
Rau ib txoj kev txwv, qhov txwv sab saud lossis qis yog teem np.inf nrog rau lub cim npe.
Cia nws yuav tsum nrhiav qhov tsawg kawg nkaus ntawm kev paub Rosenbrock muaj nuj nqi ntawm ob qhov sib txawv:

SciPy, optimization nrog tej yam kev mob

Hauv qhov no, cov kev txwv hauv qab no tau teem rau ntawm nws lub ntsiab lus txhais:

SciPy, optimization nrog tej yam kev mob

SciPy, optimization nrog tej yam kev mob

SciPy, optimization nrog tej yam kev mob

SciPy, optimization nrog tej yam kev mob

SciPy, optimization nrog tej yam kev mob

SciPy, optimization nrog tej yam kev mob

Hauv peb cov ntaub ntawv, muaj ib qho kev daws teeb meem tshwj xeeb ntawm lub ntsiab lus SciPy, optimization nrog tej yam kev mob, uas tsuas yog thawj thiab thib plaub txwv tsis pub siv tau.
Cia peb mus dhau cov kev txwv hauv qab mus rau sab saum toj thiab saib seb peb tuaj yeem sau lawv li cas hauv scipy.
Kev txwv SciPy, optimization nrog tej yam kev mob ΠΈ SciPy, optimization nrog tej yam kev mob cia peb txhais nws siv cov khoom Bounds.

from scipy.optimize import Bounds
bounds = Bounds ([0, -0.5], [1.0, 2.0])

Kev txwv SciPy, optimization nrog tej yam kev mob ΠΈ SciPy, optimization nrog tej yam kev mob Cia peb sau nws hauv daim ntawv linear:

SciPy, optimization nrog tej yam kev mob

Cia peb txhais cov kev txwv no raws li cov khoom LinearConstraint:

import numpy as np
from scipy.optimize import LinearConstraint
linear_constraint = LinearConstraint ([[1, 2], [2, 1]], [-np.inf, 1], [1, 1])

Thiab thaum kawg lub nonlinear txwv nyob rau hauv daim ntawv matrix:

SciPy, optimization nrog tej yam kev mob

Peb txhais cov Jacobian matrix rau qhov kev txwv no thiab ib qho kev sib xyaw ua ke ntawm Hessian matrix nrog cov vector tsis txaus ntseeg SciPy, optimization nrog tej yam kev mob:

SciPy, optimization nrog tej yam kev mob

SciPy, optimization nrog tej yam kev mob

Tam sim no peb tuaj yeem txhais qhov kev txwv tsis muaj kab raws li qhov khoom NonlinearConstraint:

from scipy.optimize import NonlinearConstraint

def cons_f(x):
     return [x[0]**2 + x[1], x[0]**2 - x[1]]

def cons_J(x):
     return [[2*x[0], 1], [2*x[0], -1]]

def cons_H(x, v):
     return v[0]*np.array([[2, 0], [0, 0]]) + v[1]*np.array([[2, 0], [0, 0]])

nonlinear_constraint = NonlinearConstraint(cons_f, -np.inf, 1, jac=cons_J, hess=cons_H)

Yog tias qhov loj me me, matrices kuj tuaj yeem teev nyob rau hauv daim ntawv sparse:

from scipy.sparse import csc_matrix

def cons_H_sparse(x, v):
     return v[0]*csc_matrix([[2, 0], [0, 0]]) + v[1]*csc_matrix([[2, 0], [0, 0]])

nonlinear_constraint = NonlinearConstraint(cons_f, -np.inf, 1,
                                            jac=cons_J, hess=cons_H_sparse)

los yog ua ib yam khoom LinearOperator:

from scipy.sparse.linalg import LinearOperator

def cons_H_linear_operator(x, v):
    def matvec(p):
        return np.array([p[0]*2*(v[0]+v[1]), 0])
    return LinearOperator((2, 2), matvec=matvec)

nonlinear_constraint = NonlinearConstraint(cons_f, -np.inf, 1,
                                jac=cons_J, hess=cons_H_linear_operator)

Thaum xam cov Hessian matrix SciPy, optimization nrog tej yam kev mob yuav tsum tau siv zog ntau, koj tuaj yeem siv chav kawm HessianUpdateStrategy. Cov tswv yim hauv qab no muaj: BFGS ΠΈ SR1.

from scipy.optimize import BFGS

nonlinear_constraint = NonlinearConstraint(cons_f, -np.inf, 1, jac=cons_J, hess=BFGS())

Lub Hessian kuj tuaj yeem suav nrog qhov sib txawv tsis kawg:

nonlinear_constraint = NonlinearConstraint (cons_f, -np.inf, 1, jac = cons_J, hess = '2-point')

Jacobian matrix rau kev txwv kuj tuaj yeem suav nrog siv qhov sib txawv kawg. Txawm li cas los xij, hauv qhov no Hessian matrix tsis tuaj yeem suav nrog siv qhov sib txawv tsis kawg. Lub Hessian yuav tsum raug txhais ua ib qho haujlwm lossis siv HessianUpdateStrategy chav kawm.

nonlinear_constraint = NonlinearConstraint (cons_f, -np.inf, 1, jac = '2-point', hess = BFGS ())

Txoj kev daws teeb meem optimization zoo li no:

from scipy.optimize import minimize
from scipy.optimize import rosen, rosen_der, rosen_hess, rosen_hess_prod

x0 = np.array([0.5, 0])
res = minimize(rosen, x0, method='trust-constr', jac=rosen_der, hess=rosen_hess,
                constraints=[linear_constraint, nonlinear_constraint],
                options={'verbose': 1}, bounds=bounds)
print(res.x)

`gtol` termination condition is satisfied.
Number of iterations: 12, function evaluations: 8, CG iterations: 7, optimality: 2.99e-09, constraint violation: 1.11e-16, execution time: 0.033 s.
[0.41494531 0.17010937]

Yog tias tsim nyog, lub luag haujlwm rau kev suav cov Hessian tuaj yeem txhais tau siv cov chav kawm LinearOperator

def rosen_hess_linop(x):
    def matvec(p):
        return rosen_hess_prod(x, p)
    return LinearOperator((2, 2), matvec=matvec)

res = minimize(rosen, x0, method='trust-constr', jac=rosen_der, hess=rosen_hess_linop,
                 constraints=[linear_constraint, nonlinear_constraint],
                 options={'verbose': 1}, bounds=bounds)

print(res.x)

los yog cov khoom ntawm Hessian thiab ib tug arbitrary vector los ntawm parameter hessp:

res = minimize(rosen, x0, method='trust-constr', jac=rosen_der, hessp=rosen_hess_prod,
                constraints=[linear_constraint, nonlinear_constraint],
                options={'verbose': 1}, bounds=bounds)
print(res.x)

Xwb, thawj thiab thib ob derivatives ntawm kev ua haujlwm tau zoo tuaj yeem kwv yees. Piv txwv li, Hessian tuaj yeem kwv yees siv qhov ua haujlwm SR1 (quasi-Newtonian approximation). Cov gradient tuaj yeem kwv yees los ntawm qhov sib txawv ntawm qhov kawg.

from scipy.optimize import SR1
res = minimize(rosen, x0, method='trust-constr',  jac="2-point", hess=SR1(),
               constraints=[linear_constraint, nonlinear_constraint],
               options={'verbose': 1}, bounds=bounds)
print(res.x)

Cov txheej txheem optimization = "SLSQP"

Txoj kev SLSQP yog tsim los daws cov teeb meem ntawm kev txo qis kev ua haujlwm hauv daim ntawv:

SciPy, optimization nrog tej yam kev mob

SciPy, optimization nrog tej yam kev mob

SciPy, optimization nrog tej yam kev mob

SciPy, optimization nrog tej yam kev mob

Qhov twg SciPy, optimization nrog tej yam kev mob ΠΈ SciPy, optimization nrog tej yam kev mob - Cov kev ntsuas ntsuas ntawm cov lus piav qhia txog kev txwv nyob rau hauv daim ntawv ntawm kev sib npaug lossis kev tsis sib xws. SciPy, optimization nrog tej yam kev mob - teeb tsa ntawm qis thiab sab sauv ciam teb rau lub ntsiab lus txhais ntawm txoj haujlwm.

Linear thiab nonlinear txwv yog piav nyob rau hauv daim ntawv ntawm dictionaries nrog cov yuam sij type, fun ΠΈ jac.

ineq_cons = {'type': 'ineq',
             'fun': lambda x: np.array ([1 - x [0] - 2 * x [1],
                                          1 - x [0] ** 2 - x [1],
                                          1 - x [0] ** 2 + x [1]]),
             'jac': lambda x: np.array ([[- 1.0, -2.0],
                                          [-2 * x [0], -1.0],
                                          [-2 * x [0], 1.0]])
            }

eq_cons = {'type': 'eq',
           'fun': lambda x: np.array ([2 * x [0] + x [1] - 1]),
           'jac': lambda x: np.array ([2.0, 1.0])
          }

Kev tshawb nrhiav qhov tsawg kawg nkaus yog ua raws li hauv qab no:

x0 = np.array([0.5, 0])
res = minimize(rosen, x0, method='SLSQP', jac=rosen_der,
               constraints=[eq_cons, ineq_cons], options={'ftol': 1e-9, 'disp': True},
               bounds=bounds)

print(res.x)

Optimization terminated successfully.    (Exit mode 0)
            Current function value: 0.34271757499419825
            Iterations: 4
            Function evaluations: 5
            Gradient evaluations: 4
[0.41494475 0.1701105 ]

Ua piv txwv optimization

Nyob rau hauv kev twb kev txuas nrog rau txoj kev hloov mus rau lub thib tsib thev naus laus zis qauv, cia peb saib ntawm kev tsim kho kom zoo siv cov piv txwv ntawm lub web studio, uas coj peb cov nyiaj tau los me me tab sis ruaj khov. Cia peb xav txog peb tus kheej ua tus thawj coj ntawm lub galley uas tsim peb hom khoom:

  • x0 - muag nplooj ntawv tsaws, los ntawm 10 tr.
  • x1 - tuam txhab websites, los ntawm 20 tr.
  • x2 - khw hauv online, los ntawm 30 tr.

Peb pab neeg ua haujlwm phooj ywg suav nrog plaub tus menyuam yaus, ob tus neeg nruab nrab thiab ib tus laus. Lawv cov nyiaj ua haujlwm txhua hli:

  • Lub Rau Hli: 4 * 150 = 600 Ρ‡Π΅Π» * час,
  • nruab nrab: 2 * 150 = 300 Ρ‡Π΅Π» * час,
  • Senor: 150 Ρ‡Π΅Π» * час.

Cia thawj tus menyuam yaus muaj sijhawm siv sijhawm (0, 1, 2) teev ntawm kev txhim kho thiab xa tawm ntawm ib qhov chaw ntawm hom (x10, x20, x30), nruab nrab - (7, 15, 20), laus - (5, 10, 15 ) teev ntawm lub sijhawm zoo tshaj plaws ntawm koj lub neej.

Zoo li txhua tus thawj coj ib txwm muaj, peb xav kom tau txais txiaj ntsig txhua hli. Thawj kauj ruam mus rau kev vam meej yog sau lub hom phiaj ua haujlwm value Raws li tus nqi ntawm cov nyiaj tau los ntawm cov khoom tsim tawm ib hlis:

def value(x):
    return - 10*x[0] - 20*x[1] - 30*x[2]

Qhov no tsis yog qhov yuam kev; thaum tshawb nrhiav qhov siab tshaj plaws, lub hom phiaj muaj nuj nqi raug txo qis nrog cov cim qhia.

Cov kauj ruam tom ntej yog txwv tsis pub peb cov neeg ua haujlwm ua haujlwm ntau dhau thiab qhia txog kev txwv cov sijhawm ua haujlwm:

SciPy, optimization nrog tej yam kev mob

Dab tsi yog sib npaug:

SciPy, optimization nrog tej yam kev mob

ineq_cons = {'type': 'ineq',
             'fun': lambda x: np.array ([600 - 10 * x [0] - 20 * x [1] - 30 * x[2],
                                         300 - 7  * x [0] - 15 * x [1] - 20 * x[2],
                                         150 - 5  * x [0] - 10 * x [1] - 15 * x[2]])
            }

Ib qho kev txwv tsis pub tshaj yog tias cov khoom tsim tawm yuav tsum tsuas yog qhov zoo:

bnds = Bounds ([0, 0, 0], [np.inf, np.inf, np.inf])

Thiab thaum kawg, feem ntau rosy assumption yog vim hais tias ntawm tus nqi qis thiab siab zoo, ib tug queue ntawm txaus siab cov neeg muas zaub yog tas li lineup rau peb. Peb tuaj yeem xaiv qhov ntim ntau lawm txhua hli peb tus kheej, raws li kev daws teeb meem kev ua kom zoo dua qub nrog scipy.optimize:

x0 = np.array([10, 10, 10])
res = minimize(value, x0, method='SLSQP', constraints=ineq_cons, bounds=bnds)
print(res.x)

[7.85714286 5.71428571 3.57142857]

Cia peb hloov pauv mus rau tag nrho cov lej thiab suav cov khoom thauj txhua hli ntawm rowers nrog kev pom zoo ntawm cov khoom lag luam x = (8, 6, 3) :

  • Lub Rau Hli: 8 * 10 + 6 * 20 + 3 * 30 = 290 Ρ‡Π΅Π» * час;
  • nruab nrab: 8 * 7 + 6 * 15 + 3 * 20 = 206 Ρ‡Π΅Π» * час;
  • Senor: 8 * 5 + 6 * 10 + 3 * 15 = 145 Ρ‡Π΅Π» * час.

Xaus: txhawm rau kom tus thawj coj tau txais nws qhov txiaj ntsig zoo tshaj plaws, nws yog qhov zoo tshaj los tsim 8 nplooj ntawv tsaws, 6 qhov chaw nruab nrab thiab 3 lub khw muag khoom ib hlis. Nyob rau hauv cov ntaub ntawv no, cov laus yuav tsum plowing yam tsis tau saib ntawm lub tshuab, lub load ntawm nruab nrab yuav yog kwv yees li 2/3, cov juniors tsawg tshaj li ib nrab.

xaus

Kab lus qhia txog cov txheej txheem yooj yim rau kev ua haujlwm nrog pob scipy.optimize, siv los daws cov teeb meem kev txo qis qis. Tus kheej kuv siv scipy tsuas yog rau cov hom phiaj kev kawm, uas yog vim li cas qhov piv txwv tau muab los ntawm qhov xwm txheej zoo li no.

Muaj ntau txoj kev xav thiab cov piv txwv virtual tuaj yeem pom, piv txwv li, hauv phau ntawv los ntawm IL Akulich "Kev ua lej lej hauv cov piv txwv thiab teeb meem." Ntau hardcore thov scipy.optimize tsim kom muaj 3D qauv los ntawm ib pawg duab (tsab xov xwm ntawm Habre) tuaj yeem saib hauv scipy-cookbook.

Lub hauv paus ntawm cov ntaub ntawv yog docs.scipy.orgcov uas xav pab txhawb rau kev txhais lus ntawm no thiab lwm ntu scipy Zoo siab txais tos GitHub.

Бпасибо mephistophees rau kev koom tes hauv kev npaj kev tshaj tawm.

Tau qhov twg los: www.hab.com

Ntxiv ib saib