SciPy, optimization nemamiriro

SciPy, optimization nemamiriro

SciPy (inodudzwa sai pie) ndeye numpy-based mathematics package inosanganisira C uye Fortran maraibhurari. SciPy inoshandura yako inopindirana yePython chikamu kuita yakazara data sainzi nharaunda seMATLAB, IDL, Octave, R, kana SciLab.

Muchikamu chino, tichatarisa maitiro ekutanga ekugadzirisa masvomhu - kugadzirisa matambudziko ekugadziriswa kwemamiriro ezvinhu kune scalar function yezvakasiyana zvakasiyana uchishandisa scipy.optimize package. Unconstrained optimization algorithms yakatokurukurwa mukati chinyorwa chekupedzisira. Rubatsiro rwakadzama uye rwekusvika-zuva pamabasa escipy runogona kuwanikwa uchishandisa rubatsiro() kuraira, Shift+Tab kana mu. zvinyorwa zvepamutemo.

Nhanganyaya

Chimiro chakajairwa chekugadzirisa zvese zviri zviviri uye zvisingadzoreki optimization matambudziko mu scipy.optimize package inopihwa nebasa. minimize(). Zvisinei, zvinozivikanwa kuti hapana nzira yepasi rose yekugadzirisa matambudziko ose, saka kusarudzwa kwenzira yakakwana, senguva dzose, inowira pamapfudzi emuongorori.
Iyo yakakodzera optimization algorithm inotsanangurwa uchishandisa iyo basa nharo minimize(..., method="").
Nekumisikidza optimization yebasa rezvakasiyana zvakasiyana, kushandiswa kwenzira dzinotevera dziripo:

  • trust-constr - tsvaga hushoma hwemunharaunda munharaunda yekuvimba. Wiki chinyorwa, nyaya yaHabrΓ©;
  • SLSQP -Sequential quadratic programming ine zvipingamupinyi, Newtonian nzira yekugadzirisa iyo Lagrange system. Wiki chinyorwa.
  • TNC - Truncated Newton Constrained, nhamba shoma yekudzokororwa, yakanaka kune nonlinear mabasa ane nhamba huru yezvakasiyana zvakasiyana. Wiki chinyorwa.
  • L-BFGS-B - nzira kubva kuchikwata cheBroyden-Fletcher-Goldfarb-Shanno, inoshandiswa nekudzikiswa kwekushandisa ndangariro nekuda kwekurodha zvishoma kwemavheji kubva kuHessian matrix. Wiki chinyorwa, nyaya yaHabrΓ©.
  • COBYLA - MARE Inomanikidzwa Kukwidziridzwa Ne Linear Approximation, yakamanikidzwa optimization ine mutsara fungidziro (isina gradient kuverenga). Wiki chinyorwa.

Zvichienderana nenzira yakasarudzwa, mamiriro uye zvirambidzo zvekugadzirisa dambudziko zvinoiswa zvakasiyana:

  • chinhu chekirasi Bounds nokuda kwemaitiro L-BFGS-B, TNC, SLSQP, kuvimba-constr;
  • rondedzero (min, max) nokuda kwemaitiro akafanana L-BFGS-B, TNC, SLSQP, kuvimba-constr;
  • chinhu kana urongwa hwezvinhu LinearConstraint, NonlinearConstraint yeCOBYLA, SLSQP, kuvimba-constr nzira;
  • duramazwi kana rondedzero yemaduramazwi {'type':str, 'fun':callable, 'jac':callable,opt, 'args':sequence,opt} yeCOBYLA, SLSQP nzira.

Mutsara wechinyorwa:
1) Funga nezvekushandiswa kweiyo conditional optimization algorithm munharaunda yekuvimba (nzira = "trust-constr") ine zvipingaidzo zvinotsanangurwa sezvinhu. Bounds, LinearConstraint, NonlinearConstraint ;
2) Funga kutevedzana kwehurongwa uchishandisa nzira shoma yemakwere (nzira = "SLSQP") ine zvirambidzo zvakatsanangurwa muchimiro cheduramazwi. {'type', 'fun', 'jac', 'args'};
3) Ongorora muenzaniso we optimization yezvigadzirwa zvakagadzirwa uchishandisa muenzaniso wewebhu studio.

Conditional optimization method="trust-constr"

Kuitwa kwenzira trust-constr maererano ne EQSQP kumatambudziko ane zvipingamupinyi zvechimiro chekuenzana uye mberi TRIP kune matambudziko ane zvipingamupinyi nenzira yekusaenzana. Nzira dzose dziri mbiri dzinoshandiswa nema algorithms ekutsvaga hushoma hwemunharaunda munharaunda yekuvimba uye dzakanyatsokodzera matambudziko makuru.

Masvomhu kuumbwa kwedambudziko rekutsvaga hushoma mune zvakajairika fomu:

SciPy, optimization nemamiriro

SciPy, optimization nemamiriro

SciPy, optimization nemamiriro

Nokuda kwezvipingamupinyi zvakasimba zvakaenzana, muganhu wezasi wakaiswa wakaenzana nechepamusoro SciPy, optimization nemamiriro.
Nokuda kwekumanikidzika kwenzira imwe chete, muganhu wepamusoro kana wepasi wakaiswa np.inf nechiratidzo chinoenderana.
Ngazvive zvakafanira kuwana hushoma hweinozivikanwa Rosenbrock basa remhando mbiri:

SciPy, optimization nemamiriro

Muchiitiko ichi, zvinotevera zvirambidzo zvinoiswa pane yayo dura retsanangudzo:

SciPy, optimization nemamiriro

SciPy, optimization nemamiriro

SciPy, optimization nemamiriro

SciPy, optimization nemamiriro

SciPy, optimization nemamiriro

SciPy, optimization nemamiriro

Mune yedu kesi, pane yakasarudzika mhinduro pane iyo pfungwa SciPy, optimization nemamiriro, iyo chete yekutanga neyechina zvirambidzo zvinoshanda.
Ngatifambei nezvinorambidzwa kubva pasi kusvika kumusoro uye tione kuti tingazvinyora sei mu scipy.
Zvibvumirano SciPy, optimization nemamiriro ΠΈ SciPy, optimization nemamiriro ngatiitsanangure tichishandisa chinhu chinonzi Bounds.

from scipy.optimize import Bounds
bounds = Bounds ([0, -0.5], [1.0, 2.0])

Zvibvumirano SciPy, optimization nemamiriro ΠΈ SciPy, optimization nemamiriro Ngatiinyore mumitsetse

SciPy, optimization nemamiriro

Ngatitsanangure zvipingaidzo izvi sechinhu LinearConstraint:

import numpy as np
from scipy.optimize import LinearConstraint
linear_constraint = LinearConstraint ([[1, 2], [2, 1]], [-np.inf, 1], [1, 1])

Uye pakupedzisira iyo isina mutsara inomanikidza mune matrix fomu:

SciPy, optimization nemamiriro

Isu tinotsanangura iyo Jacobian matrix yeichi chinomanikidza uye mutsetse musanganiswa weiyo Hessian matrix ine inopokana vector. SciPy, optimization nemamiriro:

SciPy, optimization nemamiriro

SciPy, optimization nemamiriro

Iye zvino isu tinogona kutsanangura chisungo chisina mutsara sechinhu NonlinearConstraint:

from scipy.optimize import NonlinearConstraint

def cons_f(x):
     return [x[0]**2 + x[1], x[0]**2 - x[1]]

def cons_J(x):
     return [[2*x[0], 1], [2*x[0], -1]]

def cons_H(x, v):
     return v[0]*np.array([[2, 0], [0, 0]]) + v[1]*np.array([[2, 0], [0, 0]])

nonlinear_constraint = NonlinearConstraint(cons_f, -np.inf, 1, jac=cons_J, hess=cons_H)

Kana saizi yakakura, matrices anogona zvakare kutsanangurwa mune sparse fomu:

from scipy.sparse import csc_matrix

def cons_H_sparse(x, v):
     return v[0]*csc_matrix([[2, 0], [0, 0]]) + v[1]*csc_matrix([[2, 0], [0, 0]])

nonlinear_constraint = NonlinearConstraint(cons_f, -np.inf, 1,
                                            jac=cons_J, hess=cons_H_sparse)

kana sechinhu LinearOperator:

from scipy.sparse.linalg import LinearOperator

def cons_H_linear_operator(x, v):
    def matvec(p):
        return np.array([p[0]*2*(v[0]+v[1]), 0])
    return LinearOperator((2, 2), matvec=matvec)

nonlinear_constraint = NonlinearConstraint(cons_f, -np.inf, 1,
                                jac=cons_J, hess=cons_H_linear_operator)

Paunenge uchiverenga iyo Hessian matrix SciPy, optimization nemamiriro zvinoda kushanda nesimba, unogona kushandisa kirasi HessianUpdateStrategy. Mazano anotevera aripo: BFGS ΠΈ SR1.

from scipy.optimize import BFGS

nonlinear_constraint = NonlinearConstraint(cons_f, -np.inf, 1, jac=cons_J, hess=BFGS())

Iyo Hessian inogona zvakare kuverengerwa uchishandisa inogumira misiyano:

nonlinear_constraint = NonlinearConstraint (cons_f, -np.inf, 1, jac = cons_J, hess = '2-point')

Iyo Jacobian matrix yezvipingamupinyi inogona zvakare kuverengerwa uchishandisa inogumira misiyano. Nekudaro, mune ino kesi iyo Hessian matrix haigone kuverengerwa uchishandisa inogumira misiyano. Iyo Hessian inofanirwa kutsanangurwa sebasa kana kushandisa iyo HessianUpdateStrategy kirasi.

nonlinear_constraint = NonlinearConstraint (cons_f, -np.inf, 1, jac = '2-point', hess = BFGS ())

Mhinduro kudambudziko re optimization inoita seizvi:

from scipy.optimize import minimize
from scipy.optimize import rosen, rosen_der, rosen_hess, rosen_hess_prod

x0 = np.array([0.5, 0])
res = minimize(rosen, x0, method='trust-constr', jac=rosen_der, hess=rosen_hess,
                constraints=[linear_constraint, nonlinear_constraint],
                options={'verbose': 1}, bounds=bounds)
print(res.x)

`gtol` termination condition is satisfied.
Number of iterations: 12, function evaluations: 8, CG iterations: 7, optimality: 2.99e-09, constraint violation: 1.11e-16, execution time: 0.033 s.
[0.41494531 0.17010937]

Kana zvichidikanwa, basa rekuverenga iyo Hessian rinogona kutsanangurwa uchishandisa iyo LinearOperator kirasi

def rosen_hess_linop(x):
    def matvec(p):
        return rosen_hess_prod(x, p)
    return LinearOperator((2, 2), matvec=matvec)

res = minimize(rosen, x0, method='trust-constr', jac=rosen_der, hess=rosen_hess_linop,
                 constraints=[linear_constraint, nonlinear_constraint],
                 options={'verbose': 1}, bounds=bounds)

print(res.x)

kana chigadzirwa cheHessian uye chinopokana vector kuburikidza neparameter hessp:

res = minimize(rosen, x0, method='trust-constr', jac=rosen_der, hessp=rosen_hess_prod,
                constraints=[linear_constraint, nonlinear_constraint],
                options={'verbose': 1}, bounds=bounds)
print(res.x)

Neimwe nzira, yekutanga neyechipiri zvinobva kune basa riri kuvandudzwa zvinogona kuverengerwa. Semuenzaniso, iyo Hessian inogona kufananidzwa uchishandisa basa SR1 (quasi-Newtonian approximation). Iyo gradient inogona kuenzaniswa nemisiyano inopera.

from scipy.optimize import SR1
res = minimize(rosen, x0, method='trust-constr',  jac="2-point", hess=SR1(),
               constraints=[linear_constraint, nonlinear_constraint],
               options={'verbose': 1}, bounds=bounds)
print(res.x)

Conditional optimization nzira="SLSQP"

Iyo SLSQP nzira yakagadzirirwa kugadzirisa matambudziko ekuderedza basa muchimiro:

SciPy, optimization nemamiriro

SciPy, optimization nemamiriro

SciPy, optimization nemamiriro

SciPy, optimization nemamiriro

Kupi SciPy, optimization nemamiriro ΠΈ SciPy, optimization nemamiriro - seti yemandikisi ekutaura anotsanangura zvirambidzo nenzira yekuenzana kana kusaenzana. SciPy, optimization nemamiriro - seti yepasi nepamusoro miganhu yedomeni yetsanangudzo yebasa.

Linear uye nonlinear zvipingaidzo zvinotsanangurwa nenzira yemaduramazwi ane makiyi type, fun ΠΈ jac.

ineq_cons = {'type': 'ineq',
             'fun': lambda x: np.array ([1 - x [0] - 2 * x [1],
                                          1 - x [0] ** 2 - x [1],
                                          1 - x [0] ** 2 + x [1]]),
             'jac': lambda x: np.array ([[- 1.0, -2.0],
                                          [-2 * x [0], -1.0],
                                          [-2 * x [0], 1.0]])
            }

eq_cons = {'type': 'eq',
           'fun': lambda x: np.array ([2 * x [0] + x [1] - 1]),
           'jac': lambda x: np.array ([2.0, 1.0])
          }

Kutsvaga kwehushoma kunoitwa sezvizvi:

x0 = np.array([0.5, 0])
res = minimize(rosen, x0, method='SLSQP', jac=rosen_der,
               constraints=[eq_cons, ineq_cons], options={'ftol': 1e-9, 'disp': True},
               bounds=bounds)

print(res.x)

Optimization terminated successfully.    (Exit mode 0)
            Current function value: 0.34271757499419825
            Iterations: 4
            Function evaluations: 5
            Gradient evaluations: 4
[0.41494475 0.1701105 ]

Optimization Muenzaniso

Panyaya yekuchinja kune yechishanu tekinoroji chimiro, ngatitarisei kugadzirisa kwekugadzira tichishandisa muenzaniso wewebhu studio, iyo inotiunzira mari shoma asi yakagadzikana. Ngatizvifungidzire isu semutungamiriri wegwara rinogadzira marudzi matatu ezvigadzirwa:

  • x0 - kutengesa mapeji ekumhara, kubva ku10 tr.
  • x1 - mawebhusaiti emakambani, kubva ku20 tr.
  • x2 - zvitoro zvepamhepo, kubva pa30 tr.

Chikwata chedu chekushanda chine hushamwari chinosanganisira vana vechidiki, vaviri vepakati uye mumwe mukuru. Mari yavo yemwedzi yekushanda yenguva:

  • Junes: 4 * 150 = 600 Ρ‡Π΅Π» * час,
  • pakati: 2 * 150 = 300 Ρ‡Π΅Π» * час,
  • sevha: 150 Ρ‡Π΅Π» * час.

Rega mudiki wekutanga aripo apedze (0, 1, 2) maawa pakuvandudza uye kuendesa imwe saiti yemhando (x10, x20, x30), yepakati - (7, 15, 20), mukuru - (5, 10, 15 ) maawa enguva yakanakisisa yehupenyu hwako.

Senge chero akajairwa director, isu tinoda kuwedzera purofiti pamwedzi. Nhanho yekutanga yekubudirira kunyora pasi chinangwa chebasa value sehuwandu hwemari kubva kune zvigadzirwa zvinogadzirwa pamwedzi:

def value(x):
    return - 10*x[0] - 20*x[1] - 30*x[2]

Ichi hachisi chikanganiso; kana uchitsvaga iyo yakanyanya, basa rechinangwa rinoderedzwa nechiratidzo chakapesana.

Nhanho inotevera ndeyekurambidza vashandi vedu kushanda zvakanyanya uye kuunza zvirambidzo pamaawa ekushanda:

SciPy, optimization nemamiriro

Chii chakaenzana:

SciPy, optimization nemamiriro

ineq_cons = {'type': 'ineq',
             'fun': lambda x: np.array ([600 - 10 * x [0] - 20 * x [1] - 30 * x[2],
                                         300 - 7  * x [0] - 15 * x [1] - 20 * x[2],
                                         150 - 5  * x [0] - 10 * x [1] - 15 * x[2]])
            }

Chirambidzo chiri pamutemo ndechekuti kubuda kwechigadzirwa kunofanirwa kuve kwakanaka chete:

bnds = Bounds ([0, 0, 0], [np.inf, np.inf, np.inf])

Uye pakupedzisira, iyo yakanyanya kunaka fungidziro ndeyekuti nekuda kwemutengo wakaderera uye yemhando yepamusoro, mutsara wevatengi vanogutsikana unogara wakagadzirira isu. Isu tinokwanisa kusarudza mavhoriyamu ekugadzirwa kwemwedzi isu pachedu, zvichibva pakugadzirisa dambudziko rekugadzirisa scipy.optimize:

x0 = np.array([10, 10, 10])
res = minimize(value, x0, method='SLSQP', constraints=ineq_cons, bounds=bnds)
print(res.x)

[7.85714286 5.71428571 3.57142857]

Ngatitenderedzei zvakasununguka kune nhamba dzese uye tiverenge mwedzi wega wega mutoro wevatyairi nekugovera kwakaringana kwezvigadzirwa. x = (8, 6, 3) :

  • Junes: 8 * 10 + 6 * 20 + 3 * 30 = 290 Ρ‡Π΅Π» * час;
  • pakati: 8 * 7 + 6 * 15 + 3 * 20 = 206 Ρ‡Π΅Π» * час;
  • sevha: 8 * 5 + 6 * 10 + 3 * 15 = 145 Ρ‡Π΅Π» * час.

Mhedziso: kuitira kuti mutungamiriri agamuchire hukuru hwake hwakakodzera, zvakaringana kugadzira mapeji ekumhara masere, nzvimbo 8 dzepakati nepakati uye zvitoro zvitatu pamwedzi. Muchiitiko ichi, mukuru anofanira kurima asina kutarisa kumusoro kubva pamushini, mutoro wepakati unenge uri 6/3, juniors isingasviki hafu.

mhedziso

Chinyorwa chinotsanangura maitiro ekutanga ekushanda nepasuru scipy.optimize, inoshandiswa kugadzirisa matambudziko ekuderedza zvimiso. Ini pachangu ndinoshandisa scipy nekuda kwezvinangwa zvedzidzo chete, ndosaka muenzaniso wakapihwa uri wekusekesa kudaro.

Yakawanda yedzidziso uye chaiyo mienzaniso inogona kuwanikwa, semuenzaniso, mubhuku raI.L. Akulich "Mathematics programming mumienzaniso nematambudziko." More hardcore application scipy.optimize kuvaka chimiro che 3D kubva pane seti yemifananidzo (nyaya yaHabrΓ©) inogona kutariswa mukati scipy-cookbook.

Nzvimbo huru yeruzivo ndeye docs.scipy.orgavo vanoda kubatsira mukuturikirwa kweichi nezvimwe zvikamu scipy Welcome to GitHub.

Бпасибо mephistophees nokuda kwokubatanidzwa mukugadzirira kubudiswa.

Source: www.habr.com

Voeg