Kugadzirisa equation yekureruka kwemutsara

Chinyorwa chinokurukura nzira dzinoverengeka dzekuona masvomhu equation yemutsara wakapfava (paviri) wekudzoreredza.

Nzira dzese dzekugadzirisa equation yakurukurwa pano dzinobva panzira yedikidiki. Ngatitarisei nzira dzinotevera:

  • Analytical solution
  • Gradient Descent
  • Stochastic gradient descent

Kune yega yega nzira yekugadzirisa equation yemutsara wakatwasuka, chinyorwa chinopa akasiyana mabasa, ayo anonyanya kukamurwa kuita ayo akanyorwa pasina kushandisa raibhurari. numpy uye nevaya vanoshandisa kuverenga numpy. Zvinotendwa kuti kushandisa nounyanzvi numpy zvichaderedza computing cost.

Kodhi yese yakapihwa muchinyorwa inonyorwa mukati python-2.7 uchishandisa Jupyter Notebook. Iyo kodhi kodhi uye faira ine sampuro data inotumirwa pairi Github

Chinyorwa chacho chakanyanya kunanga kune vese vanotanga uye avo vakatotanga zvishoma nezvishoma kugona kudzidza kwechikamu chakafara kwazvo muhungwaru hwekugadzira - kudzidza kwemichina.

Kuenzanisira mashoko acho, tinoshandisa muenzaniso wakapfava zvikuru.

Muenzaniso mamiriro

Tine maitiro mashanu anoratidza kutsamira Y ΠΎΡ‚ X (Tafura Nhamba 1):

Tafura Nhamba 1 β€œMienzaniso yemamiriro ezvinhu”

Kugadzirisa equation yekureruka kwemutsara

Tichafunga kuti tsika Kugadzirisa equation yekureruka kwemutsara mwedzi wegore, uye Kugadzirisa equation yekureruka kwemutsara - mari mwedzi uno. Mune mamwe mazwi, mari inobva pamwedzi wegore, uye Kugadzirisa equation yekureruka kwemutsara - chiratidzo chete icho mari inoenderana nayo.

Muenzaniso wakadaro-saizvozvo, zvese kubva pakuona kwemamiriro ekutsamira kwemari pamwedzi wegore, uye kubva pakuona kwehuwandu hwehukoshi - kune vashoma kwazvo. Zvisinei, kurerutsa kwakadaro kuchaita kuti zvikwanisike, sezvavanotaura, kutsanangura, kwete nguva dzose zviri nyore, zvinhu izvo vanotanga vanotevedzera. Uye zvakare kuve nyore kwenhamba kuchabvumira avo vanoshuvira kugadzirisa muenzaniso papepa pasina yakakosha mari yebasa.

Ngatifungei kuti kutsamira kwakapihwa mumuenzaniso kunogona kufananidzwa nemasvomhu equation yeakareruka (paviri) regression mutsara wefomu:

Kugadzirisa equation yekureruka kwemutsara

apo Kugadzirisa equation yekureruka kwemutsara ndiwo mwedzi wakatorwa mari; Kugadzirisa equation yekureruka kwemutsara - mari inoenderana nemwedzi, Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara ndiwo maregression coefficients emutsara unofungidzirwa.

Cherechedza kuti coefficient Kugadzirisa equation yekureruka kwemutsara inowanzonzi mutsetse kana gradient yemutsetse unofungidzirwa; inomiririra mari inoshandiswa ne Kugadzirisa equation yekureruka kwemutsara painoshanduka Kugadzirisa equation yekureruka kwemutsara.

Zviripachena, basa redu mumuenzaniso nderekusarudza macoefficients akadaro muequation Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara, uko kutsauka kwemitengo yedu yakaverengerwa yemari pamwedzi kubva kumhinduro dzechokwadi, i.e. tsika dzakaunzwa mumuenzaniso dzichave shoma.

Kashoma sikweya nzira

Zvinoenderana nediki mativi nzira, kutsauka kunofanirwa kuverengerwa nekuipeta. Iyi tekinoroji inokutendera kuti udzivise kukanzura kukanzura kana vaine zviratidzo zvakapesana. Semuenzaniso, kana mune imwe nyaya, kutsauka kuri +5 (pamwe neshanu), uye mune imwe -5 (kubvisa shanu), ipapo huwandu hwekutsauka huchadzima imwe neimwe kunze uye hunosvika 0 (zero). Izvo zvinogoneka kwete kukwereta kutsauka, asi kushandisa pfuma yemodulus uye ipapo misiyano yese ichave yakanaka uye ichaunganidza. Isu hatisi kuzogara pane iyi pfungwa zvakadzama, asi zvinongoratidza kuti kuti zvive nyore kuverenga, itsika kukwereta kutsauka.

Izvi ndizvo zvinoita fomula yatichasarudza nayo shoma shoma yekutsauka kwakapetwa (zvikanganiso):

Kugadzirisa equation yekureruka kwemutsara

apo Kugadzirisa equation yekureruka kwemutsara ibasa rekufungidzira mhinduro dzechokwadi (kureva, mari yatakaverenga),

Kugadzirisa equation yekureruka kwemutsara imhinduro dzechokwadi (mari inowanikwa mumuenzaniso),

Kugadzirisa equation yekureruka kwemutsara ndiyo indekisi yemuenzaniso (nhamba yemwedzi umo kutsauka kwakatemwa)

Ngatisiyanei basa, titsanangure chidimbu chekusiyanisa equation, uye tigadzirire kuenda kune yekuongorora mhinduro. Asi chekutanga, ngatitorei rwendo rupfupi pamusoro pekuti kusiyanisa chii uye tiyeuke zvinoreva geometric yezvinotorwa.

Musiyano

Musiyano (differentiation) ndiko kushanda kwekutsvaga kubva pane chimwe chinhu.

Chii chinonzi derivative chinoshandiswa? Kubva pane chimwe chinhu chinoratidza mwero weshanduko yebasa uye inotiudza mafambiro ayo. Kana derivative pane imwe nzvimbo yakanaka, saka basa rinowedzera; kana zvisina kudaro, basa rinodzikira. Uye iyo yakakura kukosha kweabsolute derivative, yakakwira mwero wekuchinja kwemaitiro ebasa, pamwe nekukwira kwemateru egirofu yebasa.

Semuyenzaniso, pasi pemamiriro eCartesian coordinate system, kukosha kweiyo derivative panzvimbo M (0,0) yakaenzana + 25 zvinoreva kuti pane imwe nguva, apo kukosha kunoshandurwa Kugadzirisa equation yekureruka kwemutsara kurudyi neyakajairwa unit, kukosha Kugadzirisa equation yekureruka kwemutsara inowedzera ne25 yakajairika mayunitsi. Pagirafu inoita senge kukwira kwakaringana kwehunhu Kugadzirisa equation yekureruka kwemutsara kubva pane imwe nzvimbo.

Mumwe muenzaniso. The derivative value yakaenzana -0,1 zvinoreva kuti kana wadzingwa Kugadzirisa equation yekureruka kwemutsara pane imwe yakajairika unit, kukosha Kugadzirisa equation yekureruka kwemutsara inoderera ne 0,1 chete yakajairika unit. Panguva imwecheteyo, pagirafu yebasa racho, tinogona kuona kuderera kusingaonekwe. Kudhirowa fananidzo negomo, zvinoita sekunge tiri kudzika zvishoma nezvishoma pamateru kubva mugomo, kusiyana nemuenzaniso wapfuura, pataifanira kukwira nhongonya dzakanyanya :)

Saka, mushure mekusiyanisa basa Kugadzirisa equation yekureruka kwemutsara nemaodds Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara, tinotsanangura 1st order partial differential equations. Mushure mekuona maequation, isu tinogashira hurongwa hwemaequation maviri, nekugadzirisa izvo zvatichakwanisa kusarudza kukosha kwakadai kweiyo coefficients. Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara, iyo hunhu hwezvinobvamo zvinoenderana pamapoinzi akapihwa hunochinja nehuwandu, hudiki kwazvo, uye kana iri mhinduro yekuongorora haishanduke zvachose. Mune mamwe mazwi, basa rekukanganisa pane akawanikwa coefficients richasvika padiki, sezvo hunhu hwezvikamu zvakabva panzvimbo idzi huchaenzana ne zero.

Saka, maererano nemitemo yekusiyanisa, chikamu chinobva kune equation ye 1st order maererano ne coefficient. Kugadzirisa equation yekureruka kwemutsara achatora fomu:

Kugadzirisa equation yekureruka kwemutsara

1st order partial derivative equation maererano ne Kugadzirisa equation yekureruka kwemutsara achatora fomu:

Kugadzirisa equation yekureruka kwemutsara

Nekuda kweizvozvo, takagamuchira system ye equations ine mhinduro yakapusa yekuongorora:

kutanga{equation*}
kutanga{nyaya}
na + bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i = 0

sumlimits_{i=1}^nx_i(a +bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i) = 0
kupera{nyaya}
magumo{equation*}

Tisati tagadzirisa equation, ngatitangei kurodha, tarisa kuti kurodha kwacho here, uye tofometa iyo data.

Kurodha nekugadzirisa data

Zvinofanira kucherechedzwa kuti nekuda kweiyo mhinduro yekuongorora, uyezve ye gradient uye stochastic gradient descent, isu tichashandisa iyo kodhi mumisiyano miviri: kushandisa raibhurari. numpy uye pasina kuishandisa, ipapo tichada zvakakodzera data formatting (ona code).

Kurodha data uye kugadzirisa kodhi

# ΠΈΠΌΠΏΠΎΡ€Ρ‚ΠΈΡ€ΡƒΠ΅ΠΌ всС Π½ΡƒΠΆΠ½Ρ‹Π΅ Π½Π°ΠΌ Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import math
import pylab as pl
import random

# Π³Ρ€Π°Ρ„ΠΈΠΊΠΈ ΠΎΡ‚ΠΎΠ±Ρ€Π°Π·ΠΈΠΌ Π² Jupyter
%matplotlib inline

# ΡƒΠΊΠ°ΠΆΠ΅ΠΌ Ρ€Π°Π·ΠΌΠ΅Ρ€ Π³Ρ€Π°Ρ„ΠΈΠΊΠΎΠ²
from pylab import rcParams
rcParams['figure.figsize'] = 12, 6

# ΠΎΡ‚ΠΊΠ»ΡŽΡ‡ΠΈΠΌ прСдупрСТдСния Anaconda
import warnings
warnings.simplefilter('ignore')

# Π·Π°Π³Ρ€ΡƒΠ·ΠΈΠΌ значСния
table_zero = pd.read_csv('data_example.txt', header=0, sep='t')

# посмотрим ΠΈΠ½Ρ„ΠΎΡ€ΠΌΠ°Ρ†ΠΈΡŽ ΠΎ Ρ‚Π°Π±Π»ΠΈΡ†Π΅ ΠΈ Π½Π° саму Ρ‚Π°Π±Π»ΠΈΡ†Ρƒ
print table_zero.info()
print '********************************************'
print table_zero
print '********************************************'

# ΠΏΠΎΠ΄Π³ΠΎΡ‚ΠΎΠ²ΠΈΠΌ Π΄Π°Π½Π½Ρ‹Π΅ Π±Π΅Π· использования NumPy

x_us = []
[x_us.append(float(i)) for i in table_zero['x']]
print x_us
print type(x_us)
print '********************************************'

y_us = []
[y_us.append(float(i)) for i in table_zero['y']]
print y_us
print type(y_us)
print '********************************************'

# ΠΏΠΎΠ΄Π³ΠΎΡ‚ΠΎΠ²ΠΈΠΌ Π΄Π°Π½Π½Ρ‹Π΅ с использованиСм NumPy

x_np = table_zero[['x']].values
print x_np
print type(x_np)
print x_np.shape
print '********************************************'

y_np = table_zero[['y']].values
print y_np
print type(y_np)
print y_np.shape
print '********************************************'

Kufungidzira

Zvino, mushure mekunge taisa data, kechipiri, taongorora iko kurongeka uye pakupedzisira kurodha data, isu tichaita yekutanga kuona. Iyo nzira inowanzoshandiswa kune iyi ndeye pairplot raibhurari seaborn. Mumuenzaniso wedu, nekuda kwehuwandu hushoma, hapana chikonzero chekushandisa raibhurari seaborn. Tichashandisa raibhurari yenguva dzose matplotlib uye ingotarisa pakapararira.

Scatterplot code

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–1 "Π—Π°Π²ΠΈΡΠΈΠΌΠΎΡΡ‚ΡŒ Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠΈ ΠΎΡ‚ мСсяца Π³ΠΎΠ΄Π°"'

plt.plot(x_us,y_us,'o',color='green',markersize=16)
plt.xlabel('$Months$', size=16)
plt.ylabel('$Sales$', size=16)
plt.show()

Chati Nhamba 1 β€œKutsamira kwemari pamwedzi wegore”

Kugadzirisa equation yekureruka kwemutsara

Analytical solution

Ngatishandisei zvishandiso zvakajairika mukati python uye gadzirisa system ye equations:

kutanga{equation*}
kutanga{nyaya}
na + bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i = 0

sumlimits_{i=1}^nx_i(a +bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i) = 0
kupera{nyaya}
magumo{equation*}

Maererano nekutonga kwaCramer isu tichawana general determinant, pamwe chete nema determinants na Kugadzirisa equation yekureruka kwemutsara uye na Kugadzirisa equation yekureruka kwemutsara, mushure mezvo, kupatsanura chirevo ne Kugadzirisa equation yekureruka kwemutsara kune general determinant - tsvaga coefficient Kugadzirisa equation yekureruka kwemutsara, zvakafanana tinowana coefficient Kugadzirisa equation yekureruka kwemutsara.

Analytical solution code

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для расчСта коэффициСнтов a ΠΈ b ΠΏΠΎ ΠΏΡ€Π°Π²ΠΈΠ»Ρƒ ΠšΡ€Π°ΠΌΠ΅Ρ€Π°
def Kramer_method (x,y):
        # сумма Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ (всС мСсяца)
    sx = sum(x)
        # сумма истинных ΠΎΡ‚Π²Π΅Ρ‚ΠΎΠ² (Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠ° Π·Π° вСсь ΠΏΠ΅Ρ€ΠΈΠΎΠ΄)
    sy = sum(y)
        # сумма произвСдСния Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ Π½Π° истинныС ΠΎΡ‚Π²Π΅Ρ‚Ρ‹
    list_xy = []
    [list_xy.append(x[i]*y[i]) for i in range(len(x))]
    sxy = sum(list_xy)
        # сумма ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    list_x_sq = []
    [list_x_sq.append(x[i]**2) for i in range(len(x))]
    sx_sq = sum(list_x_sq)
        # количСство Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    n = len(x)
        # ΠΎΠ±Ρ‰ΠΈΠΉ ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΡ‚Π΅Π»ΡŒ
    det = sx_sq*n - sx*sx
        # ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΡ‚Π΅Π»ΡŒ ΠΏΠΎ a
    det_a = sx_sq*sy - sx*sxy
        # искомый ΠΏΠ°Ρ€Π°ΠΌΠ΅Ρ‚Ρ€ a
    a = (det_a / det)
        # ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΡ‚Π΅Π»ΡŒ ΠΏΠΎ b
    det_b = sxy*n - sy*sx
        # искомый ΠΏΠ°Ρ€Π°ΠΌΠ΅Ρ‚Ρ€ b
    b = (det_b / det)
        # ΠΊΠΎΠ½Ρ‚Ρ€ΠΎΠ»ΡŒΠ½Ρ‹Π΅ значСния (ΠΏΡ€ΠΎΠΎΠ²Π΅Ρ€ΠΊΠ°)
    check1 = (n*b + a*sx - sy)
    check2 = (b*sx + a*sx_sq - sxy)
    return [round(a,4), round(b,4)]

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ ΠΈ запишСм ΠΏΡ€Π°Π²ΠΈΠ»ΡŒΠ½Ρ‹Π΅ ΠΎΡ‚Π²Π΅Ρ‚Ρ‹
ab_us = Kramer_method(x_us,y_us)
a_us = ab_us[0]
b_us = ab_us[1]
print ' 33[1m' + ' 33[4m' + "ΠžΠΏΡ‚ΠΈΠΌΠ°Π»ΡŒΠ½Ρ‹Π΅ значСния коэффициСнтов a ΠΈ b:"  + ' 33[0m' 
print 'a =', a_us
print 'b =', b_us
print

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для подсчСта суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ошибок
def errors_sq_Kramer_method(answers,x,y):
    list_errors_sq = []
    for i in range(len(x)):
        err = (answers[0] + answers[1]*x[i] - y[i])**2
        list_errors_sq.append(err)
    return sum(list_errors_sq)

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ ΠΈ запишСм Π·Π½Π°Ρ‡Π΅Π½ΠΈΠ΅ ошибки
error_sq = errors_sq_Kramer_method(ab_us,x_us,y_us)
print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ" + ' 33[0m'
print error_sq
print

# Π·Π°ΠΌΠ΅Ρ€ΠΈΠΌ врСмя расчСта
# print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния расчСта суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
# % timeit error_sq = errors_sq_Kramer_method(ab,x_us,y_us)

Hezvino izvo zvatakawana:

Kugadzirisa equation yekureruka kwemutsara

Saka, kukosha kweiyo coefficients kwakawanikwa, huwandu hwekutsauka kwakapetwa hwakagadzirwa. Ngatitorei mutsara wakatwasuka pane inoparadzira histogram zvinoenderana neanowanikwa coefficients.

Regression line code

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для формирования массива рассчСтных Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠΈ
def sales_count(ab,x,y):
    line_answers = []
    [line_answers.append(ab[0]+ab[1]*x[i]) for i in range(len(x))]
    return line_answers

# построим Π³Ρ€Π°Ρ„ΠΈΠΊΠΈ
print 'Π“Ρ€Ρ„ΠΈΠΊβ„–2 "ΠŸΡ€Π°Π²ΠΈΠ»ΡŒΠ½Ρ‹Π΅ ΠΈ расчСтныС ΠΎΡ‚Π²Π΅Ρ‚Ρ‹"'
plt.plot(x_us,y_us,'o',color='green',markersize=16, label = '$True$ $answers$')
plt.plot(x_us, sales_count(ab_us,x_us,y_us), color='red',lw=4,
         label='$Function: a + bx,$ $where$ $a='+str(round(ab_us[0],2))+',$ $b='+str(round(ab_us[1],2))+'$')
plt.xlabel('$Months$', size=16)
plt.ylabel('$Sales$', size=16)
plt.legend(loc=1, prop={'size': 16})
plt.show()

Chati Nhamba 2 β€œMhinduro dzakarurama uye dzakaverengerwa”

Kugadzirisa equation yekureruka kwemutsara

Unogona kutarisa girafu rekutsauka pamwedzi wega wega. Kwatiri isu, isu hatizowana kukosha kwakakosha kubva kwairi, asi isu tichagutsa kuda kuziva kwedu nezvekuti yakapusa mutsara regression equation inoratidza kutsamira kwemari pamwedzi wegore.

Kutsauka kwechati kodhi

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для формирования массива ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ Π² ΠΏΡ€ΠΎΡ†Π΅Π½Ρ‚Π°Ρ…
def error_per_month(ab,x,y):
    sales_c = sales_count(ab,x,y)
    errors_percent = []
    for i in range(len(x)):
        errors_percent.append(100*(sales_c[i]-y[i])/y[i])
    return errors_percent

# построим Π³Ρ€Π°Ρ„ΠΈΠΊ
print 'Π“Ρ€Π°Ρ„ΠΈΠΊβ„–3 "ΠžΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΡ ΠΏΠΎ-мСсячно, %"'
plt.gca().bar(x_us, error_per_month(ab_us,x_us,y_us), color='brown')
plt.xlabel('Months', size=16)
plt.ylabel('Calculation error, %', size=16)
plt.show()

Chati Nhamba 3 β€œKutsauka, %”

Kugadzirisa equation yekureruka kwemutsara

Hatina kukwana, asi takapedza basa redu.

Ngatinyorei basa iro, kuti tione macoefficients Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara inoshandisa raibhurari numpy, zvakanyatsojeka, tichanyora mabasa maviri: imwe inoshandisa pseudoinverse matrix (isingakurudzirwi mukuita, sezvo maitiro acho ari computationally akaoma uye asina kugadzikana), imwe inoshandisa matrix equation.

Analytical Solution Code (NumPy)

# для Π½Π°Ρ‡Π°Π»Π° Π΄ΠΎΠ±Π°Π²ΠΈΠΌ столбСц с Π½Π΅ ΠΈΠ·ΠΌΠ΅Π½ΡΡŽΡ‰ΠΈΠΌΡΡ Π·Π½Π°Ρ‡Π΅Π½ΠΈΠ΅ΠΌ Π² 1. 
# Π”Π°Π½Π½Ρ‹ΠΉ столбСц Π½ΡƒΠΆΠ΅Π½ для Ρ‚ΠΎΠ³ΠΎ, Ρ‡Ρ‚ΠΎΠ±Ρ‹ Π½Π΅ ΠΎΠ±Ρ€Π°Π±Π°Ρ‚Ρ‹Π²Π°Ρ‚ΡŒ ΠΎΡ‚Π΄Π΅Π»ΡŒΠ½ΠΎ коэффицСнт a
vector_1 = np.ones((x_np.shape[0],1))
x_np = table_zero[['x']].values # Π½Π° всякий случай ΠΏΡ€ΠΈΠ²Π΅Π΄Π΅ΠΌ Π² ΠΏΠ΅Ρ€Π²ΠΈΡ‡Π½Ρ‹ΠΉ Ρ„ΠΎΡ€ΠΌΠ°Ρ‚ Π²Π΅ΠΊΡ‚ΠΎΡ€ x_np
x_np = np.hstack((vector_1,x_np))

# ΠΏΡ€ΠΎΠ²Π΅Ρ€ΠΈΠΌ Ρ‚ΠΎ, Ρ‡Ρ‚ΠΎ всС сдСлали ΠΏΡ€Π°Π²ΠΈΠ»ΡŒΠ½ΠΎ
print vector_1[0:3]
print x_np[0:3]
print '***************************************'
print

# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ, которая опрСдСляСт значСния коэффициСнтов a ΠΈ b с использованиСм псСвдообратной ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρ‹
def pseudoinverse_matrix(X, y):
    # Π·Π°Π΄Π°Π΅ΠΌ явный Ρ„ΠΎΡ€ΠΌΠ°Ρ‚ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρ‹ ΠΏΡ€ΠΈΠ·Π½Π°ΠΊΠΎΠ²
    X = np.matrix(X)
    # опрСдСляСм Ρ‚Ρ€Π°Π½ΡΠΏΠΎΠ½ΠΈΡ€ΠΎΠ²Π°Π½Π½ΡƒΡŽ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρƒ
    XT = X.T
    # опрСдСляСм ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚Π½ΡƒΡŽ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρƒ
    XTX = XT*X
    # опрСдСляСм ΠΏΡΠ΅Π²Π΄ΠΎΠΎΠ±Ρ€Π°Ρ‚Π½ΡƒΡŽ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρƒ
    inv = np.linalg.pinv(XTX)
    # Π·Π°Π΄Π°Π΅ΠΌ явный Ρ„ΠΎΡ€ΠΌΠ°Ρ‚ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρ‹ ΠΎΡ‚Π²Π΅Ρ‚ΠΎΠ²
    y = np.matrix(y)
    # Π½Π°Ρ…ΠΎΠ΄ΠΈΠΌ Π²Π΅ΠΊΡ‚ΠΎΡ€ вСсов
    return (inv*XT)*y

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ
ab_np = pseudoinverse_matrix(x_np, y_np)
print ab_np
print '***************************************'
print

# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ, которая ΠΈΡΠΏΠΎΠ»ΡŒΠ·ΡƒΠ΅Ρ‚ для Ρ€Π΅ΡˆΠ΅Π½ΠΈΡ ΠΌΠ°Ρ‚Ρ€ΠΈΡ‡Π½ΠΎΠ΅ ΡƒΡ€Π°Π²Π½Π΅Π½ΠΈΠ΅
def matrix_equation(X,y):
    a = np.dot(X.T, X)
    b = np.dot(X.T, y)
    return np.linalg.solve(a, b)

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ
ab_np = matrix_equation(x_np,y_np)
print ab_np

Ngatienzanise nguva yakashandiswa pakusarudza coefficients Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara, maererano ne3 nzira dzakaratidzwa.

Kodhi yekuverenga nguva yekuverenga

print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния расчСта коэффициСнтов Π±Π΅Π· использования Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:" + ' 33[0m'
% timeit ab_us = Kramer_method(x_us,y_us)
print '***************************************'
print
print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния расчСта коэффициСнтов с использованиСм псСвдообратной ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρ‹:" + ' 33[0m'
%timeit ab_np = pseudoinverse_matrix(x_np, y_np)
print '***************************************'
print
print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния расчСта коэффициСнтов с использованиСм ΠΌΠ°Ρ‚Ρ€ΠΈΡ‡Π½ΠΎΠ³ΠΎ уравнСния:" + ' 33[0m'
%timeit ab_np = matrix_equation(x_np, y_np)

Kugadzirisa equation yekureruka kwemutsara

Ne data shoma shoma, basa re "self-written" rinobuda mberi, iro rinowana coefficients uchishandisa nzira yeCramer.

Iye zvino unogona kuenda kune dzimwe nzira dzekutsvaga coefficients Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara.

Gradient Descent

Kutanga, ngatitsanangure kuti gradient chii. Zvichitaurwa zviri nyore, gradient ichikamu chinoratidza kwainoenda kunonyanya kukura kwechinhu. Nekufananidza nekukwira gomo, pakatarisana ne gradient ndipo panokwirwa mawere kusvika pamusoro pegomo. Kukudziridza muenzaniso negomo, tinoyeuka kuti isu tinoda kudzika kwakadzika kuitira kuti tisvike kunzvimbo yakaderera nekukurumidza sezvinobvira, ndiko kuti, zvishoma - nzvimbo iyo basa risingawedzere kana kuderera. Panguva ino derivative inenge yakaenzana ne zero. Naizvozvo, hatidi gradient, asi antigradient. Kuti uwane antigradient iwe unongoda kuwedzera gradient nayo -1 (kubvisa imwe).

Ngatitarisei kune chokwadi chekuti basa rinogona kuve neakati wandei minima, uye tadzika mune imwe yadzo tichishandisa iyo algorithm inotsanangurwa pazasi, isu hatizokwanisa kuwana imwe shoma, inogona kunge yakaderera pane yakawanikwa. Ngatimbozororai, iyi haisi tyisidziro kwatiri! Kwatiri isu tiri kubata nehumwe hushoma, kubvira basa redu Kugadzirisa equation yekureruka kwemutsara pagirafu pane chirevo chenguva dzose. Uye sezvo isu tese tichifanira kuziva zvakanyanya kubva kuchikoro chedu masvomhu kosi, parabola ine imwe chete shoma.

Mushure mekunge tawana chikonzero nei taida gradient, uye zvakare kuti gradient chikamu, kureva, vheji ine yakapihwa coordination, ari chaizvo coefficients akafanana. Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara tinogona kushandisa gradient descent.

Ndisati ndatanga, ini ndinokurudzira kuverenga mitsara mishoma nezve descent algorithm:

  • Isu tinosarudza mune pseudo-random nzira iyo kurongeka kweiyo coefficients Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara. Mumuenzaniso wedu, tichatsanangura coefficients pedyo ne zero. Iyi itsika yakajairika, asi imwe neimwe inogona kunge iine maitiro ayo.
  • Kubva pakurongeka Kugadzirisa equation yekureruka kwemutsara bvisa kukosha kwe 1st kurongeka chikamu chinotorwa panzvimbo Kugadzirisa equation yekureruka kwemutsara. Saka, kana derivative yakanaka, ipapo basa rinowedzera. Naizvozvo, nekubvisa kukosha kweiyo derivative, isu tichafamba munzira yakapesana nekukura, ndiko kuti, munzira yekudzika. Kana derivative isiri iyo, ipapo basa panguva ino rinodzikira uye nekubvisa kukosha kweiyo derivative tinofamba munzira yekudzika.
  • Isu tinoita basa rakafanana necoordination Kugadzirisa equation yekureruka kwemutsara: bvisa kukosha kwechidimbu chinotorwa panzvimbo yacho Kugadzirisa equation yekureruka kwemutsara.
  • Kuti urege kusvetuka pamusoro pehushoma uye kubhururuka munzvimbo yakadzika, zvinodikanwa kuseta saizi yenhanho munzira yekudzika. Kazhinji, iwe unogona kunyora chinyorwa chese nezve maitiro ekuseta nhanho nemazvo uye maitiro ekurishandura panguva yekudzika kuitira kudzikisira mitengo yemakomputa. Asi ikozvino tine basa rakasiyana rakasiyana pamberi pedu, uye isu tichagadzira nhanho saizi tichishandisa nzira yesainzi ye "poke" kana, sezvavanotaura mune yakafanana parlance, empirically.
  • Kana tave kubva kune dzakapihwa macoordinates Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara bvisa kukosha kwezvakatorwa, tinowana mitsva inoronga Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara. Isu tinotora danho rinotevera (kubvisa), ratova kubva kune akaverengerwa marongero. Uye saka kutenderera kunotanga zvakare uye zvakare, kusvika iyo inodiwa convergence yawanikwa.

Zvose! Iye zvino tagadzirira kuenda kunotsvaga goroji rakadzika reMariana Trench. Ngatitangei.

Code ye gradient descent

# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска Π±Π΅Π· использования Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy. 
# Ѐункция Π½Π° Π²Ρ…ΠΎΠ΄ ΠΏΡ€ΠΈΠ½ΠΈΠΌΠ°Π΅Ρ‚ Π΄ΠΈΠ°ΠΏΠ°Π·ΠΎΠ½Ρ‹ Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ x,y, Π΄Π»ΠΈΠ½Ρƒ шага (ΠΏΠΎ ΡƒΠΌΠΎΠ»Ρ‡Π°Π½ΠΈΡŽ=0,1), Π΄ΠΎΠΏΡƒΡΡ‚ΠΈΠΌΡƒΡŽ ΠΏΠΎΠ³Ρ€Π΅ΡˆΠ½ΠΎΡΡ‚ΡŒ(tolerance)
def gradient_descent_usual(x_us,y_us,l=0.1,tolerance=0.000000000001):
    # сумма Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ (всС мСсяца)
    sx = sum(x_us)
    # сумма истинных ΠΎΡ‚Π²Π΅Ρ‚ΠΎΠ² (Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠ° Π·Π° вСсь ΠΏΠ΅Ρ€ΠΈΠΎΠ΄)
    sy = sum(y_us)
    # сумма произвСдСния Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ Π½Π° истинныС ΠΎΡ‚Π²Π΅Ρ‚Ρ‹
    list_xy = []
    [list_xy.append(x_us[i]*y_us[i]) for i in range(len(x_us))]
    sxy = sum(list_xy)
    # сумма ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    list_x_sq = []
    [list_x_sq.append(x_us[i]**2) for i in range(len(x_us))]
    sx_sq = sum(list_x_sq)
    # количСство Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    num = len(x_us)
    # Π½Π°Ρ‡Π°Π»ΡŒΠ½Ρ‹Π΅ значСния коэффициСнтов, ΠΎΠΏΡ€Π΅Π΄Π΅Π»Π΅Π½Π½Ρ‹Π΅ псСвдослучайным ΠΎΠ±Ρ€Π°Π·ΠΎΠΌ
    a = float(random.uniform(-0.5, 0.5))
    b = float(random.uniform(-0.5, 0.5))
    # создаСм массив с ошибками, для старта ΠΈΡΠΏΠΎΠ»ΡŒΠ·ΡƒΠ΅ΠΌ значСния 1 ΠΈ 0
    # послС Π·Π°Π²Π΅Ρ€ΡˆΠ΅Π½ΠΈΡ спуска стартовыС значСния ΡƒΠ΄Π°Π»ΠΈΠΌ
    errors = [1,0]
    # запускаСм Ρ†ΠΈΠΊΠ» спуска
    # Ρ†ΠΈΠΊΠ» Ρ€Π°Π±ΠΎΡ‚Π°Π΅Ρ‚ Π΄ΠΎ Ρ‚Π΅Ρ… ΠΏΠΎΡ€, ΠΏΠΎΠΊΠ° ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠ΅ послСднСй ошибки суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ ΠΏΡ€Π΅Π΄Ρ‹Π΄ΡƒΡ‰Π΅ΠΉ, Π½Π΅ Π±ΡƒΠ΄Π΅Ρ‚ мСньшС tolerance
    while abs(errors[-1]-errors[-2]) > tolerance:
        a_step = a - l*(num*a + b*sx - sy)/num
        b_step = b - l*(a*sx + b*sx_sq - sxy)/num
        a = a_step
        b = b_step
        ab = [a,b]
        errors.append(errors_sq_Kramer_method(ab,x_us,y_us))
    return (ab),(errors[2:])

# запишСм массив Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ 
list_parametres_gradient_descence = gradient_descent_usual(x_us,y_us,l=0.1,tolerance=0.000000000001)


print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_gradient_descence[0][0],3)
print 'b =', round(list_parametres_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_gradient_descence[1][-1],3)
print



print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_gradient_descence[1])
print

Kugadzirisa equation yekureruka kwemutsara

Takanyura kuzasi chaiko kweMariana Trench uye ikoko takawana ese akafanana coefficient values Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara, zvinova ndizvo chaizvo zvaifanira kutarisirwa.

Ngatitore imwe dive, panguva ino chete, mota yedu yegungwa yakadzika ichazadzwa nehumwe matekinoroji, kureva raibhurari. numpy.

Kodhi ye gradient descent (NumPy)

# ΠΏΠ΅Ρ€Π΅Π΄ Ρ‚Π΅ΠΌ ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΡ‚ΡŒ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска с использованиСм Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy, 
# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ опрСдСлСния суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ Ρ‚Π°ΠΊΠΆΠ΅ с использованиСм NumPy
def error_square_numpy(ab,x_np,y_np):
    y_pred = np.dot(x_np,ab)
    error = y_pred - y_np
    return sum((error)**2)

# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска с использованиСм Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy. 
# Ѐункция Π½Π° Π²Ρ…ΠΎΠ΄ ΠΏΡ€ΠΈΠ½ΠΈΠΌΠ°Π΅Ρ‚ Π΄ΠΈΠ°ΠΏΠ°Π·ΠΎΠ½Ρ‹ Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ x,y, Π΄Π»ΠΈΠ½Ρƒ шага (ΠΏΠΎ ΡƒΠΌΠΎΠ»Ρ‡Π°Π½ΠΈΡŽ=0,1), Π΄ΠΎΠΏΡƒΡΡ‚ΠΈΠΌΡƒΡŽ ΠΏΠΎΠ³Ρ€Π΅ΡˆΠ½ΠΎΡΡ‚ΡŒ(tolerance)
def gradient_descent_numpy(x_np,y_np,l=0.1,tolerance=0.000000000001):
    # сумма Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ (всС мСсяца)
    sx = float(sum(x_np[:,1]))
    # сумма истинных ΠΎΡ‚Π²Π΅Ρ‚ΠΎΠ² (Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠ° Π·Π° вСсь ΠΏΠ΅Ρ€ΠΈΠΎΠ΄)
    sy = float(sum(y_np))
    # сумма произвСдСния Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ Π½Π° истинныС ΠΎΡ‚Π²Π΅Ρ‚Ρ‹
    sxy = x_np*y_np
    sxy = float(sum(sxy[:,1]))
    # сумма ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    sx_sq = float(sum(x_np[:,1]**2))
    # количСство Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    num = float(x_np.shape[0])
    # Π½Π°Ρ‡Π°Π»ΡŒΠ½Ρ‹Π΅ значСния коэффициСнтов, ΠΎΠΏΡ€Π΅Π΄Π΅Π»Π΅Π½Π½Ρ‹Π΅ псСвдослучайным ΠΎΠ±Ρ€Π°Π·ΠΎΠΌ
    a = float(random.uniform(-0.5, 0.5))
    b = float(random.uniform(-0.5, 0.5))
    # создаСм массив с ошибками, для старта ΠΈΡΠΏΠΎΠ»ΡŒΠ·ΡƒΠ΅ΠΌ значСния 1 ΠΈ 0
    # послС Π·Π°Π²Π΅Ρ€ΡˆΠ΅Π½ΠΈΡ спуска стартовыС значСния ΡƒΠ΄Π°Π»ΠΈΠΌ
    errors = [1,0]
    # запускаСм Ρ†ΠΈΠΊΠ» спуска
    # Ρ†ΠΈΠΊΠ» Ρ€Π°Π±ΠΎΡ‚Π°Π΅Ρ‚ Π΄ΠΎ Ρ‚Π΅Ρ… ΠΏΠΎΡ€, ΠΏΠΎΠΊΠ° ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠ΅ послСднСй ошибки суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ ΠΏΡ€Π΅Π΄Ρ‹Π΄ΡƒΡ‰Π΅ΠΉ, Π½Π΅ Π±ΡƒΠ΄Π΅Ρ‚ мСньшС tolerance
    while abs(errors[-1]-errors[-2]) > tolerance:
        a_step = a - l*(num*a + b*sx - sy)/num
        b_step = b - l*(a*sx + b*sx_sq - sxy)/num
        a = a_step
        b = b_step
        ab = np.array([[a],[b]])
        errors.append(error_square_numpy(ab,x_np,y_np))
    return (ab),(errors[2:])

# запишСм массив Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ 
list_parametres_gradient_descence = gradient_descent_numpy(x_np,y_np,l=0.1,tolerance=0.000000000001)

print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_gradient_descence[0][0],3)
print 'b =', round(list_parametres_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_gradient_descence[1][-1],3)
print

print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_gradient_descence[1])
print

Kugadzirisa equation yekureruka kwemutsara
Coefficient values Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara isingachinjiki.

Ngatitarisei kuti chikanganiso chakachinja sei panguva yekudzika kwegradient, kureva kuti, huwandu hwekutsauka kwakapetwa kwakachinja sei nedanho rega rega.

Kodhi yekuronga uwandu hwemakona akatsauka

print 'Π“Ρ€Π°Ρ„ΠΈΠΊβ„–4 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово"'
plt.plot(range(len(list_parametres_gradient_descence[1])), list_parametres_gradient_descence[1], color='red', lw=3)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

Girafu nhamba 4 "Huwandu hwekutsauka kwakapetwa panguva yekudzika kwegradient"

Kugadzirisa equation yekureruka kwemutsara

Pagirafu tinoona kuti nenhanho imwe neimwe kukanganisa kunodzikira, uye mushure meimwe nhamba yekudzokorora tinoona mutsara wakachinjika.

Chekupedzisira, ngatifungidzire mutsauko wenguva yekushandisa kodhi:

Kodhi yekuona gradient descent kuverenga nguva

print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска Π±Π΅Π· использования Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:" + ' 33[0m'
%timeit list_parametres_gradient_descence = gradient_descent_usual(x_us,y_us,l=0.1,tolerance=0.000000000001)
print '***************************************'
print

print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска с использованиСм Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:" + ' 33[0m'
%timeit list_parametres_gradient_descence = gradient_descent_numpy(x_np,y_np,l=0.1,tolerance=0.000000000001)

Kugadzirisa equation yekureruka kwemutsara

Zvichida tiri kuita chimwe chinhu chisina kunaka, asi zvakare iri nyore "rakanyorwa-pamba" basa risingashandisi raibhurari. numpy inokunda nguva yekuverenga yebasa uchishandisa raibhurari numpy.

Asi isu hatisi kumira, asi tiri kufamba takananga kudzidza imwe nzira inonakidza yekugadzirisa iyo yakapfava mutsara regression equation. Meet!

Stochastic gradient descent

Kuti unzwisise nekukurumidza musimboti wekushanda kwe stochastic gradient descent, zviri nani kuona misiyano yayo kubva kune yakajairika gradient descent. Isu, kana iri nyaya ye gradient descent, mune equation yezvinobva pa Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara akashandisa zviverengero zvehunhu hwese maficha uye mhinduro dzechokwadi dziripo mumuenzaniso (kureva kuti, sums dzezvose. Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara) Mu stochastic gradient descent, isu hatizoshandisa ese hunhu huripo mumuenzaniso, asi pachinzvimbo, pseudo-randomly sarudza iyo inonzi sampuli index uye shandisa hunhu hwayo.

Semuenzaniso, kana index yakatemwa kuva nhamba 3 (matatu), saka tinotora maitiro Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara, tobva taisa ma values ​​mumaderivative equations toona macoordinates matsva. Zvadaro, tasarudza marongesheni, isu zvakare pseudo-randomly tinoona iyo sampuli index, kutsiva hunhu hunoenderana neindekisi mune chikamu mutsauko equation, uye toona marongero nenzira itsva. Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara etc. kusvikira kusanganiswa kwaita girini. Pakutanga kuona, zvingasaita sekuti izvi zvinogona kushanda zvachose, asi zvinoita. Ichokwadi kuti zvakakosha kuziva kuti chikanganiso hachidzike nenhanho imwe neimwe, asi pane zvechokwadi maitiro.

Ndezvipi zvakanakira stochastic gradient descent pane yakajairwa? Kana saizi yedu yemuenzaniso yakakura uye yakayerwa mumakumi ezviuru zvehukoshi, zvino zviri nyore kugadzirisa, toti, zviuru zvisingaverengeki zvadzo, pane sampuli yese. Apa ndipo panopinda stochastic gradient descent. Muchiitiko chedu, hongu, isu hatizocherechedzi misiyano yakawanda.

Ngatitarisei kodhi.

Kodhi ye stochastic gradient descent

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ стох.Π³Ρ€Π°Π΄.шага
def stoch_grad_step_usual(vector_init, x_us, ind, y_us, l):
#     Π²Ρ‹Π±ΠΈΡ€Π°Π΅ΠΌ Π·Π½Π°Ρ‡Π΅Π½ΠΈΠ΅ икс, ΠΊΠΎΡ‚ΠΎΡ€ΠΎΠ΅ соотвСтствуСт случайному Π·Π½Π°Ρ‡Π΅Π½ΠΈΡŽ ΠΏΠ°Ρ€Π°ΠΌΠ΅Ρ‚Ρ€Π° ind 
# (см.Ρ„-Ρ†ΠΈΡŽ stoch_grad_descent_usual)
    x = x_us[ind]
#     рассчитывыаСм Π·Π½Π°Ρ‡Π΅Π½ΠΈΠ΅ y (Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΡƒ), которая соотвСтствуСт Π²Ρ‹Π±Ρ€Π°Π½Π½ΠΎΠΌΡƒ Π·Π½Π°Ρ‡Π΅Π½ΠΈΡŽ x
    y_pred = vector_init[0] + vector_init[1]*x_us[ind]
#     вычисляСм ΠΎΡˆΠΈΠ±ΠΊΡƒ расчСтной Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠΈ ΠΎΡ‚Π½ΠΎΡΠΈΡ‚Π΅Π»ΡŒΠ½ΠΎ прСдставлСнной Π² Π²Ρ‹Π±ΠΎΡ€ΠΊΠ΅
    error = y_pred - y_us[ind]
#     опрСдСляСм ΠΏΠ΅Ρ€Π²ΡƒΡŽ ΠΊΠΎΠΎΡ€Π΄ΠΈΠ½Π°Ρ‚Ρƒ Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π° ab
    grad_a = error
#     опрСдСляСм Π²Ρ‚ΠΎΡ€ΡƒΡŽ ΠΊΠΎΠΎΡ€Π΄ΠΈΠ½Π°Ρ‚Ρƒ ab
    grad_b = x_us[ind]*error
#     вычисляСм Π½ΠΎΠ²Ρ‹ΠΉ Π²Π΅ΠΊΡ‚ΠΎΡ€ коэффициСнтов
    vector_new = [vector_init[0]-l*grad_a, vector_init[1]-l*grad_b]
    return vector_new


# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ стох.Π³Ρ€Π°Π΄.спуска
def stoch_grad_descent_usual(x_us, y_us, l=0.1, steps = 800):
#     для самого Π½Π°Ρ‡Π°Π»Π° Ρ€Π°Π±ΠΎΡ‚Ρ‹ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΠΈ Π·Π°Π΄Π°Π΄ΠΈΠΌ Π½Π°Ρ‡Π°Π»ΡŒΠ½Ρ‹Π΅ значСния коэффициСнтов
    vector_init = [float(random.uniform(-0.5, 0.5)), float(random.uniform(-0.5, 0.5))]
    errors = []
#     запустим Ρ†ΠΈΠΊΠ» спуска
# Ρ†ΠΈΠΊΠ» расчитан Π½Π° ΠΎΠΏΡ€Π΅Π΄Π΅Π»Π΅Π½Π½ΠΎΠ΅ количСство шагов (steps)
    for i in range(steps):
        ind = random.choice(range(len(x_us)))
        new_vector = stoch_grad_step_usual(vector_init, x_us, ind, y_us, l)
        vector_init = new_vector
        errors.append(errors_sq_Kramer_method(vector_init,x_us,y_us))
    return (vector_init),(errors)


# запишСм массив Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ 
list_parametres_stoch_gradient_descence = stoch_grad_descent_usual(x_us, y_us, l=0.1, steps = 800)

print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_stoch_gradient_descence[0][0],3)
print 'b =', round(list_parametres_stoch_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_stoch_gradient_descence[1][-1],3)
print

print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² стохастичСском Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_stoch_gradient_descence[1])

Kugadzirisa equation yekureruka kwemutsara

Isu tinotarisa zvakanyatsonaka kune coefficients uye tinozvibata isu tichibvunza mubvunzo "Izvi zvingave sei?" Tine mamwe macoefficient values Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara. Pamwe stochastic gradient descent yakawana mamwe maparamita akanyanya equation? Sezvineiwo kwete. Zvakakwana kuti utarise huwandu hwekutsauka kwakapetwa uye kuona kuti nehunyowani hutsva hweiyo coefficients, iko kukanganisa kwakakura. Hatisi kukurumidza kuora mwoyo. Ngativake girafu rekuchinja kukanganisa.

Kodhi yekuronga huwandu hwekutsauka kwakapetwa mu stochastic gradient descent

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–5 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1])), list_parametres_stoch_gradient_descence[1], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

Girafu nhamba 5 "Sm of squared deviations panguva ye stochastic gradient descent"

Kugadzirisa equation yekureruka kwemutsara

Tichitarisa purogiramu, zvinhu zvose zvinowira munzvimbo uye zvino tichagadzirisa zvose.

Saka chii chakaitika? Zvinotevera zvakaitika. Kana isu tikasarudza mwedzi zvisina tsarukano, zvino ndeyemwedzi wakasarudzwa iyo algorithm yedu inotsvaga kudzikisa chikanganiso mukuverenga mari. Zvadaro tinosarudza imwe mwedzi uye tinodzokorora kuverenga, asi tinoderedza kukanganisa kwemwedzi wechipiri wakasarudzwa. Zvino rangarira kuti mwedzi miviri yekutanga inotsauka zvakanyanya kubva pamutsara wemutsara wakapfava weregression equation. Izvi zvinoreva kuti kana imwe yemwedzi miviri iyi yasarudzwa, nekuderedza kukanganisa kweumwe neumwe wavo, algorithm yedu inowedzera zvakanyanya kukanganisa kwemuenzaniso wose. Saka ndoita sei? Mhinduro iri nyore: unofanirwa kuderedza danho rekudzika. Mushure mezvose, nekudzikisa danho rekudzika, iko kukanganisa kuchamirawo "kusvetuka" kumusoro uye pasi. Kana kuti pane kudaro, kukanganisa kwe "kusvetuka" hakuzoregi, asi hazvizoiti nokukurumidza :) Ngationgororei.

Kodhi yekumhanyisa SGD ine zvidiki zvinowedzera

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ, ΡƒΠΌΠ΅Π½ΡŒΡˆΠΈΠ² шаг Π² 100 Ρ€Π°Π· ΠΈ ΡƒΠ²Π΅Π»ΠΈΡ‡ΠΈΠ² количСство шагов ΡΠΎΠΎΡ‚Π²Π΅Ρ‚ΡΠ²ΡƒΡŽΡ‰Π΅ 
list_parametres_stoch_gradient_descence = stoch_grad_descent_usual(x_us, y_us, l=0.001, steps = 80000)

print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_stoch_gradient_descence[0][0],3)
print 'b =', round(list_parametres_stoch_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_stoch_gradient_descence[1][-1],3)
print



print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² стохастичСском Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_stoch_gradient_descence[1])

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–6 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1])), list_parametres_stoch_gradient_descence[1], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

Kugadzirisa equation yekureruka kwemutsara

Girafu Nha. 6 "Huwandu hwekutsauka kwakapetwa panguva yekudzika kwe stochastic gradient (matanho zviuru makumi masere)"

Kugadzirisa equation yekureruka kwemutsara

Iyo coefficients yakagadziridzwa, asi haisati yakwana. Hypothetically, izvi zvinogona kugadziriswa nenzira iyi. Isu tinosarudza, semuenzaniso, mune yekupedzisira 1000 iterations kukosha kweiyo coefficients iyo yakaderera kukanganisa yakaitwa. Ichokwadi, nokuda kweizvi tichazofanirawo kunyora pasi kukosha kweiyo coefficients pachavo. Hatizoiti izvi, asi kuti teerera kune purogiramu. Inotaridzika yakatsetseka uye kukanganisa kunoratidzika kuderera zvakaenzana. Chaizvoizvo ichi hachisi chokwadi. Ngatitarisei kune ekutanga 1000 iterations tozvienzanisa neyekupedzisira.

Kodhi yeSGD chati (yekutanga 1000 matanho)

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–7 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово. ΠŸΠ΅Ρ€Π²Ρ‹Π΅ 1000 ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1][:1000])), 
         list_parametres_stoch_gradient_descence[1][:1000], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–7 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово. ПослСдниС 1000 ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1][-1000:])), 
         list_parametres_stoch_gradient_descence[1][-1000:], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

Girafu nhamba 7 "Sum of squared deviations SGD (matanho ekutanga 1000)"

Kugadzirisa equation yekureruka kwemutsara

Girafu nhamba 8 "Sum of squared deviations SGD (matanho 1000 ekupedzisira)"

Kugadzirisa equation yekureruka kwemutsara

Pakutanga kwekudzika, tinoona yunifomu uye kuderera kwakanyanya mukukanganisa. Mukudzokororwa kwekupedzisira, tinoona kuti kukanganisa kunotenderera nekutenderedza kukosha kwe1,475 uye pane dzimwe nguva inotoenzana nehuhu hwakanyanya kukosha, asi ichiri kukwira ... ndinodzokorora, unogona kunyora pasi kukosha kweiyo coefficients Kugadzirisa equation yekureruka kwemutsara ΠΈ Kugadzirisa equation yekureruka kwemutsara, uye wozosarudza izvo izvo kukanganisa kuri kushoma. Nekudaro, isu takave nedambudziko rakakura: taifanira kutora zviuru makumi masere nhanho (ona kodhi) kuwana kukosha padhuze nekwakanyanya. Uye izvi zvinotopokana nepfungwa yekuchengetedza computation nguva ne stochastic gradient descent maererano ne gradient descent. Chii chinogona kugadziriswa uye kuvandudzwa? Hazvisi zvakaoma kuona kuti mukutanga kudzokororwa tiri kuburuka nechivimbo uye, naizvozvo, tinofanira kusiya danho guru mukutanga kudzokororwa uye kuderedza danho sezvatinoenda mberi. Hatingaite izvi munyaya ino - yatove yakareba. Avo vanoshuvira vanogona kuzvifungira kuti voita sei izvi, hazvina kuoma :)

Zvino ngatiitei stochastic gradient descent tichishandisa raibhurari numpy (uye ngatirege kugumburwa pamusoro pematombo atakamboona)

Kodhi yeStochastic Gradient Descent (NumPy)

# для Π½Π°Ρ‡Π°Π»Π° напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ шага
def stoch_grad_step_numpy(vector_init, X, ind, y, l):
    x = X[ind]
    y_pred = np.dot(x,vector_init)
    err = y_pred - y[ind]
    grad_a = err
    grad_b = x[1]*err
    return vector_init - l*np.array([grad_a, grad_b])

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ стохастичСского Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска
def stoch_grad_descent_numpy(X, y, l=0.1, steps = 800):
    vector_init = np.array([[np.random.randint(X.shape[0])], [np.random.randint(X.shape[0])]])
    errors = []
    for i in range(steps):
        ind = np.random.randint(X.shape[0])
        new_vector = stoch_grad_step_numpy(vector_init, X, ind, y, l)
        vector_init = new_vector
        errors.append(error_square_numpy(vector_init,X,y))
    return (vector_init), (errors)

# запишСм массив Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ 
list_parametres_stoch_gradient_descence = stoch_grad_descent_numpy(x_np, y_np, l=0.001, steps = 80000)

print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_stoch_gradient_descence[0][0],3)
print 'b =', round(list_parametres_stoch_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_stoch_gradient_descence[1][-1],3)
print



print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² стохастичСском Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_stoch_gradient_descence[1])
print

Kugadzirisa equation yekureruka kwemutsara

Iwo ma values ​​akazoita kunge akafanana neapo achidzika pasina kushandisa numpy. Zvisinei, izvi zvine musoro.

Ngationei kuti stochastic gradient descents yakatitorera nguva yakareba sei.

Kodhi yekuona SGD kuverenga nguva (80 zviuru matanho)

print ' 33[1m' + ' 33[4m' +
"ВрСмя выполнСния стохастичСского Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска Π±Π΅Π· использования Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:"
+ ' 33[0m'
%timeit list_parametres_stoch_gradient_descence = stoch_grad_descent_usual(x_us, y_us, l=0.001, steps = 80000)
print '***************************************'
print

print ' 33[1m' + ' 33[4m' +
"ВрСмя выполнСния стохастичСского Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска с использованиСм Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:"
+ ' 33[0m'
%timeit list_parametres_stoch_gradient_descence = stoch_grad_descent_numpy(x_np, y_np, l=0.001, steps = 80000)

Kugadzirisa equation yekureruka kwemutsara

Kuwedzera mukati mesango, makore akasvibira: zvakare, "self-written" fomula inoratidza chigumisiro chakanakisisa. Zvose izvi zvinoratidza kuti panofanira kuva nedzimwe nzira dzisinganyatsooneki dzekushandisa raibhurari numpy, izvo zvinokurumidzisa mashandiro emakomputa. Munyaya ino hatisi kuzodzidza nezvavo. Pachave nechimwe chinhu chekufunga nezvazvo munguva yako yakasununguka :)

Ngatidimburei

Ndisati ndapfupikisa, ndinoda kupindura mubvunzo ungangobva kumuverengi wedu anodiwa. Sei, chaizvoizvo, "kutambudzwa" kwakadaro nemadzinza, nei tichida kufamba tichikwira nekudzika mugomo (kunyanya pasi) kuti tiwane nzvimbo yakaderera yakaderera, kana tine mumaoko edu chigadziro chakasimba uye chiri nyore, fomu yemhinduro yekuongorora, iyo inotitumira ipapo ipapo kunzvimbo chaiyo?

Mhinduro yomubvunzo uyu iri pamusoro. Zvino tatarisa muenzaniso wakapfava, umo mhinduro yechokwadi iri Kugadzirisa equation yekureruka kwemutsara zvinoenderana nechiratidzo chimwe Kugadzirisa equation yekureruka kwemutsara. Iwe hausi kuona izvi kazhinji muhupenyu, saka ngatimbofungidzira kuti isu tine 2, 30, 50 kana kupfuura zviratidzo. Ngatiwedzere kune izvi zviuru, kana kunyange makumi ezviuru zvehukoshi kune yega yega hunhu. Muchiitiko ichi, mhinduro yekuongorora haigoni kumira muedzo uye inokundikana. Nekudaro, kudzika kwe gradient uye kusiyanisa kwayo kunozotisvitsa zvishoma nezvishoma asi zvechokwadi kutiswededza pedyo nechinangwa - hushoma hwebasa. Uye usazvinetse nekumhanya - isu tichazotarisa nzira dzinozotitendera kuseta nekugadzirisa nhanho kureba (kureva, kumhanya).

Uye ikozvino pfupiso pfupi chaiyo.

Chekutanga, ndinovimba kuti izvo zvakaratidzwa muchinyorwa zvichabatsira kutanga "data masayendisiti" mukunzwisisa nzira yekugadzirisa nyore (uye kwete chete) mutsara regression equations.

Chechipiri, takatarisa nzira dzinoverengeka dzekugadzirisa equation. Iye zvino, zvichienderana nemamiriro ezvinhu, tinogona kusarudza iyo inonyatsokodzera kugadzirisa dambudziko.

Chechitatu, takaona simba remamwe marongero, kureva gradient descent nhanho kureba. Iyi parameter haigone kuregererwa. Sezvakataurwa pamusoro apa, kuti uderedze mutengo wekuverenga, nhanho urefu hunofanira kuchinjwa panguva yekudzika.

Chechina, mune yedu, "yakanyorwa-pamba" mabasa airatidza yakanakisa nguva mhedzisiro yekuverenga. Izvi zvimwe zvinokonzerwa nekusanyanya kushandiswa kwehunyanzvi kwehunyanzvi hwe library numpy. Asi ngazvive izvo, mhedzisiro inotevera inozviratidza pachayo. Kune rumwe rutivi, dzimwe nguva zvakakodzera kubvunza pfungwa dzakasimbiswa, uye kune rumwe rutivi, hazvina kukodzera nguva dzose kuomesa zvinhu zvose - pane zvinopesana, dzimwe nguva nzira iri nyore yekugadzirisa dambudziko inonyanya kushanda. Uye sezvo chinangwa chedu chaive chekuongorora nzira nhatu dzekugadzirisa mutsara wakapfava regression equation, kushandiswa kwe "kuzvinyora" mabasa kwaikwana kwatiri.

Literature (kana chimwe chinhu chakadaro)

1. Linear regression

http://statistica.ru/theory/osnovy-lineynoy-regressii/

2. Kashoma masikweya nzira

mathprofi.ru/metod_naimenshih_kvadratov.html

3. Kubva

www.mathprofi.ru/chastnye_proizvodnye_primery.html

4. Gradient

mathprofi.ru/proizvodnaja_po_napravleniju_i_gradient.html

5. Gradient descent

habr.com/en/post/471458

habr.com/en/post/307312

artemarakcheev.com//2017-12-31/linear_regression

6. NumPy raibhurari

docs.scipy.org/doc/numpy-1.10.1/reference/generated/numpy.linalg.solve.html

docs.scipy.org/doc/numpy-1.10.0/reference/generated/numpy.linalg.pinv.html

pythonworld.ru/numpy/2.html

Source: www.habr.com

Voeg