Ukusombulula i-equation yokuhlehla komgca olula

Inqaku lixoxa ngeendlela ezininzi zokumisela i-equation yemathematika yomgca wokubuyisela umva olula (odibeneyo).

Zonke iindlela zokusombulula inxaki ekuxoxwe ngayo apha zisekwe kwindlela yesikwere esincinci. Makhe sichaze iindlela ngolu hlobo lulandelayo:

  • Isisombululo sohlalutyo
  • Ukuhla kweGradient
  • Ukwehla kwe-Stochastic gradient

Kwindlela nganye yokusombulula i-equation yomgca othe tye, inqaku libonelela ngemisebenzi eyahlukeneyo, eyahlulahlulwe ikakhulu kuleyo ibhaliweyo ngaphandle kokusebenzisa ithala leencwadi. INumPy kunye nezo zisetyenziselwa izibalo INumPy. Kukholelwa ukuba ukusetyenziswa ngobuchule INumPy iya kunciphisa iindleko zekhompyutha.

Yonke ikhowudi enikwe kwinqaku ibhalwe ngolwimi i-python 2.7 usebenzisa Incwadana yokubhalela kaJupyter. Ikhowudi yomthombo kunye nefayile eneenkcukacha zesampulu zifakwe kuyo Github

Eli nqaku lijolise ngakumbi kubo bobabini abaqalayo kunye nabo sele beqalise ngokuthe ngcembe ukufunda icandelo elibanzi kakhulu kubukrelekrele bokwenziwa - ukufunda ngomatshini.

Ukuzekelisa umbandela, sisebenzisa umzekelo olula kakhulu.

Umzekelo iimeko

Sinemilinganiselo emihlanu ebonisa ukuxhomekeka Y ukusuka X (Uluhlu loku-1):

Itheyibhile 1 "Iimeko zomzekelo"

Ukusombulula i-equation yokuhlehla komgca olula

Siya kucinga ukuba amaxabiso Ukusombulula i-equation yokuhlehla komgca olula yinyanga yonyaka, kwaye Ukusombulula i-equation yokuhlehla komgca olula - ingeniso kule nyanga. Ngamanye amazwi, ingeniso ixhomekeke kwinyanga yonyaka, kwaye Ukusombulula i-equation yokuhlehla komgca olula - uphawu kuphela apho ingeniso ixhomekeke.

Umzekelo unjalo-njalo, zombini ukusuka kwindawo yokujonga ukuxhomekeka kwemeko yengeniso ngenyanga yonyaka, kwaye ukusuka kwindawo yokujonga inani lamaxabiso - bambalwa kakhulu kubo. Nangona kunjalo, ukwenza lula okunjalo kuya kwenza ukuba kube lula, njengoko bethetha, ukuchaza, kungekhona ngokulula, izinto ezenziwa ngabaqalayo. Kwaye nokulula kwamanani kuya kuvumela abo banqwenela ukusombulula umzekelo ephepheni ngaphandle kweendleko ezibalulekileyo zomsebenzi.

Makhe sicinge ukuba ukuxhomekeka okunikiweyo kumzekelo kunokuqikelelwa kakuhle ngezibalo zomgca wobuyiselo olula (odibeneyo) wefomu:

Ukusombulula i-equation yokuhlehla komgca olula

apho Ukusombulula i-equation yokuhlehla komgca olula yinyanga efunyenwe ngayo ingeniso; Ukusombulula i-equation yokuhlehla komgca olula - ingeniso ehambelana nenyanga, Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula zi-coefficients zokuhlehla zomgca oqikelelweyo.

Qaphela ukuba i-coefficient Ukusombulula i-equation yokuhlehla komgca olula kudla ngokubizwa ngokuba kukuthambeka okanye ukuthambeka komgca oqikelelweyo; imele imali apho i Ukusombulula i-equation yokuhlehla komgca olula xa itshintsha Ukusombulula i-equation yokuhlehla komgca olula.

Ngokucacileyo, umsebenzi wethu kumzekelo kukukhetha ii-coefficients ezinjalo kwi-equation Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula, apho ukutenxa kumaxabiso ethu engeniso ebalwayo ngenyanga ukusuka kwiimpendulo eziyinyani, i.e. amaxabiso aboniswe kwisampulu aya kuba mncinci.

Ubuncinci indlela yesikwere

Ngokweyona ndlela incinci yesikwere, ukutenxa kufuneka kubalwe ngokuyiphinda. Ubuchwephesha bukuvumela ukuba uphephe ukurhoxiswa okufanayo kweetenxa ukuba zineempawu ezichaseneyo. Umzekelo, ukuba kwimeko enye, ukutenxa +5 (kunye nesihlanu), kwaye kwenye -5 (thabatha isihlanu), emva koko isixa-mali sokutenxa siya kurhoxisa enye kwenye kwaye ibe ngu-0 (zero). Kuyenzeka ukuba ungenzi isikweri sokutenxa, kodwa ukusebenzisa ipropathi yemodyuli kwaye emva koko zonke izitenxo ziya kuba zilungile kwaye ziya kuqokelelana. Asiyi kuhlala kule ngongoma ngokweenkcukacha, kodwa sibonisa nje ukuba ukuze kube lula ukubala, kuyinto yesiko ukwenza isikwere ukutenxa.

Le yindlela ifomyula ejongeka ngayo esiya kuthi ngayo sigqibe ngesona sixa sincinci sokunxaxha okuphindwe kabini (iimpazamo):

Ukusombulula i-equation yokuhlehla komgca olula

apho Ukusombulula i-equation yokuhlehla komgca olula ngumsebenzi woqikelelo lweempendulo eziyinyani (oko kukuthi, ingeniso esiyibalileyo),

Ukusombulula i-equation yokuhlehla komgca olula ziimpendulo eziyinyani (ingeniso inikezelwe kwisampulu),

Ukusombulula i-equation yokuhlehla komgca olula sisalathiso sesampulu (inani lenyanga apho ukutenxa kumiselwe)

Masihlukanise umsebenzi, sichaze ii-equations zokwahlula ngokuyinxenye, kwaye silungele ukuqhubela phambili kwisisombululo sohlalutyo. Kodwa kuqala, makhe sithathe uhambo olufutshane malunga nokuba yintoni umahluko kwaye sikhumbule intsingiselo yejometri yento ephuma kuyo.

Umahluko

Umahluko kukusebenza kokufumana i-derivative yomsebenzi.

Isetyenziselwa ntoni i-derivative? I-derivative ye-function ibonakalisa izinga lotshintsho lomsebenzi kwaye isixelela indlela yayo. Ukuba i-derivative kwinqanaba elinikiweyo lilungile, ngoko umsebenzi uyakhula; kungenjalo, umsebenzi uyancipha. Kwaye ixabiso elikhulu le-absolute derivative, liphezulu izinga lotshintsho lwamaxabiso omsebenzi, kunye nokunyuka kwethambeka lomsebenzi wegrafu.

Umzekelo, phantsi kweemeko zeCartesian coordinate system, ixabiso le-derivative kwindawo ethi M (0,0) ilingana no. + 25 kuthetha ukuba kwinqanaba elinikiweyo, xa ixabiso litshintshiwe Ukusombulula i-equation yokuhlehla komgca olula ekunene ngeyunithi eqhelekileyo, ixabiso Ukusombulula i-equation yokuhlehla komgca olula ukwanda ngama-25 eeyunithi eziqhelekileyo. Kwigrafu kubonakala ngathi ukunyuka okuthe kratya kumaxabiso Ukusombulula i-equation yokuhlehla komgca olula ukusuka kwindawo enikiweyo.

Omnye umzekelo. Ixabiso le-derivative liyalingana -0,1 kuthetha ukuba xa ufuduswa Ukusombulula i-equation yokuhlehla komgca olula ngokweyunithi enye eqhelekileyo, ixabiso Ukusombulula i-equation yokuhlehla komgca olula iyancipha ngo-0,1 kuphela iyunithi eqhelekileyo. Kwangaxeshanye, kwigrafu yomsebenzi, sinokubona i-slope engabonakaliyo ezantsi. Ukuzoba umzekeliso nentaba, kuba ngathi sihla ngokucothayo kwithambeka elithambileyo ukusuka entabeni, ngokungafaniyo nomzekelo wangaphambili, apho bekufuneka sinyuke iincopho eziphakamileyo :)

Ngaloo ndlela, emva kokwahlula umsebenzi Ukusombulula i-equation yokuhlehla komgca olula ngeengxaki Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula, sichaza ulandelelwano lwe-1st olwahlukileyo lweeequations. Emva kokumisela ii-equations, siya kufumana inkqubo yee-equations ezimbini, ngokusombulula apho siya kuba nako ukukhetha amaxabiso anjalo ee-coefficients. Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula, apho amaxabiso ezinto eziphuma kuzo ezihambelanayo kumanqaku anikiweyo atshintsha ngexabiso elincinci kakhulu, kwaye kwimeko yesisombululo sohlalutyo ayitshintshi kwaphela. Ngamanye amazwi, umsebenzi wempazamo kwii-coefficients ezifunyenweyo uya kufikelela kubuncinci, kuba amaxabiso ezinto eziphuma kwinxenye kula manqaku aya kulingana no-zero.

Ngoko ke, ngokwemigaqo yokwahlulahlula, inxenye ye-equation derivative yomyalelo woku-1 ngokubhekiselele kwi-coefficient. Ukusombulula i-equation yokuhlehla komgca olula iya kuthatha ifom:

Ukusombulula i-equation yokuhlehla komgca olula

I-odolo yoku-1 ephuma kwinxaki ephuma kwinxaki ngokunxulumene Ukusombulula i-equation yokuhlehla komgca olula iya kuthatha ifom:

Ukusombulula i-equation yokuhlehla komgca olula

Ngenxa yoko, sifumene inkqubo yee-equations enesisombululo esilula sohlalutyo:

qala{inxaki*}
qala{amatyala}
na + bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i = 0

sumlimits_{i=1}^nx_i(a +bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i) = 0
isiphelo{amatyala}
isiphelo{inxaki*}

Ngaphambi kokusombulula i-equation, masilayishe kwangaphambili, sijonge ukuba ukulayishwa kuchanekile, kwaye sifomethe idatha.

Iyalayisha kunye nokufomatha idatha

Kufuneka kuqatshelwe ukuba ngenxa yento yokuba kwisisombululo sohlalutyo, kwaye emva koko kwi-gradient kunye ne-stochastic gradient descent, siya kusebenzisa ikhowudi kwiiyantlukwano ezimbini: usebenzisa ithala leencwadi. INumPy kwaye ngaphandle kokuyisebenzisa, ngoko siya kufuna ukufomatha idatha efanelekileyo (jonga ikhowudi).

Ukulayishwa kwedatha kunye nekhowudi yokucubungula

# ΠΈΠΌΠΏΠΎΡ€Ρ‚ΠΈΡ€ΡƒΠ΅ΠΌ всС Π½ΡƒΠΆΠ½Ρ‹Π΅ Π½Π°ΠΌ Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import math
import pylab as pl
import random

# Π³Ρ€Π°Ρ„ΠΈΠΊΠΈ ΠΎΡ‚ΠΎΠ±Ρ€Π°Π·ΠΈΠΌ Π² Jupyter
%matplotlib inline

# ΡƒΠΊΠ°ΠΆΠ΅ΠΌ Ρ€Π°Π·ΠΌΠ΅Ρ€ Π³Ρ€Π°Ρ„ΠΈΠΊΠΎΠ²
from pylab import rcParams
rcParams['figure.figsize'] = 12, 6

# ΠΎΡ‚ΠΊΠ»ΡŽΡ‡ΠΈΠΌ прСдупрСТдСния Anaconda
import warnings
warnings.simplefilter('ignore')

# Π·Π°Π³Ρ€ΡƒΠ·ΠΈΠΌ значСния
table_zero = pd.read_csv('data_example.txt', header=0, sep='t')

# посмотрим ΠΈΠ½Ρ„ΠΎΡ€ΠΌΠ°Ρ†ΠΈΡŽ ΠΎ Ρ‚Π°Π±Π»ΠΈΡ†Π΅ ΠΈ Π½Π° саму Ρ‚Π°Π±Π»ΠΈΡ†Ρƒ
print table_zero.info()
print '********************************************'
print table_zero
print '********************************************'

# ΠΏΠΎΠ΄Π³ΠΎΡ‚ΠΎΠ²ΠΈΠΌ Π΄Π°Π½Π½Ρ‹Π΅ Π±Π΅Π· использования NumPy

x_us = []
[x_us.append(float(i)) for i in table_zero['x']]
print x_us
print type(x_us)
print '********************************************'

y_us = []
[y_us.append(float(i)) for i in table_zero['y']]
print y_us
print type(y_us)
print '********************************************'

# ΠΏΠΎΠ΄Π³ΠΎΡ‚ΠΎΠ²ΠΈΠΌ Π΄Π°Π½Π½Ρ‹Π΅ с использованиСм NumPy

x_np = table_zero[['x']].values
print x_np
print type(x_np)
print x_np.shape
print '********************************************'

y_np = table_zero[['y']].values
print y_np
print type(y_np)
print y_np.shape
print '********************************************'

Ukubonakala

Ngoku, emva kokuba, okokuqala, silayishe idatha, okwesibini, sihlolisise ukuchaneka kokulayisha kwaye ekugqibeleni sifomethwe idatha, siya kuqhuba umboniso wokuqala. Indlela esoloko isetyenziselwa oku iploti iilayibrari Ulwandle. Kumzekelo wethu, ngenxa yamanani aqingqiweyo, akukho sizathu sokusebenzisa ithala leencwadi Ulwandle. Siya kusebenzisa ithala leencwadi eliqhelekileyo matplotlib kwaye ujonge nje isiqwenga.

Ikhowudi ye-scatterplot

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–1 "Π—Π°Π²ΠΈΡΠΈΠΌΠΎΡΡ‚ΡŒ Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠΈ ΠΎΡ‚ мСсяца Π³ΠΎΠ΄Π°"'

plt.plot(x_us,y_us,'o',color='green',markersize=16)
plt.xlabel('$Months$', size=16)
plt.ylabel('$Sales$', size=16)
plt.show()

Itshathi No. 1 β€œKuxhomekeke kwingeniso ngenyanga yonyaka”

Ukusombulula i-equation yokuhlehla komgca olula

Isisombululo sohlalutyo

Masisebenzise ezona zixhobo zixhaphakileyo kwi python kunye nokusombulula inkqubo yeenxaki:

qala{inxaki*}
qala{amatyala}
na + bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i = 0

sumlimits_{i=1}^nx_i(a +bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i) = 0
isiphelo{amatyala}
isiphelo{inxaki*}

Ngokomgaqo kaCramer siya kufumana i-determinant jikelele, kunye ne-determinants nge Ukusombulula i-equation yokuhlehla komgca olula nangokuthi Ukusombulula i-equation yokuhlehla komgca olula, emva koko, ukwahlula isilawuli nge Ukusombulula i-equation yokuhlehla komgca olula kwi-general determinant - fumana i-coefficient Ukusombulula i-equation yokuhlehla komgca olula, ngokufanayo sifumana i-coefficient Ukusombulula i-equation yokuhlehla komgca olula.

Ikhowudi yesisombululo sohlalutyo

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для расчСта коэффициСнтов a ΠΈ b ΠΏΠΎ ΠΏΡ€Π°Π²ΠΈΠ»Ρƒ ΠšΡ€Π°ΠΌΠ΅Ρ€Π°
def Kramer_method (x,y):
        # сумма Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ (всС мСсяца)
    sx = sum(x)
        # сумма истинных ΠΎΡ‚Π²Π΅Ρ‚ΠΎΠ² (Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠ° Π·Π° вСсь ΠΏΠ΅Ρ€ΠΈΠΎΠ΄)
    sy = sum(y)
        # сумма произвСдСния Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ Π½Π° истинныС ΠΎΡ‚Π²Π΅Ρ‚Ρ‹
    list_xy = []
    [list_xy.append(x[i]*y[i]) for i in range(len(x))]
    sxy = sum(list_xy)
        # сумма ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    list_x_sq = []
    [list_x_sq.append(x[i]**2) for i in range(len(x))]
    sx_sq = sum(list_x_sq)
        # количСство Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    n = len(x)
        # ΠΎΠ±Ρ‰ΠΈΠΉ ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΡ‚Π΅Π»ΡŒ
    det = sx_sq*n - sx*sx
        # ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΡ‚Π΅Π»ΡŒ ΠΏΠΎ a
    det_a = sx_sq*sy - sx*sxy
        # искомый ΠΏΠ°Ρ€Π°ΠΌΠ΅Ρ‚Ρ€ a
    a = (det_a / det)
        # ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΡ‚Π΅Π»ΡŒ ΠΏΠΎ b
    det_b = sxy*n - sy*sx
        # искомый ΠΏΠ°Ρ€Π°ΠΌΠ΅Ρ‚Ρ€ b
    b = (det_b / det)
        # ΠΊΠΎΠ½Ρ‚Ρ€ΠΎΠ»ΡŒΠ½Ρ‹Π΅ значСния (ΠΏΡ€ΠΎΠΎΠ²Π΅Ρ€ΠΊΠ°)
    check1 = (n*b + a*sx - sy)
    check2 = (b*sx + a*sx_sq - sxy)
    return [round(a,4), round(b,4)]

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ ΠΈ запишСм ΠΏΡ€Π°Π²ΠΈΠ»ΡŒΠ½Ρ‹Π΅ ΠΎΡ‚Π²Π΅Ρ‚Ρ‹
ab_us = Kramer_method(x_us,y_us)
a_us = ab_us[0]
b_us = ab_us[1]
print ' 33[1m' + ' 33[4m' + "ΠžΠΏΡ‚ΠΈΠΌΠ°Π»ΡŒΠ½Ρ‹Π΅ значСния коэффициСнтов a ΠΈ b:"  + ' 33[0m' 
print 'a =', a_us
print 'b =', b_us
print

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для подсчСта суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ошибок
def errors_sq_Kramer_method(answers,x,y):
    list_errors_sq = []
    for i in range(len(x)):
        err = (answers[0] + answers[1]*x[i] - y[i])**2
        list_errors_sq.append(err)
    return sum(list_errors_sq)

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ ΠΈ запишСм Π·Π½Π°Ρ‡Π΅Π½ΠΈΠ΅ ошибки
error_sq = errors_sq_Kramer_method(ab_us,x_us,y_us)
print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ" + ' 33[0m'
print error_sq
print

# Π·Π°ΠΌΠ΅Ρ€ΠΈΠΌ врСмя расчСта
# print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния расчСта суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
# % timeit error_sq = errors_sq_Kramer_method(ab,x_us,y_us)

Nantsi into esinayo:

Ukusombulula i-equation yokuhlehla komgca olula

Ke, amaxabiso ee-coefficients afunyenwe, isixa sokutenxa esisikweri sisekiwe. Masizobe umgca othe ngqo kwi-histogram esasazwayo ngokuhambelana ne-coefficients efunyenweyo.

Ikhowudi yomgca wobuyiselo

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для формирования массива рассчСтных Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠΈ
def sales_count(ab,x,y):
    line_answers = []
    [line_answers.append(ab[0]+ab[1]*x[i]) for i in range(len(x))]
    return line_answers

# построим Π³Ρ€Π°Ρ„ΠΈΠΊΠΈ
print 'Π“Ρ€Ρ„ΠΈΠΊβ„–2 "ΠŸΡ€Π°Π²ΠΈΠ»ΡŒΠ½Ρ‹Π΅ ΠΈ расчСтныС ΠΎΡ‚Π²Π΅Ρ‚Ρ‹"'
plt.plot(x_us,y_us,'o',color='green',markersize=16, label = '$True$ $answers$')
plt.plot(x_us, sales_count(ab_us,x_us,y_us), color='red',lw=4,
         label='$Function: a + bx,$ $where$ $a='+str(round(ab_us[0],2))+',$ $b='+str(round(ab_us[1],2))+'$')
plt.xlabel('$Months$', size=16)
plt.ylabel('$Sales$', size=16)
plt.legend(loc=1, prop={'size': 16})
plt.show()

Inombolo yesi-2 β€œIimpendulo ezichanekileyo nezibaliweyo”

Ukusombulula i-equation yokuhlehla komgca olula

Ungajonga igrafu yokutenxa kwinyanga nganye. Kwimeko yethu, asizukufumana naliphi na ixabiso elibonakalayo elibonakalayo kuyo, kodwa siya kwanelisa umnqweno wethu wokwazi malunga nokuba ulandelelwano olulula lokuhlehla equation lubonisa njani ukuxhomekeka kwengeniso kwinyanga yonyaka.

Ikhowudi yetshathi yokutenxa

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для формирования массива ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ Π² ΠΏΡ€ΠΎΡ†Π΅Π½Ρ‚Π°Ρ…
def error_per_month(ab,x,y):
    sales_c = sales_count(ab,x,y)
    errors_percent = []
    for i in range(len(x)):
        errors_percent.append(100*(sales_c[i]-y[i])/y[i])
    return errors_percent

# построим Π³Ρ€Π°Ρ„ΠΈΠΊ
print 'Π“Ρ€Π°Ρ„ΠΈΠΊβ„–3 "ΠžΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΡ ΠΏΠΎ-мСсячно, %"'
plt.gca().bar(x_us, error_per_month(ab_us,x_us,y_us), color='brown')
plt.xlabel('Months', size=16)
plt.ylabel('Calculation error, %', size=16)
plt.show()

Isicangca esinguNombolo 3 β€œUkunxaxha, %”

Ukusombulula i-equation yokuhlehla komgca olula

Asifezekanga, kodwa siwugqibile umsebenzi wethu.

Masibhale umsebenzi wokuba, ukufumanisa i-coefficients Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula usebenzisa ithala leencwadi INumPy, ngokuchanekileyo, siya kubhala imisebenzi emibini: enye isebenzisa i-matrix ye-pseudoinverse (engakhuthazwayo ekusebenzeni, ekubeni inkqubo iyinkimbinkimbi kwaye ingazinzile), enye isebenzisa i-equation ye-matrix.

Uhlalutyo lweKhowudi yesisombululo (NumPy)

# для Π½Π°Ρ‡Π°Π»Π° Π΄ΠΎΠ±Π°Π²ΠΈΠΌ столбСц с Π½Π΅ ΠΈΠ·ΠΌΠ΅Π½ΡΡŽΡ‰ΠΈΠΌΡΡ Π·Π½Π°Ρ‡Π΅Π½ΠΈΠ΅ΠΌ Π² 1. 
# Π”Π°Π½Π½Ρ‹ΠΉ столбСц Π½ΡƒΠΆΠ΅Π½ для Ρ‚ΠΎΠ³ΠΎ, Ρ‡Ρ‚ΠΎΠ±Ρ‹ Π½Π΅ ΠΎΠ±Ρ€Π°Π±Π°Ρ‚Ρ‹Π²Π°Ρ‚ΡŒ ΠΎΡ‚Π΄Π΅Π»ΡŒΠ½ΠΎ коэффицСнт a
vector_1 = np.ones((x_np.shape[0],1))
x_np = table_zero[['x']].values # Π½Π° всякий случай ΠΏΡ€ΠΈΠ²Π΅Π΄Π΅ΠΌ Π² ΠΏΠ΅Ρ€Π²ΠΈΡ‡Π½Ρ‹ΠΉ Ρ„ΠΎΡ€ΠΌΠ°Ρ‚ Π²Π΅ΠΊΡ‚ΠΎΡ€ x_np
x_np = np.hstack((vector_1,x_np))

# ΠΏΡ€ΠΎΠ²Π΅Ρ€ΠΈΠΌ Ρ‚ΠΎ, Ρ‡Ρ‚ΠΎ всС сдСлали ΠΏΡ€Π°Π²ΠΈΠ»ΡŒΠ½ΠΎ
print vector_1[0:3]
print x_np[0:3]
print '***************************************'
print

# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ, которая опрСдСляСт значСния коэффициСнтов a ΠΈ b с использованиСм псСвдообратной ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρ‹
def pseudoinverse_matrix(X, y):
    # Π·Π°Π΄Π°Π΅ΠΌ явный Ρ„ΠΎΡ€ΠΌΠ°Ρ‚ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρ‹ ΠΏΡ€ΠΈΠ·Π½Π°ΠΊΠΎΠ²
    X = np.matrix(X)
    # опрСдСляСм Ρ‚Ρ€Π°Π½ΡΠΏΠΎΠ½ΠΈΡ€ΠΎΠ²Π°Π½Π½ΡƒΡŽ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρƒ
    XT = X.T
    # опрСдСляСм ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚Π½ΡƒΡŽ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρƒ
    XTX = XT*X
    # опрСдСляСм ΠΏΡΠ΅Π²Π΄ΠΎΠΎΠ±Ρ€Π°Ρ‚Π½ΡƒΡŽ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρƒ
    inv = np.linalg.pinv(XTX)
    # Π·Π°Π΄Π°Π΅ΠΌ явный Ρ„ΠΎΡ€ΠΌΠ°Ρ‚ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρ‹ ΠΎΡ‚Π²Π΅Ρ‚ΠΎΠ²
    y = np.matrix(y)
    # Π½Π°Ρ…ΠΎΠ΄ΠΈΠΌ Π²Π΅ΠΊΡ‚ΠΎΡ€ вСсов
    return (inv*XT)*y

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ
ab_np = pseudoinverse_matrix(x_np, y_np)
print ab_np
print '***************************************'
print

# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ, которая ΠΈΡΠΏΠΎΠ»ΡŒΠ·ΡƒΠ΅Ρ‚ для Ρ€Π΅ΡˆΠ΅Π½ΠΈΡ ΠΌΠ°Ρ‚Ρ€ΠΈΡ‡Π½ΠΎΠ΅ ΡƒΡ€Π°Π²Π½Π΅Π½ΠΈΠ΅
def matrix_equation(X,y):
    a = np.dot(X.T, X)
    b = np.dot(X.T, y)
    return np.linalg.solve(a, b)

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ
ab_np = matrix_equation(x_np,y_np)
print ab_np

Makhe sithelekise ixesha elichithwe ekunqumeni i-coefficients Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula, ngokuhambelana neendlela ezi-3 ezinikezelweyo.

Ikhowudi yokubala ixesha lokubala

print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния расчСта коэффициСнтов Π±Π΅Π· использования Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:" + ' 33[0m'
% timeit ab_us = Kramer_method(x_us,y_us)
print '***************************************'
print
print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния расчСта коэффициСнтов с использованиСм псСвдообратной ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρ‹:" + ' 33[0m'
%timeit ab_np = pseudoinverse_matrix(x_np, y_np)
print '***************************************'
print
print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния расчСта коэффициСнтов с использованиСм ΠΌΠ°Ρ‚Ρ€ΠΈΡ‡Π½ΠΎΠ³ΠΎ уравнСния:" + ' 33[0m'
%timeit ab_np = matrix_equation(x_np, y_np)

Ukusombulula i-equation yokuhlehla komgca olula

Ngomlinganiselo omncinci wedatha, umsebenzi "ozibhalayo" uphuma phambili, ofumana i-coefficients usebenzisa indlela yeCramer.

Ngoku ungaya kwezinye iindlela zokufumana ii-coefficients Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula.

Ukuhla kweGradient

Okokuqala, makhe sichaze ukuba yintoni i-gradient. Ukubeka ngokulula, i-gradient licandelo elibonisa isalathiso sokukhula okuphezulu komsebenzi. Ngokufaniswa nokunyuka intaba, apho uthambeka lujonge khona kulapho oyena mnqantsa unyukela encotsheni yentaba ukhoyo. Ukuphuhlisa umzekelo kunye nentaba, sikhumbula ukuba ngokwenene sidinga ukuhla okugqithiseleyo ukuze sifike kwindawo ephantsi ngokukhawuleza, oko kukuthi, ubuncinci - indawo apho umsebenzi awunyuki okanye uyancipha. Kweli nqanaba i-derivative iyakulingana no-zero. Ngoko ke, asiyifuni i-gradient, kodwa i-antigradient. Ukufumana i-antigradient kufuneka nje uphindaphinde i-gradient ngayo -1 (thabatha enye).

Makhe sithathele ingqalelo into yokuba umsebenzi unokuba ne-minima ezininzi, kwaye emva kokuhla kwenye yazo usebenzisa i-algorithm ecetywayo ngezantsi, asiyi kukwazi ukufumana enye ubuncinane, enokuba ngaphantsi kweyona ifunyenweyo. Masiphumle, oku ayisosisongelo kuthi! Kwimeko yethu sijongana nobuncinci obunye, ukusukela umsebenzi wethu Ukusombulula i-equation yokuhlehla komgca olula kwigrafu yiparabola eqhelekileyo. Kwaye njengoko sonke kufuneka sazi kakuhle kakhulu kwikhosi yemathematika yesikolo, iparabola inobuncinane obunye kuphela.

Emva kokuba sifumanise ukuba kutheni sifuna i-gradient, kunye nokuba i-gradient licandelo, oko kukuthi, i-vector enolungelelwaniso olunikiweyo, olulingana ncakasana ne-coefficients. Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula singakwazi ukuphumeza ukuhla kwe-gradient.

Ngaphambi kokuba ndiqale, ndicebisa ukuba ufunde izivakalisi ezimbalwa malunga nokwehla kwe-algorithm:

  • Simisela ngendlela engacwangciswanga ngokungalindelekanga ulungelelwaniso lwee-coefficients Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula. Kumzekelo wethu, siya kuchaza i-coefficients kufuphi ne-zero. Le yinto eqhelekileyo, kodwa imeko nganye inokuba nenkqubo yayo.
  • Ukusuka kukulungelelanisa Ukusombulula i-equation yokuhlehla komgca olula thabatha ixabiso le-1st order derivative partial kwindawo Ukusombulula i-equation yokuhlehla komgca olula. Ngoko ke, ukuba i-derivative ilungile, ngoko umsebenzi uyanda. Ngoko ke, ngokukhupha ixabiso le-derivative, siya kuhamba kwicala elichasene nokukhula, oko kukuthi, kwicala lokuhla. Ukuba i-derivative i-negative, ngoko umsebenzi kule ngongoma uyancipha kwaye ngokukhupha ixabiso le-derivative sihamba kwicala lokuhla.
  • Senza umsebenzi ofanayo kunye nolungelelwaniso Ukusombulula i-equation yokuhlehla komgca olula: thabatha ixabiso lenxenye yokuphuma kwinqanaba Ukusombulula i-equation yokuhlehla komgca olula.
  • Ukuze ungatsibi ngaphaya kobuncinci kwaye ubhabhe kwindawo enzulu, kuyimfuneko ukuseta ubungakanani benyathelo kwicala lokuhla. Ngokubanzi, ungabhala inqaku elipheleleyo malunga nendlela yokuseta inyathelo ngokuchanekileyo kunye nendlela yokuyitshintsha ngexesha lenkqubo yokwehla ukwenzela ukunciphisa iindleko zokubala. Kodwa ngoku sinomsebenzi owahluke kancinane phambi kwethu, kwaye siya kuseka ubungakanani benyathelo sisebenzisa indlela yesayensi ye "poke" okanye, njengoko bethetha ngolwimi oluqhelekileyo, ngobuchule.
  • Emva kokuba sisuka kwii-coordinates ezinikiweyo Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula Ukukhupha amaxabiso ezinto eziphuma kuzo, sifumana ulungelelwaniso olutsha Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula. Sithatha isinyathelo esilandelayo (ukuthabatha), sele sele sisuka kwiindlela ezibaliweyo. Kwaye ke umjikelo uqala kwakhona kwaye kwakhona, de ukudibanisa okufunekayo kufezekiswe.

Konke! Ngoku sikulungele ukuya kukhangela owona mwonyo unzulu weMariana Trench. Masiqalise.

Ikhowudi yokwehla komgangatho

# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска Π±Π΅Π· использования Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy. 
# Ѐункция Π½Π° Π²Ρ…ΠΎΠ΄ ΠΏΡ€ΠΈΠ½ΠΈΠΌΠ°Π΅Ρ‚ Π΄ΠΈΠ°ΠΏΠ°Π·ΠΎΠ½Ρ‹ Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ x,y, Π΄Π»ΠΈΠ½Ρƒ шага (ΠΏΠΎ ΡƒΠΌΠΎΠ»Ρ‡Π°Π½ΠΈΡŽ=0,1), Π΄ΠΎΠΏΡƒΡΡ‚ΠΈΠΌΡƒΡŽ ΠΏΠΎΠ³Ρ€Π΅ΡˆΠ½ΠΎΡΡ‚ΡŒ(tolerance)
def gradient_descent_usual(x_us,y_us,l=0.1,tolerance=0.000000000001):
    # сумма Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ (всС мСсяца)
    sx = sum(x_us)
    # сумма истинных ΠΎΡ‚Π²Π΅Ρ‚ΠΎΠ² (Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠ° Π·Π° вСсь ΠΏΠ΅Ρ€ΠΈΠΎΠ΄)
    sy = sum(y_us)
    # сумма произвСдСния Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ Π½Π° истинныС ΠΎΡ‚Π²Π΅Ρ‚Ρ‹
    list_xy = []
    [list_xy.append(x_us[i]*y_us[i]) for i in range(len(x_us))]
    sxy = sum(list_xy)
    # сумма ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    list_x_sq = []
    [list_x_sq.append(x_us[i]**2) for i in range(len(x_us))]
    sx_sq = sum(list_x_sq)
    # количСство Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    num = len(x_us)
    # Π½Π°Ρ‡Π°Π»ΡŒΠ½Ρ‹Π΅ значСния коэффициСнтов, ΠΎΠΏΡ€Π΅Π΄Π΅Π»Π΅Π½Π½Ρ‹Π΅ псСвдослучайным ΠΎΠ±Ρ€Π°Π·ΠΎΠΌ
    a = float(random.uniform(-0.5, 0.5))
    b = float(random.uniform(-0.5, 0.5))
    # создаСм массив с ошибками, для старта ΠΈΡΠΏΠΎΠ»ΡŒΠ·ΡƒΠ΅ΠΌ значСния 1 ΠΈ 0
    # послС Π·Π°Π²Π΅Ρ€ΡˆΠ΅Π½ΠΈΡ спуска стартовыС значСния ΡƒΠ΄Π°Π»ΠΈΠΌ
    errors = [1,0]
    # запускаСм Ρ†ΠΈΠΊΠ» спуска
    # Ρ†ΠΈΠΊΠ» Ρ€Π°Π±ΠΎΡ‚Π°Π΅Ρ‚ Π΄ΠΎ Ρ‚Π΅Ρ… ΠΏΠΎΡ€, ΠΏΠΎΠΊΠ° ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠ΅ послСднСй ошибки суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ ΠΏΡ€Π΅Π΄Ρ‹Π΄ΡƒΡ‰Π΅ΠΉ, Π½Π΅ Π±ΡƒΠ΄Π΅Ρ‚ мСньшС tolerance
    while abs(errors[-1]-errors[-2]) > tolerance:
        a_step = a - l*(num*a + b*sx - sy)/num
        b_step = b - l*(a*sx + b*sx_sq - sxy)/num
        a = a_step
        b = b_step
        ab = [a,b]
        errors.append(errors_sq_Kramer_method(ab,x_us,y_us))
    return (ab),(errors[2:])

# запишСм массив Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ 
list_parametres_gradient_descence = gradient_descent_usual(x_us,y_us,l=0.1,tolerance=0.000000000001)


print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_gradient_descence[0][0],3)
print 'b =', round(list_parametres_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_gradient_descence[1][-1],3)
print



print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_gradient_descence[1])
print

Ukusombulula i-equation yokuhlehla komgca olula

Siye santywila emazantsi eMariana Trench kwaye apho safumana amaxabiso afanayo Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula, nto leyo kanye ebeyilindelekile.

Masithathe enye into yokuntywila, ngeli xesha kuphela, isithuthi sethu solwandle olunzulu siya kugcwala obunye ubuchwephesha, obuziithala leencwadi. INumPy.

Ikhowudi yokwehla komgangatho (NumPy)

# ΠΏΠ΅Ρ€Π΅Π΄ Ρ‚Π΅ΠΌ ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΡ‚ΡŒ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска с использованиСм Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy, 
# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ опрСдСлСния суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ Ρ‚Π°ΠΊΠΆΠ΅ с использованиСм NumPy
def error_square_numpy(ab,x_np,y_np):
    y_pred = np.dot(x_np,ab)
    error = y_pred - y_np
    return sum((error)**2)

# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска с использованиСм Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy. 
# Ѐункция Π½Π° Π²Ρ…ΠΎΠ΄ ΠΏΡ€ΠΈΠ½ΠΈΠΌΠ°Π΅Ρ‚ Π΄ΠΈΠ°ΠΏΠ°Π·ΠΎΠ½Ρ‹ Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ x,y, Π΄Π»ΠΈΠ½Ρƒ шага (ΠΏΠΎ ΡƒΠΌΠΎΠ»Ρ‡Π°Π½ΠΈΡŽ=0,1), Π΄ΠΎΠΏΡƒΡΡ‚ΠΈΠΌΡƒΡŽ ΠΏΠΎΠ³Ρ€Π΅ΡˆΠ½ΠΎΡΡ‚ΡŒ(tolerance)
def gradient_descent_numpy(x_np,y_np,l=0.1,tolerance=0.000000000001):
    # сумма Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ (всС мСсяца)
    sx = float(sum(x_np[:,1]))
    # сумма истинных ΠΎΡ‚Π²Π΅Ρ‚ΠΎΠ² (Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠ° Π·Π° вСсь ΠΏΠ΅Ρ€ΠΈΠΎΠ΄)
    sy = float(sum(y_np))
    # сумма произвСдСния Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ Π½Π° истинныС ΠΎΡ‚Π²Π΅Ρ‚Ρ‹
    sxy = x_np*y_np
    sxy = float(sum(sxy[:,1]))
    # сумма ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    sx_sq = float(sum(x_np[:,1]**2))
    # количСство Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    num = float(x_np.shape[0])
    # Π½Π°Ρ‡Π°Π»ΡŒΠ½Ρ‹Π΅ значСния коэффициСнтов, ΠΎΠΏΡ€Π΅Π΄Π΅Π»Π΅Π½Π½Ρ‹Π΅ псСвдослучайным ΠΎΠ±Ρ€Π°Π·ΠΎΠΌ
    a = float(random.uniform(-0.5, 0.5))
    b = float(random.uniform(-0.5, 0.5))
    # создаСм массив с ошибками, для старта ΠΈΡΠΏΠΎΠ»ΡŒΠ·ΡƒΠ΅ΠΌ значСния 1 ΠΈ 0
    # послС Π·Π°Π²Π΅Ρ€ΡˆΠ΅Π½ΠΈΡ спуска стартовыС значСния ΡƒΠ΄Π°Π»ΠΈΠΌ
    errors = [1,0]
    # запускаСм Ρ†ΠΈΠΊΠ» спуска
    # Ρ†ΠΈΠΊΠ» Ρ€Π°Π±ΠΎΡ‚Π°Π΅Ρ‚ Π΄ΠΎ Ρ‚Π΅Ρ… ΠΏΠΎΡ€, ΠΏΠΎΠΊΠ° ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠ΅ послСднСй ошибки суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ ΠΏΡ€Π΅Π΄Ρ‹Π΄ΡƒΡ‰Π΅ΠΉ, Π½Π΅ Π±ΡƒΠ΄Π΅Ρ‚ мСньшС tolerance
    while abs(errors[-1]-errors[-2]) > tolerance:
        a_step = a - l*(num*a + b*sx - sy)/num
        b_step = b - l*(a*sx + b*sx_sq - sxy)/num
        a = a_step
        b = b_step
        ab = np.array([[a],[b]])
        errors.append(error_square_numpy(ab,x_np,y_np))
    return (ab),(errors[2:])

# запишСм массив Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ 
list_parametres_gradient_descence = gradient_descent_numpy(x_np,y_np,l=0.1,tolerance=0.000000000001)

print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_gradient_descence[0][0],3)
print 'b =', round(list_parametres_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_gradient_descence[1][-1],3)
print

print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_gradient_descence[1])
print

Ukusombulula i-equation yokuhlehla komgca olula
Amaxabiso e-Coefficient Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula ayinakuguqulwa.

Makhe sijonge ukuba impazamo yatshintsha njani ngexesha lokwehla komgangatho, oko kukuthi, indlela isixa sotembuko oluphindwe kabini olutshintshe ngayo inyathelo ngalinye.

Ikhowudi yokucwangcisa iimali zokunxaxha okuphindwe kabini

print 'Π“Ρ€Π°Ρ„ΠΈΠΊβ„–4 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово"'
plt.plot(range(len(list_parametres_gradient_descence[1])), list_parametres_gradient_descence[1], color='red', lw=3)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

Igrafu enguNombolo yesi-4 β€œIsimbuku sokunxaxha okusikweri ngexesha lokuhla komthambeka”

Ukusombulula i-equation yokuhlehla komgca olula

Kwigrafu sibona ukuba ngenyathelo ngalinye impazamo iyancipha, kwaye emva kwenani elithile lokuphindaphinda sibona umgca othe tye.

Okokugqibela, masiqikelele umahluko kwixesha lokwenziwa kwekhowudi:

Ikhowudi yokumisela ixesha lokubala ukuhla

print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска Π±Π΅Π· использования Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:" + ' 33[0m'
%timeit list_parametres_gradient_descence = gradient_descent_usual(x_us,y_us,l=0.1,tolerance=0.000000000001)
print '***************************************'
print

print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска с использованиСм Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:" + ' 33[0m'
%timeit list_parametres_gradient_descence = gradient_descent_numpy(x_np,y_np,l=0.1,tolerance=0.000000000001)

Ukusombulula i-equation yokuhlehla komgca olula

Mhlawumbi senza into engalunganga, kodwa kwakhona ngumsebenzi olula "obhalwe ekhaya" ongasebenzisi ithala leencwadi. INumPy lilodlula ixesha lokubala lomsebenzi usebenzisa ithala leencwadi INumPy.

Kodwa asimi ngxi, kodwa sijonge phambili ekufundeni enye indlela enika umdla yokusombulula i-equation yomgca olula. Sidibane nathi!

Ukwehla kwe-Stochastic gradient

Ukuze uqonde ngokukhawuleza umgaqo wokusebenza kwe-stochastic gradient descent, kungcono ukumisela ukungafani kwayo ukusuka kwi-gradient yesiqhelo. Thina, kwimeko yokwehla kwe-gradient, kwii-equations ze-derivatives ze Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula wasebenzisa ii-sums zamaxabiso azo zonke iimpawu kunye neempendulo eziyinyani ezikhoyo kwisampulu (oko kukuthi, iimali zazo zonke. Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula). Kwi-stochastic gradient descent, asizukusebenzisa onke amaxabiso akhoyo kwisampulu, kodwa endaweni yoko, khetha ngokungakhethiyo into ebizwa ngokuba yisampula yesalathiso kwaye usebenzise amaxabiso ayo.

Umzekelo, ukuba isalathisi sizimisele ukuba yinombolo 3 (ezintathu), ngoko sithatha amaxabiso Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula, emva koko sibeka amaxabiso endaweni ye-equations ephumayo kwaye simisele ulungelelwaniso olutsha. Emva koko, emva kokugqiba ulungelelwaniso, siphinda sigqibe nge-pseudo-ngokungakhethiyo isalathiso sesampulu, sitshintshe amaxabiso ahambelana nesalathiso kwi-equations ephakathi, kwaye simisele ulungelelwaniso ngendlela entsha. Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula njl. de ukudibana kube luhlaza. Ekuqaleni, kusenokungabonakali ngathi oku kunokusebenza konke konke, kodwa kunjalo. Kuyinyani ukuba kuyafaneleka ukuba uqaphele ukuba impazamo ayinciphisi ngenyathelo ngalinye, kodwa ngokuqinisekileyo kukho ukuthambekela.

Zeziphi iingenelo zokwehla kwestochastic kunesiqhelo? Ukuba ubungakanani bethu besampulu bukhulu kakhulu kwaye bulinganiselwa kumashumi amawaka amaxabiso, ngoko kulula kakhulu ukuqhubekeka, yithi, iwaka elingakhethiyo, kunokuba isampuli iyonke. Kulapho kungena khona ukwehla kwestochastic gradient. Kwimeko yethu, ngokuqinisekileyo, asiyi kuqaphela umahluko omkhulu.

Makhe sijonge ikhowudi.

Ikhowudi yokwehla kwestochastic

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ стох.Π³Ρ€Π°Π΄.шага
def stoch_grad_step_usual(vector_init, x_us, ind, y_us, l):
#     Π²Ρ‹Π±ΠΈΡ€Π°Π΅ΠΌ Π·Π½Π°Ρ‡Π΅Π½ΠΈΠ΅ икс, ΠΊΠΎΡ‚ΠΎΡ€ΠΎΠ΅ соотвСтствуСт случайному Π·Π½Π°Ρ‡Π΅Π½ΠΈΡŽ ΠΏΠ°Ρ€Π°ΠΌΠ΅Ρ‚Ρ€Π° ind 
# (см.Ρ„-Ρ†ΠΈΡŽ stoch_grad_descent_usual)
    x = x_us[ind]
#     рассчитывыаСм Π·Π½Π°Ρ‡Π΅Π½ΠΈΠ΅ y (Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΡƒ), которая соотвСтствуСт Π²Ρ‹Π±Ρ€Π°Π½Π½ΠΎΠΌΡƒ Π·Π½Π°Ρ‡Π΅Π½ΠΈΡŽ x
    y_pred = vector_init[0] + vector_init[1]*x_us[ind]
#     вычисляСм ΠΎΡˆΠΈΠ±ΠΊΡƒ расчСтной Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠΈ ΠΎΡ‚Π½ΠΎΡΠΈΡ‚Π΅Π»ΡŒΠ½ΠΎ прСдставлСнной Π² Π²Ρ‹Π±ΠΎΡ€ΠΊΠ΅
    error = y_pred - y_us[ind]
#     опрСдСляСм ΠΏΠ΅Ρ€Π²ΡƒΡŽ ΠΊΠΎΠΎΡ€Π΄ΠΈΠ½Π°Ρ‚Ρƒ Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π° ab
    grad_a = error
#     опрСдСляСм Π²Ρ‚ΠΎΡ€ΡƒΡŽ ΠΊΠΎΠΎΡ€Π΄ΠΈΠ½Π°Ρ‚Ρƒ ab
    grad_b = x_us[ind]*error
#     вычисляСм Π½ΠΎΠ²Ρ‹ΠΉ Π²Π΅ΠΊΡ‚ΠΎΡ€ коэффициСнтов
    vector_new = [vector_init[0]-l*grad_a, vector_init[1]-l*grad_b]
    return vector_new


# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ стох.Π³Ρ€Π°Π΄.спуска
def stoch_grad_descent_usual(x_us, y_us, l=0.1, steps = 800):
#     для самого Π½Π°Ρ‡Π°Π»Π° Ρ€Π°Π±ΠΎΡ‚Ρ‹ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΠΈ Π·Π°Π΄Π°Π΄ΠΈΠΌ Π½Π°Ρ‡Π°Π»ΡŒΠ½Ρ‹Π΅ значСния коэффициСнтов
    vector_init = [float(random.uniform(-0.5, 0.5)), float(random.uniform(-0.5, 0.5))]
    errors = []
#     запустим Ρ†ΠΈΠΊΠ» спуска
# Ρ†ΠΈΠΊΠ» расчитан Π½Π° ΠΎΠΏΡ€Π΅Π΄Π΅Π»Π΅Π½Π½ΠΎΠ΅ количСство шагов (steps)
    for i in range(steps):
        ind = random.choice(range(len(x_us)))
        new_vector = stoch_grad_step_usual(vector_init, x_us, ind, y_us, l)
        vector_init = new_vector
        errors.append(errors_sq_Kramer_method(vector_init,x_us,y_us))
    return (vector_init),(errors)


# запишСм массив Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ 
list_parametres_stoch_gradient_descence = stoch_grad_descent_usual(x_us, y_us, l=0.1, steps = 800)

print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_stoch_gradient_descence[0][0],3)
print 'b =', round(list_parametres_stoch_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_stoch_gradient_descence[1][-1],3)
print

print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² стохастичСском Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_stoch_gradient_descence[1])

Ukusombulula i-equation yokuhlehla komgca olula

Sijonga ngononophelo kwii-coefficients kwaye sizibambe sibuza umbuzo othi "Inokwenzeka njani le nto?" Sifumene amanye amaxabiso e-coefficient Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula. Mhlawumbi ukuhla komgangatho westochastic kufumene iiparamitha ezizezona zilungileyo zenxaki? Ngelishwa akukho. Kwanele ukujonga isimbuku sokunxaxha okuphindwe kabini kwaye ubone ukuba ngamaxabiso amatsha ee-coefficients, impazamo inkulu. Asingxamanga ukuba siphelelwe lithemba. Masenze igrafu yotshintsho lwempazamo.

Ikhowudi yokucwangcisa isixa sotembuko oluphindwe kabini kumzantsi westochastic gradient

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–5 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1])), list_parametres_stoch_gradient_descence[1], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

Igrafu enguNombolo 5 β€œIsimbuku sokunxaxha okusikweri ngexesha lokuhla kwe-stochastic gradient”

Ukusombulula i-equation yokuhlehla komgca olula

Ukujonga ishedyuli, yonke into iwela endaweni kwaye ngoku siya kulungisa yonke into.

Kwenzeka ntoni ke? Oku kulandelayo kwenzeka. Xa sikhetha inyanga ngokungakhethiyo, ke yinyanga ekhethiweyo ukuba i-algorithm yethu ifuna ukunciphisa impazamo ekubaleni ingeniso. Emva koko sikhetha enye inyanga kwaye siphinda ukubala, kodwa sinciphisa impazamo kwinyanga yesibini ekhethiweyo. Ngoku khumbula ukuba iinyanga ezimbini zokuqala zitenxa kakhulu kumgca wolandelelwano olulula lwenxaki. Oku kuthetha ukuba xa nayiphi na kwezi nyanga zimbini ikhethiwe, ngokunciphisa impazamo nganye kuzo, i-algorithm yethu yonyusa kakhulu impazamo kwisampulu yonke. Ke makwenziwe ntoni? Impendulo ilula: kufuneka unciphise inyathelo lokuhla. Emva kwayo yonke loo nto, ngokunciphisa inyathelo lokuhla, impazamo iya kuyeka "ukutsiba" phezulu naphantsi. Okanye kunoko, impazamo "yokuxhuma" ayiyi kuyeka, kodwa ayiyi kuyenza ngokukhawuleza :) Makhe sihlolisise.

Ikhowudi yokuqhuba i-SGD ngokunyuswa okuncinci

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ, ΡƒΠΌΠ΅Π½ΡŒΡˆΠΈΠ² шаг Π² 100 Ρ€Π°Π· ΠΈ ΡƒΠ²Π΅Π»ΠΈΡ‡ΠΈΠ² количСство шагов ΡΠΎΠΎΡ‚Π²Π΅Ρ‚ΡΠ²ΡƒΡŽΡ‰Π΅ 
list_parametres_stoch_gradient_descence = stoch_grad_descent_usual(x_us, y_us, l=0.001, steps = 80000)

print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_stoch_gradient_descence[0][0],3)
print 'b =', round(list_parametres_stoch_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_stoch_gradient_descence[1][-1],3)
print



print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² стохастичСском Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_stoch_gradient_descence[1])

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–6 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1])), list_parametres_stoch_gradient_descence[1], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

Ukusombulula i-equation yokuhlehla komgca olula

Igrafu enguNombolo 6 β€œIsimbuku sokunxaxha okusikweri ngexesha lokuhla kwe-stochastic gradient (amawaka angama-80 amanyathelo)”

Ukusombulula i-equation yokuhlehla komgca olula

I-coefficients iphuculwe, kodwa ayikabikho kakuhle. Ngokwenyani, oku kunokulungiswa ngolu hlobo. Sikhetha, umzekelo, kuphindaphindo lwe-1000 lokugqibela amaxabiso ee-coefficients apho impazamo encinci yenziwe ngayo. Kuyinyani, oku kuya kufuneka sibhale phantsi amaxabiso ee-coefficients ngokwawo. Asiyi kukwenza oku, kodwa kunoko sinikele ingqalelo kwishedyuli. Ijongeka igudile kwaye impazamo ibonakala iyancipha ngokulinganayo. Eneneni oku akuyonyani. Makhe sijonge kwi-1000 yokuqala yophindaphindo kwaye siyithelekise neyokugqibela.

Ikhowudi yetshathi yeSGD (amanyathelo okuqala ali-1000)

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–7 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово. ΠŸΠ΅Ρ€Π²Ρ‹Π΅ 1000 ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1][:1000])), 
         list_parametres_stoch_gradient_descence[1][:1000], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–7 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово. ПослСдниС 1000 ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1][-1000:])), 
         list_parametres_stoch_gradient_descence[1][-1000:], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

Igrafu enguNombolo yesi-7 β€œIsimbuku see-square deviations SGD (first 1000 steps)”

Ukusombulula i-equation yokuhlehla komgca olula

Igrafu enguNombolo 8 β€œIsimbuku see-square deviations SGD (inyathelo lokugqibela eli-1000)”

Ukusombulula i-equation yokuhlehla komgca olula

Kwasekuqaleni kokuhla, sibona impazamo efanayo kunye nokwehla okunyukayo. Ekuphindaphindweni kokugqibela, sibona ukuba impazamo iyajikeleza kwaye ijikeleze ixabiso le-1,475 kwaye ngamanye amaxesha ide ilingane nelona xabiso liphezulu, kodwa ke liyenyuka ... ndiyaphinda, ungabhala phantsi amaxabiso ii-coefficients Ukusombulula i-equation yokuhlehla komgca olula ΠΈ Ukusombulula i-equation yokuhlehla komgca olula, kwaye emva koko ukhethe ezo impazamo incinci kuzo. Nangona kunjalo, sasinengxaki enkulu ngakumbi: kwafuneka sithathe amanyathelo angama-80 amawaka (jonga ikhowudi) ukufumana amaxabiso kufutshane neyona nto ilungileyo. Kwaye oku sele kuphikisana nombono wokugcina ixesha lokubala kunye ne-stochastic gradient descent ngokubhekisele kwi-gradient descent. Yintoni enokulungiswa ize iphuculwe? Akunzima ukuqaphela ukuba kwii-iterations zokuqala sihamba ngokuzithemba kwaye, ngoko ke, kufuneka sishiye inyathelo elikhulu kwi-iterations yokuqala kwaye sinciphise isinyathelo njengoko siqhubela phambili. Asiyi kwenza oku kweli nqaku - sele lide kakhulu. Abo banqwenelayo banokuzicingela ngokwabo ukuba bangayenza njani le nto, akunzima :)

Ngoku masenze ukwehla komgangatho westochastic sisebenzisa ithala leencwadi INumPy (kwaye masingakhubeki kulamatye ebesiwachazile ngaphambili)

IKhowudi yeStochastic ngokweHlupheko (NumPy)

# для Π½Π°Ρ‡Π°Π»Π° напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ шага
def stoch_grad_step_numpy(vector_init, X, ind, y, l):
    x = X[ind]
    y_pred = np.dot(x,vector_init)
    err = y_pred - y[ind]
    grad_a = err
    grad_b = x[1]*err
    return vector_init - l*np.array([grad_a, grad_b])

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ стохастичСского Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска
def stoch_grad_descent_numpy(X, y, l=0.1, steps = 800):
    vector_init = np.array([[np.random.randint(X.shape[0])], [np.random.randint(X.shape[0])]])
    errors = []
    for i in range(steps):
        ind = np.random.randint(X.shape[0])
        new_vector = stoch_grad_step_numpy(vector_init, X, ind, y, l)
        vector_init = new_vector
        errors.append(error_square_numpy(vector_init,X,y))
    return (vector_init), (errors)

# запишСм массив Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ 
list_parametres_stoch_gradient_descence = stoch_grad_descent_numpy(x_np, y_np, l=0.001, steps = 80000)

print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_stoch_gradient_descence[0][0],3)
print 'b =', round(list_parametres_stoch_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_stoch_gradient_descence[1][-1],3)
print



print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² стохастичСском Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_stoch_gradient_descence[1])
print

Ukusombulula i-equation yokuhlehla komgca olula

Amaxabiso ajike aphantse afane naxa usihla ngaphandle kokusebenzisa INumPy. Noko ke, oku kusengqiqweni.

Masifumanise ukuba ukwehla kwestochastic gradient kusithathe ixesha elingakanani.

Ikhowudi yokumisela ixesha lokubala le-SGD (amanyathelo angama-80 amawaka)

print ' 33[1m' + ' 33[4m' +
"ВрСмя выполнСния стохастичСского Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска Π±Π΅Π· использования Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:"
+ ' 33[0m'
%timeit list_parametres_stoch_gradient_descence = stoch_grad_descent_usual(x_us, y_us, l=0.001, steps = 80000)
print '***************************************'
print

print ' 33[1m' + ' 33[4m' +
"ВрСмя выполнСния стохастичСского Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска с использованиСм Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:"
+ ' 33[0m'
%timeit list_parametres_stoch_gradient_descence = stoch_grad_descent_numpy(x_np, y_np, l=0.001, steps = 80000)

Ukusombulula i-equation yokuhlehla komgca olula

Ukuqhubela phambili ehlathini, amafu amnyama: kwakhona, ifomula "ebhaliweyo" ibonisa umphumo ongcono. Konke oku kubonisa ukuba makubekho iindlela ezichuliweyo zokusebenzisa ithala leencwadi INumPy, ezikhawulezisa ngokwenene imisebenzi yokubala. Kweli nqaku asizukufunda ngazo. Kuya kubakho into onokucinga ngayo ngexesha lakho lokuphumla :)

Sishwankathela

Ngaphambi kokuba ndishwankathele, ndingathanda ukuphendula umbuzo ekusenokwenzeka ukuba ukhe wavela kumfundi wethu othandekayo. Kutheni, eneneni, "intuthumbo" enjalo kunye nokwehla, kutheni kufuneka sihambe sinyuka sinyuka intaba (ubukhulu becala ezantsi) ukuze sifumane indawo ephantsi exabisekileyo, ukuba sinezandla zethu isixhobo esinamandla nesilula, uhlobo lwesisombululo sohlalutyo, esisithumela ngoko nangoko kwiNdawo eLungileyo?

Impendulo yalo mbuzo ilapha phezulu. Ngoku sijonge umzekelo olula kakhulu, apho impendulo yokwenyani ikhoyo Ukusombulula i-equation yokuhlehla komgca olula ixhomekeke kuphawu olunye Ukusombulula i-equation yokuhlehla komgca olula. Awuyiboni le nto rhoqo ebomini, ngoko makhe sicinge ukuba sineempawu ezi-2, ezingama-30, ezingama-50 okanye ngaphezulu. Makhe songeze koku amawaka, okanye amashumi amawaka amaxabiso kuphawu ngalunye. Kule meko, isisombululo sokuhlalutya asinako ukumelana novavanyo kwaye singaphumeleli. Kwelinye icala, ukuhla kwe-gradient kunye nokwahluka kwayo kuya kuthi chu kodwa ngokuqinisekileyo kusisondeze kufutshane nenjongo - ubuncinci bomsebenzi. Kwaye ungakhathazeki ngesantya - mhlawumbi siya kujonga iindlela eziya kusivumela ukuba sibeke kwaye silawule ubude benyathelo (oko kukuthi, isantya).

Kwaye ngoku isishwankathelo esifutshane.

Okokuqala, ndiyathemba ukuba izinto ezichazwe kwinqaku ziya kunceda ukuqalisa "izazinzulu zedatha" ekuqondeni indlela yokusombulula ii-equations ezilula (kwaye hayi kuphela).

Okwesibini, sijonge iindlela ezininzi zokusombulula i-equation. Ngoku, kuxhomekeke kwimeko, sinokukhetha eyona ifaneleke kakhulu ukusombulula ingxaki.

Okwesithathu, sabona amandla esethingi ezongezelelweyo, ezizezi, ubude benyathelo lokwehla kwe-gradient. Le parameter ayinakungahoywa. Njengoko kuphawuliwe ngasentla, ukwenzela ukunciphisa iindleko zokubala, ubude besinyathelo kufuneka butshintshwe ngexesha lokuhla.

Okwesine, kwimeko yethu, imisebenzi "ebhalwe ekhaya" ibonise iziphumo ezilungileyo zexesha lokubala. Oku mhlawumbi kungenxa yokungasebenzisi kakhulu kobuchule bethala leencwadi INumPy. Kodwa nokuba kusenokuba njalo, isigqibo esilandelayo sizibonakalisa ngokwaso. Ngakolunye uhlangothi, ngamanye amaxesha kuyafaneleka ukubuza iimbono ezimiselweyo, kwaye kwelinye icala, akusoloko kufanelekile ukunyanzela yonke into - ngokuchaseneyo, ngamanye amaxesha indlela elula yokusombulula ingxaki isebenza ngakumbi. Kwaye kuba injongo yethu yayikukuhlalutya iindlela ezintathu zokusombulula i-equation ye-linear regression equation, ukusetyenziswa kwemisebenzi "ezibhale ngokwakho" kwakwanele kuthi.

Uncwadi (okanye into enjalo)

1. Ukuhlehla komgca

http://statistica.ru/theory/osnovy-lineynoy-regressii/

2. Ubuncinci indlela yezikwere

mathprofi.ru/metod_naimenshih_kvadratov.html

3. Ukukhutshwa

www.mathprofi.ru/chastnye_proizvodnye_primery.html

4. Gradient

mathprofi.ru/proizvodnaja_po_napravleniju_i_gradient.html

5. Ukwehla komgangatho

habr.com/en/post/471458

habr.com/en/post/307312

artemarakcheev.com//2017-12-31/linear_regression

6. Ithala leencwadi leNumPy

docs.scipy.org/doc/numpy-1.10.1/reference/generated/numpy.linalg.solve.html

docs.scipy.org/doc/numpy-1.10.0/reference/generated/numpy.linalg.pinv.html

pythonworld.ru/numpy/2.html

umthombo: www.habr.com

Yongeza izimvo