Chinyorwa chinokurukura nzira dzinoverengeka dzekuona masvomhu equation yemutsara wakapfava (paviri) wekudzoreredza.
Nzira dzese dzekugadzirisa equation yakurukurwa pano dzinobva panzira yedikidiki. Ngatitarisei nzira dzinotevera:
- Analytical solution
- Gradient Descent
- Stochastic gradient descent
Kune yega yega nzira yekugadzirisa equation yemutsara wakatwasuka, chinyorwa chinopa akasiyana mabasa, ayo anonyanya kukamurwa kuita ayo akanyorwa pasina kushandisa raibhurari. numpy uye nevaya vanoshandisa kuverenga numpy. Zvinotendwa kuti kushandisa nounyanzvi numpy zvichaderedza computing cost.
Kodhi yese yakapihwa muchinyorwa inonyorwa mukati python-2.7 uchishandisa Jupyter Notebook. Iyo kodhi kodhi uye faira ine sampuro data inotumirwa pairi
Chinyorwa chacho chakanyanya kunanga kune vese vanotanga uye avo vakatotanga zvishoma nezvishoma kugona kudzidza kwechikamu chakafara kwazvo muhungwaru hwekugadzira - kudzidza kwemichina.
Kuenzanisira mashoko acho, tinoshandisa muenzaniso wakapfava zvikuru.
Muenzaniso mamiriro
Tine maitiro mashanu anoratidza kutsamira Y ΠΎΡ X (Tafura Nhamba 1):
Tafura Nhamba 1 βMienzaniso yemamiriro ezvinhuβ
Tichafunga kuti tsika mwedzi wegore, uye - mari mwedzi uno. Mune mamwe mazwi, mari inobva pamwedzi wegore, uye - chiratidzo chete icho mari inoenderana nayo.
Muenzaniso wakadaro-saizvozvo, zvese kubva pakuona kwemamiriro ekutsamira kwemari pamwedzi wegore, uye kubva pakuona kwehuwandu hwehukoshi - kune vashoma kwazvo. Zvisinei, kurerutsa kwakadaro kuchaita kuti zvikwanisike, sezvavanotaura, kutsanangura, kwete nguva dzose zviri nyore, zvinhu izvo vanotanga vanotevedzera. Uye zvakare kuve nyore kwenhamba kuchabvumira avo vanoshuvira kugadzirisa muenzaniso papepa pasina yakakosha mari yebasa.
Ngatifungei kuti kutsamira kwakapihwa mumuenzaniso kunogona kufananidzwa nemasvomhu equation yeakareruka (paviri) regression mutsara wefomu:
apo ndiwo mwedzi wakatorwa mari; - mari inoenderana nemwedzi, ΠΈ ndiwo maregression coefficients emutsara unofungidzirwa.
Cherechedza kuti coefficient inowanzonzi mutsetse kana gradient yemutsetse unofungidzirwa; inomiririra mari inoshandiswa ne painoshanduka .
Zviripachena, basa redu mumuenzaniso nderekusarudza macoefficients akadaro muequation ΠΈ , uko kutsauka kwemitengo yedu yakaverengerwa yemari pamwedzi kubva kumhinduro dzechokwadi, i.e. tsika dzakaunzwa mumuenzaniso dzichave shoma.
Kashoma sikweya nzira
Zvinoenderana nediki mativi nzira, kutsauka kunofanirwa kuverengerwa nekuipeta. Iyi tekinoroji inokutendera kuti udzivise kukanzura kukanzura kana vaine zviratidzo zvakapesana. Semuenzaniso, kana mune imwe nyaya, kutsauka kuri +5 (pamwe neshanu), uye mune imwe -5 (kubvisa shanu), ipapo huwandu hwekutsauka huchadzima imwe neimwe kunze uye hunosvika 0 (zero). Izvo zvinogoneka kwete kukwereta kutsauka, asi kushandisa pfuma yemodulus uye ipapo misiyano yese ichave yakanaka uye ichaunganidza. Isu hatisi kuzogara pane iyi pfungwa zvakadzama, asi zvinongoratidza kuti kuti zvive nyore kuverenga, itsika kukwereta kutsauka.
Izvi ndizvo zvinoita fomula yatichasarudza nayo shoma shoma yekutsauka kwakapetwa (zvikanganiso):
apo ibasa rekufungidzira mhinduro dzechokwadi (kureva, mari yatakaverenga),
imhinduro dzechokwadi (mari inowanikwa mumuenzaniso),
ndiyo indekisi yemuenzaniso (nhamba yemwedzi umo kutsauka kwakatemwa)
Ngatisiyanei basa, titsanangure chidimbu chekusiyanisa equation, uye tigadzirire kuenda kune yekuongorora mhinduro. Asi chekutanga, ngatitorei rwendo rupfupi pamusoro pekuti kusiyanisa chii uye tiyeuke zvinoreva geometric yezvinotorwa.
Musiyano
Musiyano (differentiation) ndiko kushanda kwekutsvaga kubva pane chimwe chinhu.
Chii chinonzi derivative chinoshandiswa? Kubva pane chimwe chinhu chinoratidza mwero weshanduko yebasa uye inotiudza mafambiro ayo. Kana derivative pane imwe nzvimbo yakanaka, saka basa rinowedzera; kana zvisina kudaro, basa rinodzikira. Uye iyo yakakura kukosha kweabsolute derivative, yakakwira mwero wekuchinja kwemaitiro ebasa, pamwe nekukwira kwemateru egirofu yebasa.
Semuyenzaniso, pasi pemamiriro eCartesian coordinate system, kukosha kweiyo derivative panzvimbo M (0,0) yakaenzana + 25 zvinoreva kuti pane imwe nguva, apo kukosha kunoshandurwa kurudyi neyakajairwa unit, kukosha inowedzera ne25 yakajairika mayunitsi. Pagirafu inoita senge kukwira kwakaringana kwehunhu kubva pane imwe nzvimbo.
Mumwe muenzaniso. The derivative value yakaenzana -0,1 zvinoreva kuti kana wadzingwa pane imwe yakajairika unit, kukosha inoderera ne 0,1 chete yakajairika unit. Panguva imwecheteyo, pagirafu yebasa racho, tinogona kuona kuderera kusingaonekwe. Kudhirowa fananidzo negomo, zvinoita sekunge tiri kudzika zvishoma nezvishoma pamateru kubva mugomo, kusiyana nemuenzaniso wapfuura, pataifanira kukwira nhongonya dzakanyanya :)
Saka, mushure mekusiyanisa basa nemaodds ΠΈ , tinotsanangura 1st order partial differential equations. Mushure mekuona maequation, isu tinogashira hurongwa hwemaequation maviri, nekugadzirisa izvo zvatichakwanisa kusarudza kukosha kwakadai kweiyo coefficients. ΠΈ , iyo hunhu hwezvinobvamo zvinoenderana pamapoinzi akapihwa hunochinja nehuwandu, hudiki kwazvo, uye kana iri mhinduro yekuongorora haishanduke zvachose. Mune mamwe mazwi, basa rekukanganisa pane akawanikwa coefficients richasvika padiki, sezvo hunhu hwezvikamu zvakabva panzvimbo idzi huchaenzana ne zero.
Saka, maererano nemitemo yekusiyanisa, chikamu chinobva kune equation ye 1st order maererano ne coefficient. achatora fomu:
1st order partial derivative equation maererano ne achatora fomu:
Nekuda kweizvozvo, takagamuchira system ye equations ine mhinduro yakapusa yekuongorora:
kutanga{equation*}
kutanga{nyaya}
na + bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i = 0
sumlimits_{i=1}^nx_i(a +bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i) = 0
kupera{nyaya}
magumo{equation*}
Tisati tagadzirisa equation, ngatitangei kurodha, tarisa kuti kurodha kwacho here, uye tofometa iyo data.
Kurodha nekugadzirisa data
Zvinofanira kucherechedzwa kuti nekuda kweiyo mhinduro yekuongorora, uyezve ye gradient uye stochastic gradient descent, isu tichashandisa iyo kodhi mumisiyano miviri: kushandisa raibhurari. numpy uye pasina kuishandisa, ipapo tichada zvakakodzera data formatting (ona code).
Kurodha data uye kugadzirisa kodhi
# ΠΈΠΌΠΏΠΎΡΡΠΈΡΡΠ΅ΠΌ Π²ΡΠ΅ Π½ΡΠΆΠ½ΡΠ΅ Π½Π°ΠΌ Π±ΠΈΠ±Π»ΠΈΠΎΡΠ΅ΠΊΠΈ
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import math
import pylab as pl
import random
# Π³ΡΠ°ΡΠΈΠΊΠΈ ΠΎΡΠΎΠ±ΡΠ°Π·ΠΈΠΌ Π² Jupyter
%matplotlib inline
# ΡΠΊΠ°ΠΆΠ΅ΠΌ ΡΠ°Π·ΠΌΠ΅Ρ Π³ΡΠ°ΡΠΈΠΊΠΎΠ²
from pylab import rcParams
rcParams['figure.figsize'] = 12, 6
# ΠΎΡΠΊΠ»ΡΡΠΈΠΌ ΠΏΡΠ΅Π΄ΡΠΏΡΠ΅ΠΆΠ΄Π΅Π½ΠΈΡ Anaconda
import warnings
warnings.simplefilter('ignore')
# Π·Π°Π³ΡΡΠ·ΠΈΠΌ Π·Π½Π°ΡΠ΅Π½ΠΈΡ
table_zero = pd.read_csv('data_example.txt', header=0, sep='t')
# ΠΏΠΎΡΠΌΠΎΡΡΠΈΠΌ ΠΈΠ½ΡΠΎΡΠΌΠ°ΡΠΈΡ ΠΎ ΡΠ°Π±Π»ΠΈΡΠ΅ ΠΈ Π½Π° ΡΠ°ΠΌΡ ΡΠ°Π±Π»ΠΈΡΡ
print table_zero.info()
print '********************************************'
print table_zero
print '********************************************'
# ΠΏΠΎΠ΄Π³ΠΎΡΠΎΠ²ΠΈΠΌ Π΄Π°Π½Π½ΡΠ΅ Π±Π΅Π· ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΡ NumPy
x_us = []
[x_us.append(float(i)) for i in table_zero['x']]
print x_us
print type(x_us)
print '********************************************'
y_us = []
[y_us.append(float(i)) for i in table_zero['y']]
print y_us
print type(y_us)
print '********************************************'
# ΠΏΠΎΠ΄Π³ΠΎΡΠΎΠ²ΠΈΠΌ Π΄Π°Π½Π½ΡΠ΅ Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ NumPy
x_np = table_zero[['x']].values
print x_np
print type(x_np)
print x_np.shape
print '********************************************'
y_np = table_zero[['y']].values
print y_np
print type(y_np)
print y_np.shape
print '********************************************'
Kufungidzira
Zvino, mushure mekunge taisa data, kechipiri, taongorora iko kurongeka uye pakupedzisira kurodha data, isu tichaita yekutanga kuona. Iyo nzira inowanzoshandiswa kune iyi ndeye pairplot raibhurari seaborn. Mumuenzaniso wedu, nekuda kwehuwandu hushoma, hapana chikonzero chekushandisa raibhurari seaborn. Tichashandisa raibhurari yenguva dzose matplotlib uye ingotarisa pakapararira.
Scatterplot code
print 'ΠΡΠ°ΡΠΈΠΊ β1 "ΠΠ°Π²ΠΈΡΠΈΠΌΠΎΡΡΡ Π²ΡΡΡΡΠΊΠΈ ΠΎΡ ΠΌΠ΅ΡΡΡΠ° Π³ΠΎΠ΄Π°"'
plt.plot(x_us,y_us,'o',color='green',markersize=16)
plt.xlabel('$Months$', size=16)
plt.ylabel('$Sales$', size=16)
plt.show()
Chati Nhamba 1 βKutsamira kwemari pamwedzi wegoreβ
Analytical solution
Ngatishandisei zvishandiso zvakajairika mukati python uye gadzirisa system ye equations:
kutanga{equation*}
kutanga{nyaya}
na + bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i = 0
sumlimits_{i=1}^nx_i(a +bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i) = 0
kupera{nyaya}
magumo{equation*}
Maererano nekutonga kwaCramer isu tichawana general determinant, pamwe chete nema determinants na uye na , mushure mezvo, kupatsanura chirevo ne kune general determinant - tsvaga coefficient , zvakafanana tinowana coefficient .
Analytical solution code
# ΠΎΠΏΡΠ΅Π΄Π΅Π»ΠΈΠΌ ΡΡΠ½ΠΊΡΠΈΡ Π΄Π»Ρ ΡΠ°ΡΡΠ΅ΡΠ° ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ² a ΠΈ b ΠΏΠΎ ΠΏΡΠ°Π²ΠΈΠ»Ρ ΠΡΠ°ΠΌΠ΅ΡΠ°
def Kramer_method (x,y):
# ΡΡΠΌΠΌΠ° Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ (Π²ΡΠ΅ ΠΌΠ΅ΡΡΡΠ°)
sx = sum(x)
# ΡΡΠΌΠΌΠ° ΠΈΡΡΠΈΠ½Π½ΡΡ
ΠΎΡΠ²Π΅ΡΠΎΠ² (Π²ΡΡΡΡΠΊΠ° Π·Π° Π²Π΅ΡΡ ΠΏΠ΅ΡΠΈΠΎΠ΄)
sy = sum(y)
# ΡΡΠΌΠΌΠ° ΠΏΡΠΎΠΈΠ·Π²Π΅Π΄Π΅Π½ΠΈΡ Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ Π½Π° ΠΈΡΡΠΈΠ½Π½ΡΠ΅ ΠΎΡΠ²Π΅ΡΡ
list_xy = []
[list_xy.append(x[i]*y[i]) for i in range(len(x))]
sxy = sum(list_xy)
# ΡΡΠΌΠΌΠ° ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ
list_x_sq = []
[list_x_sq.append(x[i]**2) for i in range(len(x))]
sx_sq = sum(list_x_sq)
# ΠΊΠΎΠ»ΠΈΡΠ΅ΡΡΠ²ΠΎ Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ
n = len(x)
# ΠΎΠ±ΡΠΈΠΉ ΠΎΠΏΡΠ΅Π΄Π΅Π»ΠΈΡΠ΅Π»Ρ
det = sx_sq*n - sx*sx
# ΠΎΠΏΡΠ΅Π΄Π΅Π»ΠΈΡΠ΅Π»Ρ ΠΏΠΎ a
det_a = sx_sq*sy - sx*sxy
# ΠΈΡΠΊΠΎΠΌΡΠΉ ΠΏΠ°ΡΠ°ΠΌΠ΅ΡΡ a
a = (det_a / det)
# ΠΎΠΏΡΠ΅Π΄Π΅Π»ΠΈΡΠ΅Π»Ρ ΠΏΠΎ b
det_b = sxy*n - sy*sx
# ΠΈΡΠΊΠΎΠΌΡΠΉ ΠΏΠ°ΡΠ°ΠΌΠ΅ΡΡ b
b = (det_b / det)
# ΠΊΠΎΠ½ΡΡΠΎΠ»ΡΠ½ΡΠ΅ Π·Π½Π°ΡΠ΅Π½ΠΈΡ (ΠΏΡΠΎΠΎΠ²Π΅ΡΠΊΠ°)
check1 = (n*b + a*sx - sy)
check2 = (b*sx + a*sx_sq - sxy)
return [round(a,4), round(b,4)]
# Π·Π°ΠΏΡΡΡΠΈΠΌ ΡΡΠ½ΠΊΡΠΈΡ ΠΈ Π·Π°ΠΏΠΈΡΠ΅ΠΌ ΠΏΡΠ°Π²ΠΈΠ»ΡΠ½ΡΠ΅ ΠΎΡΠ²Π΅ΡΡ
ab_us = Kramer_method(x_us,y_us)
a_us = ab_us[0]
b_us = ab_us[1]
print ' 33[1m' + ' 33[4m' + "ΠΠΏΡΠΈΠΌΠ°Π»ΡΠ½ΡΠ΅ Π·Π½Π°ΡΠ΅Π½ΠΈΡ ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ² a ΠΈ b:" + ' 33[0m'
print 'a =', a_us
print 'b =', b_us
print
# ΠΎΠΏΡΠ΅Π΄Π΅Π»ΠΈΠΌ ΡΡΠ½ΠΊΡΠΈΡ Π΄Π»Ρ ΠΏΠΎΠ΄ΡΡΠ΅ΡΠ° ΡΡΠΌΠΌΡ ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡΠΈΠ±ΠΎΠΊ
def errors_sq_Kramer_method(answers,x,y):
list_errors_sq = []
for i in range(len(x)):
err = (answers[0] + answers[1]*x[i] - y[i])**2
list_errors_sq.append(err)
return sum(list_errors_sq)
# Π·Π°ΠΏΡΡΡΠΈΠΌ ΡΡΠ½ΠΊΡΠΈΡ ΠΈ Π·Π°ΠΏΠΈΡΠ΅ΠΌ Π·Π½Π°ΡΠ΅Π½ΠΈΠ΅ ΠΎΡΠΈΠ±ΠΊΠΈ
error_sq = errors_sq_Kramer_method(ab_us,x_us,y_us)
print ' 33[1m' + ' 33[4m' + "Π‘ΡΠΌΠΌΠ° ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ" + ' 33[0m'
print error_sq
print
# Π·Π°ΠΌΠ΅ΡΠΈΠΌ Π²ΡΠ΅ΠΌΡ ΡΠ°ΡΡΠ΅ΡΠ°
# print ' 33[1m' + ' 33[4m' + "ΠΡΠ΅ΠΌΡ Π²ΡΠΏΠΎΠ»Π½Π΅Π½ΠΈΡ ΡΠ°ΡΡΠ΅ΡΠ° ΡΡΠΌΠΌΡ ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
# % timeit error_sq = errors_sq_Kramer_method(ab,x_us,y_us)
Hezvino izvo zvatakawana:
Saka, kukosha kweiyo coefficients kwakawanikwa, huwandu hwekutsauka kwakapetwa hwakagadzirwa. Ngatitorei mutsara wakatwasuka pane inoparadzira histogram zvinoenderana neanowanikwa coefficients.
Regression line code
# ΠΎΠΏΡΠ΅Π΄Π΅Π»ΠΈΠΌ ΡΡΠ½ΠΊΡΠΈΡ Π΄Π»Ρ ΡΠΎΡΠΌΠΈΡΠΎΠ²Π°Π½ΠΈΡ ΠΌΠ°ΡΡΠΈΠ²Π° ΡΠ°ΡΡΡΠ΅ΡΠ½ΡΡ
Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ Π²ΡΡΡΡΠΊΠΈ
def sales_count(ab,x,y):
line_answers = []
[line_answers.append(ab[0]+ab[1]*x[i]) for i in range(len(x))]
return line_answers
# ΠΏΠΎΡΡΡΠΎΠΈΠΌ Π³ΡΠ°ΡΠΈΠΊΠΈ
print 'ΠΡΡΠΈΠΊβ2 "ΠΡΠ°Π²ΠΈΠ»ΡΠ½ΡΠ΅ ΠΈ ΡΠ°ΡΡΠ΅ΡΠ½ΡΠ΅ ΠΎΡΠ²Π΅ΡΡ"'
plt.plot(x_us,y_us,'o',color='green',markersize=16, label = '$True$ $answers$')
plt.plot(x_us, sales_count(ab_us,x_us,y_us), color='red',lw=4,
label='$Function: a + bx,$ $where$ $a='+str(round(ab_us[0],2))+',$ $b='+str(round(ab_us[1],2))+'$')
plt.xlabel('$Months$', size=16)
plt.ylabel('$Sales$', size=16)
plt.legend(loc=1, prop={'size': 16})
plt.show()
Chati Nhamba 2 βMhinduro dzakarurama uye dzakaverengerwaβ
Unogona kutarisa girafu rekutsauka pamwedzi wega wega. Kwatiri isu, isu hatizowana kukosha kwakakosha kubva kwairi, asi isu tichagutsa kuda kuziva kwedu nezvekuti yakapusa mutsara regression equation inoratidza kutsamira kwemari pamwedzi wegore.
Kutsauka kwechati kodhi
# ΠΎΠΏΡΠ΅Π΄Π΅Π»ΠΈΠΌ ΡΡΠ½ΠΊΡΠΈΡ Π΄Π»Ρ ΡΠΎΡΠΌΠΈΡΠΎΠ²Π°Π½ΠΈΡ ΠΌΠ°ΡΡΠΈΠ²Π° ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ Π² ΠΏΡΠΎΡΠ΅Π½ΡΠ°Ρ
def error_per_month(ab,x,y):
sales_c = sales_count(ab,x,y)
errors_percent = []
for i in range(len(x)):
errors_percent.append(100*(sales_c[i]-y[i])/y[i])
return errors_percent
# ΠΏΠΎΡΡΡΠΎΠΈΠΌ Π³ΡΠ°ΡΠΈΠΊ
print 'ΠΡΠ°ΡΠΈΠΊβ3 "ΠΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΡ ΠΏΠΎ-ΠΌΠ΅ΡΡΡΠ½ΠΎ, %"'
plt.gca().bar(x_us, error_per_month(ab_us,x_us,y_us), color='brown')
plt.xlabel('Months', size=16)
plt.ylabel('Calculation error, %', size=16)
plt.show()
Chati Nhamba 3 βKutsauka, %β
Hatina kukwana, asi takapedza basa redu.
Ngatinyorei basa iro, kuti tione macoefficients ΠΈ inoshandisa raibhurari numpy, zvakanyatsojeka, tichanyora mabasa maviri: imwe inoshandisa pseudoinverse matrix (isingakurudzirwi mukuita, sezvo maitiro acho ari computationally akaoma uye asina kugadzikana), imwe inoshandisa matrix equation.
Analytical Solution Code (NumPy)
# Π΄Π»Ρ Π½Π°ΡΠ°Π»Π° Π΄ΠΎΠ±Π°Π²ΠΈΠΌ ΡΡΠΎΠ»Π±Π΅Ρ Ρ Π½Π΅ ΠΈΠ·ΠΌΠ΅Π½ΡΡΡΠΈΠΌΡΡ Π·Π½Π°ΡΠ΅Π½ΠΈΠ΅ΠΌ Π² 1.
# ΠΠ°Π½Π½ΡΠΉ ΡΡΠΎΠ»Π±Π΅Ρ Π½ΡΠΆΠ΅Π½ Π΄Π»Ρ ΡΠΎΠ³ΠΎ, ΡΡΠΎΠ±Ρ Π½Π΅ ΠΎΠ±ΡΠ°Π±Π°ΡΡΠ²Π°ΡΡ ΠΎΡΠ΄Π΅Π»ΡΠ½ΠΎ ΠΊΠΎΡΡΡΠΈΡΠ΅Π½Ρ a
vector_1 = np.ones((x_np.shape[0],1))
x_np = table_zero[['x']].values # Π½Π° Π²ΡΡΠΊΠΈΠΉ ΡΠ»ΡΡΠ°ΠΉ ΠΏΡΠΈΠ²Π΅Π΄Π΅ΠΌ Π² ΠΏΠ΅ΡΠ²ΠΈΡΠ½ΡΠΉ ΡΠΎΡΠΌΠ°Ρ Π²Π΅ΠΊΡΠΎΡ x_np
x_np = np.hstack((vector_1,x_np))
# ΠΏΡΠΎΠ²Π΅ΡΠΈΠΌ ΡΠΎ, ΡΡΠΎ Π²ΡΠ΅ ΡΠ΄Π΅Π»Π°Π»ΠΈ ΠΏΡΠ°Π²ΠΈΠ»ΡΠ½ΠΎ
print vector_1[0:3]
print x_np[0:3]
print '***************************************'
print
# Π½Π°ΠΏΠΈΡΠ΅ΠΌ ΡΡΠ½ΠΊΡΠΈΡ, ΠΊΠΎΡΠΎΡΠ°Ρ ΠΎΠΏΡΠ΅Π΄Π΅Π»ΡΠ΅Ρ Π·Π½Π°ΡΠ΅Π½ΠΈΡ ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ² a ΠΈ b Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ ΠΏΡΠ΅Π²Π΄ΠΎΠΎΠ±ΡΠ°ΡΠ½ΠΎΠΉ ΠΌΠ°ΡΡΠΈΡΡ
def pseudoinverse_matrix(X, y):
# Π·Π°Π΄Π°Π΅ΠΌ ΡΠ²Π½ΡΠΉ ΡΠΎΡΠΌΠ°Ρ ΠΌΠ°ΡΡΠΈΡΡ ΠΏΡΠΈΠ·Π½Π°ΠΊΠΎΠ²
X = np.matrix(X)
# ΠΎΠΏΡΠ΅Π΄Π΅Π»ΡΠ΅ΠΌ ΡΡΠ°Π½ΡΠΏΠΎΠ½ΠΈΡΠΎΠ²Π°Π½Π½ΡΡ ΠΌΠ°ΡΡΠΈΡΡ
XT = X.T
# ΠΎΠΏΡΠ΅Π΄Π΅Π»ΡΠ΅ΠΌ ΠΊΠ²Π°Π΄ΡΠ°ΡΠ½ΡΡ ΠΌΠ°ΡΡΠΈΡΡ
XTX = XT*X
# ΠΎΠΏΡΠ΅Π΄Π΅Π»ΡΠ΅ΠΌ ΠΏΡΠ΅Π²Π΄ΠΎΠΎΠ±ΡΠ°ΡΠ½ΡΡ ΠΌΠ°ΡΡΠΈΡΡ
inv = np.linalg.pinv(XTX)
# Π·Π°Π΄Π°Π΅ΠΌ ΡΠ²Π½ΡΠΉ ΡΠΎΡΠΌΠ°Ρ ΠΌΠ°ΡΡΠΈΡΡ ΠΎΡΠ²Π΅ΡΠΎΠ²
y = np.matrix(y)
# Π½Π°Ρ
ΠΎΠ΄ΠΈΠΌ Π²Π΅ΠΊΡΠΎΡ Π²Π΅ΡΠΎΠ²
return (inv*XT)*y
# Π·Π°ΠΏΡΡΡΠΈΠΌ ΡΡΠ½ΠΊΡΠΈΡ
ab_np = pseudoinverse_matrix(x_np, y_np)
print ab_np
print '***************************************'
print
# Π½Π°ΠΏΠΈΡΠ΅ΠΌ ΡΡΠ½ΠΊΡΠΈΡ, ΠΊΠΎΡΠΎΡΠ°Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΡΠ΅Ρ Π΄Π»Ρ ΡΠ΅ΡΠ΅Π½ΠΈΡ ΠΌΠ°ΡΡΠΈΡΠ½ΠΎΠ΅ ΡΡΠ°Π²Π½Π΅Π½ΠΈΠ΅
def matrix_equation(X,y):
a = np.dot(X.T, X)
b = np.dot(X.T, y)
return np.linalg.solve(a, b)
# Π·Π°ΠΏΡΡΡΠΈΠΌ ΡΡΠ½ΠΊΡΠΈΡ
ab_np = matrix_equation(x_np,y_np)
print ab_np
Ngatienzanise nguva yakashandiswa pakusarudza coefficients ΠΈ , maererano ne3 nzira dzakaratidzwa.
Kodhi yekuverenga nguva yekuverenga
print ' 33[1m' + ' 33[4m' + "ΠΡΠ΅ΠΌΡ Π²ΡΠΏΠΎΠ»Π½Π΅Π½ΠΈΡ ΡΠ°ΡΡΠ΅ΡΠ° ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ² Π±Π΅Π· ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΡ Π±ΠΈΠ±Π»ΠΈΠΎΡΠ΅ΠΊΠΈ NumPy:" + ' 33[0m'
% timeit ab_us = Kramer_method(x_us,y_us)
print '***************************************'
print
print ' 33[1m' + ' 33[4m' + "ΠΡΠ΅ΠΌΡ Π²ΡΠΏΠΎΠ»Π½Π΅Π½ΠΈΡ ΡΠ°ΡΡΠ΅ΡΠ° ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ² Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ ΠΏΡΠ΅Π²Π΄ΠΎΠΎΠ±ΡΠ°ΡΠ½ΠΎΠΉ ΠΌΠ°ΡΡΠΈΡΡ:" + ' 33[0m'
%timeit ab_np = pseudoinverse_matrix(x_np, y_np)
print '***************************************'
print
print ' 33[1m' + ' 33[4m' + "ΠΡΠ΅ΠΌΡ Π²ΡΠΏΠΎΠ»Π½Π΅Π½ΠΈΡ ΡΠ°ΡΡΠ΅ΡΠ° ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ² Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ ΠΌΠ°ΡΡΠΈΡΠ½ΠΎΠ³ΠΎ ΡΡΠ°Π²Π½Π΅Π½ΠΈΡ:" + ' 33[0m'
%timeit ab_np = matrix_equation(x_np, y_np)
Ne data shoma shoma, basa re "self-written" rinobuda mberi, iro rinowana coefficients uchishandisa nzira yeCramer.
Iye zvino unogona kuenda kune dzimwe nzira dzekutsvaga coefficients ΠΈ .
Gradient Descent
Kutanga, ngatitsanangure kuti gradient chii. Zvichitaurwa zviri nyore, gradient ichikamu chinoratidza kwainoenda kunonyanya kukura kwechinhu. Nekufananidza nekukwira gomo, pakatarisana ne gradient ndipo panokwirwa mawere kusvika pamusoro pegomo. Kukudziridza muenzaniso negomo, tinoyeuka kuti isu tinoda kudzika kwakadzika kuitira kuti tisvike kunzvimbo yakaderera nekukurumidza sezvinobvira, ndiko kuti, zvishoma - nzvimbo iyo basa risingawedzere kana kuderera. Panguva ino derivative inenge yakaenzana ne zero. Naizvozvo, hatidi gradient, asi antigradient. Kuti uwane antigradient iwe unongoda kuwedzera gradient nayo -1 (kubvisa imwe).
Ngatitarisei kune chokwadi chekuti basa rinogona kuve neakati wandei minima, uye tadzika mune imwe yadzo tichishandisa iyo algorithm inotsanangurwa pazasi, isu hatizokwanisa kuwana imwe shoma, inogona kunge yakaderera pane yakawanikwa. Ngatimbozororai, iyi haisi tyisidziro kwatiri! Kwatiri isu tiri kubata nehumwe hushoma, kubvira basa redu pagirafu pane chirevo chenguva dzose. Uye sezvo isu tese tichifanira kuziva zvakanyanya kubva kuchikoro chedu masvomhu kosi, parabola ine imwe chete shoma.
Mushure mekunge tawana chikonzero nei taida gradient, uye zvakare kuti gradient chikamu, kureva, vheji ine yakapihwa coordination, ari chaizvo coefficients akafanana. ΠΈ tinogona kushandisa gradient descent.
Ndisati ndatanga, ini ndinokurudzira kuverenga mitsara mishoma nezve descent algorithm:
- Isu tinosarudza mune pseudo-random nzira iyo kurongeka kweiyo coefficients ΠΈ . Mumuenzaniso wedu, tichatsanangura coefficients pedyo ne zero. Iyi itsika yakajairika, asi imwe neimwe inogona kunge iine maitiro ayo.
- Kubva pakurongeka bvisa kukosha kwe 1st kurongeka chikamu chinotorwa panzvimbo . Saka, kana derivative yakanaka, ipapo basa rinowedzera. Naizvozvo, nekubvisa kukosha kweiyo derivative, isu tichafamba munzira yakapesana nekukura, ndiko kuti, munzira yekudzika. Kana derivative isiri iyo, ipapo basa panguva ino rinodzikira uye nekubvisa kukosha kweiyo derivative tinofamba munzira yekudzika.
- Isu tinoita basa rakafanana necoordination : bvisa kukosha kwechidimbu chinotorwa panzvimbo yacho .
- Kuti urege kusvetuka pamusoro pehushoma uye kubhururuka munzvimbo yakadzika, zvinodikanwa kuseta saizi yenhanho munzira yekudzika. Kazhinji, iwe unogona kunyora chinyorwa chese nezve maitiro ekuseta nhanho nemazvo uye maitiro ekurishandura panguva yekudzika kuitira kudzikisira mitengo yemakomputa. Asi ikozvino tine basa rakasiyana rakasiyana pamberi pedu, uye isu tichagadzira nhanho saizi tichishandisa nzira yesainzi ye "poke" kana, sezvavanotaura mune yakafanana parlance, empirically.
- Kana tave kubva kune dzakapihwa macoordinates ΠΈ bvisa kukosha kwezvakatorwa, tinowana mitsva inoronga ΠΈ . Isu tinotora danho rinotevera (kubvisa), ratova kubva kune akaverengerwa marongero. Uye saka kutenderera kunotanga zvakare uye zvakare, kusvika iyo inodiwa convergence yawanikwa.
Zvose! Iye zvino tagadzirira kuenda kunotsvaga goroji rakadzika reMariana Trench. Ngatitangei.
Code ye gradient descent
# Π½Π°ΠΏΠΈΡΠ΅ΠΌ ΡΡΠ½ΠΊΡΠΈΡ Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ½ΠΎΠ³ΠΎ ΡΠΏΡΡΠΊΠ° Π±Π΅Π· ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΡ Π±ΠΈΠ±Π»ΠΈΠΎΡΠ΅ΠΊΠΈ NumPy.
# Π€ΡΠ½ΠΊΡΠΈΡ Π½Π° Π²Ρ
ΠΎΠ΄ ΠΏΡΠΈΠ½ΠΈΠΌΠ°Π΅Ρ Π΄ΠΈΠ°ΠΏΠ°Π·ΠΎΠ½Ρ Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ x,y, Π΄Π»ΠΈΠ½Ρ ΡΠ°Π³Π° (ΠΏΠΎ ΡΠΌΠΎΠ»ΡΠ°Π½ΠΈΡ=0,1), Π΄ΠΎΠΏΡΡΡΠΈΠΌΡΡ ΠΏΠΎΠ³ΡΠ΅ΡΠ½ΠΎΡΡΡ(tolerance)
def gradient_descent_usual(x_us,y_us,l=0.1,tolerance=0.000000000001):
# ΡΡΠΌΠΌΠ° Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ (Π²ΡΠ΅ ΠΌΠ΅ΡΡΡΠ°)
sx = sum(x_us)
# ΡΡΠΌΠΌΠ° ΠΈΡΡΠΈΠ½Π½ΡΡ
ΠΎΡΠ²Π΅ΡΠΎΠ² (Π²ΡΡΡΡΠΊΠ° Π·Π° Π²Π΅ΡΡ ΠΏΠ΅ΡΠΈΠΎΠ΄)
sy = sum(y_us)
# ΡΡΠΌΠΌΠ° ΠΏΡΠΎΠΈΠ·Π²Π΅Π΄Π΅Π½ΠΈΡ Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ Π½Π° ΠΈΡΡΠΈΠ½Π½ΡΠ΅ ΠΎΡΠ²Π΅ΡΡ
list_xy = []
[list_xy.append(x_us[i]*y_us[i]) for i in range(len(x_us))]
sxy = sum(list_xy)
# ΡΡΠΌΠΌΠ° ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ
list_x_sq = []
[list_x_sq.append(x_us[i]**2) for i in range(len(x_us))]
sx_sq = sum(list_x_sq)
# ΠΊΠΎΠ»ΠΈΡΠ΅ΡΡΠ²ΠΎ Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ
num = len(x_us)
# Π½Π°ΡΠ°Π»ΡΠ½ΡΠ΅ Π·Π½Π°ΡΠ΅Π½ΠΈΡ ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ², ΠΎΠΏΡΠ΅Π΄Π΅Π»Π΅Π½Π½ΡΠ΅ ΠΏΡΠ΅Π²Π΄ΠΎΡΠ»ΡΡΠ°ΠΉΠ½ΡΠΌ ΠΎΠ±ΡΠ°Π·ΠΎΠΌ
a = float(random.uniform(-0.5, 0.5))
b = float(random.uniform(-0.5, 0.5))
# ΡΠΎΠ·Π΄Π°Π΅ΠΌ ΠΌΠ°ΡΡΠΈΠ² Ρ ΠΎΡΠΈΠ±ΠΊΠ°ΠΌΠΈ, Π΄Π»Ρ ΡΡΠ°ΡΡΠ° ΠΈΡΠΏΠΎΠ»ΡΠ·ΡΠ΅ΠΌ Π·Π½Π°ΡΠ΅Π½ΠΈΡ 1 ΠΈ 0
# ΠΏΠΎΡΠ»Π΅ Π·Π°Π²Π΅ΡΡΠ΅Π½ΠΈΡ ΡΠΏΡΡΠΊΠ° ΡΡΠ°ΡΡΠΎΠ²ΡΠ΅ Π·Π½Π°ΡΠ΅Π½ΠΈΡ ΡΠ΄Π°Π»ΠΈΠΌ
errors = [1,0]
# Π·Π°ΠΏΡΡΠΊΠ°Π΅ΠΌ ΡΠΈΠΊΠ» ΡΠΏΡΡΠΊΠ°
# ΡΠΈΠΊΠ» ΡΠ°Π±ΠΎΡΠ°Π΅Ρ Π΄ΠΎ ΡΠ΅Ρ
ΠΏΠΎΡ, ΠΏΠΎΠΊΠ° ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠ΅ ΠΏΠΎΡΠ»Π΅Π΄Π½Π΅ΠΉ ΠΎΡΠΈΠ±ΠΊΠΈ ΡΡΠΌΠΌΡ ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡ ΠΏΡΠ΅Π΄ΡΠ΄ΡΡΠ΅ΠΉ, Π½Π΅ Π±ΡΠ΄Π΅Ρ ΠΌΠ΅Π½ΡΡΠ΅ tolerance
while abs(errors[-1]-errors[-2]) > tolerance:
a_step = a - l*(num*a + b*sx - sy)/num
b_step = b - l*(a*sx + b*sx_sq - sxy)/num
a = a_step
b = b_step
ab = [a,b]
errors.append(errors_sq_Kramer_method(ab,x_us,y_us))
return (ab),(errors[2:])
# Π·Π°ΠΏΠΈΡΠ΅ΠΌ ΠΌΠ°ΡΡΠΈΠ² Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ
list_parametres_gradient_descence = gradient_descent_usual(x_us,y_us,l=0.1,tolerance=0.000000000001)
print ' 33[1m' + ' 33[4m' + "ΠΠ½Π°ΡΠ΅Π½ΠΈΡ ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ² a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_gradient_descence[0][0],3)
print 'b =', round(list_parametres_gradient_descence[0][1],3)
print
print ' 33[1m' + ' 33[4m' + "Π‘ΡΠΌΠΌΠ° ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_gradient_descence[1][-1],3)
print
print ' 33[1m' + ' 33[4m' + "ΠΠΎΠ»ΠΈΡΠ΅ΡΡΠ²ΠΎ ΠΈΡΠ΅ΡΠ°ΡΠΈΠΉ Π² Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ½ΠΎΠΌ ΡΠΏΡΡΠΊΠ΅:" + ' 33[0m'
print len(list_parametres_gradient_descence[1])
print
Takanyura kuzasi chaiko kweMariana Trench uye ikoko takawana ese akafanana coefficient values ΠΈ , zvinova ndizvo chaizvo zvaifanira kutarisirwa.
Ngatitore imwe dive, panguva ino chete, mota yedu yegungwa yakadzika ichazadzwa nehumwe matekinoroji, kureva raibhurari. numpy.
Kodhi ye gradient descent (NumPy)
# ΠΏΠ΅ΡΠ΅Π΄ ΡΠ΅ΠΌ ΠΎΠΏΡΠ΅Π΄Π΅Π»ΠΈΡΡ ΡΡΠ½ΠΊΡΠΈΡ Π΄Π»Ρ Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ½ΠΎΠ³ΠΎ ΡΠΏΡΡΠΊΠ° Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ Π±ΠΈΠ±Π»ΠΈΠΎΡΠ΅ΠΊΠΈ NumPy,
# Π½Π°ΠΏΠΈΡΠ΅ΠΌ ΡΡΠ½ΠΊΡΠΈΡ ΠΎΠΏΡΠ΅Π΄Π΅Π»Π΅Π½ΠΈΡ ΡΡΠΌΠΌΡ ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΡΠ°ΠΊΠΆΠ΅ Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ NumPy
def error_square_numpy(ab,x_np,y_np):
y_pred = np.dot(x_np,ab)
error = y_pred - y_np
return sum((error)**2)
# Π½Π°ΠΏΠΈΡΠ΅ΠΌ ΡΡΠ½ΠΊΡΠΈΡ Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ½ΠΎΠ³ΠΎ ΡΠΏΡΡΠΊΠ° Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ Π±ΠΈΠ±Π»ΠΈΠΎΡΠ΅ΠΊΠΈ NumPy.
# Π€ΡΠ½ΠΊΡΠΈΡ Π½Π° Π²Ρ
ΠΎΠ΄ ΠΏΡΠΈΠ½ΠΈΠΌΠ°Π΅Ρ Π΄ΠΈΠ°ΠΏΠ°Π·ΠΎΠ½Ρ Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ x,y, Π΄Π»ΠΈΠ½Ρ ΡΠ°Π³Π° (ΠΏΠΎ ΡΠΌΠΎΠ»ΡΠ°Π½ΠΈΡ=0,1), Π΄ΠΎΠΏΡΡΡΠΈΠΌΡΡ ΠΏΠΎΠ³ΡΠ΅ΡΠ½ΠΎΡΡΡ(tolerance)
def gradient_descent_numpy(x_np,y_np,l=0.1,tolerance=0.000000000001):
# ΡΡΠΌΠΌΠ° Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ (Π²ΡΠ΅ ΠΌΠ΅ΡΡΡΠ°)
sx = float(sum(x_np[:,1]))
# ΡΡΠΌΠΌΠ° ΠΈΡΡΠΈΠ½Π½ΡΡ
ΠΎΡΠ²Π΅ΡΠΎΠ² (Π²ΡΡΡΡΠΊΠ° Π·Π° Π²Π΅ΡΡ ΠΏΠ΅ΡΠΈΠΎΠ΄)
sy = float(sum(y_np))
# ΡΡΠΌΠΌΠ° ΠΏΡΠΎΠΈΠ·Π²Π΅Π΄Π΅Π½ΠΈΡ Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ Π½Π° ΠΈΡΡΠΈΠ½Π½ΡΠ΅ ΠΎΡΠ²Π΅ΡΡ
sxy = x_np*y_np
sxy = float(sum(sxy[:,1]))
# ΡΡΠΌΠΌΠ° ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ
sx_sq = float(sum(x_np[:,1]**2))
# ΠΊΠΎΠ»ΠΈΡΠ΅ΡΡΠ²ΠΎ Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ
num = float(x_np.shape[0])
# Π½Π°ΡΠ°Π»ΡΠ½ΡΠ΅ Π·Π½Π°ΡΠ΅Π½ΠΈΡ ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ², ΠΎΠΏΡΠ΅Π΄Π΅Π»Π΅Π½Π½ΡΠ΅ ΠΏΡΠ΅Π²Π΄ΠΎΡΠ»ΡΡΠ°ΠΉΠ½ΡΠΌ ΠΎΠ±ΡΠ°Π·ΠΎΠΌ
a = float(random.uniform(-0.5, 0.5))
b = float(random.uniform(-0.5, 0.5))
# ΡΠΎΠ·Π΄Π°Π΅ΠΌ ΠΌΠ°ΡΡΠΈΠ² Ρ ΠΎΡΠΈΠ±ΠΊΠ°ΠΌΠΈ, Π΄Π»Ρ ΡΡΠ°ΡΡΠ° ΠΈΡΠΏΠΎΠ»ΡΠ·ΡΠ΅ΠΌ Π·Π½Π°ΡΠ΅Π½ΠΈΡ 1 ΠΈ 0
# ΠΏΠΎΡΠ»Π΅ Π·Π°Π²Π΅ΡΡΠ΅Π½ΠΈΡ ΡΠΏΡΡΠΊΠ° ΡΡΠ°ΡΡΠΎΠ²ΡΠ΅ Π·Π½Π°ΡΠ΅Π½ΠΈΡ ΡΠ΄Π°Π»ΠΈΠΌ
errors = [1,0]
# Π·Π°ΠΏΡΡΠΊΠ°Π΅ΠΌ ΡΠΈΠΊΠ» ΡΠΏΡΡΠΊΠ°
# ΡΠΈΠΊΠ» ΡΠ°Π±ΠΎΡΠ°Π΅Ρ Π΄ΠΎ ΡΠ΅Ρ
ΠΏΠΎΡ, ΠΏΠΎΠΊΠ° ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠ΅ ΠΏΠΎΡΠ»Π΅Π΄Π½Π΅ΠΉ ΠΎΡΠΈΠ±ΠΊΠΈ ΡΡΠΌΠΌΡ ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡ ΠΏΡΠ΅Π΄ΡΠ΄ΡΡΠ΅ΠΉ, Π½Π΅ Π±ΡΠ΄Π΅Ρ ΠΌΠ΅Π½ΡΡΠ΅ tolerance
while abs(errors[-1]-errors[-2]) > tolerance:
a_step = a - l*(num*a + b*sx - sy)/num
b_step = b - l*(a*sx + b*sx_sq - sxy)/num
a = a_step
b = b_step
ab = np.array([[a],[b]])
errors.append(error_square_numpy(ab,x_np,y_np))
return (ab),(errors[2:])
# Π·Π°ΠΏΠΈΡΠ΅ΠΌ ΠΌΠ°ΡΡΠΈΠ² Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ
list_parametres_gradient_descence = gradient_descent_numpy(x_np,y_np,l=0.1,tolerance=0.000000000001)
print ' 33[1m' + ' 33[4m' + "ΠΠ½Π°ΡΠ΅Π½ΠΈΡ ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ² a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_gradient_descence[0][0],3)
print 'b =', round(list_parametres_gradient_descence[0][1],3)
print
print ' 33[1m' + ' 33[4m' + "Π‘ΡΠΌΠΌΠ° ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_gradient_descence[1][-1],3)
print
print ' 33[1m' + ' 33[4m' + "ΠΠΎΠ»ΠΈΡΠ΅ΡΡΠ²ΠΎ ΠΈΡΠ΅ΡΠ°ΡΠΈΠΉ Π² Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ½ΠΎΠΌ ΡΠΏΡΡΠΊΠ΅:" + ' 33[0m'
print len(list_parametres_gradient_descence[1])
print
Coefficient values ΠΈ isingachinjiki.
Ngatitarisei kuti chikanganiso chakachinja sei panguva yekudzika kwegradient, kureva kuti, huwandu hwekutsauka kwakapetwa kwakachinja sei nedanho rega rega.
Kodhi yekuronga uwandu hwemakona akatsauka
print 'ΠΡΠ°ΡΠΈΠΊβ4 "Π‘ΡΠΌΠΌΠ° ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-ΡΠ°Π³ΠΎΠ²ΠΎ"'
plt.plot(range(len(list_parametres_gradient_descence[1])), list_parametres_gradient_descence[1], color='red', lw=3)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()
Girafu nhamba 4 "Huwandu hwekutsauka kwakapetwa panguva yekudzika kwegradient"
Pagirafu tinoona kuti nenhanho imwe neimwe kukanganisa kunodzikira, uye mushure meimwe nhamba yekudzokorora tinoona mutsara wakachinjika.
Chekupedzisira, ngatifungidzire mutsauko wenguva yekushandisa kodhi:
Kodhi yekuona gradient descent kuverenga nguva
print ' 33[1m' + ' 33[4m' + "ΠΡΠ΅ΠΌΡ Π²ΡΠΏΠΎΠ»Π½Π΅Π½ΠΈΡ Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ½ΠΎΠ³ΠΎ ΡΠΏΡΡΠΊΠ° Π±Π΅Π· ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΡ Π±ΠΈΠ±Π»ΠΈΠΎΡΠ΅ΠΊΠΈ NumPy:" + ' 33[0m'
%timeit list_parametres_gradient_descence = gradient_descent_usual(x_us,y_us,l=0.1,tolerance=0.000000000001)
print '***************************************'
print
print ' 33[1m' + ' 33[4m' + "ΠΡΠ΅ΠΌΡ Π²ΡΠΏΠΎΠ»Π½Π΅Π½ΠΈΡ Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ½ΠΎΠ³ΠΎ ΡΠΏΡΡΠΊΠ° Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ Π±ΠΈΠ±Π»ΠΈΠΎΡΠ΅ΠΊΠΈ NumPy:" + ' 33[0m'
%timeit list_parametres_gradient_descence = gradient_descent_numpy(x_np,y_np,l=0.1,tolerance=0.000000000001)
Zvichida tiri kuita chimwe chinhu chisina kunaka, asi zvakare iri nyore "rakanyorwa-pamba" basa risingashandisi raibhurari. numpy inokunda nguva yekuverenga yebasa uchishandisa raibhurari numpy.
Asi isu hatisi kumira, asi tiri kufamba takananga kudzidza imwe nzira inonakidza yekugadzirisa iyo yakapfava mutsara regression equation. Meet!
Stochastic gradient descent
Kuti unzwisise nekukurumidza musimboti wekushanda kwe stochastic gradient descent, zviri nani kuona misiyano yayo kubva kune yakajairika gradient descent. Isu, kana iri nyaya ye gradient descent, mune equation yezvinobva pa ΠΈ akashandisa zviverengero zvehunhu hwese maficha uye mhinduro dzechokwadi dziripo mumuenzaniso (kureva kuti, sums dzezvose. ΠΈ ) Mu stochastic gradient descent, isu hatizoshandisa ese hunhu huripo mumuenzaniso, asi pachinzvimbo, pseudo-randomly sarudza iyo inonzi sampuli index uye shandisa hunhu hwayo.
Semuenzaniso, kana index yakatemwa kuva nhamba 3 (matatu), saka tinotora maitiro ΠΈ , tobva taisa ma values ββmumaderivative equations toona macoordinates matsva. Zvadaro, tasarudza marongesheni, isu zvakare pseudo-randomly tinoona iyo sampuli index, kutsiva hunhu hunoenderana neindekisi mune chikamu mutsauko equation, uye toona marongero nenzira itsva. ΠΈ etc. kusvikira kusanganiswa kwaita girini. Pakutanga kuona, zvingasaita sekuti izvi zvinogona kushanda zvachose, asi zvinoita. Ichokwadi kuti zvakakosha kuziva kuti chikanganiso hachidzike nenhanho imwe neimwe, asi pane zvechokwadi maitiro.
Ndezvipi zvakanakira stochastic gradient descent pane yakajairwa? Kana saizi yedu yemuenzaniso yakakura uye yakayerwa mumakumi ezviuru zvehukoshi, zvino zviri nyore kugadzirisa, toti, zviuru zvisingaverengeki zvadzo, pane sampuli yese. Apa ndipo panopinda stochastic gradient descent. Muchiitiko chedu, hongu, isu hatizocherechedzi misiyano yakawanda.
Ngatitarisei kodhi.
Kodhi ye stochastic gradient descent
# ΠΎΠΏΡΠ΅Π΄Π΅Π»ΠΈΠΌ ΡΡΠ½ΠΊΡΠΈΡ ΡΡΠΎΡ
.Π³ΡΠ°Π΄.ΡΠ°Π³Π°
def stoch_grad_step_usual(vector_init, x_us, ind, y_us, l):
# Π²ΡΠ±ΠΈΡΠ°Π΅ΠΌ Π·Π½Π°ΡΠ΅Π½ΠΈΠ΅ ΠΈΠΊΡ, ΠΊΠΎΡΠΎΡΠΎΠ΅ ΡΠΎΠΎΡΠ²Π΅ΡΡΡΠ²ΡΠ΅Ρ ΡΠ»ΡΡΠ°ΠΉΠ½ΠΎΠΌΡ Π·Π½Π°ΡΠ΅Π½ΠΈΡ ΠΏΠ°ΡΠ°ΠΌΠ΅ΡΡΠ° ind
# (ΡΠΌ.Ρ-ΡΠΈΡ stoch_grad_descent_usual)
x = x_us[ind]
# ΡΠ°ΡΡΡΠΈΡΡΠ²ΡΠ°Π΅ΠΌ Π·Π½Π°ΡΠ΅Π½ΠΈΠ΅ y (Π²ΡΡΡΡΠΊΡ), ΠΊΠΎΡΠΎΡΠ°Ρ ΡΠΎΠΎΡΠ²Π΅ΡΡΡΠ²ΡΠ΅Ρ Π²ΡΠ±ΡΠ°Π½Π½ΠΎΠΌΡ Π·Π½Π°ΡΠ΅Π½ΠΈΡ x
y_pred = vector_init[0] + vector_init[1]*x_us[ind]
# Π²ΡΡΠΈΡΠ»ΡΠ΅ΠΌ ΠΎΡΠΈΠ±ΠΊΡ ΡΠ°ΡΡΠ΅ΡΠ½ΠΎΠΉ Π²ΡΡΡΡΠΊΠΈ ΠΎΡΠ½ΠΎΡΠΈΡΠ΅Π»ΡΠ½ΠΎ ΠΏΡΠ΅Π΄ΡΡΠ°Π²Π»Π΅Π½Π½ΠΎΠΉ Π² Π²ΡΠ±ΠΎΡΠΊΠ΅
error = y_pred - y_us[ind]
# ΠΎΠΏΡΠ΅Π΄Π΅Π»ΡΠ΅ΠΌ ΠΏΠ΅ΡΠ²ΡΡ ΠΊΠΎΠΎΡΠ΄ΠΈΠ½Π°ΡΡ Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ° ab
grad_a = error
# ΠΎΠΏΡΠ΅Π΄Π΅Π»ΡΠ΅ΠΌ Π²ΡΠΎΡΡΡ ΠΊΠΎΠΎΡΠ΄ΠΈΠ½Π°ΡΡ ab
grad_b = x_us[ind]*error
# Π²ΡΡΠΈΡΠ»ΡΠ΅ΠΌ Π½ΠΎΠ²ΡΠΉ Π²Π΅ΠΊΡΠΎΡ ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ²
vector_new = [vector_init[0]-l*grad_a, vector_init[1]-l*grad_b]
return vector_new
# ΠΎΠΏΡΠ΅Π΄Π΅Π»ΠΈΠΌ ΡΡΠ½ΠΊΡΠΈΡ ΡΡΠΎΡ
.Π³ΡΠ°Π΄.ΡΠΏΡΡΠΊΠ°
def stoch_grad_descent_usual(x_us, y_us, l=0.1, steps = 800):
# Π΄Π»Ρ ΡΠ°ΠΌΠΎΠ³ΠΎ Π½Π°ΡΠ°Π»Π° ΡΠ°Π±ΠΎΡΡ ΡΡΠ½ΠΊΡΠΈΠΈ Π·Π°Π΄Π°Π΄ΠΈΠΌ Π½Π°ΡΠ°Π»ΡΠ½ΡΠ΅ Π·Π½Π°ΡΠ΅Π½ΠΈΡ ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ²
vector_init = [float(random.uniform(-0.5, 0.5)), float(random.uniform(-0.5, 0.5))]
errors = []
# Π·Π°ΠΏΡΡΡΠΈΠΌ ΡΠΈΠΊΠ» ΡΠΏΡΡΠΊΠ°
# ΡΠΈΠΊΠ» ΡΠ°ΡΡΠΈΡΠ°Π½ Π½Π° ΠΎΠΏΡΠ΅Π΄Π΅Π»Π΅Π½Π½ΠΎΠ΅ ΠΊΠΎΠ»ΠΈΡΠ΅ΡΡΠ²ΠΎ ΡΠ°Π³ΠΎΠ² (steps)
for i in range(steps):
ind = random.choice(range(len(x_us)))
new_vector = stoch_grad_step_usual(vector_init, x_us, ind, y_us, l)
vector_init = new_vector
errors.append(errors_sq_Kramer_method(vector_init,x_us,y_us))
return (vector_init),(errors)
# Π·Π°ΠΏΠΈΡΠ΅ΠΌ ΠΌΠ°ΡΡΠΈΠ² Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ
list_parametres_stoch_gradient_descence = stoch_grad_descent_usual(x_us, y_us, l=0.1, steps = 800)
print ' 33[1m' + ' 33[4m' + "ΠΠ½Π°ΡΠ΅Π½ΠΈΡ ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ² a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_stoch_gradient_descence[0][0],3)
print 'b =', round(list_parametres_stoch_gradient_descence[0][1],3)
print
print ' 33[1m' + ' 33[4m' + "Π‘ΡΠΌΠΌΠ° ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_stoch_gradient_descence[1][-1],3)
print
print ' 33[1m' + ' 33[4m' + "ΠΠΎΠ»ΠΈΡΠ΅ΡΡΠ²ΠΎ ΠΈΡΠ΅ΡΠ°ΡΠΈΠΉ Π² ΡΡΠΎΡ
Π°ΡΡΠΈΡΠ΅ΡΠΊΠΎΠΌ Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ½ΠΎΠΌ ΡΠΏΡΡΠΊΠ΅:" + ' 33[0m'
print len(list_parametres_stoch_gradient_descence[1])
Isu tinotarisa zvakanyatsonaka kune coefficients uye tinozvibata isu tichibvunza mubvunzo "Izvi zvingave sei?" Tine mamwe macoefficient values ΠΈ . Pamwe stochastic gradient descent yakawana mamwe maparamita akanyanya equation? Sezvineiwo kwete. Zvakakwana kuti utarise huwandu hwekutsauka kwakapetwa uye kuona kuti nehunyowani hutsva hweiyo coefficients, iko kukanganisa kwakakura. Hatisi kukurumidza kuora mwoyo. Ngativake girafu rekuchinja kukanganisa.
Kodhi yekuronga huwandu hwekutsauka kwakapetwa mu stochastic gradient descent
print 'ΠΡΠ°ΡΠΈΠΊ β5 "Π‘ΡΠΌΠΌΠ° ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-ΡΠ°Π³ΠΎΠ²ΠΎ"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1])), list_parametres_stoch_gradient_descence[1], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()
Girafu nhamba 5 "Sm of squared deviations panguva ye stochastic gradient descent"
Tichitarisa purogiramu, zvinhu zvose zvinowira munzvimbo uye zvino tichagadzirisa zvose.
Saka chii chakaitika? Zvinotevera zvakaitika. Kana isu tikasarudza mwedzi zvisina tsarukano, zvino ndeyemwedzi wakasarudzwa iyo algorithm yedu inotsvaga kudzikisa chikanganiso mukuverenga mari. Zvadaro tinosarudza imwe mwedzi uye tinodzokorora kuverenga, asi tinoderedza kukanganisa kwemwedzi wechipiri wakasarudzwa. Zvino rangarira kuti mwedzi miviri yekutanga inotsauka zvakanyanya kubva pamutsara wemutsara wakapfava weregression equation. Izvi zvinoreva kuti kana imwe yemwedzi miviri iyi yasarudzwa, nekuderedza kukanganisa kweumwe neumwe wavo, algorithm yedu inowedzera zvakanyanya kukanganisa kwemuenzaniso wose. Saka ndoita sei? Mhinduro iri nyore: unofanirwa kuderedza danho rekudzika. Mushure mezvose, nekudzikisa danho rekudzika, iko kukanganisa kuchamirawo "kusvetuka" kumusoro uye pasi. Kana kuti pane kudaro, kukanganisa kwe "kusvetuka" hakuzoregi, asi hazvizoiti nokukurumidza :) Ngationgororei.
Kodhi yekumhanyisa SGD ine zvidiki zvinowedzera
# Π·Π°ΠΏΡΡΡΠΈΠΌ ΡΡΠ½ΠΊΡΠΈΡ, ΡΠΌΠ΅Π½ΡΡΠΈΠ² ΡΠ°Π³ Π² 100 ΡΠ°Π· ΠΈ ΡΠ²Π΅Π»ΠΈΡΠΈΠ² ΠΊΠΎΠ»ΠΈΡΠ΅ΡΡΠ²ΠΎ ΡΠ°Π³ΠΎΠ² ΡΠΎΠΎΡΠ²Π΅ΡΡΠ²ΡΡΡΠ΅
list_parametres_stoch_gradient_descence = stoch_grad_descent_usual(x_us, y_us, l=0.001, steps = 80000)
print ' 33[1m' + ' 33[4m' + "ΠΠ½Π°ΡΠ΅Π½ΠΈΡ ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ² a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_stoch_gradient_descence[0][0],3)
print 'b =', round(list_parametres_stoch_gradient_descence[0][1],3)
print
print ' 33[1m' + ' 33[4m' + "Π‘ΡΠΌΠΌΠ° ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_stoch_gradient_descence[1][-1],3)
print
print ' 33[1m' + ' 33[4m' + "ΠΠΎΠ»ΠΈΡΠ΅ΡΡΠ²ΠΎ ΠΈΡΠ΅ΡΠ°ΡΠΈΠΉ Π² ΡΡΠΎΡ
Π°ΡΡΠΈΡΠ΅ΡΠΊΠΎΠΌ Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ½ΠΎΠΌ ΡΠΏΡΡΠΊΠ΅:" + ' 33[0m'
print len(list_parametres_stoch_gradient_descence[1])
print 'ΠΡΠ°ΡΠΈΠΊ β6 "Π‘ΡΠΌΠΌΠ° ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-ΡΠ°Π³ΠΎΠ²ΠΎ"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1])), list_parametres_stoch_gradient_descence[1], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()
Girafu Nha. 6 "Huwandu hwekutsauka kwakapetwa panguva yekudzika kwe stochastic gradient (matanho zviuru makumi masere)"
Iyo coefficients yakagadziridzwa, asi haisati yakwana. Hypothetically, izvi zvinogona kugadziriswa nenzira iyi. Isu tinosarudza, semuenzaniso, mune yekupedzisira 1000 iterations kukosha kweiyo coefficients iyo yakaderera kukanganisa yakaitwa. Ichokwadi, nokuda kweizvi tichazofanirawo kunyora pasi kukosha kweiyo coefficients pachavo. Hatizoiti izvi, asi kuti teerera kune purogiramu. Inotaridzika yakatsetseka uye kukanganisa kunoratidzika kuderera zvakaenzana. Chaizvoizvo ichi hachisi chokwadi. Ngatitarisei kune ekutanga 1000 iterations tozvienzanisa neyekupedzisira.
Kodhi yeSGD chati (yekutanga 1000 matanho)
print 'ΠΡΠ°ΡΠΈΠΊ β7 "Π‘ΡΠΌΠΌΠ° ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-ΡΠ°Π³ΠΎΠ²ΠΎ. ΠΠ΅ΡΠ²ΡΠ΅ 1000 ΠΈΡΠ΅ΡΠ°ΡΠΈΠΉ"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1][:1000])),
list_parametres_stoch_gradient_descence[1][:1000], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()
print 'ΠΡΠ°ΡΠΈΠΊ β7 "Π‘ΡΠΌΠΌΠ° ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-ΡΠ°Π³ΠΎΠ²ΠΎ. ΠΠΎΡΠ»Π΅Π΄Π½ΠΈΠ΅ 1000 ΠΈΡΠ΅ΡΠ°ΡΠΈΠΉ"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1][-1000:])),
list_parametres_stoch_gradient_descence[1][-1000:], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()
Girafu nhamba 7 "Sum of squared deviations SGD (matanho ekutanga 1000)"
Girafu nhamba 8 "Sum of squared deviations SGD (matanho 1000 ekupedzisira)"
Pakutanga kwekudzika, tinoona yunifomu uye kuderera kwakanyanya mukukanganisa. Mukudzokororwa kwekupedzisira, tinoona kuti kukanganisa kunotenderera nekutenderedza kukosha kwe1,475 uye pane dzimwe nguva inotoenzana nehuhu hwakanyanya kukosha, asi ichiri kukwira ... ndinodzokorora, unogona kunyora pasi kukosha kweiyo coefficients ΠΈ , uye wozosarudza izvo izvo kukanganisa kuri kushoma. Nekudaro, isu takave nedambudziko rakakura: taifanira kutora zviuru makumi masere nhanho (ona kodhi) kuwana kukosha padhuze nekwakanyanya. Uye izvi zvinotopokana nepfungwa yekuchengetedza computation nguva ne stochastic gradient descent maererano ne gradient descent. Chii chinogona kugadziriswa uye kuvandudzwa? Hazvisi zvakaoma kuona kuti mukutanga kudzokororwa tiri kuburuka nechivimbo uye, naizvozvo, tinofanira kusiya danho guru mukutanga kudzokororwa uye kuderedza danho sezvatinoenda mberi. Hatingaite izvi munyaya ino - yatove yakareba. Avo vanoshuvira vanogona kuzvifungira kuti voita sei izvi, hazvina kuoma :)
Zvino ngatiitei stochastic gradient descent tichishandisa raibhurari numpy (uye ngatirege kugumburwa pamusoro pematombo atakamboona)
Kodhi yeStochastic Gradient Descent (NumPy)
# Π΄Π»Ρ Π½Π°ΡΠ°Π»Π° Π½Π°ΠΏΠΈΡΠ΅ΠΌ ΡΡΠ½ΠΊΡΠΈΡ Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ½ΠΎΠ³ΠΎ ΡΠ°Π³Π°
def stoch_grad_step_numpy(vector_init, X, ind, y, l):
x = X[ind]
y_pred = np.dot(x,vector_init)
err = y_pred - y[ind]
grad_a = err
grad_b = x[1]*err
return vector_init - l*np.array([grad_a, grad_b])
# ΠΎΠΏΡΠ΅Π΄Π΅Π»ΠΈΠΌ ΡΡΠ½ΠΊΡΠΈΡ ΡΡΠΎΡ
Π°ΡΡΠΈΡΠ΅ΡΠΊΠΎΠ³ΠΎ Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ½ΠΎΠ³ΠΎ ΡΠΏΡΡΠΊΠ°
def stoch_grad_descent_numpy(X, y, l=0.1, steps = 800):
vector_init = np.array([[np.random.randint(X.shape[0])], [np.random.randint(X.shape[0])]])
errors = []
for i in range(steps):
ind = np.random.randint(X.shape[0])
new_vector = stoch_grad_step_numpy(vector_init, X, ind, y, l)
vector_init = new_vector
errors.append(error_square_numpy(vector_init,X,y))
return (vector_init), (errors)
# Π·Π°ΠΏΠΈΡΠ΅ΠΌ ΠΌΠ°ΡΡΠΈΠ² Π·Π½Π°ΡΠ΅Π½ΠΈΠΉ
list_parametres_stoch_gradient_descence = stoch_grad_descent_numpy(x_np, y_np, l=0.001, steps = 80000)
print ' 33[1m' + ' 33[4m' + "ΠΠ½Π°ΡΠ΅Π½ΠΈΡ ΠΊΠΎΡΡΡΠΈΡΠΈΠ΅Π½ΡΠΎΠ² a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_stoch_gradient_descence[0][0],3)
print 'b =', round(list_parametres_stoch_gradient_descence[0][1],3)
print
print ' 33[1m' + ' 33[4m' + "Π‘ΡΠΌΠΌΠ° ΠΊΠ²Π°Π΄ΡΠ°ΡΠΎΠ² ΠΎΡΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_stoch_gradient_descence[1][-1],3)
print
print ' 33[1m' + ' 33[4m' + "ΠΠΎΠ»ΠΈΡΠ΅ΡΡΠ²ΠΎ ΠΈΡΠ΅ΡΠ°ΡΠΈΠΉ Π² ΡΡΠΎΡ
Π°ΡΡΠΈΡΠ΅ΡΠΊΠΎΠΌ Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ½ΠΎΠΌ ΡΠΏΡΡΠΊΠ΅:" + ' 33[0m'
print len(list_parametres_stoch_gradient_descence[1])
print
Iwo ma values ββakazoita kunge akafanana neapo achidzika pasina kushandisa numpy. Zvisinei, izvi zvine musoro.
Ngationei kuti stochastic gradient descents yakatitorera nguva yakareba sei.
Kodhi yekuona SGD kuverenga nguva (80 zviuru matanho)
print ' 33[1m' + ' 33[4m' +
"ΠΡΠ΅ΠΌΡ Π²ΡΠΏΠΎΠ»Π½Π΅Π½ΠΈΡ ΡΡΠΎΡ
Π°ΡΡΠΈΡΠ΅ΡΠΊΠΎΠ³ΠΎ Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ½ΠΎΠ³ΠΎ ΡΠΏΡΡΠΊΠ° Π±Π΅Π· ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΡ Π±ΠΈΠ±Π»ΠΈΠΎΡΠ΅ΠΊΠΈ NumPy:"
+ ' 33[0m'
%timeit list_parametres_stoch_gradient_descence = stoch_grad_descent_usual(x_us, y_us, l=0.001, steps = 80000)
print '***************************************'
print
print ' 33[1m' + ' 33[4m' +
"ΠΡΠ΅ΠΌΡ Π²ΡΠΏΠΎΠ»Π½Π΅Π½ΠΈΡ ΡΡΠΎΡ
Π°ΡΡΠΈΡΠ΅ΡΠΊΠΎΠ³ΠΎ Π³ΡΠ°Π΄ΠΈΠ΅Π½ΡΠ½ΠΎΠ³ΠΎ ΡΠΏΡΡΠΊΠ° Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ Π±ΠΈΠ±Π»ΠΈΠΎΡΠ΅ΠΊΠΈ NumPy:"
+ ' 33[0m'
%timeit list_parametres_stoch_gradient_descence = stoch_grad_descent_numpy(x_np, y_np, l=0.001, steps = 80000)
Kuwedzera mukati mesango, makore akasvibira: zvakare, "self-written" fomula inoratidza chigumisiro chakanakisisa. Zvose izvi zvinoratidza kuti panofanira kuva nedzimwe nzira dzisinganyatsooneki dzekushandisa raibhurari numpy, izvo zvinokurumidzisa mashandiro emakomputa. Munyaya ino hatisi kuzodzidza nezvavo. Pachave nechimwe chinhu chekufunga nezvazvo munguva yako yakasununguka :)
Ngatidimburei
Ndisati ndapfupikisa, ndinoda kupindura mubvunzo ungangobva kumuverengi wedu anodiwa. Sei, chaizvoizvo, "kutambudzwa" kwakadaro nemadzinza, nei tichida kufamba tichikwira nekudzika mugomo (kunyanya pasi) kuti tiwane nzvimbo yakaderera yakaderera, kana tine mumaoko edu chigadziro chakasimba uye chiri nyore, fomu yemhinduro yekuongorora, iyo inotitumira ipapo ipapo kunzvimbo chaiyo?
Mhinduro yomubvunzo uyu iri pamusoro. Zvino tatarisa muenzaniso wakapfava, umo mhinduro yechokwadi iri zvinoenderana nechiratidzo chimwe . Iwe hausi kuona izvi kazhinji muhupenyu, saka ngatimbofungidzira kuti isu tine 2, 30, 50 kana kupfuura zviratidzo. Ngatiwedzere kune izvi zviuru, kana kunyange makumi ezviuru zvehukoshi kune yega yega hunhu. Muchiitiko ichi, mhinduro yekuongorora haigoni kumira muedzo uye inokundikana. Nekudaro, kudzika kwe gradient uye kusiyanisa kwayo kunozotisvitsa zvishoma nezvishoma asi zvechokwadi kutiswededza pedyo nechinangwa - hushoma hwebasa. Uye usazvinetse nekumhanya - isu tichazotarisa nzira dzinozotitendera kuseta nekugadzirisa nhanho kureba (kureva, kumhanya).
Uye ikozvino pfupiso pfupi chaiyo.
Chekutanga, ndinovimba kuti izvo zvakaratidzwa muchinyorwa zvichabatsira kutanga "data masayendisiti" mukunzwisisa nzira yekugadzirisa nyore (uye kwete chete) mutsara regression equations.
Chechipiri, takatarisa nzira dzinoverengeka dzekugadzirisa equation. Iye zvino, zvichienderana nemamiriro ezvinhu, tinogona kusarudza iyo inonyatsokodzera kugadzirisa dambudziko.
Chechitatu, takaona simba remamwe marongero, kureva gradient descent nhanho kureba. Iyi parameter haigone kuregererwa. Sezvakataurwa pamusoro apa, kuti uderedze mutengo wekuverenga, nhanho urefu hunofanira kuchinjwa panguva yekudzika.
Chechina, mune yedu, "yakanyorwa-pamba" mabasa airatidza yakanakisa nguva mhedzisiro yekuverenga. Izvi zvimwe zvinokonzerwa nekusanyanya kushandiswa kwehunyanzvi kwehunyanzvi hwe library numpy. Asi ngazvive izvo, mhedzisiro inotevera inozviratidza pachayo. Kune rumwe rutivi, dzimwe nguva zvakakodzera kubvunza pfungwa dzakasimbiswa, uye kune rumwe rutivi, hazvina kukodzera nguva dzose kuomesa zvinhu zvose - pane zvinopesana, dzimwe nguva nzira iri nyore yekugadzirisa dambudziko inonyanya kushanda. Uye sezvo chinangwa chedu chaive chekuongorora nzira nhatu dzekugadzirisa mutsara wakapfava regression equation, kushandiswa kwe "kuzvinyora" mabasa kwaikwana kwatiri.
Literature (kana chimwe chinhu chakadaro)
1. Linear regression
2. Kashoma masikweya nzira
3. Kubva
4. Gradient
5. Gradient descent
6. NumPy raibhurari