Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Sengoliloeng se bua ka mekhoa e 'maloa ea ho tseba hore na lipalo tsa lipalo tsa mola o bonolo (para) oa regression.

Mekhoa eohle ea ho rarolla equation e shebiloeng mona e ipapisitse le mokhoa oa bonyane ba lisekoere. Re hlalosa mekhoa e latelang:

  • Tharollo ea Tlhahlobo
  • ho thetheha ka sekhahla
  • Ho theoha ha Stochastic Gradient

Bakeng sa e 'ngoe le e' ngoe ea mekhoa ea ho rarolla equation ea mola o otlolohileng, sengoloa se hlahisa mesebetsi e fapaneng, e arotsoeng haholo ho tse ngotsoeng ntle le ho sebelisa laeborari. Lipalo le tse sebelisoang ho etsa lipalo Lipalo. Ho lumeloa hore ho sebelisa ka boqhetseke Lipalo e tla fokotsa litΕ‘enyehelo tsa komporo.

Khoutu eohle sengolong sena e ngotsoe ka python-2.7 sebedisa Buka ea Jupyter. Khoutu ea mohloli le mohlala oa faele ea data li fumaneha ho Github

Sehlooho sena se tsepamisitse maikutlo ho ba qalang ka bobeli le ba seng ba ntse ba qala butle-butle ho tseba thuto ea karolo e pharaletseng haholo ea bohlale ba maiketsetso - ho ithuta ka mochine.

A re sebeliseng mohlala o bonolo haholo ho bontΕ‘a boitsebiso.

Maemo a Mohlala

Re na le litekanyetso tse hlano tse khethollang bokhoba Y ΠΎΡ‚ X (Lethathamo #1):

Letlapa la 1 "Maemo a mohlala"

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Re tla nka hore litekanyetso Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ke khoeli ea selemo, le Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation - meputso khoeling ena. Ka mantsoe a mang, chelete e kenang e itΕ‘etlehile ka khoeli ea selemo, le Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation - letΕ‘oao feela leo lekeno le itΕ‘etlehileng ka lona.

Mohlala ke joalo-joalo, ka bobeli mabapi le ho itΕ‘etleha ka maemo a chelete ka khoeli ea selemo, le ho latela palo ea litekanyetso - ho na le tse fokolang haholo tsa tsona. Leha ho le joalo, ho nolofatsa ho joalo ho tla lumella, joalo ka ha ba bua ka menoana, ho hlalosa, eseng kamehla ka boiketlo, boitsebiso bo nkiloeng ke ba qalang. Hape ho nolofatsa ha lipalo ho tla lumella ba lakatsang ho rarolla mohlala ka "pampiri" ntle le litΕ‘enyehelo tse kholo tsa mosebetsi.

Ha re re ho itΕ‘etleha ho fanoeng mohlaleng ho ka hakanyetsoa hantle ka palo ea lipalo ea mola oa mokhoa o bonolo oa (para) oa ho khutlisa foromo:

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

moo Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ke khoeli eo chelete e fumanoeng ka eona. Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation - chelete e tsamaellanang le khoeli, Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ke li-coefficients tsa regression tsa moeli o hakantsoeng.

Hlokomela hore coefficient Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation hangata ho thoeng ke moepa kapa lekhalo la moeli o hakantsoeng; ke chelete eo ka eona Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ha e fetoha Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation.

Ho hlakile hore mosebetsi oa rona mohlaleng ke ho khetha li-coefficients tse joalo ho equation Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation, moo ho kheloha ha litekanyetso tsa rona tsa lekhetho ka likhoeli tse tsoang ho likarabo tsa 'nete, i.e. boleng bo hlahisitsoeng sampoleng bo tla fokola.

Mokhoa o fokolang oa lisekoere

Ho ea ka mokhoa o fokolang oa li-square, kheloha e lokela ho baloa ka ho e squaring. Mokhoa o joalo o lumella ho qoba puseletso e kopanetsoeng ea ho kheloha, haeba ba e-na le matΕ‘oao a fapaneng. Ka mohlala, haeba boemong bo le bong, ho kheloha ke +5 (mmoho le tse hlano), le ho tse ding -5 (ho tlosa bohlano), joale kakaretso ea liphapang e tla hlakoloa 'me e tla ba 0 (zero). Hoa khoneha hore u se ke ua qhekella ho kheloha, empa ho sebelisa thepa ea modulus 'me joale liphapang tsohle li tla ba tse ntle' me li tla bokellana. Re ke ke ra lula ntlheng ena ka botlalo, empa re bonts'a feela hore molemong oa lipalo, ho tloaelehile ho kheloha kheloha.

Ena ke tsela eo foromo e shebahalang ka eona, ka thuso ea eona re tla fumana palo e nyane haholo ea liphapang tse nang le lisekoere (liphoso):

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

moo Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ke ts'ebetso ea khakanyo ea likarabo tsa 'nete (ke hore, lekeno le baliloeng ke rona),

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ke likarabo tsa 'nete (chelete e fanoeng sampoleng),

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ke index ea sampole (nomoro ea khoeli eo ho kheloha ho lekantsoeng)

Ha re khetholle tΕ‘ebetso, re hlalose li-equation tse sa fellang, 'me re itokisetse ho fetela tharollong ea tlhahlobo. Empa pele, a re nke khato e khutΕ‘oanyane mabapi le hore na phapang ke eng, 'me re hopole moelelo oa geometri oa motsoako.

Phapang

Phapang ke ts'ebetso ea ho fumana motsoako oa tΕ‘ebetso.

Derivative bakeng sa eng? Motsoako oa ts'ebetso o tΕ‘oaea sekhahla sa phetoho ea ts'ebetso mme e re bonts'a tsela ea eona. Haeba derivative sebakeng se itseng e le ntle, joale ts'ebetso e ntse e eketseha; ho seng joalo, ts'ebetso e ea fokotseha. 'Me boleng bo boholo ba motsoako oa modulo, sekhahla sa phetoho ea litekanyetso tsa ts'ebetso se phahame, hammoho le moepa o moholo oa kerafo ea ts'ebetso.

Mohlala, tlasa maemo a Cartesian coordinate system, boleng ba derivative ntlheng M(0,0) bo lekana le + 25 e bolela hore sebakeng se itseng, ha boleng bo suthisoa Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ka ho le letona ka yuniti e tloaelehileng, boleng Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation e eketseha ka li-unit tse 25 tse tloaelehileng. Kerafong, e shebahala joalo ka moepa o batlang o phahama oa boleng Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ho tloha sebakeng se fanoeng.

Mohlala o mong. Boleng ba derivative ke -0,1 e bolela hore ha ho sutha Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ka yuniti e le 'ngoe e tloaelehileng, boleng Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation e fokotseha ka yuniti e tloaelehileng ea 0,1 feela. Ka nako e ts'oanang, kerafong ea ts'ebetso, re ka bona moepa o sa bonahaleng o theohelang tlase. Ha re etsa papiso le thaba, ho tőoana leha re theoha letsoapong le bonolo ho tloha thabeng, ho fapana le mohlala o fetileng, moo re ileng ra tlameha ho nka litlhōrō tse moepa haholo :)

Kahoo, ka mor'a ho khetholla mosebetsi Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ka mathata Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation, re hlalosa li-equations tsa karoloana e nkiloeng ho tatellano ea 1. Kamora ho hlalosa li-equations, re tla fumana sistimi ea li-equation tse peli, ho rarolla tseo re ka khethang boleng bo joalo ba li-coefficients. Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation, moo boleng ba li-derivatives tse lumellanang le lintlha tse fanoeng li fetohang ka chelete e nyenyane haholo, 'me tabeng ea tharollo ea tlhahlobo ha e fetohe ho hang. Ka mantsoe a mang, ts'ebetso ea phoso ho li-coefficients tse fumanoeng e tla fihla bonyane, kaha boleng ba likarolo tse nkiloeng lintlheng tsena li tla lekana le zero.

Kahoo, ho ea ka melao ea karohano, equation ea karolo e nkiloeng ea tatellano ea 1 mabapi le coefficient. Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation e tla nka foromo:

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

1st order partial derivative equation mabapi le Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation e tla nka foromo:

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Ka lebaka leo, re na le sistimi ea li-equations e nang le tharollo e bonolo ea tlhahlobo:

qala{equation*}
qala{maemo}
na + bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i = 0

sumlimits_{i=1}^nx_i(a +bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i) = 0
pheletso{maemo}
pheletso{equation*}

Pele re rarolla equation, a re ke re jare, re hlahlobe ho nepahala ha ho kenya le ho fomata data.

Ho kenya le ho fometa data

Ho lokela ho hlokomeloa hore ka lebaka la hore bakeng sa tharollo ea tlhahlobo, le nakong e tlang bakeng sa ho theoha ha gradient le stochastic gradient, re tla sebelisa khoutu ka mefuta e 'meli: ho sebelisa laeborari. Lipalo 'me ntle le ho e sebelisa, joale re hloka mokhoa o nepahetseng oa ho hlophisa data (bona khoutu).

Khoutu ea ho kenya le ho sebetsa

# ΠΈΠΌΠΏΠΎΡ€Ρ‚ΠΈΡ€ΡƒΠ΅ΠΌ всС Π½ΡƒΠΆΠ½Ρ‹Π΅ Π½Π°ΠΌ Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import math
import pylab as pl
import random

# Π³Ρ€Π°Ρ„ΠΈΠΊΠΈ ΠΎΡ‚ΠΎΠ±Ρ€Π°Π·ΠΈΠΌ Π² Jupyter
%matplotlib inline

# ΡƒΠΊΠ°ΠΆΠ΅ΠΌ Ρ€Π°Π·ΠΌΠ΅Ρ€ Π³Ρ€Π°Ρ„ΠΈΠΊΠΎΠ²
from pylab import rcParams
rcParams['figure.figsize'] = 12, 6

# ΠΎΡ‚ΠΊΠ»ΡŽΡ‡ΠΈΠΌ прСдупрСТдСния Anaconda
import warnings
warnings.simplefilter('ignore')

# Π·Π°Π³Ρ€ΡƒΠ·ΠΈΠΌ значСния
table_zero = pd.read_csv('data_example.txt', header=0, sep='t')

# посмотрим ΠΈΠ½Ρ„ΠΎΡ€ΠΌΠ°Ρ†ΠΈΡŽ ΠΎ Ρ‚Π°Π±Π»ΠΈΡ†Π΅ ΠΈ Π½Π° саму Ρ‚Π°Π±Π»ΠΈΡ†Ρƒ
print table_zero.info()
print '********************************************'
print table_zero
print '********************************************'

# ΠΏΠΎΠ΄Π³ΠΎΡ‚ΠΎΠ²ΠΈΠΌ Π΄Π°Π½Π½Ρ‹Π΅ Π±Π΅Π· использования NumPy

x_us = []
[x_us.append(float(i)) for i in table_zero['x']]
print x_us
print type(x_us)
print '********************************************'

y_us = []
[y_us.append(float(i)) for i in table_zero['y']]
print y_us
print type(y_us)
print '********************************************'

# ΠΏΠΎΠ΄Π³ΠΎΡ‚ΠΎΠ²ΠΈΠΌ Π΄Π°Π½Π½Ρ‹Π΅ с использованиСм NumPy

x_np = table_zero[['x']].values
print x_np
print type(x_np)
print x_np.shape
print '********************************************'

y_np = table_zero[['y']].values
print y_np
print type(y_np)
print y_np.shape
print '********************************************'

Ponahalo

Joale, ka mor'a hore rona, pele, re laetse data, ea bobeli, re hlahlobe ho nepahala ha ho kenya, 'me qetellong re hlophise data, re tla etsa pono ea pele. Hangata mokhoa ona o sebelisoa pairplot lilaebrari Seaborn. Mohlala oa rona, ka lebaka la lipalo tse fokolang, ha ho utloahale ho sebelisa laebrari Seaborn. Re tla sebelisa laebrari e tloaelehileng Matlotlib mme o shebe feela sekgasa.

Khoutu ea Scatterplot

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–1 "Π—Π°Π²ΠΈΡΠΈΠΌΠΎΡΡ‚ΡŒ Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠΈ ΠΎΡ‚ мСсяца Π³ΠΎΠ΄Π°"'

plt.plot(x_us,y_us,'o',color='green',markersize=16)
plt.xlabel('$Months$', size=16)
plt.ylabel('$Sales$', size=16)
plt.show()

Chate No. 1 "Ho its'etleha ka chelete ka khoeli ea selemo"

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Tharollo ea Tlhahlobo

Re sebelisa lisebelisoa tse atileng haholo ho python le ho rarolla tsamaiso ea equations:

qala{equation*}
qala{maemo}
na + bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i = 0

sumlimits_{i=1}^nx_i(a +bsumlimits_{i=1}^nx_i - sumlimits_{i=1}^ny_i) = 0
pheletso{maemo}
pheletso{equation*}

Ho ea ka molao oa Cramer fumana lehlakisi le tloaelehileng, mmoho le tse hlalosang ka ho Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation le ka Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation, ka mor'a moo, ho arola se khethollang ka Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ho sekhetho se tloaelehileng - fumana coefficient Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation, ka ho tΕ‘oanang re fumana coefficient Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation.

Khoutu ea tharollo ea analytical

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для расчСта коэффициСнтов a ΠΈ b ΠΏΠΎ ΠΏΡ€Π°Π²ΠΈΠ»Ρƒ ΠšΡ€Π°ΠΌΠ΅Ρ€Π°
def Kramer_method (x,y):
        # сумма Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ (всС мСсяца)
    sx = sum(x)
        # сумма истинных ΠΎΡ‚Π²Π΅Ρ‚ΠΎΠ² (Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠ° Π·Π° вСсь ΠΏΠ΅Ρ€ΠΈΠΎΠ΄)
    sy = sum(y)
        # сумма произвСдСния Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ Π½Π° истинныС ΠΎΡ‚Π²Π΅Ρ‚Ρ‹
    list_xy = []
    [list_xy.append(x[i]*y[i]) for i in range(len(x))]
    sxy = sum(list_xy)
        # сумма ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    list_x_sq = []
    [list_x_sq.append(x[i]**2) for i in range(len(x))]
    sx_sq = sum(list_x_sq)
        # количСство Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    n = len(x)
        # ΠΎΠ±Ρ‰ΠΈΠΉ ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΡ‚Π΅Π»ΡŒ
    det = sx_sq*n - sx*sx
        # ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΡ‚Π΅Π»ΡŒ ΠΏΠΎ a
    det_a = sx_sq*sy - sx*sxy
        # искомый ΠΏΠ°Ρ€Π°ΠΌΠ΅Ρ‚Ρ€ a
    a = (det_a / det)
        # ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΡ‚Π΅Π»ΡŒ ΠΏΠΎ b
    det_b = sxy*n - sy*sx
        # искомый ΠΏΠ°Ρ€Π°ΠΌΠ΅Ρ‚Ρ€ b
    b = (det_b / det)
        # ΠΊΠΎΠ½Ρ‚Ρ€ΠΎΠ»ΡŒΠ½Ρ‹Π΅ значСния (ΠΏΡ€ΠΎΠΎΠ²Π΅Ρ€ΠΊΠ°)
    check1 = (n*b + a*sx - sy)
    check2 = (b*sx + a*sx_sq - sxy)
    return [round(a,4), round(b,4)]

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ ΠΈ запишСм ΠΏΡ€Π°Π²ΠΈΠ»ΡŒΠ½Ρ‹Π΅ ΠΎΡ‚Π²Π΅Ρ‚Ρ‹
ab_us = Kramer_method(x_us,y_us)
a_us = ab_us[0]
b_us = ab_us[1]
print ' 33[1m' + ' 33[4m' + "ΠžΠΏΡ‚ΠΈΠΌΠ°Π»ΡŒΠ½Ρ‹Π΅ значСния коэффициСнтов a ΠΈ b:"  + ' 33[0m' 
print 'a =', a_us
print 'b =', b_us
print

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для подсчСта суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ошибок
def errors_sq_Kramer_method(answers,x,y):
    list_errors_sq = []
    for i in range(len(x)):
        err = (answers[0] + answers[1]*x[i] - y[i])**2
        list_errors_sq.append(err)
    return sum(list_errors_sq)

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ ΠΈ запишСм Π·Π½Π°Ρ‡Π΅Π½ΠΈΠ΅ ошибки
error_sq = errors_sq_Kramer_method(ab_us,x_us,y_us)
print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ" + ' 33[0m'
print error_sq
print

# Π·Π°ΠΌΠ΅Ρ€ΠΈΠΌ врСмя расчСта
# print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния расчСта суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
# % timeit error_sq = errors_sq_Kramer_method(ab,x_us,y_us)

Seo re nang le sona ke sena:

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Kahoo, litekanyetso tsa li-coefficients li fumanoa, kakaretso ea likhaello tse nang le lisekoere li behiloe. A re ke re hule mola o otlolohileng ho histogram e hasanyang ho latela li-coefficients tse fumanoeng.

Khoutu ea mola oa khatello

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для формирования массива рассчСтных Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠΈ
def sales_count(ab,x,y):
    line_answers = []
    [line_answers.append(ab[0]+ab[1]*x[i]) for i in range(len(x))]
    return line_answers

# построим Π³Ρ€Π°Ρ„ΠΈΠΊΠΈ
print 'Π“Ρ€Ρ„ΠΈΠΊβ„–2 "ΠŸΡ€Π°Π²ΠΈΠ»ΡŒΠ½Ρ‹Π΅ ΠΈ расчСтныС ΠΎΡ‚Π²Π΅Ρ‚Ρ‹"'
plt.plot(x_us,y_us,'o',color='green',markersize=16, label = '$True$ $answers$')
plt.plot(x_us, sales_count(ab_us,x_us,y_us), color='red',lw=4,
         label='$Function: a + bx,$ $where$ $a='+str(round(ab_us[0],2))+',$ $b='+str(round(ab_us[1],2))+'$')
plt.xlabel('$Months$', size=16)
plt.ylabel('$Sales$', size=16)
plt.legend(loc=1, prop={'size': 16})
plt.show()

Chate No. 2 "Likarabo tse nepahetseng le tse baliloeng"

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

U ka sheba kerafo ea liphapang khoeli le khoeli. Tabeng ea rona, re ke ke ra fumana molemo ofe kapa ofe oa bohlokoa ho eona, empa re tla khotsofatsa bohelehele ba hore na mokhoa o bonolo oa regression equation o bontΕ‘a joang ho itΕ‘etleha ka chelete khoeling ea selemo.

Khoutu ea chate e khelohileng

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для формирования массива ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ Π² ΠΏΡ€ΠΎΡ†Π΅Π½Ρ‚Π°Ρ…
def error_per_month(ab,x,y):
    sales_c = sales_count(ab,x,y)
    errors_percent = []
    for i in range(len(x)):
        errors_percent.append(100*(sales_c[i]-y[i])/y[i])
    return errors_percent

# построим Π³Ρ€Π°Ρ„ΠΈΠΊ
print 'Π“Ρ€Π°Ρ„ΠΈΠΊβ„–3 "ΠžΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΡ ΠΏΠΎ-мСсячно, %"'
plt.gca().bar(x_us, error_per_month(ab_us,x_us,y_us), color='brown')
plt.xlabel('Months', size=16)
plt.ylabel('Calculation error, %', size=16)
plt.show()

Chate No. 3 "Deviations,%"

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Ha rea ​​phethahala, empa re entse mosebetsi oa rona.

Ha re ngole tΕ‘ebetso eo, ho fumana li-coefficients Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation e sebelisa laebrari Lipalo, ka mokhoa o nepahetseng haholoanyane, re tla ngola mesebetsi e 'meli: e' ngoe e sebelisa pseudo-inverse matrix (ha e khothalletsoe ts'ebetsong, kaha mokhoa ona o rarahane ka mokhoa o rarahaneng le o sa tsitsang), e mong o sebelisa equation ea matrix.

Khoutu ea Tharollo ea Analytic (NumPy)

# для Π½Π°Ρ‡Π°Π»Π° Π΄ΠΎΠ±Π°Π²ΠΈΠΌ столбСц с Π½Π΅ ΠΈΠ·ΠΌΠ΅Π½ΡΡŽΡ‰ΠΈΠΌΡΡ Π·Π½Π°Ρ‡Π΅Π½ΠΈΠ΅ΠΌ Π² 1. 
# Π”Π°Π½Π½Ρ‹ΠΉ столбСц Π½ΡƒΠΆΠ΅Π½ для Ρ‚ΠΎΠ³ΠΎ, Ρ‡Ρ‚ΠΎΠ±Ρ‹ Π½Π΅ ΠΎΠ±Ρ€Π°Π±Π°Ρ‚Ρ‹Π²Π°Ρ‚ΡŒ ΠΎΡ‚Π΄Π΅Π»ΡŒΠ½ΠΎ коэффицСнт a
vector_1 = np.ones((x_np.shape[0],1))
x_np = table_zero[['x']].values # Π½Π° всякий случай ΠΏΡ€ΠΈΠ²Π΅Π΄Π΅ΠΌ Π² ΠΏΠ΅Ρ€Π²ΠΈΡ‡Π½Ρ‹ΠΉ Ρ„ΠΎΡ€ΠΌΠ°Ρ‚ Π²Π΅ΠΊΡ‚ΠΎΡ€ x_np
x_np = np.hstack((vector_1,x_np))

# ΠΏΡ€ΠΎΠ²Π΅Ρ€ΠΈΠΌ Ρ‚ΠΎ, Ρ‡Ρ‚ΠΎ всС сдСлали ΠΏΡ€Π°Π²ΠΈΠ»ΡŒΠ½ΠΎ
print vector_1[0:3]
print x_np[0:3]
print '***************************************'
print

# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ, которая опрСдСляСт значСния коэффициСнтов a ΠΈ b с использованиСм псСвдообратной ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρ‹
def pseudoinverse_matrix(X, y):
    # Π·Π°Π΄Π°Π΅ΠΌ явный Ρ„ΠΎΡ€ΠΌΠ°Ρ‚ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρ‹ ΠΏΡ€ΠΈΠ·Π½Π°ΠΊΠΎΠ²
    X = np.matrix(X)
    # опрСдСляСм Ρ‚Ρ€Π°Π½ΡΠΏΠΎΠ½ΠΈΡ€ΠΎΠ²Π°Π½Π½ΡƒΡŽ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρƒ
    XT = X.T
    # опрСдСляСм ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚Π½ΡƒΡŽ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρƒ
    XTX = XT*X
    # опрСдСляСм ΠΏΡΠ΅Π²Π΄ΠΎΠΎΠ±Ρ€Π°Ρ‚Π½ΡƒΡŽ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρƒ
    inv = np.linalg.pinv(XTX)
    # Π·Π°Π΄Π°Π΅ΠΌ явный Ρ„ΠΎΡ€ΠΌΠ°Ρ‚ ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρ‹ ΠΎΡ‚Π²Π΅Ρ‚ΠΎΠ²
    y = np.matrix(y)
    # Π½Π°Ρ…ΠΎΠ΄ΠΈΠΌ Π²Π΅ΠΊΡ‚ΠΎΡ€ вСсов
    return (inv*XT)*y

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ
ab_np = pseudoinverse_matrix(x_np, y_np)
print ab_np
print '***************************************'
print

# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ, которая ΠΈΡΠΏΠΎΠ»ΡŒΠ·ΡƒΠ΅Ρ‚ для Ρ€Π΅ΡˆΠ΅Π½ΠΈΡ ΠΌΠ°Ρ‚Ρ€ΠΈΡ‡Π½ΠΎΠ΅ ΡƒΡ€Π°Π²Π½Π΅Π½ΠΈΠ΅
def matrix_equation(X,y):
    a = np.dot(X.T, X)
    b = np.dot(X.T, y)
    return np.linalg.solve(a, b)

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ
ab_np = matrix_equation(x_np,y_np)
print ab_np

Bapisa nako eo e e nkileng ho fumana li-coefficients Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation, ho latela mekhoa ea 3 e hlahisitsoeng.

Khoutu ea Palo ea Nako ea Palo

print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния расчСта коэффициСнтов Π±Π΅Π· использования Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:" + ' 33[0m'
% timeit ab_us = Kramer_method(x_us,y_us)
print '***************************************'
print
print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния расчСта коэффициСнтов с использованиСм псСвдообратной ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρ‹:" + ' 33[0m'
%timeit ab_np = pseudoinverse_matrix(x_np, y_np)
print '***************************************'
print
print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния расчСта коэффициСнтов с использованиСм ΠΌΠ°Ρ‚Ρ€ΠΈΡ‡Π½ΠΎΠ³ΠΎ уравнСния:" + ' 33[0m'
%timeit ab_np = matrix_equation(x_np, y_np)

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Ka palo e nyane ea data, mosebetsi oa "ho ingolla" o tla pele, o fumanang li-coefficients ho sebelisa mokhoa oa Cramer.

Hona joale o ka fetela pele ho litsela tse ling tsa ho fumana li-coefficients Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation.

ho thetheha ka sekhahla

Pele, a re hlaloseng hore na gradient ke eng. Ka mokhoa o bonolo, gradient ke karolo e bonts'ang tataiso ea kholo e kholo ea ts'ebetso. Ka papiso le ho hloa thaba, moo lerata le shebahalang teng, ho na le moepa o moholo ka ho fetisisa o nyolohelang tlhōrōng ea thaba. Ho ntshetsa pele mohlala oa thaba, hopola hore ha e le hantle re hloka ho theoha ho fetisisa e le hore re fihlele tlaase kapele kamoo ho ka khonehang, ke hore, bonyane - sebaka seo mosebetsi o sa eketseheng le ho se fokotsehe. Ka nako ena, derivative e tla ba zero. Ka hona, ha re hloke gradient, empa anti-gradient. Ho fumana antigradient, o hloka feela ho atisa gradient ka -1 (tlosa e le nngwe).

A re ele hloko taba ea hore ts'ebetso e ka ba le li-minima tse 'maloa,' me ha re theohetse ho e 'ngoe ea tsona ho ea ka algorithm e hlalositsoeng ka tlase, re ke ke ra khona ho fumana bonyane bo bong, bo ka' nang ba e-ba tlaase ho feta bo fumanoeng. Phomola, ha re kotsing! Tabeng ea rona, re sebetsana le bonyane bo le bong, kaha mosebetsi oa rona Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation mo kerafong ke papiso e tloaelehileng. 'Me joalo ka ha bohle re lokela ho tseba hantle ho tsoa thutong ea lipalo ea sekolo, parabola e na le bonyane bo le bong feela.

Kamora hore re fumane hore na hobaneng re hloka gradient, le hore gradient ke karolo, ke hore, vector e nang le likhokahano tse fanoeng, e leng li-coefficients tse tΕ‘oanang. Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation re ka kenya ts'ebetsong ho theoha ha likhahla.

Pele ke qala, ke khothaletsa ho bala lipolelo tse 'maloa feela mabapi le algorithm ea ho theola:

  • Re khetha pseudo-ka linako tse ling likhokahano tsa li-coefficients Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation. Mohlala oa rona, re tla hlalosa li-coefficients haufi le zero. Ena ke tloaelo e tloaelehileng, empa nyeoe e 'ngoe le e' ngoe e ka ba le tloaelo ea eona.
  • Ho tloha ho coordinate Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation tlosa boleng ba karolo e tsoang ho karolo ea tatellano ea 1 ntlheng Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation. Kahoo, haeba derivative e le ntle, joale mosebetsi o ntse o eketseha. Ka hona, ho tlosa boleng ba derivative, re tla tsamaea ka lehlakoreng le fapaneng la kholo, ke hore, ka tsela ea ho theoha. Haeba derivative e le mpe, joale mosebetsi sebakeng sena oa fokotseha le ho tlosa boleng ba motsoako, re fetela ho theoha.
  • Re etsa ts'ebetso e ts'oanang le coordinate Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation: tlosa boleng ba karolo e nkiloeng ntlheng ena Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation.
  • E le hore u se ke ua qhomela ho feta bonyane 'me u se ke ua fofa sebakeng se tebileng, ho hlokahala hore u behe boholo ba mohato ka tsela ea ho theoha. Ka kakaretso, motho a ka ngola sehlooho se feletseng mabapi le mokhoa oa ho beha mohato ka nepo le hore na o ka o fetola joang nakong ea ho theoha e le ho fokotsa litΕ‘enyehelo tsa lipalo. Empa joale re na le mosebetsi o fapaneng hanyane, 'me re tla theha boholo ba mohato ka mokhoa oa mahlale oa "poke" kapa, joalo ka ha ba re ho batho ba tloaelehileng, ka matla.
  • Hang ha re se re tsoile ho li-coordinate tse fanoeng Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation tlosa boleng ba li-derivatives, re fumana likhokahano tse ncha Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation. Re nka mohato o latelang (ho tlosa), o se o ntse o le ho lihokahanyo tse baloang. 'Me kahoo potoloho e qala khafetsa, ho fihlela convergence e hlokahalang e fihletsoe.

Kaofela! Joale re se re itokiselitse ho ea batla phula e tebileng ka ho fetisisa ea Mariana Trench. Ha re qaleng.

Khoutu ea ho theoha ha Gradient

# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска Π±Π΅Π· использования Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy. 
# Ѐункция Π½Π° Π²Ρ…ΠΎΠ΄ ΠΏΡ€ΠΈΠ½ΠΈΠΌΠ°Π΅Ρ‚ Π΄ΠΈΠ°ΠΏΠ°Π·ΠΎΠ½Ρ‹ Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ x,y, Π΄Π»ΠΈΠ½Ρƒ шага (ΠΏΠΎ ΡƒΠΌΠΎΠ»Ρ‡Π°Π½ΠΈΡŽ=0,1), Π΄ΠΎΠΏΡƒΡΡ‚ΠΈΠΌΡƒΡŽ ΠΏΠΎΠ³Ρ€Π΅ΡˆΠ½ΠΎΡΡ‚ΡŒ(tolerance)
def gradient_descent_usual(x_us,y_us,l=0.1,tolerance=0.000000000001):
    # сумма Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ (всС мСсяца)
    sx = sum(x_us)
    # сумма истинных ΠΎΡ‚Π²Π΅Ρ‚ΠΎΠ² (Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠ° Π·Π° вСсь ΠΏΠ΅Ρ€ΠΈΠΎΠ΄)
    sy = sum(y_us)
    # сумма произвСдСния Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ Π½Π° истинныС ΠΎΡ‚Π²Π΅Ρ‚Ρ‹
    list_xy = []
    [list_xy.append(x_us[i]*y_us[i]) for i in range(len(x_us))]
    sxy = sum(list_xy)
    # сумма ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    list_x_sq = []
    [list_x_sq.append(x_us[i]**2) for i in range(len(x_us))]
    sx_sq = sum(list_x_sq)
    # количСство Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    num = len(x_us)
    # Π½Π°Ρ‡Π°Π»ΡŒΠ½Ρ‹Π΅ значСния коэффициСнтов, ΠΎΠΏΡ€Π΅Π΄Π΅Π»Π΅Π½Π½Ρ‹Π΅ псСвдослучайным ΠΎΠ±Ρ€Π°Π·ΠΎΠΌ
    a = float(random.uniform(-0.5, 0.5))
    b = float(random.uniform(-0.5, 0.5))
    # создаСм массив с ошибками, для старта ΠΈΡΠΏΠΎΠ»ΡŒΠ·ΡƒΠ΅ΠΌ значСния 1 ΠΈ 0
    # послС Π·Π°Π²Π΅Ρ€ΡˆΠ΅Π½ΠΈΡ спуска стартовыС значСния ΡƒΠ΄Π°Π»ΠΈΠΌ
    errors = [1,0]
    # запускаСм Ρ†ΠΈΠΊΠ» спуска
    # Ρ†ΠΈΠΊΠ» Ρ€Π°Π±ΠΎΡ‚Π°Π΅Ρ‚ Π΄ΠΎ Ρ‚Π΅Ρ… ΠΏΠΎΡ€, ΠΏΠΎΠΊΠ° ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠ΅ послСднСй ошибки суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ ΠΏΡ€Π΅Π΄Ρ‹Π΄ΡƒΡ‰Π΅ΠΉ, Π½Π΅ Π±ΡƒΠ΄Π΅Ρ‚ мСньшС tolerance
    while abs(errors[-1]-errors[-2]) > tolerance:
        a_step = a - l*(num*a + b*sx - sy)/num
        b_step = b - l*(a*sx + b*sx_sq - sxy)/num
        a = a_step
        b = b_step
        ab = [a,b]
        errors.append(errors_sq_Kramer_method(ab,x_us,y_us))
    return (ab),(errors[2:])

# запишСм массив Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ 
list_parametres_gradient_descence = gradient_descent_usual(x_us,y_us,l=0.1,tolerance=0.000000000001)


print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_gradient_descence[0][0],3)
print 'b =', round(list_parametres_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_gradient_descence[1][-1],3)
print



print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_gradient_descence[1])
print

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Re ile ra itahlela tlase tlase ho Mariana Trench mme moo ra fumana litekanyetso tsohle tse tΕ‘oanang tsa li-coefficients. Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equationseo ka sebele se lokelang ho lebelloa.

Ha re qoeleng hape, lekhetlong lena feela, ho petetsa koloi ea rona e tebileng ea leoatle e tla ba mahlale a mang, e leng laeborari. Lipalo.

Khouto ea ho theoha ha Gradient (NumPy)

# ΠΏΠ΅Ρ€Π΅Π΄ Ρ‚Π΅ΠΌ ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΡ‚ΡŒ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ для Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска с использованиСм Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy, 
# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ опрСдСлСния суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ Ρ‚Π°ΠΊΠΆΠ΅ с использованиСм NumPy
def error_square_numpy(ab,x_np,y_np):
    y_pred = np.dot(x_np,ab)
    error = y_pred - y_np
    return sum((error)**2)

# напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска с использованиСм Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy. 
# Ѐункция Π½Π° Π²Ρ…ΠΎΠ΄ ΠΏΡ€ΠΈΠ½ΠΈΠΌΠ°Π΅Ρ‚ Π΄ΠΈΠ°ΠΏΠ°Π·ΠΎΠ½Ρ‹ Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ x,y, Π΄Π»ΠΈΠ½Ρƒ шага (ΠΏΠΎ ΡƒΠΌΠΎΠ»Ρ‡Π°Π½ΠΈΡŽ=0,1), Π΄ΠΎΠΏΡƒΡΡ‚ΠΈΠΌΡƒΡŽ ΠΏΠΎΠ³Ρ€Π΅ΡˆΠ½ΠΎΡΡ‚ΡŒ(tolerance)
def gradient_descent_numpy(x_np,y_np,l=0.1,tolerance=0.000000000001):
    # сумма Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ (всС мСсяца)
    sx = float(sum(x_np[:,1]))
    # сумма истинных ΠΎΡ‚Π²Π΅Ρ‚ΠΎΠ² (Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠ° Π·Π° вСсь ΠΏΠ΅Ρ€ΠΈΠΎΠ΄)
    sy = float(sum(y_np))
    # сумма произвСдСния Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ Π½Π° истинныС ΠΎΡ‚Π²Π΅Ρ‚Ρ‹
    sxy = x_np*y_np
    sxy = float(sum(sxy[:,1]))
    # сумма ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    sx_sq = float(sum(x_np[:,1]**2))
    # количСство Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ
    num = float(x_np.shape[0])
    # Π½Π°Ρ‡Π°Π»ΡŒΠ½Ρ‹Π΅ значСния коэффициСнтов, ΠΎΠΏΡ€Π΅Π΄Π΅Π»Π΅Π½Π½Ρ‹Π΅ псСвдослучайным ΠΎΠ±Ρ€Π°Π·ΠΎΠΌ
    a = float(random.uniform(-0.5, 0.5))
    b = float(random.uniform(-0.5, 0.5))
    # создаСм массив с ошибками, для старта ΠΈΡΠΏΠΎΠ»ΡŒΠ·ΡƒΠ΅ΠΌ значСния 1 ΠΈ 0
    # послС Π·Π°Π²Π΅Ρ€ΡˆΠ΅Π½ΠΈΡ спуска стартовыС значСния ΡƒΠ΄Π°Π»ΠΈΠΌ
    errors = [1,0]
    # запускаСм Ρ†ΠΈΠΊΠ» спуска
    # Ρ†ΠΈΠΊΠ» Ρ€Π°Π±ΠΎΡ‚Π°Π΅Ρ‚ Π΄ΠΎ Ρ‚Π΅Ρ… ΠΏΠΎΡ€, ΠΏΠΎΠΊΠ° ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠ΅ послСднСй ошибки суммы ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ ΠΏΡ€Π΅Π΄Ρ‹Π΄ΡƒΡ‰Π΅ΠΉ, Π½Π΅ Π±ΡƒΠ΄Π΅Ρ‚ мСньшС tolerance
    while abs(errors[-1]-errors[-2]) > tolerance:
        a_step = a - l*(num*a + b*sx - sy)/num
        b_step = b - l*(a*sx + b*sx_sq - sxy)/num
        a = a_step
        b = b_step
        ab = np.array([[a],[b]])
        errors.append(error_square_numpy(ab,x_np,y_np))
    return (ab),(errors[2:])

# запишСм массив Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ 
list_parametres_gradient_descence = gradient_descent_numpy(x_np,y_np,l=0.1,tolerance=0.000000000001)

print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_gradient_descence[0][0],3)
print 'b =', round(list_parametres_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_gradient_descence[1][-1],3)
print

print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_gradient_descence[1])
print

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation
Litekanyetso tsa coefficient Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ha lia fetoha.

Ha re shebeng hore na phoso e ile ea fetoha joang nakong ea ho theoha ha gradient, ke hore, hore na kakaretso ea liphapang tsa lisekoere e fetohile joang mohatong o mong le o mong.

Khoutu ea Sum Squared Deviation Plot

print 'Π“Ρ€Π°Ρ„ΠΈΠΊβ„–4 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово"'
plt.plot(range(len(list_parametres_gradient_descence[1])), list_parametres_gradient_descence[1], color='red', lw=3)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

Chate #4 "Kakaretso ea Liphapang tse Masekwerene ho Motheo oa Gradient"

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Ho graph, re bona hore phoso e fokotseha ka mohato o mong le o mong, 'me ka mor'a palo e itseng ea ho pheta-pheta, re bona mola o batlang o rapaletse.

Qetellong, a re hlahlobeng phapang ea nako ea ho sebelisa khoutu:

Khoutu bakeng sa khomphutha ea nako ea ho theoha ha likhahla

print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска Π±Π΅Π· использования Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:" + ' 33[0m'
%timeit list_parametres_gradient_descence = gradient_descent_usual(x_us,y_us,l=0.1,tolerance=0.000000000001)
print '***************************************'
print

print ' 33[1m' + ' 33[4m' + "ВрСмя выполнСния Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска с использованиСм Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:" + ' 33[0m'
%timeit list_parametres_gradient_descence = gradient_descent_numpy(x_np,y_np,l=0.1,tolerance=0.000000000001)

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Mohlomong re etsa ho hong ho phoso, empa hape mosebetsi o bonolo oa "boikutlo" o sa sebeliseng laebrari Lipalo pele ho nako ea palo ea ts'ebetso ho sebelisoa laeborari Lipalo.

Empa ha re eme, empa re lebile ho ithuteng mokhoa o mong o monate oa ho rarolla equation e bonolo ea linear regression. Kopana!

Ho theoha ha Stochastic Gradient

E le hore u utloisise ka potlako hore na ho theoha ha stochastic gradient ho sebetsa joang, ho molemo ho hlalosa phapang ea eona ho tloha ho mokhoa o tloaelehileng oa ho theoha. Rona, tabeng ea ho theoha ha gradient, ho liequation tsa derivatives tsa Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation o sebelisitse kakaretso ea boleng ba likarolo tsohle le likarabo tsa 'nete tse fumanehang sampoleng (ke hore, kakaretso ea tsohle Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation). Sebakeng sa stochastic gradient, re ke ke ra sebelisa litekanyetso tsohle tsa sampole, empa ho fapana le hoo, re tla khetha ka mokhoa o ikhethileng seo ho thoeng ke sampole index ebe re sebelisa boleng ba eona.

Ka mohlala, haeba index e hlalosoa e le nomoro ea 3 (boraro), joale re nka litekanyetso Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation, ebe re kenya boleng sebakeng sa li-equation tsa derivatives ebe re khetha likhokahano tse ncha. Joale, ha re se re entse qeto ea lihokahanyo, re boetse re khetha index ea mohlala ka mokhoa o sa reroang, re nkela litekanyetso tse tsamaellanang le index ho li-equation tse fapaneng, ebe re khetha likhokahano ka tsela e ncha. Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation etc. ho fihlela botala bo kopana. Ha u sheba ka lekhetlo la pele, ho ka 'na ha bonahala eka e ka sebetsa ho hang, empa ea sebetsa. Ke 'nete, ke habohlokoa ho hlokomela hore phoso ha e fokotsehe ka mohato o mong le o mong, empa ka sebele ho na le mokhoa.

Melemo ea ho theoha ha stochastic gradient ke efe ho feta ho theoha ho tloaelehileng? Haeba sampole ea boholo ba rona e le kholo haholo 'me e lekantsoe ka mashome a likete tsa boleng, joale ho bonolo haholo ho sebetsa, re re, sekete se sa reroang, ho feta sampole kaofela. Mona ke moo ho theoha ha stochastic gradient ho qala. Tabeng ea rona, ha e le hantle, re ke ke ra hlokomela phapang e kholo.

Ha re shebeng khoutu.

Khoutu ea ho theoha ha Stochastic Gradient

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ стох.Π³Ρ€Π°Π΄.шага
def stoch_grad_step_usual(vector_init, x_us, ind, y_us, l):
#     Π²Ρ‹Π±ΠΈΡ€Π°Π΅ΠΌ Π·Π½Π°Ρ‡Π΅Π½ΠΈΠ΅ икс, ΠΊΠΎΡ‚ΠΎΡ€ΠΎΠ΅ соотвСтствуСт случайному Π·Π½Π°Ρ‡Π΅Π½ΠΈΡŽ ΠΏΠ°Ρ€Π°ΠΌΠ΅Ρ‚Ρ€Π° ind 
# (см.Ρ„-Ρ†ΠΈΡŽ stoch_grad_descent_usual)
    x = x_us[ind]
#     рассчитывыаСм Π·Π½Π°Ρ‡Π΅Π½ΠΈΠ΅ y (Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΡƒ), которая соотвСтствуСт Π²Ρ‹Π±Ρ€Π°Π½Π½ΠΎΠΌΡƒ Π·Π½Π°Ρ‡Π΅Π½ΠΈΡŽ x
    y_pred = vector_init[0] + vector_init[1]*x_us[ind]
#     вычисляСм ΠΎΡˆΠΈΠ±ΠΊΡƒ расчСтной Π²Ρ‹Ρ€ΡƒΡ‡ΠΊΠΈ ΠΎΡ‚Π½ΠΎΡΠΈΡ‚Π΅Π»ΡŒΠ½ΠΎ прСдставлСнной Π² Π²Ρ‹Π±ΠΎΡ€ΠΊΠ΅
    error = y_pred - y_us[ind]
#     опрСдСляСм ΠΏΠ΅Ρ€Π²ΡƒΡŽ ΠΊΠΎΠΎΡ€Π΄ΠΈΠ½Π°Ρ‚Ρƒ Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π° ab
    grad_a = error
#     опрСдСляСм Π²Ρ‚ΠΎΡ€ΡƒΡŽ ΠΊΠΎΠΎΡ€Π΄ΠΈΠ½Π°Ρ‚Ρƒ ab
    grad_b = x_us[ind]*error
#     вычисляСм Π½ΠΎΠ²Ρ‹ΠΉ Π²Π΅ΠΊΡ‚ΠΎΡ€ коэффициСнтов
    vector_new = [vector_init[0]-l*grad_a, vector_init[1]-l*grad_b]
    return vector_new


# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ стох.Π³Ρ€Π°Π΄.спуска
def stoch_grad_descent_usual(x_us, y_us, l=0.1, steps = 800):
#     для самого Π½Π°Ρ‡Π°Π»Π° Ρ€Π°Π±ΠΎΡ‚Ρ‹ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΠΈ Π·Π°Π΄Π°Π΄ΠΈΠΌ Π½Π°Ρ‡Π°Π»ΡŒΠ½Ρ‹Π΅ значСния коэффициСнтов
    vector_init = [float(random.uniform(-0.5, 0.5)), float(random.uniform(-0.5, 0.5))]
    errors = []
#     запустим Ρ†ΠΈΠΊΠ» спуска
# Ρ†ΠΈΠΊΠ» расчитан Π½Π° ΠΎΠΏΡ€Π΅Π΄Π΅Π»Π΅Π½Π½ΠΎΠ΅ количСство шагов (steps)
    for i in range(steps):
        ind = random.choice(range(len(x_us)))
        new_vector = stoch_grad_step_usual(vector_init, x_us, ind, y_us, l)
        vector_init = new_vector
        errors.append(errors_sq_Kramer_method(vector_init,x_us,y_us))
    return (vector_init),(errors)


# запишСм массив Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ 
list_parametres_stoch_gradient_descence = stoch_grad_descent_usual(x_us, y_us, l=0.1, steps = 800)

print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_stoch_gradient_descence[0][0],3)
print 'b =', round(list_parametres_stoch_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_stoch_gradient_descence[1][-1],3)
print

print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² стохастичСском Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_stoch_gradient_descence[1])

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Re sheba ka hloko li-coefficients mme re iphumana re le potso e reng "Ho joalo joang?". Re na le litekanyetso tse ling tsa li-coefficients Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation. Mohlomong ho theoha ha stochastic gradient ho fumane liparamente tse nepahetseng bakeng sa equation? Ka bomalimabe che. Ho lekane ho sheba kakaretso ea likhaello tse lisekoere le ho bona hore ka boleng bo bocha ba li-coefficients, phoso e kholoanyane. Ha re potlakele ho nyahama. Ha re theheng kerafo ea phetoho ea liphoso.

Khoutu ea ho rala kakaretso ea liphapang tse lisekoere ho theolelo ea stochastic gradient

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–5 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1])), list_parametres_stoch_gradient_descence[1], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

Chate #5 "Kakaretso ea Liphapang tse Masekwerene ho Leloko la Stochastic Gradient"

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Ka mor'a ho sheba kemiso, ntho e 'ngoe le e' ngoe e oela hantle 'me joale re tla lokisa ntho e' ngoe le e 'ngoe.

Joale ho etsahetse’ng? Se latelang se ile sa etsahala. Ha re khetha khoeli ka mokhoa o sa reroang, ke khoeling e khethiloeng moo algorithm ea rona e batlang ho fokotsa phoso ho balloang chelete. Ebe re khetha khoeli e 'ngoe ebe re pheta palo, empa re fokotsa phoso bakeng sa khoeli ea bobeli e khethiloeng. Mme jwale a re hopoleng hore re na le dikgwedi tse pedi tsa pele tse kgelohileng haholo moleng wa mola o bonolo wa regression equation. Sena se bolela hore ha e 'ngoe ea likhoeli tsena tse peli e khethoa, ka ho fokotsa phoso ea e' ngoe le e 'ngoe ea tsona, algorithm ea rona e eketsa phoso ho feta sampole eohle. Joale ho etsoa'ng? Karabo e bonolo: o hloka ho fokotsa mohato oa ho theoha. Ka 'nete, ka ho fokotsa mohato oa ho theoha, phoso e tla boela e khaotse "ho qhomela" ho ea holimo kapa ho theoha. Kapa ho e-na le hoo, phoso ea "ho qhoma" e ke ke ea emisa, empa e ke ke ea e etsa ka potlako :) A re hlahlobeng.

Khoutu ea ho tsamaisa SGD ka mohato o monyane

# запустим Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ, ΡƒΠΌΠ΅Π½ΡŒΡˆΠΈΠ² шаг Π² 100 Ρ€Π°Π· ΠΈ ΡƒΠ²Π΅Π»ΠΈΡ‡ΠΈΠ² количСство шагов ΡΠΎΠΎΡ‚Π²Π΅Ρ‚ΡΠ²ΡƒΡŽΡ‰Π΅ 
list_parametres_stoch_gradient_descence = stoch_grad_descent_usual(x_us, y_us, l=0.001, steps = 80000)

print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_stoch_gradient_descence[0][0],3)
print 'b =', round(list_parametres_stoch_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_stoch_gradient_descence[1][-1],3)
print



print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² стохастичСском Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_stoch_gradient_descence[1])

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–6 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1])), list_parametres_stoch_gradient_descence[1], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Chate #6 "Kakaretso ea liphapang tse lisekoere bakeng sa ho theoha ha stochastic gradient (mehato e 80k)"

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Menyetla e ntlafetse, empa e ntse e le sieo. Ka boikaketsi, sena se ka lokisoa ka tsela ena. Ka mohlala, re khetha ho pheta-pheta 1000 ea ho qetela boleng ba li-coefficients tseo phoso e fokolang e entsoeng ka eona. Ke 'nete, bakeng sa sena re tla tlameha ho ngola boleng ba li-coefficients ka bobona. Re ke ke ra etsa sena, empa ho e-na le hoo, ela hloko kemiso. E shebahala e boreleli mme phoso e bonahala e fokotseha ka ho lekana. Ha e le hantle ha ho joalo. Ha re shebeng liphetolelo tse 1000 tsa pele ebe re li bapisa le tsa ho qetela.

Khoutu ea chate ea SGD (mehato ea pele ea 1000)

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–7 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово. ΠŸΠ΅Ρ€Π²Ρ‹Π΅ 1000 ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1][:1000])), 
         list_parametres_stoch_gradient_descence[1][:1000], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

print 'Π“Ρ€Π°Ρ„ΠΈΠΊ β„–7 "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ ΠΏΠΎ-шагово. ПослСдниС 1000 ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ"'
plt.plot(range(len(list_parametres_stoch_gradient_descence[1][-1000:])), 
         list_parametres_stoch_gradient_descence[1][-1000:], color='red', lw=2)
plt.xlabel('Steps (Iteration)', size=16)
plt.ylabel('Sum of squared deviations', size=16)
plt.show()

Chate No. 7 "Kakaretso ea li-square deviations tsa SGD (mehato ea pele ea 1000)"

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Chate #8 "Kakaretso ea likhato tse lisekoere tsa SGD (mehato e 1000 ea ho qetela)"

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Qalong ea ho theoha, re bona phokotso e ts'oanang le e matla ea phoso. Khatisong ea ho qetela, re bona hore phoso e potoloha le ho potoloha boleng ba 1,475 mme ka linako tse ling e lekana le boleng bona bo nepahetseng, empa e ntse e nyoloha ... li-coefficients Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ΠΈ Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation, ebe u khetha tseo phoso e leng nyane ho tsona. Leha ho le joalo, re ne re e-na le bothata bo boholoanyane: re ile ra tlameha ho nka mehato e likete tse 80 (bona khoutu) ho fumana litekanyetso tse haufi le tse nepahetseng. 'Me sena se se se ntse se hanana le mohopolo oa ho boloka nako ea komporo ho theolelo ea stochastic gradient mabapi le ho theoha ha maemo. Ke eng e ka lokisoang le ho ntlafatsoa? Ha ho thata ho bona hore likhatisong tsa pele re ntse re theoha, ka hona, re lokela ho tlohela mohato o moholo ho pheta-pheta ea pele le ho fokotsa mohato ha re ntse re tsoela pele. Re ke ke ra etsa sena sehloohong sena - se se se ntse se hula. Ba lakatsang ba ka inahanela hore na ba ka e etsa joang, ha ho thata πŸ™‚

Joale ha re theoleng litheolelo tsa stochastic gradient re sebelisa laeborari Lipalo ('me ha re seke ra khoptjoa ke majoe ao re a boneng pejana)

Khoutu ea ho theoha ha Stochastic Gradient (NumPy)

# для Π½Π°Ρ‡Π°Π»Π° напишСм Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ шага
def stoch_grad_step_numpy(vector_init, X, ind, y, l):
    x = X[ind]
    y_pred = np.dot(x,vector_init)
    err = y_pred - y[ind]
    grad_a = err
    grad_b = x[1]*err
    return vector_init - l*np.array([grad_a, grad_b])

# ΠΎΠΏΡ€Π΅Π΄Π΅Π»ΠΈΠΌ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΡŽ стохастичСского Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска
def stoch_grad_descent_numpy(X, y, l=0.1, steps = 800):
    vector_init = np.array([[np.random.randint(X.shape[0])], [np.random.randint(X.shape[0])]])
    errors = []
    for i in range(steps):
        ind = np.random.randint(X.shape[0])
        new_vector = stoch_grad_step_numpy(vector_init, X, ind, y, l)
        vector_init = new_vector
        errors.append(error_square_numpy(vector_init,X,y))
    return (vector_init), (errors)

# запишСм массив Π·Π½Π°Ρ‡Π΅Π½ΠΈΠΉ 
list_parametres_stoch_gradient_descence = stoch_grad_descent_numpy(x_np, y_np, l=0.001, steps = 80000)

print ' 33[1m' + ' 33[4m' + "ЗначСния коэффициСнтов a ΠΈ b:" + ' 33[0m'
print 'a =', round(list_parametres_stoch_gradient_descence[0][0],3)
print 'b =', round(list_parametres_stoch_gradient_descence[0][1],3)
print


print ' 33[1m' + ' 33[4m' + "Π‘ΡƒΠΌΠΌΠ° ΠΊΠ²Π°Π΄Ρ€Π°Ρ‚ΠΎΠ² ΠΎΡ‚ΠΊΠ»ΠΎΠ½Π΅Π½ΠΈΠΉ:" + ' 33[0m'
print round(list_parametres_stoch_gradient_descence[1][-1],3)
print



print ' 33[1m' + ' 33[4m' + "ΠšΠΎΠ»ΠΈΡ‡Π΅ΡΡ‚Π²ΠΎ ΠΈΡ‚Π΅Ρ€Π°Ρ†ΠΈΠΉ Π² стохастичСском Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠΌ спускС:" + ' 33[0m'
print len(list_parametres_stoch_gradient_descence[1])
print

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Litekanyetso li ile tsa batla li tΕ‘oana le ha u theoha u sa sebelise Lipalo. Leha ho le joalo, sena sea utloahala.

Ha re fumane hore na ho theoha ha stochastic gradient ho re nkile nako e kae.

Khoutu ea ho tseba nako ea ho bala SGD (mehato e 80k)

print ' 33[1m' + ' 33[4m' +
"ВрСмя выполнСния стохастичСского Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска Π±Π΅Π· использования Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:"
+ ' 33[0m'
%timeit list_parametres_stoch_gradient_descence = stoch_grad_descent_usual(x_us, y_us, l=0.001, steps = 80000)
print '***************************************'
print

print ' 33[1m' + ' 33[4m' +
"ВрСмя выполнСния стохастичСского Π³Ρ€Π°Π΄ΠΈΠ΅Π½Ρ‚Π½ΠΎΠ³ΠΎ спуска с использованиСм Π±ΠΈΠ±Π»ΠΈΠΎΡ‚Π΅ΠΊΠΈ NumPy:"
+ ' 33[0m'
%timeit list_parametres_stoch_gradient_descence = stoch_grad_descent_numpy(x_np, y_np, l=0.001, steps = 80000)

Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation

Ha u ntse u tsoela pele ka morung, maru a lefifi: hape, mokhoa oa "ho ingolla" o bontΕ‘a sephetho se setle ka ho fetisisa. Sena sohle se fana ka maikutlo a hore ho tlameha ho be le mekhoa e poteletseng le ho feta ea ho sebelisa laebrari. Lipalo, e hlileng e potlakisang ts'ebetso ea computational. Sehloohong sena, re ke ke ra ithuta ka tsona. Ntho eo u ka nahanang ka eona ha u phomola :)

Re akaretsa

Pele ke akaretsa, ke rata ho araba potso eo ho ka etsahalang hore ebe e ile ea hlaha ho ’mali oa rōna ea ratehang. Ha e le hantle, ke hobane'ng ha "litlhokofatso" tse joalo tse nang le litloholo, ke hobane'ng ha re lokela ho nyoloha le ho theoha thabeng (haholo-holo tlaase) ho fumana sebaka se tlaase sa bohlokoa, haeba re e-na le sesebelisoa se matla le se bonolo matsohong a rona, ka sebōpeho sa tharollo ea tlhahlobo e re romellang hang-hang sebakeng se nepahetseng?

Karabo ea potso ena e ka holimo. Joale re hlahlobile mohlala o bonolo haholo oo karabo ea 'nete e leng ho oona Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation ho itőetlehile ka tőobotsi e le 'ngoe Ho Rarolla Mokhoa o Bonolo oa Linear Regression Equation. Ha u bone sena hangata bophelong, kahoo a re nahaneng hore re na le matőoao a 2, 30, 50 kapa ho feta. Eketsa ho sena likete, esita le mashome a likete a boleng bakeng sa tőobotsi ka 'ngoe. Tabeng ena, tharollo ea analytical e ka 'na ea hlōleha tlhahlobo' me ea hlōleha. Ka lehlakoreng le leng, ho theoha ha gradient le ho fapana ha eona ho tla re atametsa butle butle ho sepheo - bonyane ba ts'ebetso. 'Me u se ke ua tőoenyeha ka lebelo - mohlomong re ntse re tla hlahloba litsela tse tla re lumella ho beha le ho fetola bolelele ba mohato (ke hore, lebelo).

'Me joale ka kakaretso e khutΕ‘oanyane.

Ntlha ea pele, ke tΕ‘epa hore boitsebiso bo fanoeng sehloohong sena bo tla thusa "bo-rasaense ba data" ba qalang ho utloisisa mokhoa oa ho rarolla li-equations tse bonolo (eseng feela) tsa linear regression.

Ea bobeli, re sheba litsela tse 'maloa tsa ho rarolla equation. Hona joale, ho itΕ‘etlehile ka boemo, re ka khetha e loketseng mosebetsi oo re o etsang.

Taba ea boraro, re bone matla a litlhophiso tse ling, e leng bolelele ba mohato oa ho theoha ha gradient. Paramethara ena ha ea lokela ho hlokomolohuoa. Joalokaha ho boletsoe ka holimo, e le ho fokotsa litΕ‘enyehelo tsa k'homphieutha, bolelele ba mehato bo lokela ho fetoloa nakong ea ho theoha.

Ea bone, molemong oa rona, mesebetsi e "ingoletseng" e bontΕ‘itse sephetho sa nako se nepahetseng sa lipalo. Mohlomong sena se bakoa ke ho se sebelisoe ke setsebi ka ho fetisisa sa bokhoni ba laebrari. Lipalo. Empa leha ho ka ba joalo, sephetho se itlhahisa ka tsela e latelang. Ka lehlakoreng le leng, ka linako tse ling ho bohlokoa ho botsa maikutlo a thehiloeng, 'me ka lehlakoreng le leng, ha se kamehla ho lokelang ho thatafatsa ntho e' ngoe le e 'ngoe - ho fapana le hoo, ka linako tse ling tsela e bonolo ea ho rarolla bothata e sebetsa haholoanyane. 'Me kaha sepheo sa rona e ne e le ho sekaseka mekhoa e meraro ea ho rarolla equation e bonolo ea linear regression, tΕ‘ebeliso ea mesebetsi e "ingoletseng" e ne e lekane bakeng sa rona.

Lingoliloeng (kapa ntho e joalo)

1. Linear regression

http://statistica.ru/theory/osnovy-lineynoy-regressii/

2. Mokhoa o fokolang oa lisekoere

mathprofi.ru/method_naimenshih_kvadratov.html

3. E nkiloeng

www.mathprofi.ru/chastnye_proizvodnye_primery.html

4. Gradient

mathprofi.ru/proizvodnaja_po_napravleniju_i_gradient.html

5. Ho theoha ha Gradient

habr.com/en/post/471458

habr.com/en/post/307312

artemarakcheev.com//2017-12-31/linear_regression

6. Laeborari ea NumPy

docs.scipy.org/doc/numpy-1.10.1/reference/generated/numpy.linalg.solve.html

docs.scipy.org/doc/numpy-1.10.0/reference/generated/numpy.linalg.pinv.html

pythonworld.com/numpy/2.html

Source: www.habr.com

Eketsa ka tlhaloso