Linear Regression - Code

2019-11-24 Views Python 编程 机器学习546字4 min read
featureimg

I have written linear regression before. But I only introduce some basic knowledge about it. In this series of article, I will publish the code of the machine learning algorithm in python.

The code will contain two pieces: the first piece of code is implemented by using scikit-learn; and the second piece of the code is implemented by myself.

If you start to learn Linear Regression recently, I recommend you to read Linear Regression first.

First code

import numpy as np
from sklearn.linear_model import LinearRegression
import matplotlib.pyplot as plt

dots = 200
X = np.linspace(-2 * np.pi, 2 * np.pi, dots) # Create an array of random number[independent variable] at the range [-2pi,2pi], and contains 200 dots
y = np.sin(X) + 0.2 * np.random.rand(dots) - 0.1 # Create an array of dependent variable based on independent variable, and create some noise

# Reshape the arrays to fulfill the requirement to use scikit-learn
X= X.reshape(-1,1)
y = y.reshape(-1,1)

# Train the model
model = LinearRegression()
model.fit(X,y)

print(model.score(X,y))

# Plot
plt.title("Linear Regression")
plt.scatter(X,y)
plt.plot(X,model.predict(X))
plt.show()

Second code

import numpy as np
import matplotlib.pyplot as plt

def average(x): # Calculate the avearage of an array
    suma = 0
    for i in range(0,len(x)):
        suma += x[i]
    return (suma / len(x))

def intersum(x,y): # Calculate the sum of products of two arrays
    suma = 0
    for i in range(0,len(x)):
        suma += x[i] * y[i]
    return suma

def squaresum(x): # Calculate the sum of squares of an array
    suma = 0
    for i in range(0,len(x)):
        suma += x[i] * x[i]
    return suma

class LinearRegression:
    # y = ax + b
    # Linear Regression is derived from least square method
    a = 0
    b = 0
    flag = 0

    def v(self): # Judge whether the model is fitted
        if self.flag == 0:
            raise Exception("Please train the model first")
    
    def fit(self,x,y): # Calculate the coefficient of the linear model
        ax = average(x)
        ay = average(y)
        nx = []
        for i in range(0,len(y)):
            y[i] = y[i] - ay
            nx.append(x[i] - ax)
        
        a = intersum(y,x) / intersum(nx,x)
        b = ay - a * ax
        print("y = " + str(a) + "x + " + str(b))
        self.a = a
        self.b = b

    def predict(self,x): # Using model to predict y
        self.v()
        result = []
        for i in range(0,len(x)):
            result.append(self.a * x[i] + self.b)
        return result

    def plot(self,x,y): # Plot
        self.v()
        plt.title("Linear Regression")
        plt.scatter(x,y)
        plt.plot(x,self.predict(x))
        plt.show()

    def score(self,x,y): # Calculate the score of the model [based on coefficient of determination]
        self.v()
        ay = average(y)
        ny = self.predict(x)
        ryd = []
        ryl = []
        for i in range(0,len(ny)):
            ryd.append(y[i] - ny[i])
            ryl.append(y[i] - ay)

        SSTotal = squaresum(ryl)
        SSresid = squaresum(ryd)
        return (1 - SSresid / SSTotal)

dots = 200
X = np.linspace(-2 * np.pi, 2 * np.pi, dots)
y = np.sin(X) + 0.2 * np.random.rand(dots) - 0.1

model = LinearRegression()
model.fit(X,y)
print(model.score(X,y))
model.plot(X,y)

logo   WRITTEN BY:Serence

一个程序员和文艺青年的博客!

本文章采用CC BY-NC-SA 4.0进行许可。转载请注明出处!

上一篇

Logistic Regression - Code


下一篇

Poerty - Harsh winter