## 前言

### 微积分基础知识

$\frac{\partial}{\partial x}f\left( x \right) =cnx_{}^{n-1}$

$\frac{\partial}{\partial x_{0}^{}}\sum_{i=0}^n{F\left( x_{i}^{} \right) =\sum_{i=0}^n{\frac{\partial}{\partial x_{0}^{}}F\left( x_{i}^{} \right)}}$

$\frac{\partial}{\partial x}J\left( x \right) =g\left( f\left( x \right) \right) '\times f\left( x \right) '$

$\frac{\partial}{\partial x}f\left( x,y \right) =nax_{}^{n-1}$

$\frac{\partial}{\partial y}f\left( x,y \right) =mby_{}^{m-1}$

## 正文

### 成本函数

$J\left( \theta \right) =\frac{1}{2m}\sum_{i=1}^m{\left( h_{\theta}^{}\left( x_{i}^{} \right) -y_{i}^{} \right) _{}^{2}}$

### 梯度下降算法

$\theta _{j}^{}=\theta _{i}^{}-\alpha \times Gradient$

$Gradient=\frac{\partial}{\partial \theta _{j}^{}}J\left( \theta \right)$

$\theta _j=\theta _j-\alpha \frac{\partial}{\partial \theta _j}J\left( \theta \right)$

\begin{aligned} \theta _0&=\theta _0-\frac{\alpha}{m}\sum_{i=1}^m{\left( h\left( x_{}^{i} \right) -y_{}^{i} \right)}\\ \end{aligned}

\begin{aligned} \theta _j&=\theta _j-\frac{\alpha}{m}\sum_{i=1}^m{\left( \left( h\left( x^{\left( i \right)} \right) -y^{\left( i \right)} \right) x_{j}^{\left( i \right)} \right)}\\ \end{aligned}

\begin{aligned} \frac{\partial}{\partial \theta _j}J\left( \theta \right) &=\frac{\partial}{\partial \theta _j}\frac{1}{2m}\sum_{i=1}^m{\left( h\left( x^{\left( i \right)} \right) -y^{\left( i \right)} \right)}^2\\ &=\frac{1}{2m}\sum_{i=1}^m{\frac{\partial}{\partial \theta _j}}\left( h\left( x^{\left( i \right)} \right) -y^{\left( i \right)} \right) ^2\\ &=2\frac{1}{2m}\sum_{i=1}^m{\left( \left( h\left( x^{\left( i \right)} \right) -y^{\left( i \right)} \right) \frac{\partial}{\partial \theta _j}\left( h\left( x^{\left( i \right)} \right) -y^{\left( i \right)} \right) \right)}\\ &=\frac{1}{m}\sum_{i=1}^m{\left( \left( h\left( x^{\left( i \right)} \right) -y^{\left( i \right)} \right) \frac{\partial}{\partial \theta _j}\left( \sum_{j=0}^n{\theta}_jx_{j}^{\left( i \right)}-y^{\left( i \right)} \right) \right)}\\ &=\frac{1}{m}\sum_{i=1}^m{\left( \left( h\left( x^{\left( i \right)} \right) -y^{\left( i \right)} \right) x_{j}^{\left( i \right)} \right)}\\ \end{aligned}

## 补充

### 多变量的线性回归算法

$h\left( x \right) =\theta _{0}^{}+\theta _{1}^{}x_{1}^{}+\theta _{2}^{}x_{2}^{}+\theta _{3}^{}x_{3}^{}+...+\theta _{n}^{}x_{n}^{}$

$h_{\theta}^{}\left( x \right) =\left[ \theta _{0}^{}\ \theta _{1}^{}\ ...\ \theta _{n}^{} \right] \left[ \begin{array}{c} x_{0}^{}\\ x_{1}^{}\\ ...\\ x_{n}^{}\\ \end{array} \right]$

$J\left( \theta \right) =\frac{1}{2m}\left( X\theta -\overrightharpoon{y} \right) _{}^{T}\left( X\theta -\overrightharpoon{y} \right)$

### 利用最小二乘法（OSL）求解线性方程模型

$y_{\text{i}}^{}=\text{a}_{0}^{}+\text{a}_{1}^{}\text{x}$

$\sum{\left( \text{Y}_{\text{i}}^{}-\text{Y}_{\text{j}}^{} \right) _{}^{2}}$

$\varphi =\sum{\left( \text{Y}_{\text{i}}^{}-\text{Y}_{\text{j}}^{} \right) _{}^{2}}$

$\varphi =\sum{\left( \text{Y}_{\text{i}}^{}-\text{a}_{0}^{}-\text{a}_{1}^{}\text{X}_{\text{i}}^{} \right) _{}^{2}}$

$\sum{\left( \text{Y}_{\text{i}}^{}-\text{Y}_{\text{j}}^{} \right) _{}^{2}}$

$\sum{2\left( \text{a}_{0}^{}+\text{a}_{1}^{}\text{X}_{\text{i}}^{}-\text{Y}_{\text{i}}^{} \right) _{}^{2}=0}$

$\sum{2\text{X}_{\text{i}}^{}\left( \text{a}_{0}^{}+\text{a}_{1}^{}\text{X}_{\text{i}}^{}-\text{Y}_{\text{i}}^{} \right) _{}^{2}=0}$

$\text{na}_{0}^{}+\left( \sum{X_{i}^{}} \right) a_{1}^{}=\sum{Y_{i}^{}\ - ①}$

$\left( \sum{X_{i}^{}} \right) a_{0}^{}+\left( \sum{\text{X}_{\text{i}}^{2}} \right) \text{a}_{1}^{}=\sum{\left( \text{X}_{\text{i}}^{}\text{Y}_{\text{i}}^{} \right) \ -\ \text{②}}$

$a_{0}^{}=\frac{\sum{x_{i}^{2}\sum{y_{i}^{}-\sum{x_{i}^{}\sum{x_{i}^{}y_{i}^{}}}}}}{n\sum{x_{i}^{2}}-\left( \sum{x_{i}^{}} \right) _{}^{2}}$

$a_{1}^{}=\frac{n\sum{x_{i}^{}y_{i}^{}-\sum{x_{i}^{}\sum{y_{i}^{}}}}}{n\sum{x_{i}^{2}}-\left( \sum{x_{i}^{}} \right) _{}^{2}}$