### Discover more content...

Enter some keywords in the search box above, we will do our best to offer you relevant results.

### We're sorry!

We couldn't find any results for your search. Please try again with another keywords.

## 一. Multiple Features

$x_{j}^{(i)}$: 训练集第 i 个向量中的第 j 个元素(第 i 行第 j 列)
$x^{(i)}$: 训练集第 i 个向量(第 i 行)
$m$: 总共 m 行
$n$: 总共 n 列

$$h_{\theta}(x) = \theta_{0} + \theta_{1}x_{1} + \theta_{2}x_{2} + \theta_{3}x_{3} + \cdots + \theta_{n}x_{n}$$

$$h_{\theta}(x) = \begin{bmatrix} \theta_{0} & \theta_{1} & \cdots & \theta_{n} \end{bmatrix} \begin{bmatrix} x_{0}\\ x_{1}\\ \vdots \\ x_{n} \end{bmatrix} = \theta^{T}x$$

## 二. Gradient Descent for Multiple Variables

$$\theta_{j} := \theta_{j} - \alpha \frac{1}{m} \sum_{i=1}^{m}(h_{\theta}(x^{(i)})-y^{(i)})x^{(i)}_{j}$$

## 三. Gradient Descent in Practice I - Feature Scaling

### 2. Mean normalization

$x_{i} = \frac{x_{i} - \mu_{i}}{s_{i}}$

## 四. Gradient Descent in Practice II - Learning Rate

$\alpha$ 的取值可以从 0.001，0.003，0.01，0.03，0.1，0.3，1 这几个值去尝试，选一个最优的。

## 五. Features and Polynomial Regression

GitHub Repo：Halfrost-Field

Follow: halfrost · GitHub

Previous Post

Next Post