机器学习与数据挖掘

Coursera ML(6)-Neural Networks R

2017-04-16  本文已影响37人  mmmwhy

神经网络模型 更多见:iii.run


Neural Networks Model

A single neuron model: logistic unit

$$\begin{bmatrix}x_0 \newline x_1 \newline x_2 \newline \end{bmatrix}\rightarrow\begin{bmatrix}\ \ \ \newline \end{bmatrix}\rightarrow h_\theta(x)$$


Neural Networks

$$\begin{bmatrix}x_0 \newline x_1 \newline x_2 \newline x_3\end{bmatrix}\rightarrow\begin{bmatrix}a_1^{(2)} \newline a_2^{(2)} \newline a_3^{(2)} \newline \end{bmatrix}\rightarrow h_\theta(x)$$


其中
$$\begin{align}& a_i^{(j)} = \text{"activation" of unit $i$ in layer $j$} \newline& \Theta^{(j)} = \text{matrix of weights controlling function mapping from layer $j$ to layer $j+1$}\end{align}$$

Calculation from one layer to the next


In the picture above, we have the networks from layer j to layer j+1, in which layer j has 3(+1) units while layer j+1 has 3 layers. Let $s_j=3$, $s_j+1=3$

Multiclass Classification

分类器

Each $y^{(i)}$ represents a different image corresponding to either a car, pedestrian, truck, or motorcycle. The inner layers, each provide us with some new information which leads to our final hypothesis function. The setup looks like:


Summary

Example: layer 1 has 2 input nodes and layer 2 has 4 activation nodes. Dimension of $\Theta^{(1)}$ is going to be 4×3 where $s_j = 2$ and $s_{j+1} = 4$, so $s_{j+1} \times (s_j + 1) = 4 \times 3$$


本小节笔记,有大段参考自 shawnau.github.io

上一篇下一篇

猜你喜欢

热点阅读