2016年8月2日(week4神经网络2)

2016-08-04  本文已影响19人  上海王尔德

Next, we need to understand it by using Math:

Neuron Network model1.png

Here is a picture of a typical Neuron Network. There are 4 layers: Layer 1 is called input layer; Layer2&3 are called hidden layers which compute intermidate features that lead us to a more meaningful(abstract?) result; Layers 4 is called output layer.

answer1.png

We want to compute values in Layer2. How do we do that? Suppose we are given a 3x4 matrix Big-theta(1) alreday. Then a(2) is just g(theta*a(1)), done!

Next,

Neuron Network example of logic &&.png

We can do even more complicated things by Neuron Network, such as logic arithmetics.
By taking a advantage sigmoid(x) function, we can approximate every logic arithmetic units.
I believe the picture above explains everything!
Finally,

Neuron Network Multicalss Classification.png

Packing up all cool thing we've learned so far, it's time to realize the ture power of multiclassification. How do we do that?
We simply make them distinct vectors like [1,0,0,0],[0,1,0,0],[0,0,1,0],etc

You should look for my complete code of this section : https://github.com/yhyu13/Coursera-Machine-Learning-Andrew-Ng (file name: predict.m)

上一篇 下一篇

猜你喜欢

热点阅读