机器学习

Neural networks

2020-01-28  本文已影响0人  spraysss

Layer

a^{(j)}_i is "activation" of unit i in layer j
\Theta^{j} is matrix of weights controlling function mapping from layer j to layer j+1

a^{2}_1=g(\Theta_{10}^{(1)}x_0+\Theta_{11}^{(1)}x_1+\Theta_{12}^{(1)}x_2+\Theta_{13}^{(1)}x_3)
a^{2}_2=g(\Theta_{20}^{(1)}x_0+\Theta_{21}^{(1)}x_1+\Theta_{22}^{(1)}x_2+\Theta_{23}^{(1)}x_3)
a^{2}_2=g(\Theta_{30}^{(1)}x_0+\Theta_{31}^{(1)}x_1+\Theta_{32}^{(1)}x_2+\Theta_{33}^{(1)}x_3)
h_\theta(x)=g(\Theta_{10}^{(1)}a_0+\Theta_{11}^{(1)}a_1+\Theta_{12}^{(1)}a_2+\Theta_{13}^{(1)}a_3)

if network has s_j units in layer j,s_{j+1} units in layer j+1,then \Theta^{(j)} will be of dimension s_{j+1}\times (s_j+1)

XNOR implement use Neural networks

multi-class classfication use Neural networks


上一篇下一篇

猜你喜欢

热点阅读