第二章-神经网络基础(logistic 回归函数、向量化)

2017-10-29  本文已影响0人  人机分离机

2.1 二分分类

2.2 logistic 回归模型

course2-3.jpg

2.3 logistic 回归损失函数(成本函数)

2.4 阶梯下降法

course2-5.jpg

2.5 向量化

import time

a = np.random.rand(1000000)
b = np.random.rand(1000000)

tic = time.time()
c = np.dot(a,b)
toc = time.time()

print(c)
print("Vectorized version(向量化):" + str(1000*(toc-tic))+"ms")

c = 0
tic = time.time()
for i in range(1000000):
    c += a[i]*b[i]
toc = time.time()

print(c)
print("For loop(循环):" + str(1000*(toc-tic))+ "ms")
# 输出
249961.060873
Vectorized version(向量化):2.0003318786621094ms
249961.060873
For loop(循环):779.0098190307617ms
Z = np.dot(W.T,X)+b

2.6 高度向量化的、非常高效的逻辑回归的梯度下降算法

course2-6.jpg
Z = wTX + b = np. dot(w. T, X)
A = σ(Z)
dZ = A -Y
dw = 1/m ∗ X ∗ dz^T
db = 1/m * np. sum(dZ)
w = w - α * dw
b = b - α * db

2.7 python中的广播机制( Broadcasting in Python)

import numpy as np
A = np.array([[56.0, 0.0, 4.4, 68.0],
              [1.2, 104.0, 52.0, 8.0],
              [1.8, 135.0, 99.0, 0.9]])
print(A)
[[  56.     0.     4.4   68. ]
 [   1.2  104.    52.     8. ]
 [   1.8  135.    99.     0.9]]
cal = A.sum(axis=0)
print(cal)
[  59.   239.   155.4   76.9]
percentage = 100*A/cal.reshape(1,4)
print(percentage)
[[ 94.91525424   0.           2.83140283  88.42652796]
 [  2.03389831  43.51464435  33.46203346  10.40312094]
 [  3.05084746  56.48535565  63.70656371   1.17035111]]
  1. axis 用来指明将要进行的运算是沿着
    哪个轴执行,在 numpy 中, 0 轴是垂直的,也就是列,而 1 轴是水平的,也就是行
  2. A/cal.reshape(1,4)指令则调用了 numpy 中的广播机制
  1. numpy 广播机制
上一篇 下一篇

猜你喜欢

热点阅读