数据蛙数据分析每周作业机器学习专题

《李宏毅·机器学习》作业(一)Python纯手写线性回归v1

2019-03-03  本文已影响7人  Spareribs

作业再认真分析,其实需要实现的内容就红色框圈出来的部分,其中:

这个相对简单,比较难的是这三部分:

image.png

代码分析

损失函数

J(\theta) =\frac{\sum ( X\theta - y)^2}{2m} \tag{1}

def compute_cost(x, y, theta):
    m = len(y)
    J = np.sum(np.square(x.dot(theta) - y)) / (2.0 * m)
    return J

J(\theta)=\frac{1}{2m}(X\theta - y)^T(X\theta - y) \tag{2}

def compute_cost_multi(X, y, theta):
    m = len(y)
    diff = X.dot(theta) - y
    J = 1.0 / (2 * m) * diff.T.dot(diff)
    return J

梯度下降

\theta = \theta - \frac{\alpha}{m(X\theta - Y)^TX)} \tag{3}

def gradient_descent(X, y, theta, alpha, num_iters):
    m = len(y)
    J_history = np.zeros(num_iters)
    for i in range(num_iters):
        theta -= alpha / m * ((X.dot(theta) - y).T.dot(X))
        J_history[i] = compute_cost(X, y, theta)
    return theta, J_history
    

def gradient_descent_multi(X, y, theta, alpha, num_iters):
    m = len(y)
    J_history = np.zeros(num_iters)
    for i in range(num_iters):
        theta -= alpha / m * ((X.dot(theta) - y).T.dot(X))
        J_history[i] = compute_cost_multi(X, y, theta)
    return theta, J_history
    

最小二乘法

\theta=(X^TX)^{-1}X^Ty \tag{4}

def normal_eqn(X, y):
    theta = np.linalg.pinv(X.T.dot(X)).dot(X.T).dot(y)
    return theta

深有体会,这种写实现过程:

  1. 先学会矩阵的运算和python里面的一些基本的矩阵操作
  2. 要先搞懂原理,需要做什么。
  3. 具体实现
上一篇 下一篇

猜你喜欢

热点阅读