吴恩达机器学习 - ex5

2018-11-19  本文已影响0人  YANWeichuan
J = 1 ./ (2 * m) * sum((X * theta - y) .^ 2) + lambda ./ (2 * m) * sum(theta(2:end) .^ 2);
grad_theta = theta;
grad_theta(1, :) = 0;
grad = 1 ./ m * (X' * (X * theta - y)) + lambda ./ m * grad_theta;
%grad(1,:) = 1 ./ m * (X(:, 1)' * (X * theta - y));
%for j = 2 : size(theta, 1)
%   grad(j:size(theta, 1), :) = 1 ./ m * (X(:, j)' * (X * theta - y)) + lambda ./ m * theta(j);

for i = 1 : m
   theta = trainLinearReg(X(1:i, :), y(1:i, :), lambda);
   error_train(i)  = linearRegCostFunction(X(1:i, :), y(1:i, :), theta, 0);
   error_val(i)  = linearRegCostFunction(Xval, yval, theta, 0);
   %error_val(i) = 1./ (2 * size(yval, 1)) * sum((Xval * theta - yval) .^ 2);
for i = 1:length(lambda_vec)
    lambda = lambda_vec(i);
    theta = trainLinearReg(X, y, lambda);
    error_train(i)   = linearRegCostFunction(X, y, theta, 0);
    error_val(i)   = linearRegCostFunction(Xval, yval, theta, 0);
    %error_val(i) = 1./ (2 * size(yval, 1)) * sum((Xval * theta - yval) .^ 2);
for i = 1 : p
  X_poly(:,i) = X .^ i

lambda为零时的拟合曲线

lambda为零时的训练和验证误差

上一篇 下一篇

猜你喜欢

热点阅读