机器学习笔记(2)-sklearn回归分析

2017-10-23  本文已影响112人  Spytensor

第一大类.广义线性模型

1. 普通最小二乘法

代码示例

from sklearn import linear_model
import numpy as np 
import matplotlib.pyplot as plt 

x = np.array([1.08,1.12,1.19,1.28,1.36,1.48,1.59,1.68,1.80,1.87,1.98,2.07]).reshape(-1,1)
y = np.array([2.25,2.37,2.40,2.55,2.64,2.75,2.92,3.03,3.14,3.26,3.36,3.50]).reshape(-1,1)
regr = linear_model.LinearRegression()
regr.fit(x, y)
# print('The coef is :%f'%regr.coef_)
# print('The intercept is :%f'%regr.intercept_)
a = regr.coef_
b = regr.intercept_
Y = a * x + b
#plt.figure('LinearRegression')
#plt.plot(x,y,'.')
plt.scatter(x, y)
#plt.figure('Predict')
plt.plot(x,Y)
#plt.figure('LinearRegression & Predict')
plt.show() 

拟合后的图像

LR1.png

2.岭回归(Ridge Regression)

代码示例

import numpy as np 
import matplotlib.pyplot as plt 
from sklearn import linear_model

#create data
x = 1. / (np.arange(1, 11) + np.arange(0, 10)[:,np.newaxis])
y = np.ones(10)

n_alphas = 200
alphas = np.logspace(-10, -2,n_alphas)

coefs = []
for a in alphas:
    ridge = linear_model.Ridge(alpha=a,fit_intercept=False)
    ridge.fit(x, y)
    coefs.append(ridge.coef_)
ax = plt.gca()
ax.plot(alphas,coefs)
ax.set_xscale('log')
ax.set_xlim(ax.get_xlim()[::-1])
plt.xlabel('alpha')
plt.ylabel('weights')
plt.title('Ridge coefficients as a function of the regularization')
plt.axis('tight')
plt.show()
上一篇 下一篇

猜你喜欢

热点阅读