PCA&SVD的基本原理
2025-10-29 本文已影响0人
斑马上树
一、基本原理
降维,PCA更多的发现特征间的线性关系,t-SNE主打非线性降维。
Dimensionality reduction we normally use it to visualize our data and to find hidden informationwe don't normally see. Also, it is use for optimizing of course this is under the assumption thatwhen implementing dimensionality reduction to your dataset it is representative of the actualdataset. You can find this out through the explained variance.
二、python实现
sklearn版本
from sklearn.decomposition import PCA
data = np.random.randn(10,4)
pca = PCA()
pca.fit(data)
print (pca.components_)
print (pca.explained_variance_ratio_)
# 对新数据进行pca主成分提取
pca = PCA(3)
pca.fit(data)
lowd = pca.transform(data)
pca.inverse_transform(lowd)
附,参考资料:
1、sklearn文档,4.3. 预处理数据,https://www.studyai.cn/modules/preprocessing.html