线性代数基础

2018-11-01  本文已影响0人  JasonJe

线性代数

向量

\vec{\text x} = \left[ \begin{matrix} x_1\\ x_2\\ \cdots \\ x_n\\ \end{matrix} \right]

1.线性相关与线性无关

存在一组不全为0的实数a_1, a_2, \cdots, a_n,使得:

\sum_{i = 1} ^{n} a_i \vec{\text v_i} = \vec{\text 0}即至少有一个向量可以用其余向量线性表示。

当且仅当a_i = 0, i=1, 2, \cdots, n时,才有:

\sum_{i = 1} ^{n} a_i\vec{\text v_i} = \vec{\text 0}即存在一个向量\vec{\beta},其不能等于向量组内任意向量,可由向量组内向量进行唯一的线性表示。

2. 向量空间的维数

一个向量空间所包含的最大线性无关向量的数据,称为向量空间的维数。

3. 向量的点积(内积)

\vec{\text{u}}\cdot \vec{\text{v}} = u_{1}v_{1} + u_{2}v_{2} + \cdots + u_{n}x_{n} = |\vec{\text{u}}||\vec{\text{v}}| cos(\vec{\text{u}}, \vec{\text{v}} ) = \vec{\text{u}}^\text T\vec{\text{v}} = \vec{\text{u}}^\text T\vec{\text{v}}

import numpy as np

u, v = np.array([1, 2, 3]), np.array([4, 5, 6])

uv = u.dot(v)
uv = np.dot(u, v)
uv
32

4. 三维向量的叉积(外积)

\vec{\text{w}} = \vec{\text{u}} \times \vec{\text{v}} = \left| \begin{matrix} \vec{\text{i}} & \vec{\text{j}} & \vec{\text{k}} \\ u_x & u_y & u_z \\ v_x & v_y & v_z \end{matrix} \right| \\ = (u_yv_z - u_zv_y)\vec{\text{i}} - (u_xv_z - u_zv_x)\vec{\text{j}} + (u_xv_y - u_yv_x)\vec{\text{k}}

其中, \vec{\text{i}}, \vec{\text{j}}, \vec{\text{j}}x, y, z轴的单位向量。

\vec{\text{u}} = u_x\vec{\text{i}} + u_y\vec{\text{j}} + u_x\vec{\text{k}}, \\ \vec{\text{v}} = v_x\vec{\text{i}} + v_y\vec{\text{j}} + v_z\vec{\text{k}}

import numpy as np

u, v = np.array([1, 2, 3]), np.array([4, 5, 6])

uv = np.cross(u, v).sum()
uv
0

5. 三维向量的混合积

\left[ \vec{\text{u}}\vec{\text v}\vec{\text w} \right] = (\vec{\text u}\times\vec{\text v})\cdot \vec{\text w} = \vec{\text u} \cdot (\vec{\text v} \times \vec{\text w})\\\ = \left| \begin{matrix} u_x & u_y & u_z \\\ v_x & v_y & v_z \\\ w_x & w_y & w_z \end{matrix} \right| = \left| \begin{matrix} u_x & v_x & w_x \\\ u_y & v_y & w_y \\\ u_z & v_z & w_z \end{matrix} \right|

6. 三维向量的并矢积

\vec{\text u}\vec{ \text v} = \left[ \begin{matrix} u_xv_x & u_xv_y & u_xv_z \\ u_yv_x & u_yv_y & u_yv_z \\ u_zv_x & u_zv_y & u_zv_z \\ \end{matrix} \right]

也记作 \sideset{\vec{\text u}}{\vec{\text v}}\bigotimes 或者 \vec{\text u}\vec{\text v}^\text T

import numpy as np

u, v = np.array([1, 2, 3]), np.array([4, 5, 6])

uv = np.outer(u, v)
uv
array([[ 4,  5,  6],
       [ 8, 10, 12],
       [12, 15, 18]])

7. Gram - Schmidt正交化

\alpha_1, \alpha_2, \dots, \alpha_m(m\leq n)R^n中的一个线性无关向量组,若令

\beta_1 = \alpha_1 \\ \beta_2 = \alpha_2 - \frac{\left\langle \alpha_2, \beta_1\right\rangle}{\left\langle\beta_1, \beta_1\right\rangle}\beta_1 \\ \beta_m = \alpha_m - \frac{\left\langle \alpha_m, \beta_1 \right\rangle}{\left\langle \beta_1, \beta_1 \right\rangle}\beta_1 - \frac{\left\langle \alpha_m, \beta_2 \right\rangle}{\left\langle \beta_2, \beta_2 \right\rangle}\beta_2 - \dots - \frac{\left\langle \alpha_m, \beta_{m-1} \right\rangle}{\left\langle \beta_{m-1}, \beta_{m -1}\right\rangle}\beta_{m-1}

\beta_1, \beta_2, \dots, \beta_m就是一个正交向量组,若再令

e_i = \frac{\beta_i}{|| \beta_i ||}(i = 1, 2, \dots, m)

就得到一个标准正交向量组 e_1, e_2, \dots, e_m,且该向量组与\alpha_1, \alpha_2, \dots, \alpha_m 等价。

import numpy as np

A = np.array([[1,1,6],  ## numpy.linalg 是对列向量进行标准正交化
              [1,2,4],
              [1,3,2]])

q, r = np.linalg.qr(A)
q, r
(array([[-5.77350269e-01,  7.07106781e-01,  4.08248290e-01],
        [-5.77350269e-01,  5.55111512e-17, -8.16496581e-01],
        [-5.77350269e-01, -7.07106781e-01,  4.08248290e-01]]),
 array([[-1.73205081, -3.46410162, -6.92820323],
        [ 0.        , -1.41421356,  2.82842712],
        [ 0.        ,  0.        ,  0.        ]]))

矩阵

\text{A} = \left[ \begin{matrix} a_{11} & a_{12} & \dots & a_{1n}\\ a_{21} & a_{22} & \dots & a_{2n} \\ \vdots & \vdots & \ddots &\vdots \\ a_{m1} & a_{m2} & \cdots & a_{mm} \end{matrix} \right]

1. 矩阵的运算

2. \text{A}^T、\text{A}^{-1}、\text{A}^*

\text{A}^T,转置;
\text{A}^{-1},矩阵的逆;
\text{A}^*,伴随矩阵。

反矩阵


import numpy as np

A = np.mat(np.random.randint(10, size = (3,3)))
A = np.mat([[-3, 2, -5], [-1, 0, -2], [3, -4, 1]])
print(A)
print(A.T) # 转置
print(A.I) # 逆
print(np.linalg.det(A)) # 行列式
print(np.dot(np.linalg.det(A), A.I)) # 伴随矩阵
[[-3  2 -5]
 [-1  0 -2]
 [ 3 -4  1]]
[[-3 -1  3]
 [ 2  0 -4]
 [-5 -2  1]]
[[ 1.33333333 -3.          0.66666667]
 [ 0.83333333 -2.          0.16666667]
 [-0.66666667  1.         -0.33333333]]
-6.0
[[-8. 18. -4.]
 [-5. 12. -1.]
 [ 4. -6.  2.]]

3. 矩阵的秩

import numpy as np
A = np.array([[1, 1], [2, 2]]) # rank = 1
# A = np.array([[1, 2], [3, 4]]) # rank = 2
A_rank = np.linalg.matrix_rank(A)
A_rank
1

4. 特征值与特征向量

import numpy as np

A = np.mat([[-1, 1, 0], [-4, 3, 0], [1, 0, 2]])
eigenvalue, featurevector = np.linalg.eig(A)
eigenvalue, featurevector
(array([2., 1., 1.]), matrix([[ 0.        ,  0.40824829,  0.40824829],
         [ 0.        ,  0.81649658,  0.81649658],
         [ 1.        , -0.40824829, -0.40824829]]))

5. 相似矩阵

上一篇 下一篇

猜你喜欢

热点阅读