torch中的tensor

2020-04-17  本文已影响0人  JerryLoveCoding

tensor,即“张量”。实际上跟numpy数组、向量、矩阵的格式基本一样。但是是专门针对GPU来设计的,可以运行在GPU上来加快计算效率。

定义一个tensor

Torch中定义tensor,与numpy中定义矩阵差不多,例如定义一个5×3的tensor,每一项都是0的张量:

x = torch.zeros(5,3)

另外初始化tensor还有如下形式:

还有根据numpy数组初始化:

还有随机初始化:

tensor的属性有

.data:数据(但建议使用.detach()
.dtype:张量的数据类型 如:torch.FloatTensor
.shape:张量的形状
.device:张量所在的设备
.requires_grad:是否需要求导
.grad:data的梯度
.grad_fn:创建Tensor的function
.is_leaf:是否为叶子结点

tensor的操作

a = torch.rand(2,3)
print(a)
b = torch.cat((a,a),dim=0)
print(b)
c = torch.cat((a,a),dim=1)
print(c)

Output:
tensor([[0.6646, 0.8589, 0.5251],
        [0.2515, 0.4547, 0.6944]])
tensor([[0.6646, 0.8589, 0.5251],
        [0.2515, 0.4547, 0.6944],
        [0.6646, 0.8589, 0.5251],
        [0.2515, 0.4547, 0.6944]])
tensor([[0.6646, 0.8589, 0.5251, 0.6646, 0.8589, 0.5251],
        [0.2515, 0.4547, 0.6944, 0.2515, 0.4547, 0.6944]])
a = torch.rand(2, 3)
print(a)
b = torch.chunk(a, 2, 0)
print(b)
c = torch.chunk(a, 3, 1)
print(c)

Output:
tensor([[0.6142, 0.8066, 0.9073],
        [0.7619, 0.1243, 0.3439]])
(tensor([[0.6142, 0.8066, 0.9073]]), tensor([[0.7619, 0.1243, 0.3439]]))
(tensor([[0.6142],
        [0.7619]]), tensor([[0.8066],
        [0.1243]]), tensor([[0.9073],
        [0.3439]]))

torch与numpy数组互换:

torch的运算

a = torch.rand(3)
b = torch.rand(3)
print(a)
print(b)
c = torch.add(a,b)
b.add_(a)
print(b,c)

Output
tensor([0.5786, 0.5824, 0.7091])
tensor([0.6576, 0.8690, 0.0564])
tensor([1.2362, 1.4514, 0.7655]) tensor([1.2362, 1.4514, 0.7655])
上一篇 下一篇

猜你喜欢

热点阅读