学习笔记:RNN

2020-07-22  本文已影响0人  IT_小马哥

RNN学习来源:刘二大人的视频

RNNCell

RNNCell
cell = torch.nn.RNNCell(input_size=input_size,hidden_size=hidden_size)
hidden = cell(input, hidden)
batch_size = 1
seq_len = 5
input_size = 4
hidden_size = 2
cell = nn.RNNCell(input_size=input_size, hidden_size=hidden_size)
dataset = torch.randn(seq_len, batch_size, input_size)
hidden = torch.zeros(batch_size, hidden_size)
for idx, inputs in enumerate(dataset):
    print('='*20, idx, '='*20)
    print('inputs shape:{0}|'.format(inputs.shape))
    
    hidden = cell(inputs, hidden)
    print('hidden shape:{0}'.format(hidden.shape))
    print(hidden)
# 输出如下:
==================== 0 ====================
inputs shape:torch.Size([1, 4])|
hidden shape:torch.Size([1, 2])
tensor([[ 0.5790, -0.7934]], grad_fn=<TanhBackward>)
==================== 1 ====================
inputs shape:torch.Size([1, 4])|
hidden shape:torch.Size([1, 2])
tensor([[ 0.6467, -0.4657]], grad_fn=<TanhBackward>)
==================== 2 ====================
inputs shape:torch.Size([1, 4])|
hidden shape:torch.Size([1, 2])
tensor([[0.2735, 0.6325]], grad_fn=<TanhBackward>)
==================== 3 ====================
inputs shape:torch.Size([1, 4])|
hidden shape:torch.Size([1, 2])
tensor([[0.0558, 0.4998]], grad_fn=<TanhBackward>)
==================== 4 ====================
inputs shape:torch.Size([1, 4])|
hidden shape:torch.Size([1, 2])
tensor([[-0.1049, -0.4924]], grad_fn=<TanhBackward>)

RNN 整体架构

for x in X:
  h  = rnncell(x,h)
pytroch中的RNN
上一篇 下一篇

猜你喜欢

热点阅读