3. NLP with Sequence Models

2020-12-06  本文已影响0人  Kevin不会创作

Table of Contents

Neural Networks for Sentiment Analysis

Neural Networks in Trax

For simple architectures like a 3-layers NN, you will use a serial model.

from trax import layers as t1
Model = t1.Serial(t1.Dense(4), t1.Sigmoid(),
                  t1.Dense(4), t1.Sigmoid(),
                  t1.Dense(3), t1.Softmax())

RNN for Language Modeling

Recurrent Neural Networks

Gated Recurrent Unit

Gated Recurrent Unit (GRU) has some parameters which allow you to control how much information to forget from the past and how much information to extract from the current input.

GRU

Bi-directional RNNs

In bi-directional RNNS, the outputs take information from the past and the future.

Bi-directional RNNs

Deep RNNs

Deep RNNs have more than one layer, which helps in complex tasks.

Deep RNNs

LSTMs and Named Entity Recognition

Long short-term memory

Long short-term memory (LSTM) are the best known solution to the vanishing gradient problem.

Named Entity Recognition

Named entity recognition (NER) is a fast and efficient way to scan text for certain kinds of information. NER systems locate and extract named entities from texts. Named entities can be anything from a place to an organization, to a person's name. They can even be times and dates.

Siamese Networks

Siamese Networks is a neural network made up of two identical neural networks which are merged at the end.

上一篇 下一篇

猜你喜欢

热点阅读