Linguistically-Informed Self-Att
2018-09-25 本文已影响0人
panda1942
- jointly predict parts of speech and predicates
- parts of speech 词性标注
- predicates 谓语标注,是Semantic Role Labeling的一个子任务,把句子中的谓词标注出来,用于后续SRL。
- perform parsing and attend to syntactic parse parents
- Syntactically-informed self-attention,源自于论文[2],进行 dependency parsing。需要注意,这一层是有监督信号的,根据attention A_parse 计算每一个token的head,同时利用Q_parse 和 K_parse 计算dependency label。
- assigning semantic role labels
- 根据给定的predicate,预测每个token与predicate的关系。
reference:
- Linguistically-Informed Self-Attention for Semantic Role Labeling
- Deep Biaffine Attention for Neural Dependency Parsing
- Deep Semantic Role Labeling: What Works and What’s Next
- http://www.hankcs.com/nlp/parsing/deep-biaffine-attention-for-neural-dependency-parsing.html
- https://blog.csdn.net/mingzai624/article/details/78061506