[NLP] BERT模型压缩

2021-11-22  本文已影响0人  nlpming

参考论文

-【预训练模型综述】Pre-trained Models for Natural Language Processing: A Survey
https://arxiv.org/pdf/2003.08271.pdf(邱锡鹏老师 - 视频讲解:https://www.bilibili.com/video/BV16K4y1475Z/
-【BERT模型压缩综述】Compressing Large-Scale Transformer-Based Models: A Case Study on BERT
https://arxiv.org/abs/2002.11985
-【ALBERT】ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
https://arxiv.org/abs/1909.11942
-【DistillBERT】DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
https://arxiv.org/abs/1910.01108
-【TinyBERT】TinyBERT: Distilling BERT for Natural Language Understanding
https://arxiv.org/abs/1909.10351
-【MobileBERT】MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices
https://arxiv.org/abs/2004.02984
-【QBERT】Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT
https://arxiv.org/abs/1909.05840

参考资料

上一篇 下一篇

猜你喜欢

热点阅读