声明:本文转载自 哈工大讯飞联合实验室 公众号哈工大讯飞联合实验室(HFL)资深级研究员、研究主管崔一鸣受邀在NLPCC 2020会议做题为《Revisiting Pre-trained Models for Natural Language Processing》的讲习班报告(Tutorial),介绍了预训练语言模型的发展历程以及近期的研究热点。本期推送文末提供了报告的下载方式。NLPCC 2020 Tutorials:http://tcci.ccf.org.cn/conference/2020/tutorials.php报告信息Title: Revisiting Pre-Trained Models for Natural Language ProcessingAbstract : Pre-Trained Language Models (PLM) have become fundamental elements in recent research of natural language processing. In this tutorial, we will revisit the technical progress of the text representations, i.e., from one-hot embedding to the recent PLMs. We will describe several popular PLMs (such as BERT, XLNet, RoBERT
………………………………