专栏名称: PaperWeekly
PaperWeekly是一个推荐、解读、讨论和报道人工智能前沿论文成果的学术平台,致力于让国内外优秀科研工作得到更为广泛的传播和认可。社区:http://paperweek.ly | 微博:@PaperWeekly
今天看啥  ›  专栏  ›  PaperWeekly

今日arXiv精选 | 近期必读的5篇Transformers相关论文

PaperWeekly  · 公众号  · 科研  · 2021-08-25 17:51
 关于 #今日arXiv精选 这是「AI 学术前沿」旗下的一档栏目,编辑将每日从arXiv中精选高质量论文,推送给读者。Fastformer: Additive Attention is All You NeedCategory: NLPLink: https://arxiv.org/abs/2108.09084AbstractTransformer is a powerful model for text understanding. It is inefficient due to its quadratic complexity to input sequence length. In Fastformer, instead of modeling the pair-wise interactionsbetween tokens, we first use additive attention mechanism to model global contexts.Sentence-T5: Scalable Sentence Encoders from Pre-trained Text-to-Text  ModelsCategory: NLPLink: https://arxiv.org/abs/2108.08877AbstractSentence embeddings are broadly useful for language processing tasks. While T5 achieves impressive performance on language tasks cast assequence-to-sequence mapping problems, it is unclear how to produce sentences from encode ………………………………

原文地址:访问原文地址
快照地址: 访问文章快照