Limitations of Transformers on Clinical Text Classification

IEEE Journal of Biomedical and Health Informatics(2021)

引用 52|浏览100
暂无评分
摘要
Bidirectional Encoder Representations from Transformers (BERT) and BERT-based approaches are the current state-of-the-art in many natural language processing (NLP) tasks; however, their application to document classification on long clinical texts is limited. In this work, we introduce four methods to scale BERT, which by default can only handle input sequences up to approximately 400 words long, ...
更多
查看译文
关键词
Bit error rate,Task analysis,Cancer,MIMICs,Biological system modeling,Adaptation models,Data models
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要