exBERT: Extending pre-trained models with domain-specific vocabulary under constrained training resources

Wen Tai, H. T. Kung, Xin Dong, Marcus Comiter, Chang Fu Kuo

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

62 Scopus citations

Abstract

We introduce exBERT, a training method to extend BERT pre-trained models from a general domain to a new pre-trained model for a specific domain with a new additive vocabulary under constrained training resources (i.e., constrained computation and data). exBERT uses a small extension module to learn to adapt an augmenting embedding for the new domain in the context of the original BERT’s embedding of a general vocabulary. The exBERT training method is novel in learning the new vocabulary and the extension module while keeping the weights of the original BERT model fixed, resulting in a substantial reduction in required training resources. We pre-train exBERT with biomedical articles from ClinicalKey and PubMed Central, and study its performance on biomedical downstream benchmark tasks using the MTLBioinformatics-2016 dataset. We demonstrate that exBERT consistently outperforms prior approaches when using limited corpus and pretraining computation resources.

Original languageEnglish
Title of host publicationFindings of the Association for Computational Linguistics Findings of ACL
Subtitle of host publicationEMNLP 2020
PublisherAssociation for Computational Linguistics (ACL)
Pages1433-1439
Number of pages7
ISBN (Electronic)9781952148903
StatePublished - 2020
EventFindings of the Association for Computational Linguistics, ACL 2020: EMNLP 2020 - Virtual, Online
Duration: 16 11 202020 11 2020

Publication series

NameFindings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020

Conference

ConferenceFindings of the Association for Computational Linguistics, ACL 2020: EMNLP 2020
CityVirtual, Online
Period16/11/2020/11/20

Bibliographical note

Publisher Copyright:
©2020 Association for Computational Linguistics

Fingerprint

Dive into the research topics of 'exBERT: Extending pre-trained models with domain-specific vocabulary under constrained training resources'. Together they form a unique fingerprint.

Cite this