Domain Adapted Language Models

Publicerad

Författare

Typ

Examensarbete för masterexamen

Program

Modellbyggare

Tidskriftstitel

ISSN

Volymtitel

Utgivare

Sammanfattning

BERT is a recent neural network model that has proven it self amassive leap forward in natural language processing. Due to the tedious training required by this massive model, a pretrained BERT instance has been released as a high-performing starting point for further training on downstream tasks. The pretrained model has been trained on general English text and may not be optimal for applications in specialist language domains. This study examines adapting the pretrained BERT model to the specialist language domain of legal text, with classification as the downstream task of interest. The study finds that domain adaptation is most beneficial if faced with small task-specific datasets, where performance can approach that of a model pretrained from scratch on legal text data. The study further presents practical guidelines for applying BERT in specialist language domains.

Beskrivning

Ämne/nyckelord

natural language processing, BERT, transformer, domain adaptation, language model, classification

Citation

Arkitekt (konstruktör)

Geografisk plats

Byggnad (typ)

Byggår

Modelltyp

Skala

Teknik / material

Index

item.page.endorsement

item.page.review

item.page.supplemented

item.page.referenced