Domain Adapted Language Models
dc.contributor.author | Jansson, Erik | |
dc.contributor.department | Chalmers tekniska högskola / Institutionen för data och informationsteknik | sv |
dc.contributor.examiner | Haghir Chehreghani, Morteza | |
dc.contributor.supervisor | Johansson, Richard | |
dc.date.accessioned | 2019-10-03T13:52:43Z | |
dc.date.available | 2019-10-03T13:52:43Z | |
dc.date.issued | 2019 | sv |
dc.date.submitted | 2019 | |
dc.description.abstract | BERT is a recent neural network model that has proven it self amassive leap forward in natural language processing. Due to the tedious training required by this massive model, a pretrained BERT instance has been released as a high-performing starting point for further training on downstream tasks. The pretrained model has been trained on general English text and may not be optimal for applications in specialist language domains. This study examines adapting the pretrained BERT model to the specialist language domain of legal text, with classification as the downstream task of interest. The study finds that domain adaptation is most beneficial if faced with small task-specific datasets, where performance can approach that of a model pretrained from scratch on legal text data. The study further presents practical guidelines for applying BERT in specialist language domains. | sv |
dc.identifier.coursecode | DATX05 | sv |
dc.identifier.uri | https://hdl.handle.net/20.500.12380/300390 | |
dc.language.iso | eng | sv |
dc.setspec.uppsok | Technology | |
dc.subject | natural language processing | sv |
dc.subject | BERT | sv |
dc.subject | transformer | sv |
dc.subject | domain adaptation | sv |
dc.subject | language model | sv |
dc.subject | classification | sv |
dc.title | Domain Adapted Language Models | sv |
dc.type.degree | Examensarbete för masterexamen | sv |
dc.type.uppsok | H |