Ensemble model of Bidirectional Encoder Representation from Transformers for Named Entity Recognition

Publicerad

Typ

Examensarbete för masterexamen

Program

Modellbyggare

Tidskriftstitel

ISSN

Volymtitel

Utgivare

Sammanfattning

Named entity recognition (NER) has been widely modeled using Bidirectional En coder Representations from Transformers (BERT) in state of the art implementations since its appearance in 2018. Various configurations based on BERT models currently hold 4 out of 5 top positions on the GLUE leaderboard, an acknowledged benchmark for natural language processing and understanding. Relying on BERT architecture, a range of NER model designs were investigated to predict entities in a comparatively small set of medical press releases. The performance of all investigated model designs proved to be boosted with transfer learning using the publicly available datasets Conll2003 and BC5CDR early on in the project. Transfer learning was therefore implemented in the best named entity recognition system found, the separate submodel system under Section 6.3.6. This final design consisted of two submodels, each classifying different entity subsets independently. The Conll and BC5CDR datasets were used for transfer learning in the respective submodels prior to the introduction of medical press release data. The separate submodel system reached an F1-score of 0.79 (Conll model) and 0.78 (BC5CDR model). The effect of pre-training a selection of publicly available BERT models on the medical press releases was also investigated, but was given less emphasis due to insufficient amounts of data.

Beskrivning

Ämne/nyckelord

Transfer learning, natural language processing, named entity recognition, BERT, conditional random field

Citation

Arkitekt (konstruktör)

Geografisk plats

Byggnad (typ)

Byggår

Modelltyp

Skala

Teknik / material

Index

item.page.endorsement

item.page.review

item.page.supplemented

item.page.referenced