Back to Search Start Over

Cardiff University at SemEval-2020 Task 6: fine-tuning BERT for domain-specific definition classification

Authors :
Luis Espinosa-Anke
Shelan S. Jeawak
Steven Schockaert
Source :
International Workshop on Semantic Evaluation (SemEval 2020), SemEval@COLING

Abstract

We describe the system submitted to SemEval-2020 Task 6, Subtask 1. The aim of this subtask is to predict whether a given sentence contains a definition or not. Unsurprisingly, we found that strong results can be achieved by fine-tuning a pre-trained BERT language model. In this paper,we analyze the performance of this strategy. Among others, we show that results can be improved by using a two-step fine-tuning process, in which the BERT model is first fine-tuned on the full training set, and then further specialized towards a target domain.

Details

Language :
English
Database :
OpenAIRE
Journal :
International Workshop on Semantic Evaluation (SemEval 2020), SemEval@COLING
Accession number :
edsair.doi.dedup.....7a8dbe850aa3c8691cc98b7f7198f0d3