End-to-end Named Entity Recognition and Relation Extraction using Pre-trained Language Models
Authors propose an end-to-end model for jointly extracting entities and
their relations.
There were multiple approaches to solve this task, but they either showed a low predictive power or used some external tools. The authors suggest using BERT as a pre-trained model and a single architecture with modules for NER and ER.
This paper makes the following innovations:
– end-to-end approach, relying on no handcrafted features or external NLP tools
– fast training thanks to using pre-trained models
– match or exceed state-of-the-art results for joint NER and RE on 5 datasets across 3 domains
Paper: https://arxiv.org/abs/1912.13415
Code: https://github.com/bowang-lab/joint-ner-and-re
Unofficial code: https://github.com/BaderLab/saber/blob/development/saber/models/bert_for_ner_and_re.py
#deeplearning #nlp #transformer #NER #ER
Authors propose an end-to-end model for jointly extracting entities and
their relations.
There were multiple approaches to solve this task, but they either showed a low predictive power or used some external tools. The authors suggest using BERT as a pre-trained model and a single architecture with modules for NER and ER.
This paper makes the following innovations:
– end-to-end approach, relying on no handcrafted features or external NLP tools
– fast training thanks to using pre-trained models
– match or exceed state-of-the-art results for joint NER and RE on 5 datasets across 3 domains
Paper: https://arxiv.org/abs/1912.13415
Code: https://github.com/bowang-lab/joint-ner-and-re
Unofficial code: https://github.com/BaderLab/saber/blob/development/saber/models/bert_for_ner_and_re.py
#deeplearning #nlp #transformer #NER #ER