event-icon
Description

With advances in Machine Learning (ML), neural network-based methods, such as Convolutional/Recurrent Neural Networks, have been proposed to assist terminology curators in the development and maintenance of terminologies. Bidirectional Encoder Representations from Transformers (BERT), a new language representation model, obtains state-of-the-art results on a wide array of general English NLP tasks. We explore BERT’s applicability to medical terminology-related tasks. Utilizing the “next sentence prediction” capability of BERT, we show that the Fine-tuning strategy of Transfer Learning (TL) from the BERTBASE model can address a challenging problem in automatic terminology enrichment – insertion of new concepts. Adding a pre-training strategy enhances the results. We apply our strategies to the two largest hierarchies of SNOMED CT, with one release as training data and the following release as test data. The performance of the combined two proposed TL models achieves an average F1 score of 0.83 and 0.86 for the two hierarchies, respectively.

Learning Objective: Learn how to conduct two strategies of Transfer Learning (Fine-tuning and Pre-training) from BERT to correctly place new concepts in the right position of SNOMED CT.

Authors:

Hao Liu (Presenter)
New Jersey Institute of Technology

Yehoshua Perl, New Jersey Institute of Technology
James Geller, New Jersey Institute of Technology

Presentation Materials:

Tags