Bidirectional Encoder Representations from Transformers (BERT) Language Model for Sentiment Analysis task: Review

Main Article Content

Ms. D.Deepa, et. al.

Abstract

The latest trend in the direction of sentiment analysis has brought up new demand for understanding the contextual representation of the language. Among the various conventional machine learning and deep learning models, learning the context is the promising candidate for the sentiment classification task.  BERT is a new pre-trained language model for context embedding and attracted more attention due to its deep analyzing capability, valuable linguistic knowledge in the intermediate layer, trained with larger corpus,  and fine-tuned for any NLP task. Many researchers adapted the BERT model for sentiment analysis tasks by influencing the original architecture to get better classification accuracy. This article summarizes and reviews BERT architecture and its performance observed from fine-tuning different layers and attention heads.

Downloads

Download data is not yet available.

Metrics

Metrics Loading ...

Article Details

How to Cite
et. al., M. D. (2021). Bidirectional Encoder Representations from Transformers (BERT) Language Model for Sentiment Analysis task: Review. Turkish Journal of Computer and Mathematics Education (TURCOMAT), 12(7), 1708–1721. Retrieved from https://turcomat.org/index.php/turkbilmat/article/view/3055
Section
Research Articles