Survey: using BERT model for Arabic Question Answering System.
Main Article Content
Abstract
In this paper, we deal with the community question answer problem. Using Burt's algorithm, the Question Answer Task (QA) is a Natural Domain Language Processing (NLP). We present a survey on language representation learning for the purpose of consolidating a set of common lessons learned across a variety of recent efforts. Which enables machine reading comprehension and natural language inference tasks. BERT controls its simplicity of use also is a light refinement method without substantial task-specific modifications. We highlight important considerations when interpreting recent contributions and choosing which model to use. We will address the strengths and weaknesses of the algorithm and what are the challenges that faced researchers.
Article Details
Issue
Section
Articles