Abstract Numerous machine learning techniques aim to respond to queries in natural language.Back then, people frequently used bags of words to attempt to address questions something the developers had already predetermined. Using this approach, programmers must invest a lot of the questions and answers for the specific questions that took time to write. This approach was incredibly helpful but was unable to respond to queries for the massive database for the chat bots. Current Natural Transformer-based models predominate in language processors. Making use of these transformers HuggingFace introduced the Bert Quality Assurance approach, which reads through the text, in libraries the user-provided textual context and tries to respond to queries pertaining to that textual context.This technique has shown promise in addressing the intricate query from a lengthy document. Users can simply ask inquiries about a specific year or the profits they made for that year, for instance, if a corporation provides a report regarding their financial years that has been passed through the model. And the solution can be found in a matter of seconds without scrolling through the documents.

Keywords: BERT, Question-Answering model, sequence-to-sequence model


PDF | DOI: 10.17148/IJARCCE.2022.11795

Open chat
Chat with IJARCCE