Proceedings of the International Conference on Applications of Machine Intelligence and Data Analytics (ICAMIDA 2022)

A Review on BERT and Its Implementation in Various NLP Tasks

Authors
Vrishali Chakkarwar1, *, Sharvari Tamane2, Ankita Thombre1
1Department of Computer Science and Engineering, Government College of Engineering, Aurangabad, Maharashtra, India
2University Department of Information and Communication Technology MGM University, Aurangabad, Maharashtra, India
*Corresponding author. Email: vrush.a143@gmail.com
Corresponding Author
Vrishali Chakkarwar
Available Online 1 May 2023.
DOI
10.2991/978-94-6463-136-4_12How to use a DOI?
Keywords
Bidirectional Encoder Representations from Transformers (BERT); Transformers; natural language processing (NLP); text summarization; text classification; sentence similarity; distillBERT
Abstract

We present a detailed study of BERT, which stands for ‘Bidirectional Encoder Representations from Transformers’. Natural Language Processing can be considered as an important field when concerned with development of intelligent systems. Various tasks require understanding the correct meaning of the sentence in order to provide the output. Languages are difficult to understand by computers due to their ever-changing nature with context. BERT is considered as a revolution for making the computers understand the context of the text, which is the biggest hurdle in natural language processing tasks. It learns the language and its context in a way that closely resembles how a human brain understand the meaning of a sentence. It is unique because of its ability to learn from both left and right context in a sentence for a specific word. The evolution of BERT has marked a new era in perception and understanding of natural languages which may lead computers to grasps the natural language with better comprehension. The purpose of this paper is in the effort of providing a better understanding of BERT language model and its implementation in various NLP tasks.

Copyright
© 2023 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Download article (PDF)

Volume Title
Proceedings of the International Conference on Applications of Machine Intelligence and Data Analytics (ICAMIDA 2022)
Series
Advances in Computer Science Research
Publication Date
1 May 2023
ISBN
978-94-6463-136-4
ISSN
2352-538X
DOI
10.2991/978-94-6463-136-4_12How to use a DOI?
Copyright
© 2023 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Cite this article

TY  - CONF
AU  - Vrishali Chakkarwar
AU  - Sharvari Tamane
AU  - Ankita Thombre
PY  - 2023
DA  - 2023/05/01
TI  - A Review on BERT and Its Implementation in Various NLP Tasks
BT  - Proceedings of the International Conference on Applications of Machine Intelligence and Data Analytics (ICAMIDA 2022)
PB  - Atlantis Press
SP  - 112
EP  - 121
SN  - 2352-538X
UR  - https://doi.org/10.2991/978-94-6463-136-4_12
DO  - 10.2991/978-94-6463-136-4_12
ID  - Chakkarwar2023
ER  -