Abstract: Next Word Prediction also called Language Modelling is the task of predicting the word that comes next. It is a core problem of NLP and has several applications. The paraphrasing generation techniques help to identify or to extract/generate phrases/sentences conveying similar meanings. Next word prediction (NWP) is a major challenge in the field of natural language processing. This paper mainly talks about the Recurrent Neural Network (RNN) and introduces a more effective neural network model named LSTM for supporting next word prediction and Latent semantic analysis (LSA) is a method for evaluating a piece of text using mathematical computing and examining the link between terms in the documents, as well as between documents in the corpus for supporting the paraphrasing generation technique. These models may need a significant amount of computing work, making the model inapplicable for some sorts of applications. In conclusion, although tricky and application-dependent, Proper setting of the learning rate can reduce the lingering time of the neural network.

Keywords: Latent Semantic Analysis, Paraphrasing, Next word prediction, Natural language processing, recurrent neural network, LSTM, Character prediction.


PDF | DOI: 10.17148/IJARCCE.2022.11449

Open chat
Chat with IJARCCE