Text Generation Using Different Recurrent Neural Networks
Loading...
Files
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Today, Computers have influenced the life of human beings to a great extent. To provide communication between computers and humans, natural language techniques have proven to be very efficient way to exchange the information with less personal requirement. Generative models reduce the need of acquiring laborious labelling for the dataset. Text generation techniques can be applied for improving language models, machine translation, summarization and captioning.
Text can be generated by using Hidden Markov Models and Markov Chains but it is difficult to generate whole sentence using them so we have used Recurrent Neural Networks (RNNs) with its variants LSTM and GRU to develop language model that can generate whole new text word by word automatically.
Research work presented in this thesis focused on generation of language model by training different RNNs. The proposed method works in two stages. In first stage, training of simple RNN, LSTM and GRU is done on different datasets and in second stage; sampling is done to generate output text. We have considered 5 different input datasets and for each dataset all three networks are trained. Lastly, after this all the output texts are compared to conclude which network generates more realistic text. The variation of training loss with iterations for all datasets is also examined.
Description
Master of Engineering -CSE
