Story Scrambler - Automatic Text Generation Using Word Level RNN-LSTM

Full Text (PDF, 345KB), PP.44-53

Views: 0 Downloads: 0


Dipti Pawade 1,* Avani Sakhapara 1 Mansi Jain 1 Neha Jain 1 Krushi Gada 1

1. K. J. Somaiya College of Engineering, Department of IT, Mumbai

* Corresponding author.


Received: 27 Dec. 2017 / Revised: 10 Feb. 2018 / Accepted: 9 Mar. 2018 / Published: 8 Jun. 2018

Index Terms

Recurrent neural networks, Long short-term memory, Text generation, Deep learning


With the advent of artificial intelligence, the way technology can assist humans is completely revived. Ranging from finance and medicine to music, gaming, and various other domains, it has slowly become an intricate part of our lives. A neural network, a computer system modeled on the human brain, is one of the methods of implementing artificial intelligence. In this paper, we have implemented a recurrent neural network methodology based text generation system called Story Scrambler. Our system aims to generate a new story based on a series of inputted stories.  For new story generation, we have considered two possibilities with respect to nature of inputted stories. Firstly, we have considered the stories with different storyline and characters. Secondly, we have worked with different volumes of the same stories where the storyline is in context with each other and characters are also similar. Results generated by the system are analyzed based on parameters like grammar correctness, linkage of events, interest level and uniqueness.

Cite This Paper

Dipti Pawade, Avani Sakhapara, Mansi Jain, Neha Jain, Krushi Gada, "Story Scrambler - Automatic Text Generation Using Word Level RNN-LSTM", International Journal of Information Technology and Computer Science(IJITCS), Vol.10, No.6, pp.44-53, 2018. DOI:10.5815/ijitcs.2018.06.05


[1]Chandra Khatri, Sumanvoleti, Sathish Veeraraghavan, Nish Parikh, Atiq Islam, Shifa Mahmood, Neeraj Garg, and Vivek Singh, “Algorithmic Content Generation for Products”, Proceedings of IEEE International Conference on Big Data, Santa Clara, CA(2015), pp.2945-2947. 

[2]S. Thomaidou, I. Lourentzou, P. Katsivelis-Perakis, and M. Vazirgiannis,“Automated Snippet Generation for Online Advertising”, Proceedings of ACM International Conference on Information and Knowledge Management (CIKM'13), San Francisco, USA, (2013), pp.1841- 1844.

[3]Ha┼čim Sak, Andrew Senior, and Françoise Beaufays, "Long Short-Term Memory Recurrent Neural Network Architectures for Large-Scale Acoustic Modeling". In Fifteenth Annual Conference of the International Speech Communication Association (2014).[4]Andrej Karpathy, Justin Johnson, and Li Fei-Fei, “Visualizing and Understanding Recurrent Networks”. arXiv preprint arXiv:1506.02078 (2015).


[6]Neural Networks and Deep Learning. Retrieved from: 

[7]Ilya Sutskever, James Martens, and Geoffrey Hinton, “Generating Text with Recurrent Neural Networks”, Proceedings of the 28th International Conference on Machine Learning, Bellevue, WA, USA, (2011), pp.1017-1024

[8]Martens, J. “Deep learning via Hessian-free optimization”, Proceedings of 27 International Conference on Machine Learning (ICML) (2010), pp.735-742.

[9]Soniya, Sandeep Paul, Lotika Singh, “A review on advances in deep learning”, Proceeding of international conference on Computational Intelligence: Theories, Applications and Future Directions (WCI), Kanpur, India,( 14-17 Dec. 2015).

[10]Parag Jain, Priyanka Agrawal, Abhijit Mishra, Mohak Sukhwani, Anirban Laha, Karthik Sankaranarayanan, “Story Generation from Sequence of Independent Short Descriptions”, In Proceedings of Workshop on Machine Learning for Creativity (SIGKDD’17)( Aug 2017), Halifax – Canada. 

[11]Boyang Li, Stephen Lee-Urban, George Johnston, and Mark O. Riedl, “Story Generation with Crowdsourced Plot Graphs” In Proceedings of the Twenty-Seventh AAAI Conference on Artificial Intelligence, Bellevue, Washington (14 – 18 July 2013) Pp 598-604.

[12]Ramesh Nallapati, Bowen Zhou, Cicero Nogueira dos Santos, Caglar Gulcehre, Bing Xiang, “Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond”, The SIGNLL Conference on Computational Natural Language Learning (CoNLL)(2016).

[13]Dzmitry Bahdanau, Jan Chorowski, Dmitriy Serdyuk, Philemon Brakel, and Yoshua Bengio, “End-to-end attention-based large vocabulary speech recognition”  CoRR, abs/1508.04395 (2015).

[14]Ilya Sutskever, Oriol Vinyals, and Qoc V. Le., “Sequence to Sequence Learning with Neural Networks”, CoRR (2014).

[15]Jaya A.Uma G.V., “An intelligent system for semi-automatic story generation for kids using ontology”  Proceedings of the 3rd Annual Computer Conference, Compute (22-23 Jan 2010), Bangalore, India. 

[16]Jaya A.Uma G.V., “An Intelligent Automatic Story Generation System by Revising Proppian’s System.”Communications in Computer and Information Science (2011) vol 131. Springer, Berlin, Heidelberg

[17]Huong Thanh Le; Tien Manh Le, "An approach to Abstractive Text Summarization", In proceeding of International Conference of Soft Computing and Pattern Recognition (SoCPaR), (15-18 Dec 2013), Hanoi, Vietnam.

[18]Le, H.T., Abeysinghe, G. and Huyck, C., “Generating Discourse Structures for Written Texts,” In Proc. of COLING (2004), Switzerland.

[19]D.Y. Sakhare, Dr. Raj Kumar, “Syntactic and Sentence Feature Based Hybrid Approach for Text Summarization”, I.J. Information Technology and Computer Science, 2014, 03, 38-46, DOI: 10.5815/ijitcs.2014.03.05

[20]Joakim Nivre, “Dependency Grammar and Dependency Parsing”, In MSI report 05133, 2005.

[21]Selvani Deepthi Kavila, Radhika Y, "Extractive Text Summarization Using Modified Weighing and Sentence Symmetric Feature Methods", IJMECS, vol.7, no.10, pp.33-39, 2015.DOI: 10.5815/ijmecs.2015.10.05 

[22]Maryam Kiabod, Mohammad Naderi Dehkordi and Sayed Mehran Sharafi, “A Novel Method of Significant Words Identification in Text Summarization”, Journal Of Emerging Technologies In Web Intelligence, VOL. 4, NO. 3, August 2012 

[23]Akif Hatipoglu, Sevinç Ilhan Omurca, “A Turkish Wikipedia Text Summarization System for Mobile Devices,” I.J. Information Technology and Computer Science, 2016, 01, 1-10 DOI: 10.5815/ijitcs.2016.01.01

[24]Richard S. Colon; Prabir K. Patra; Khaled M. Elleithy, "Random Word Retrieval for Automatic StoryGeneration" Proceedings of  Zone 1 Conference of the American Society for Engineering Education (ASEE Zone 1) (3-5 April 2014), Copenhagen, Denmark.

[25]Anushree Mehta; Resham Gala; Lakshmi Kurup, "A roadmap to auto story generation", 2016 International Conference on Computing for Sustainable Global Development (INDIACom), (16-18 March 2016), New Delhi, India.

[26]Dipti Pawade, Mansi Jain, Gauri Sarode, "Methods For Automatic Text Generation",i-manager’s Journal on Computer Science, Volume 4. No. 4, December 2016 - February 2017, Pp. 32-36.

[27]Roald Dahl, “Charlie and the Chocolate Factory”.

[28]Lewis Carroll, “Alice in Wonderland”