Automated Generation of Folklore Short Stories Using T5 Transformer Model
DOI:
https://doi.org/10.30871/jaic.v9i5.10582Keywords:
Folklore Generation, Transformer, T5, Reading InterestAbstract
High reading interest plays an important role in increasing knowledge and fostering a stronger literacy culture. With the growing access to information and technology, reading interest is also expected to improve through innovative and interactive platforms. However, traditional reading materials often fail to attract younger generations who are more engaged with digital content. To address this challenge, one of the efforts undertaken is the development of a modern platform that provides a collection of short stories enriched with cultural and educational values, tailored to appeal to contemporary readers. This study aims to design and implement a short story generation system using a Transformer-based language model, specifically T5 (Text-to-Text Transfer Transformer). The model is fine-tuned using a curated dataset of folktales from various regions, with the goal of producing relevant, engaging, and coherent narrative texts. The generation process is supported by pre-processing techniques to structure the data into narrative components such as introduction, conflict, climax, and resolution. The generated stories are then evaluated through human evaluation methods, including questionnaires and User Acceptance Testing (UAT), to assess their quality, coherence, engagement, and cultural relevance. This ensures that the system not only produces technically valid texts but also delivers narratives that are meaningful and enjoyable for readers. Ultimately, this study contributes to the promotion of literacy by presenting local wisdom and traditional values from diverse cultures through stories in a more modern, engaging, and accessible format for the younger generation.
Downloads
References
[1] S. Kasiyun, ‘Upaya Meningkatkan Minat Baca Sebagai Sarana Untuk Mencerdaskan Bangsa’, 2015. [Online]. Available: http://journal.unesa.ac.id/index.php/jpi
[2] H. Retno, ‘Miris, Minat Baca di Indonesia Menurut UNESCO Hanya 0.001 Persen’, Bandung, May 17, 2021.
[3] T. Nurrahim, ‘Orang Indonesia Makin Gemar Baca’, 2023. Accessed: Sep. 11, 2025. [Online]. Available: Orang Indonesia Makin Gemar Baca
[4] M. Shofiyullah, P. Bahasa, D. Sastra, I. Fakultas Bahasa, and D. Seni, Pesan Moral Dalam Kumpulan Cerita Rakyat Nusantara Karya Yustitia Angelia Sebagai Bahan Ajar Pembelajaran Mengidentifikasi Nilai dan Isi Cerita. 2020.
[5] S. Chakraborty, ‘A Study of Folklore-Publication’, 2019. [Online]. Available: https://www.researchgate.net/publication/372492438
[6] H. A. Parhusip, ‘Machine Learning dengan Pyhton’.
[7] E. Reiter and R. Dale, ‘Building Applied Natural Language Generation Systems’, Cambridge University Press, 1995.
[8] I. Dhall, S. Vashisth, and S. Saraswat, ‘Text Generation Using Long Short-Term Memory Networks’, in Lecture Notes in Networks and Systems, vol. 106, Springer, 2020, pp. 649–657. doi: 10.1007/978-981-15-2329-8_66.
[9] M. R. Assabil, N. Yusliani, and A. Darmawahyuni, ‘Text Generation using Long Short-Term Memory to Generate a LinkedIn Post’, Sriwijaya Journal of Informatic and Applications, vol. 4, no. 2, pp. 57–68, 2023, [Online]. Available: http://sjia.ejournal.unsri.ac.id
[10] S. Santhanam, ‘Context based Text-generation using LSTM networks’, Apr. 2020, [Online]. Available: http://arxiv.org/abs/2005.00048
[11] D. Hatta Fudholi, ‘Story Generator Bahasa Indonesia dengan Skip-Thoughts’, 2022.
[12] C. Angieta Winata, H. Santoso, and I. Wasito, ‘Word-Level Indonesian Story Generator With Markov Chain And Bidirectional GRU’, vol. 10, no. 4, pp. 14–26, 2023, [Online]. Available: http://jurnal.mdp.ac.id
[13] B. Hettige, V. Bandara, A. Sanja, R. Bandara, and H. Sanja, ‘Sibil AI: Children Story Generator in Sinhala Using Transformers’, 2022. [Online]. Available: https://www.researchgate.net/publication/364787507
[14] K. Q. Santoso, K. Perjuangan Mustadl’afin, L. Andriyanto, and I. Ardiyanto, ‘Generator Kisah Daerah Berbasis Bahasa Jawa dengan Finetuned GPT-2’, 2024.
[15] D. C. Senadeera and J. Ive, ‘Controlled Text Generation using T5 based Encoder-Decoder Soft Prompt Tuning and Analysis of the Utility of Generated Text in AI’, Dec. 2022, [Online]. Available: http://arxiv.org/abs/2212.02924
[16] google-research, ‘text-to-text-transfer-transformer’. Accessed: Jan. 06, 2025. [Online]. Available: https://github.com/google-research/text-to-text-transfer-transformer
[17] C. van der Lee, A. Gatt, E. van Miltenburg, and E. Krahmer, ‘Human evaluation of automatically generated text: Current trends and best practice guidelines’, Comput Speech Lang, vol. 67, May 2021, doi: 10.1016/j.csl.2020.101151.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Evangelika Pirade, I Ketut Gede Darma Putra, Desy Purnami Singgih Putri

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License (Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) ) that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).








