Generating Indonesian Poem: A Fine-Tunning Approach Using Pretrained GPT-2 Models
Abstract
In recent years, text generation has become an important subfield within Natural Language Processing (NLP), gaining significant attention and focus. Over the past decade, text generation technology has expanded significantly, reaching diverse application domains, especially in creative areas such as poem. Generating poetic content is a unique challenge that requires combining linguistic knowledge, creativity, and originality to craft each poem. This study focuses on developing a text generator for Indonesian language poem, using fine-tuning methodology with the pre-trained GPT-2 model from the Flax community. The study conducted a comparative analysis, benchmarking the performance of the researcher's model against a baseline model developed by Muhammad Agung Hambali. The evaluation outcomes showed the researcher's model outperformed the baseline model, exhibiting a 73.68% improvement in perplexity value. Furthermore, the study conducted a survey involving 62 respondents to determine the reception of the generated poem. The results indicated the poem produced by the research model was marginally superior to that of the baseline model.
Full Text:
PDFRefbacks
- There are currently no refbacks.