KaoGPT: Studying the Performance of Text Generating Models
Written:
This paper was written for the final project of ECE C147 Winter 2023.
Abstract: We developed text-generation models, including the RNN, decoder stack, encoder-decoder, and fine-tuned GPT- 2, to emulate Professor Kao’s lectures. Through experi- mentation, we found that finetuning GPT-2 led to a model that outperformed all others. However, given the limited dataset, the trained-from-scratch decoder stack per- formed surprisingly well. Our results offer insights into the strengths and limitations of various text generation models, aiding researchers in selecting the most suitable model for their needs.