Publications

KaoGPT: Studying the Performance of Text Generating Models

Written:

This paper was written for the final project of ECE C147 Winter 2023.

Abstract: We developed text-generation models, including the RNN, decoder stack, encoder-decoder, and fine-tuned GPT- 2, to emulate Professor Kao’s lectures. Through experi- mentation, we found that finetuning GPT-2 led to a model that outperformed all others. However, given the limited dataset, the trained-from-scratch decoder stack per- formed surprisingly well. Our results offer insights into the strengths and limitations of various text generation models, aiding researchers in selecting the most suitable model for their needs.

Finding an Analytical Solution for the Generalized Fermat Point

Written:

This paper is an Extended Essay in Mathematics written for the International Baccalaureate program.

Twitter Network Analysis of Lockdowns in Light of Covid-19

Written:

This paper explores public sentiment following COVID-19 lockdowns.