This repo contains my NLP Module Labs
-
Updated
May 27, 2024 - Jupyter Notebook
This repo contains my NLP Module Labs
This project explores the power of Transformers for creative text generation using the GPT-2 large language model. It leverages pre-trained models to produce coherent and engaging text continuations based on user-provided prompts.
Machine Learning Project. Please refer to my presentation-https://github.com/gongl1/projectdemo3/blob/main/Pattern%20Patent_ML.pptx - python, transformers, gpt2, nlp, sk-learn
Team project - generate recipe based on ingredients available in the fridge
A Linguistic Evaluation of Machine-Generated “Real” and “Fake” News
An Empirical Study of Multitask Learning to Improve Open Domain Dialogue Systems, NoDaLiDa 2023
I performed sentiment analysis aimed at determining the sentiment of 50000 imDB movie reviews, whether they are positive, negative, or neutral. I employed various NLP approaches including lexicon based approaches, machine learning models, PLM models, and hybrid models, and assessed the performance on each type of model.
Auto generate tweets powered by pre-trained GPT2 based large language model (LLM) available offline.
Testing of the possible use of transformers model for various NLP tasks leveraging BERT pretrained model from Hugginface
A Series on Optimizing Transformer-Based Models
In this repository, I have created the GPT architecture, provided the code for building it from scratch, and demonstrated how to train it.
KoGPT2 이용하여 플레이리스트 이름 생성하기. KoGPT2 FineTuning cased
Use GPT2 to generate funny semi-realistic Discord dialogues
Pre-ALPHA, une simple chatbox pour hergement local de chat GPT-3 -2 ,...
Add a description, image, and links to the gpt2 topic page so that developers can more easily learn about it.
To associate your repository with the gpt2 topic, visit your repo's landing page and select "manage topics."