GROUP 4. This repository contains the implementation of a Transformer-based model for abstractive text summarization and a rule-based approach for extractive text summarization.
-
Updated
Jul 7, 2024 - Jupyter Notebook
GROUP 4. This repository contains the implementation of a Transformer-based model for abstractive text summarization and a rule-based approach for extractive text summarization.
🚇 Archive of daily ridership data from BART.
This project demonstrates text summarization using the BART (Bidirectional and Auto-Regressive Transformers) model. BART is a transformer model trained as a denoising autoencoder and is effective for text generation tasks such as summarization.
Stochastic tree ensembles (BART / XBART) for supervised learning and causal inference
Fine-Tuned LLM-Based FAQ Generation for University Admissions: A project involving the fine-tuning of state-of-the-art language models, including LLaMA-3 8b, LLaMA-2 7b, Mistral 7b, T5, and BART, leveraging QLoRA PEFT.
We examine whether LLMs can maintain consistency over extended multiple text generation for 10 medical personas. We propose 5 novel plausibility metrics, and propose an ontology of common LLM errors.
This repository contains the implementation of a Transformer-based model for abstractive text summarization and a rule-based approach for extractive text summarization.
Point-and-click bartCause analysis and causal inference education
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
code for AAAI2022 paper "Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot Sentiment Classification"
Boundary Banter is a project focused on automating the generation of cricket news from live text commentary using advanced Natural Language Processing (NLP) techniques.
Our BART-based Natural Language Processing model, generating answerable open questions. Made for a Language Technology Practical course.
The Role of Model Architecture and Scale in Predicting Molecular Properties: Insights from Fine-Tuning RoBERTa, BART, and LLaMA
This project aims to simplify and summarize scientific data , convert it to a audio format as a podcast , and create a power point presentation from the paper. This helps researchers, academics and students altogether.
Instruction fine tuning BART for Dialogue Summarization | IT4772E | NLP Project 20232
BrainHack 2024 competition repository for the TIL-AI category in the Novice track for Team dingdongs.
Document Summarizer using NLP and LLMs. BART model is used.
Add a description, image, and links to the bart topic page so that developers can more easily learn about it.
To associate your repository with the bart topic, visit your repo's landing page and select "manage topics."