Codes and write-up for Red Dragon AI Advanced NLP Course.
-
Updated
Oct 5, 2020 - Python
Codes and write-up for Red Dragon AI Advanced NLP Course.
The objective of the project is to generate a abstractive summary from a bigger article. The process includes all the preprocessing step and summarizing the whole article. This will be very helpful to get the important context of bigger article.
Implementation of Transformer, BERT and GPT models in both Tensorflow 2.0 and PyTorch.
A PyTorch implementation of a transformer network trained using back-translation
Using Bayesian optimization via Ax platform + SAASBO model to simultaneously optimize 23 hyperparameters in 100 iterations (set a new Matbench benchmark).
Implementation of Basic Conversational Agent(a.k.a Chatbot) using PyTorch Transformer Module
Implementation of Transformer Pointer-Critic Deep Reinforcement Learning Algorithm
This repository contains my research work on building the state of the art next basket recommendations using techniques such as Autoencoders, TF-IDF, Attention based BI-LSTM and Transformer Networks
[TPAMI 2023 ESI Highly Cited Paper] SePiCo: Semantic-Guided Pixel Contrast for Domain Adaptive Semantic Segmentation https://arxiv.org/abs/2204.08808
list of efficient attention modules
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Add a description, image, and links to the transformer-network topic page so that developers can more easily learn about it.
To associate your repository with the transformer-network topic, visit your repo's landing page and select "manage topics."