An open-source implementation of the paper ``A Structured Self-Attentive Sentence Embedding'' published by IBM and MILA.
-
Updated
Jul 28, 2017 - Python
An open-source implementation of the paper ``A Structured Self-Attentive Sentence Embedding'' published by IBM and MILA.
Deep Semantic Role Labeling with Self-Attention
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Simple Tensorflow Implementation of "A Structured Self-attentive Sentence Embedding" (ICLR 2017)
NER for Chinese electronic medical records. Use doc2vec, self_attention and multi_attention.
Implementation of the Paper Structured Self-Attentive Sentence Embedding published in ICLR 2017
My toy model for natural language inference task.
MSG-GAN with self attention. For MSG-GAN head to -> https://github.com/akanimax/MSG-GAN
Mispronunciation detection code for jingju singing voice
Structured Self Attention implementation in tensorflow
Unofficial Implementation of Universal Transformer https://arxiv.org/abs/1807.03819
Adversarial Transfer Learning for Chinese Named Entity Recognition with Self-Attention Mechanism
Simple Tensorflow Implementation of Transformer introduced by "Attention is All You Need" (NIPS 2017)
LimbicAI is an AI that induces emotions like fear in AI.
Text classification using deep learning models in Pytorch
A variant of the Self Attention GAN named: FAGAN (Full Attention GAN)
What was the impact of Russian troll tweets on the 2016 presidential election poll results?
Chatbot using Tensorflow (Model is transformer) ko
Add a description, image, and links to the self-attention topic page so that developers can more easily learn about it.
To associate your repository with the self-attention topic, visit your repo's landing page and select "manage topics."