《李宏毅深度学习教程》(李宏毅老师推荐👍),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
-
Updated
Jul 11, 2024 - Jupyter Notebook
《李宏毅深度学习教程》(李宏毅老师推荐👍),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
The GitHub repository for the paper "Informer" accepted by AAAI 2021.
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
Datasets, tools, and benchmarks for representation learning of code.
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
CCNet: Criss-Cross Attention for Semantic Segmentation (TPAMI 2020 & ICCV 2019).
Text classification using deep learning models in Pytorch
The implementation of DeBERTa
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Recent Transformer-based CV and related works.
A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
A Structured Self-attentive Sentence Embedding
list of efficient attention modules
Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Implementing Stand-Alone Self-Attention in Vision Models using Pytorch
DSMIL: Dual-stream multiple instance learning networks for tumor detection in Whole Slide Image
Add a description, image, and links to the self-attention topic page so that developers can more easily learn about it.
To associate your repository with the self-attention topic, visit your repo's landing page and select "manage topics."