309 Open Source Attention Software Projects
Free and open source attention code projects including engines, APIs, generators, and tools.
End-to-end variable length Captcha recognition using CNN+RNN+Attention/CTC (pytorch implementation). 端到端的不定长验证码识别
Self Attentive Tensorflow191 ⭐
Tensorflow implementation of "A Structured Self-Attentive Sentence Embedding"
Njunmt Tf98 ⭐
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Multihead Siamese Nets167 ⭐
Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.
Pop Music Highlighter82 ⭐
"Pop Music Highlighter: Marking the Emotion Keypoints", TISMIR vol. 1, no. 1
Hey Jetson169 ⭐
Deep Learning based Automatic Speech Recognition with attention for the Nvidia Jetson.
Mac Network464 ⭐
Implementation for the paper "Compositional Attention Networks for Machine Reasoning" (Hudson and Manning, ICLR 2018)
Nlp Models Tensorflow1614 ⭐
Gathers machine learning and Tensorflow deep learning models for NLP problems, 1.13 < Tensorflow < 2.0
Neat Vision228 ⭐
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Ntua Slp Semeval201878 ⭐
Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Self Attentive Emb Tf94 ⭐
Simple Tensorflow Implementation of "A Structured Self-attentive Sentence Embedding" (ICLR 2017)
Mask data and code for 'Mask-guided Contrastive Attention Model for Person Re-Identification' (CVPR-2018)
Keras Utility Layer Collection59 ⭐
Collection of custom layers and utility functions for Keras which are missing in the main framework.
Codes for WWW'18 Paper-DeepMove: Predicting Human Mobility with Attentional Recurrent Network
Guided Attention Inference Network221 ⭐
Contains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Soujanyaporia Multimodal Sentiment Analysis219 ⭐
Attention-based multimodal fusion for sentiment analysis
Pytorch Seq2seq3504 ⭐
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
flask+seq2seq【TensorFlow1.0, Pytorch】 :art: :art: 在线聊天机器人 https://mp.weixin.qq.com/s/VpiAmVSTin3ALA8MnzhCJA 或 https://ask.hellobi.com/blog/python_shequ/14486
Image Caption Generator160 ⭐
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Semantic Aware Attention Based Deep Object Co Segmentation59 ⭐
Semantic Aware Attention Based Deep Object Co-segmentation
Computer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
An Unofficial Pytorch Implementation of Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering
Speech Transformer646 ⭐
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Text Classification Models Pytorch432 ⭐
Implementation of State-of-the-art Text Classification Models in Pytorch
Nlp Tutorials609 ⭐
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Pyramid Attention Networks Pytorch199 ⭐
Implementation of Pyramid Attention Networks for Semantic Segmentation.
⚠️[Deprecated] no longer maintained, please use the code in https://github.com/guoshnBJTU/ASTGCN-r-pytorch
A PyTorch Implementation of "Watch Your Step: Learning Node Embeddings via Graph Attention" (NeurIPS 2018).
A PyTorch implementation of "Predict then Propagate: Graph Neural Networks meet Personalized PageRank" (ICLR 2019).
Nlp Journey1398 ⭐
Documents, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation)，etc. All codes are implemented intensorflow 2.0.
Abd Net286 ⭐
[ICCV 2019] "ABD-Net: Attentive but Diverse Person Re-Identification" https://arxiv.org/abs/1908.01114
All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.
A simple module consistently outperforms self-attention and Transformer model on main NMT datasets with SoTA performance.
Ccnet Pure Pytorch151 ⭐
Criss-Cross Attention (2d&3d) for Semantic Segmentation in pure Pytorch with a faster and more precise implementation.
Recurrent Independent Mechanisms86 ⭐
Implementation of the paper Recurrent Independent Mechanisms (https://arxiv.org/pdf/1909.10893.pdf)
Yolov4 Pytorch1474 ⭐
This is a pytorch repository of YOLOv4, attentive YOLOv4 and mobilenet YOLOv4 with PASCAL VOC and COCO
Deep Learning and Machine Learning mini-projects. Current Project: Deepmind Attentive Reader (rc-data)
A bidirectional recurrent neural network model with attention mechanism for restoring missing punctuation in unsegmented text
Rnn Nlu471 ⭐
A TensorFlow implementation of Recurrent Neural Networks for Sequence Classification and Sequence Labeling
Eval On Nn Of Rc84 ⭐
Empirical Evaluation on Current Neural Networks on Cloze-style Reading Comprehension
[TPAMI 2018] Predicting the Driver’s Focus of Attention: the DR(eye)VE Project. A deep neural network learnt to reproduce the human driver focus of attention (FoA) in a variety of real-world driving scenarios.
Tf Rnn Attention749 ⭐
Tensorflow implementation of attention mechanism for text classification tasks.
Datastories Semeval2017 Task4191 ⭐
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Jtkim Kaist Vad685 ⭐
Voice activity detection (VAD) toolkit including DNN, bDNN, LSTM and ACAM based VAD. We also provide our directly recorded dataset.
Attention Is All You Need Pytorch6195 ⭐
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Attention Over Attention Tf Qa58 ⭐
论文“Attention-over-Attention Neural Networks for Reading Comprehension”中AoA模型实现