Machine learning, in numpy
-
Updated
Aug 19, 2020 - Python
{{ message }}
Machine learning, in numpy
Natural Language Processing Tutorial for Deep Learning Researchers
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Gathers machine learning and Tensorflow deep learning models for NLP problems, 1.13 < Tensorflow < 2.0
Documents, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation),etc. All codes are implemented intensorflow 2.0.
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Draw a leader line in your web page.
Tensorflow implementation of attention mechanism for text classification tasks.
A Tensorflow implementation of Spatial Transformer Networks.
Implementation of papers for text classification task on DBpedia
Voice activity detection (VAD) toolkit including DNN, bDNN, LSTM and ACAM based VAD. We also provide our directly recorded dataset.
Residual Attention Network for Image Classification
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
A TensorFlow implementation of Recurrent Neural Networks for Sequence Classification and Sequence Labeling
A Structured Self-attentive Sentence Embedding
Implementation for the paper "Compositional Attention Networks for Machine Reasoning" (Hudson and Manning, ICLR 2018)
Bilinear attention networks for visual question answering
A bidirectional recurrent neural network model with attention mechanism for restoring missing punctuation in unsegmented text
中文实体关系抽取,pytorch,bilstm+attention
A PyTorch Implementation of "Recurrent Models of Visual Attention"
Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Implementation of State-of-the-art Text Classification Models in Pytorch
BERT-NER (nert-bert) with google bert https://github.com/google-research.
TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
CRNN with attention to do OCR,add Chinese recognition
list of efficient attention modules
Pointer-generator reinforced seq2seq summarization in PyTorch
Four styles of encoder decoder model by Python, Theano, Keras and Seq2Seq
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."
If I want to use both of them, how to modify code in aen.py? Thanks a lot.