Natural Language Processing Tutorial for Deep Learning Researchers
-
Updated
Aug 15, 2020 - Jupyter Notebook
{{ message }}
Natural Language Processing Tutorial for Deep Learning Researchers
Trax — Deep Learning with Clear Code and Speed
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Code for the paper "Jukebox: A Generative Model for Music"
viewpager with parallax pages, together with vertical sliding (or click) and activity transition
Chinese version of GPT2 training code, using BERT tokenizer.
PostHTML is a tool to transform HTML/XML with JS plugins
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Large-scale pretraining for dialogue
GPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想)
微信、bilibili大图、长图、gif、视频、自定义view的转场效果
Keras implementation of BERT with pre-trained weights
building a chinese dialogue system based on the newest version of rasa(基于最新版本rasa搭建的对话系统)
A Keras TensorFlow 2.0 implementation of BERT, ALBERT and adapter-BERT.
Hi, I am so interesting in your project, and wonder if you need contributor and how could I make my own contribution?
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 论文的中文翻译 Chinese Translation!
A collection of view pager transformers
CTR prediction models based on deep learning(基于深度学习的广告推荐CTR预估模型)
Neural Machine Translation with Keras
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
A wrapper around tensor2tensor to flexibly train, interact, and generate data for neural chatbots.
Import SVG files in your React Native project the same way that you would in a Web application.
Official Pytorch implementation of "OmniNet: A unified architecture for multi-modal multi-task learning" | Authors: Subhojeet Pramanik, Priyanka Agrawal, Aman Hussain
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."
From paper, it mentioned
It means that 15% of token will be choose for sure.
From https://github.com/codertimo/BERT-pytorch/blob/master/bert_pytorch/dataset/dataset.py#L68,
for every single token, it has 15% of chance that go though the followup procedure.