Natural Language Processing Tutorial for Deep Learning Researchers
-
Updated
Oct 20, 2020 - Jupyter Notebook
{{ message }}
Natural Language Processing Tutorial for Deep Learning Researchers
From paper, it mentioned
Instead, the training data generator chooses 15% of tokens at random, e.g., in the sentence my
dog is hairy it chooses hairy.
It means that 15% of token will be choose for sure.
From https://github.com/codertimo/BERT-pytorch/blob/master/bert_pytorch/dataset/dataset.py#L68,
for every single token, it has 15% of chance that go though the followup procedure.
Code for the paper "Jukebox: A Generative Model for Music"
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Chinese version of GPT2 training code, using BERT tokenizer.
viewpager with parallax pages, together with vertical sliding (or click) and activity transition
PostHTML is a tool to transform HTML/XML with JS plugins
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Large-scale pretraining for dialogue
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
GPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想)
微信、bilibili大图、长图、gif、视频、自定义view的转场效果,The transition effect of wechat, bilibili large image, long image, GIF, video and custom view
Keras implementation of BERT with pre-trained weights
building a chinese dialogue system based on the newest version of rasa(基于最新版本rasa搭建的对话系统)
A Keras TensorFlow 2.0 implementation of BERT, ALBERT and adapter-BERT.
Hi, I am so interesting in your project, and wonder if you need contributor and how could I make my own contribution?
Jupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 论文的中文翻译 Chinese Translation!
CTR prediction models based on deep learning(基于深度学习的广告推荐CTR预估模型)
A collection of view pager transformers
list of efficient attention modules
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Neural Machine Translation with Keras
Import SVG files in your React Native project the same way that you would in a Web application.
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."
Is there a way to train a bidirectional RNN (like LSTM or GRU) on trax nowadays?