Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
-
Updated
Jul 9, 2021 - Python
{{ message }}
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
RoBERTa中文预训练模型: RoBERTa for Chinese
GPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想)
Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation
A PyTorch implementation of "Graph Wavelet Neural Network" (ICLR 2019)
Chinese NewsTitle Generation Project by GPT2.带有超级详细注释的中文GPT2新闻标题生成项目。
1. Use BERT, ALBERT and GPT2 as tensorflow2.0's layer. 2. Implement GCN, GAN, GIN and GraphSAGE based on message passing.
A PyTorch implementation of "Signed Graph Convolutional Network" (ICDM 2018).
Hey
We would like to start using the new GitHub discussions feature to better organize the community question
If you want to ask us a question or get some help from the community that uses AraBERT, then please use the Discussions feature.
https://github.com/aub-mind/arabert/discussions
We intend to keep the Issues tab for bug reporting and/or errors
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
XLNet for generating language.
Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
pre-training and fine-tuning framework for text generation
텐서플로2와 머신러닝으로 시작하는 자연어처리 (로지스틱회귀부터 BERT와 GPT2까지) 실습자료
Easy and Efficient Transformer : Scalable Inference Solution For Large NLP model
State of the art faster Natural Language Processing in Tensorflow 2.0 .
Generating responses with pretrained XLNet and GPT-2 in PyTorch.
An Open-Source Package for Chinese Open-domain Conversational Chatbot (中文闲聊对话系统,一键部署微信闲聊机器人)
PyTorch Implementation of OpenAI GPT-2
GPT-2 French demo | Démo française de GPT-2
Deep Learning Transformer models in MATLAB
GPT-2 Telegram Chat bot
Add a description, image, and links to the gpt2 topic page so that developers can more easily learn about it.
To associate your repository with the gpt2 topic, visit your repo's landing page and select "manage topics."
Hi, I am so interesting in your project, and wonder if you need contributor and how could I make my own contribution?