transformer-xl
Here are 16 public repositories matching this topic...
End-to-end ASR/LM implementation with PyTorch
-
Updated
Dec 21, 2020 - Python
XLNet for generating language.
-
Updated
Jul 27, 2019 - Python
custom cuda kernel for {2, 3}d relative attention with pytorch wrapper
-
Updated
May 5, 2020 - Python
[ACL‘20] Highway Transformer demo code.
-
Updated
Jun 24, 2020 - Python
Absolutely amazing SOTA Google Colab (Jupyter) Notebooks for creating/training SOTA Music AI models and for generating music with Transformer technology (Google XLNet/Transformer-XL)
-
Updated
Oct 23, 2020 - Jupyter Notebook
Fair quantitative comparison of NLP embeddings from GloVe to RoBERTa with Sequential Bayesian Optimization fine-tuning using Flair and SentEval. Extension of HyperOpt library to log_b priors.
-
Updated
Sep 4, 2019 - Jupyter Notebook
2020 阿里云天池大数据竞赛-中医药文献问题生成挑战赛
-
Updated
Dec 8, 2020 - Python
A refactored version of kimiyoung / transformer-xl/tf
-
Updated
Jun 22, 2020 - Python
Music and text generation with Transformer-XL.
-
Updated
Sep 30, 2020 - Python
Google Colab based on a fork of REMI repo. Try this incredible REMI Pop Music Transformer easily and quickly :)
-
Updated
Aug 10, 2020 - Jupyter Notebook
search engine based on feature extracted on pretrained transformerXL.
-
Updated
Feb 1, 2020 - Python
Absolutely fantastic and fully working SOTA Transformer-XL Music AI implementation MahlerNet by Elias Lousseief
-
Updated
Aug 22, 2020 - Jupyter Notebook
Code Base for Transformer-XL on Finnish Language
-
Updated
Nov 6, 2020 - Python
Skriptejä kielimallin kouluttamiseksi ja puppugenerointiin
-
Updated
Nov 19, 2020 - Jupyter Notebook
Improve this page
Add a description, image, and links to the transformer-xl topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the transformer-xl topic, visit your repo's landing page and select "manage topics."


Bart is a seq2seq model, but there might be applications where one would like to use only the pre-trained BartDecoder in an EncoderDecoder setting with a "long" encoder, such as
This is already p