The Wayback Machine - http://web.archive.org/web/20201029025934/https://github.com/topics/xlnet
Skip to content
#

xlnet

Here are 62 public repositories matching this topic...

transformers
LysandreJik
LysandreJik commented Oct 22, 2020

🚀 Feature request

This is a documentation request in order to make it easier to find corresponding examples in the documentation.

Good first issue if you want to get acquainted with the docs and how to build docs using Sphinx!

Current issue

Here's the issue: currently, if one goes to an older documentation version to check the "examples" page, for example, [v2.6.0](https://huggin

自然语言处理(nlp),小姜机器人(闲聊检索式chatbot),BERT句向量-相似度(Sentence Similarity),XLNET句向量-相似度(text xlnet embedding),文本分类(Text classification), 实体提取(ner,bert+bilstm+crf),数据增强(text augment, data enhance),同义句同义词生成,句子主干提取(mainpart),中文汉语短文本相似度,文本特征工程,keras-http-service调用

  • Updated Oct 26, 2020
  • Python

中文长文本分类、短句子分类、多标签分类、两句子相似度(Chinese Text Classification of Keras NLP, multi-label classify, or sentence classify, long or short),字词句向量嵌入层(embeddings)和网络层(graph)构建基类,FastText,TextCNN,CharCNN,TextRNN, RCNN, DCNN, DPCNN, VDCNN, CRNN, Bert, Xlnet, Albert, Attention, DeepMoji, HAN, 胶囊网络-CapsuleNet, Transformer-encode, Seq2seq, SWEM, LEAM, TextGCN

  • Updated Aug 26, 2020
  • Python
EricFillion
EricFillion commented Jan 9, 2020

All other language models do not perform as well as HappyROBERTA large for masked word prediction. We should encourage users to use HappyROBERTA Large by displaying a logger message if they use a suboptimal language model. This message will encourage them to use HappyROBERTA Large.

There are still some situations where a user may want to use another model, so we will keep them available.

Improve this page

Add a description, image, and links to the xlnet topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the xlnet topic, visit your repo's landing page and select "manage topics."

Learn more

You can’t perform that action at this time.