natural-language-processing
Natural language processing (NLP) is a field of computer science that studies how computers and humans interact. In the 1950s, Alan Turing published an article that proposed a measure of intelligence, now called the Turing test. More modern techniques, such as deep learning, have produced results in the fields of language modeling, parsing, and natural-language tasks.
Here are 9,837 public repositories matching this topic...
《动手学深度学习》:面向中文读者、能运行、可讨论。中英文版被55个国家的300所大学用于教学。
-
Updated
May 16, 2022 - Python
TensorFlow code and pre-trained models for BERT
-
Updated
May 9, 2022 - Python
Learn how to responsibly deliver value with ML.
-
Updated
May 13, 2022 - Jupyter Notebook
中文分词 词性标注 命名实体识别 依存句法分析 成分句法分析 语义依存分析 语义角色标注 指代消解 风格转换 语义相似度 新词发现 关键词短语提取 自动摘要 文本分类聚类 拼音简繁转换 自然语言处理
-
Updated
May 8, 2022 - Python
-
Updated
May 17, 2022 - Python
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
-
Updated
May 16, 2022 - Python
-
Updated
May 1, 2022
Oxford Deep NLP 2017 course
-
Updated
Jun 12, 2017
Change tensor.data to tensor.detach() due to
pytorch/pytorch#6990 (comment)
tensor.detach() is more robust than tensor.data.
-
Updated
May 17, 2022 - Python
Although the results look nice and ideal in all TensorFlow plots and are consistent across all frameworks, there is a small difference (more of a consistency issue). The result training loss/accuracy plots look like they are sampling on a lesser number of points. It looks more straight and smooth and less wiggly as compared to PyTorch or MXNet.
It can be clearly seen in chapter 6([CNN Lenet](ht
Describe the bug
Streaming Datasets can't be pickled, so any interaction between them and multiprocessing results in a crash.
Steps to reproduce the bug
import transformers
from transformers import Trainer, AutoModelForCausalLM, TrainingArguments
import datasets
ds = datasets.load_dataset('oscar', "unshuffled_deduplicated_en", split='train', streaming=True).with_format("In gensim/models/fasttext.py:
model = FastText(
vector_size=m.dim,
vector_size=m.dim,
window=m.ws,
window=m.ws,
epochs=m.epoch,
epochs=m.epoch,
negative=m.neg,
negative=m.neg,
# FIXME: these next 2 lines read in unsupported FB FT modes (loss=3 softmax or loss=4 onevsall,
# or model=3 superviA very simple framework for state-of-the-art Natural Language Processing (NLP)
-
Updated
May 17, 2022 - Python
Is your feature request related to a problem? Please describe.
I typically used compressed datasets (e.g. gzipped) to save disk space. This works fine with AllenNLP during training because I can write my dataset reader to load the compressed data. However, the predict command opens the file and reads lines for the Predictor. This fails when it tries to load data from my compressed files.
Checking the Python files in NLTK with "python -m doctest" reveals that many tests are failing. In many cases, the failures are just cosmetic discrepancies between the expected and the actual output, such as missing a blank line, or unescaped linebreaks. Other cases may be real bugs.
If these failures could be avoided, it would become possible to improve CI by running "python -m doctest" each t
Natural Language Processing Tutorial for Deep Learning Researchers
-
Updated
Jul 25, 2021 - Jupyter Notebook
This repository contains code examples for the Stanford's course: TensorFlow for Deep Learning Research.
-
Updated
Dec 22, 2020 - Python
Drench yourself in Deep Learning, Reinforcement Learning, Machine Learning, Computer Vision, and NLP by learning from these exciting lectures!!
-
Updated
Apr 10, 2022 - HTML
Stanford CoreNLP: A Java suite of core NLP tools.
-
Updated
May 6, 2022 - Java
Data-centric declarative deep learning framework
-
Updated
May 17, 2022 - Python
Web mining module for Python, with tools for scraping, natural language processing, machine learning, network analysis and visualization.
-
Updated
Jul 9, 2021 - Python
A collection of machine learning examples and tutorials.
-
Updated
Mar 24, 2022 - Python
Style and Grammar Checker for 25+ Languages
-
Updated
May 17, 2022 - Java
Some ideas for figures to add to the PPT
- Linear regression, single-layer neural network
- Multilayer Perceptron with hidden layer
- Backpropagation
- Batch Normalization and alternatives
- Computational Graphs
- Dropout
- CNN - padding, stride, pooling,...
- LeNet
- AlexNet
- VGG
- GoogleNet
- ResNet
- DenseNet
- Memory Net
Pre-trained and Reproduced Deep Learning Models (『飞桨』官方模型库,包含多种学术前沿和工业场景验证的深度学习模型)
-
Updated
May 13, 2022 - Python
A PyTorch implementation of the Transformer model in "Attention is All You Need".
-
Updated
Apr 3, 2022 - Python
Official Stanford NLP Python Library for Many Human Languages
-
Updated
May 17, 2022 - Python
Created by Alan Turing
- Wikipedia
- Wikipedia


Model description
Swin Transformer V2: Scaling Up Capacity and Resolution
repo origin: Swin Transformer V2
repo timm: Swin Transformer V2
all the model the pretrain is ready
Open sourc