Create your own GitHub profile
Sign up for your own profile on GitHub, the best place to host code, manage projects, and build software alongside 50 million developers.
Sign up
Popular repositories
-
-
-
Forked from fastai/fastai
The fast.ai deep learning library, lessons, and tutorials
-
-
3,389 contributions in the last year
Contribution activity
August 2020
- sgugger/test_nbs Jupyter Notebook
Created a pull request in huggingface/transformers that received 6 comments
Tf model outputs
This PR continues the work on model outputs and treats all TF models (except T5) to use them. Since the new output type is opt-in only (default to r…
+3,267
−2,444
•
6
comments
- Lat fix for Ray HP search
- Add tokenizer to Trainer
- Update repo to isort v5
- Don't reset the dataset type + plug for rm unused columns
- Move threshold up for flaky test with Electra
- Add tests to Trainer
- Fix #6575
- Add hyperparameter search to Trainer
- Generation doc
- Trainer automatically drops unused columns in nlp datasets
- Adding PaddingDataCollator
- Fix #6428
- Activate check on the CI
- Move prediction_loss_only to TrainingArguments
- Fixes to make life easier with the nlp library
- Data collator with padding
- Fix links for open in colab
- Colab button
- Small docfile fixes
- Add a script to check all models are tested and documented
- Add DPR to models summary
- Dataset and DataCollator for BERT Next Sentence Prediction (NSP) task
- Fixed DataCollatorForLanguageModeling not accepting lists of lists
- Update repo to isort v5
- Add bibtex for new paper
- [Doc model summary] add MBart model summary
- [Docs model summaries] Add pegasus to docs
- Regression test for pegasus bugfix
- [Tests] fix attention masks in Tests
- TFTrainer dataset doc & fix evaluation bug
- Dataset and DataCollator for BERT Next Sentence Prediction (NSP) task.
- Add tests to Trainer
- [BartTokenizerFast] add prepare_seq2seq_batch
- [Tests common] Fix flaky test
- tf generation utils: remove unused kwargs
- Feed forward chunking others
- [EncoderDecoder] Add functionality to tie encoder decoder weights
- [examples/text-classification] update xnli-mt url
- Add hyperparameter search to Trainer
- fix incorrect codecov reports
- Fixed label datatype for STS-B
- [sched] polynomial_decay_schedule use default power=1.0
- Fix flaky ONNX tests
- skip onnx test until morgan comes back
- add BartConfig.force_bos_token_to_be_generated
- Some pull request reviews not shown.
Created an issue in huggingface/transformers that received 2 comments
Reformer now requires PyTorch 1.6.0
@patrickvonplaten, one of your recent PR (#6244) on Reformer introduces a dep on PyTorch 1.6.0 minimum by using torch.cuda.default_generators. We s…
2
comments

