Grow your team on GitHub
GitHub is home to over 50 million developers working together. Join them to grow your own development teams, manage permissions, and collaborate on projects.
Sign up
Pinned repositories
Repositories
-
-
text-to-text-transfer-transformer
Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"
-
-
dex-lang
Research language for array processing in the Haskell/ML family
-
open-covid-19-data
Open source aggregation pipeline for public COVID-19 data, including hospitalization/ICU/ventilator numbers for many countries.
-
bert
TensorFlow code and pre-trained models for BERT
-
computation-thru-dynamics
Understanding computation in artificial and biological recurrent networks through the lens of dynamical systems.
-
tiny-differentiable-simulator
Tiny Differentiable Simulator is a header-only C++ physics library with zero dependencies.
-
fast-soft-sort
Fast Differentiable Sorting and Ranking
-
tapas
End-to-end neural table-text understanding models.
-
albert
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
-
tensor2robot
Distributed machine learning infrastructure for large-scale robotics research
-
arxiv-latex-cleaner
arXiv LaTeX Cleaner: Easily clean the LaTeX code of your paper to submit to arXiv
-
batch_rl
Offline Reinforcement Learning (aka Batch Reinforcement Learning) on Atari 2600 games
-
simclr
SimCLR - A Simple Framework for Contrastive Learning of Visual Representations
-
seed_rl
SEED RL: Scalable and Efficient Deep-RL with Accelerated Central Inference. Implements IMPALA and R2D2 algorithms in TF2 with SEED's architecture.
-
xtreme
XTREME is a benchmark for the evaluation of the cross-lingual generalization ability of pre-trained multilingual models that covers 40 typologically diverse languages and includes nine tasks.
-
morph-net
Fast & Simple Resource-Constrained Learning of Deep Network Structure
-
language
Shared repository for open-sourced projects from the Google AI Language team.
-
big_transfer
Official repository for the "Big Transfer (BiT): General Visual Representation Learning" paper.
-
electra
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
-
neural-structural-optimization
Neural reparameterization improves structural optimization
-
bleurt
BLEURT is a metric for Natural Language Generation based on transfer learning.

