Trax — Deep Learning with Clear Code and Speed
-
Updated
Sep 9, 2020 - Python
{{ message }}
Trax — Deep Learning with Clear Code and Speed
A Python toolbox to create adversarial examples that fool neural networks in PyTorch, TensorFlow, and JAX
Fast and Easy Infinite Neural Networks in Python
Official repository for the "Big Transfer (BiT): General Visual Representation Learning" paper.
JAX-based neural network library
PyTorch, TensorFlow, JAX and NumPy — all of them natively using the same code
Concise deep learning with JAX
Fast Differentiable Sorting and Ranking
right now we use "twice_nll" as a fit objective and in the test statistic a simple diffence
twice_nll_constrfit - twice_nll_globalfit
but rather we should just to a NLL fit and in the test stat do
2*(nll_constrfit - nll_globalfit)
this will require updating some test reference numbers in the tests
A suite of benchmarks to test the sequential CPU and GPU performance of most popular high-performance libraries for Python.
Documentation:
Code for the paper "Learning Differential Equations that are Easy to Solve"
Pytorch and Jax code for the Madam optimiser.
Description
Update README and add experiments.
Solution
Create experiments or examples package to work
Differentiable interface to FEniCS for JAX using dolfin-adjoint/pyadjoint
Collection of useful omnifocus applescripts
Differentiable interface to FEniCS for JAX
JAX implementations of core Deep RL algorithms
Google AI Princeton control framework
A JAX Implementation of the Twin Delayed DDPG Algorithm
JAX implementation of Graph Attention Networks
Graph Convolutional Networks in JAX
A Python 3 toolbox for neural receptive field estimation using splines and Gaussian priors.
small experiments with agents learning atari games, implemented in jax/numpy
minimal C-interpreter to play with. for learning purpose
Samplers from the paper "Stochastic Gradient MCMC with Repulsive Forces"
Add a description, image, and links to the jax topic page so that developers can more easily learn about it.
To associate your repository with the jax topic, visit your repo's landing page and select "manage topics."
since numpyro supports enumerating discrete latent variables, imputing missing values for discrete covariates should be a possibility (which makes numpyro suitable for many more applied projects!)
Since array shapes will be altered when using parallel enumeration it is not directly evident how to adapt the continuous imputation example to discrete covariates, an example may be helpful