jax
Here are 358 public repositories matching this topic...
scipy.stats.mode
It would be great if you could add the JAX equivalent of scipy.stats.mode, which is currently unavailable.
A use case may be a ML classification task with ensembles, where multiple models give different predictions, and we are interested in finding the most common one.
As an example, consider predictions to be a two-dimensional DeviceArray of predictions, with
shape = (number of model
-
Updated
Jul 7, 2022
Bidirectional RNN
Is there a way to train a bidirectional RNN (like LSTM or GRU) on trax nowadays?
Deep learning operations reinvented (for pytorch, tensorflow, jax and others)
-
Updated
Jul 5, 2022 - Python
Python code for "Probabilistic Machine learning" book by Kevin Murphy
-
Updated
Jul 8, 2022 - Jupyter Notebook
Display Issues
TFDS is a collection of datasets ready to use with TensorFlow, Jax, ...
-
Updated
Jul 8, 2022 - Python
Flax is a neural network library for JAX that is designed for flexibility.
-
Updated
Jul 8, 2022 - Python
The Unified Machine Learning Framework
-
Updated
Jul 8, 2022 - Python
-
Updated
Jul 8, 2022 - Python
A Python toolbox to create adversarial examples that fool neural networks in PyTorch, TensorFlow, and JAX
-
Updated
May 30, 2022 - Python
JAX-based neural network library
-
Updated
Jul 8, 2022 - Python
Fast and Easy Infinite Neural Networks in Python
-
Updated
Jul 2, 2022 - Jupyter Notebook
Right now, qml.operation.expand_matrix is often called in a code-block like:
if wire_order is None or self.wires == Wires(wire_order):
return canonical_matrix
expand_matrix(canonical_matrix, wires=self.wires, wire_order=wire_order)
see [pennylane/operation.py Line 587](https://github.com/PennyLaneAI/pennylane/blob/b6fc5380abea6215661704ebe2f5cb8e7a599635/pennylane/operation.p
Hi,
I am trying to use random_flax_module on a class that uses flax.linen.BatchNorm that uses mutable parameters. Is there any example on how to use that? Here is my code:
The model:
class NSBlock(nn.Module):
train: bool
dim: int
ks_1: int = 3
ks_2: int = 3
dl_1: int = 1
dl_2: int = 1
mp_ks: int = 3
mp_st: int = 1
@nn.compact
def
Official repository for the "Big Transfer (BiT): General Visual Representation Learning" paper.
-
Updated
Jun 22, 2022 - Python
Dear Brax team,
Since Brax is fully differentiable, I thought it'd be possible to use it like DiffTaichi or GradSim for system identification (e.g. determining the mass of an object from a trajectory and known force) but I couldn't find any example for this.
Do you happen to have any demo or tips for this?
From the top of my head I would do something like this:
Let's say the task is to es
TensorLy: Tensor Learning in Python.
-
Updated
Jul 8, 2022 - Python
Scenic: A Jax Library for Computer Vision Research and Beyond
-
Updated
Jul 8, 2022 - Python
A library for scientific machine learning and physics-informed learning
-
Updated
Jul 9, 2022 - Python
A Graph Neural Network Library in Jax
-
Updated
Jul 3, 2022 - Python
Callable PyTrees and filtered transforms => neural networks in JAX. https://docs.kidger.site/equinox/
-
Updated
Jul 9, 2022 - Python
JAX - A curated list of resources https://github.com/google/jax
-
Updated
Jun 28, 2022
Repository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2021/Spring 2022
-
Updated
Jul 8, 2022 - Jupyter Notebook
The goal of this library is to generate more helpful exception messages for matrix algebra expressions for numpy, pytorch, jax, tensorflow, keras, fastai.
-
Updated
Apr 7, 2022 - Jupyter Notebook
PyTorch, TensorFlow, JAX and NumPy — all of them natively using the same code
-
Updated
Jun 22, 2022 - Python
Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/
-
Updated
Jul 8, 2022 - Python
Official code for Score-Based Generative Modeling through Stochastic Differential Equations (ICLR 2021, Oral)
-
Updated
Aug 26, 2021 - Jupyter Notebook
Improve this page
Add a description, image, and links to the jax topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the jax topic, visit your repo's landing page and select "manage topics."

Formed in 2009, the Archive Team (not to be confused with the archive.org Archive-It Team) is a rogue archivist collective dedicated to saving copies of rapidly dying or deleted websites for the sake of history and digital heritage. The group is 100% composed of volunteers and interested parties, and has expanded into a large amount of related projects for saving online and digital history.


What does this PR do?