optimizer
Here are 306 public repositories matching this topic...
Linux System Optimizer and Monitoring - https://oguzhaninan.github.io/Stacer-Web
-
Updated
Jun 13, 2021 - C++
-
Updated
Jun 2, 2021 - Kotlin
On the Variance of the Adaptive Learning Rate and Beyond
-
Updated
Feb 9, 2021 - Python
An implementation of React v15.x that optimizes for small script size
-
Updated
Mar 29, 2019 - JavaScript
GLSL optimizer based on Mesa's GLSL compiler. Used to be used in Unity for mobile shader optimization.
-
Updated
May 14, 2020 - C++
Merged into Gifsicle!
-
Updated
May 27, 2019 - C
Virtual-machine Translation Intermediate Language
-
Updated
May 29, 2021 - C++
WOFF format support
SAM: Sharpness-Aware Minimization (PyTorch)
-
Updated
Jul 20, 2021 - Python
Scour - An SVG Optimizer / Cleaner
-
Updated
Jul 19, 2021 - Python
Heimer is a simple cross-platform mind map, diagram, and note-taking tool written in Qt.
-
Updated
Jul 9, 2021 - C++
AdamP: Slowing Down the Slowdown for Momentum Optimizers on Scale-invariant Weights (ICLR 2021)
-
Updated
Jan 13, 2021 - Python
A tool to automate and optimize DraftKings and FanDuel lineup construction.
-
Updated
Jul 11, 2021 - Python
Explore the energy-efficient dataflow scheduling for neural networks.
-
Updated
Aug 24, 2020 - Python
Easily optimize images using WP CLI
-
Updated
Jun 4, 2021 - PHP
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
-
Updated
May 22, 2021 - Python
An Artifact optimizer for Genshin Impact.
-
Updated
Jul 24, 2021 - TypeScript
Implements https://arxiv.org/abs/1711.05101 AdamW optimizer, cosine learning rate scheduler and "Cyclical Learning Rates for Training Neural Networks" https://arxiv.org/abs/1506.01186 for PyTorch framework
-
Updated
Jul 14, 2019 - Python
LAMB Optimizer for Large Batch Training (TensorFlow version)
-
Updated
Jan 17, 2020 - Python
-
Updated
Jul 1, 2021 - C#
Videos of deep learning optimizers moving on 3D problem-landscapes
-
Updated
Apr 12, 2018 - Jupyter Notebook
ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance
-
Updated
Jan 19, 2021 - C++
Improve this page
Add a description, image, and links to the optimizer topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the optimizer topic, visit your repo's landing page and select "manage topics."

