mxnet
Here are 561 public repositories matching this topic...
Visualizer for neural network, deep learning, and machine learning models
-
Updated
Oct 14, 2020 - JavaScript
Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
-
Updated
Oct 14, 2020 - Python
ncnn is a high-performance neural network inference framework optimized for the mobile platform
-
Updated
Oct 14, 2020 - C++
Bug Report
These tests were run on s390x. s390x is big-endian architecture.
Failure log for helper_test.py
________________________________________________ TestHelperTensorFunctions.test_make_tensor ________________________________________________
self = <helper_test.TestHelperTensorFunctions testMethod=test_make_tensor>
def test_make_tensor(self): # type: () -> None
Interactive deep learning book with code, math, and discussions. Available in multi-frameworks.
-
Updated
Oct 14, 2020 - Python
Face Analysis Project on MXNet
-
Updated
Oct 14, 2020 - Python
Set up deep learning environment in a single command line.
-
Updated
Aug 29, 2020 - Python
MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Convert models between Caffe, Keras, MXNet, Tensorflow, CNTK, PyTorch Onnx and CoreML.
-
Updated
Aug 14, 2020 - Python
Hi, thanks for the great code!
I wonder do you have plans to support resuming from checkpoints for classification? As we all know, in terms of training ImageNet, the training process is really long and it can be interrupted somehow, but I haven't notice any code related to "resume" in scripts/classification/train_imagenet.py.
Maybe @hetong007 ? Thanks in advance.
深度学习入门教程, 优秀文章, Deep Learning Tutorial
-
Updated
Oct 12, 2020 - Jupyter Notebook
This project reproduces the book Dive Into Deep Learning (www.d2l.ai), adapting the code from MXNet into PyTorch.
-
Updated
Sep 23, 2020 - Jupyter Notebook
The convertor/conversion of deep learning models for different deep learning frameworks/softwares.
-
Updated
Oct 4, 2020
A Simple and Versatile Framework for Object Detection and Instance Recognition
-
Updated
Sep 17, 2020 - Python
resuming training
How do i resume training for text classification?
I have the same hardware envs, same network, but I could not get the result as you, almost half as you. Any best practices and experience? thanks very much! for bytePS with 1 instance and 8 GPU, I have similar testing result.
Well, Gumbel Distribution is magical. Basically, given a sequence of K logits, i.e., "\log a_1, \log a_2, ..., \log a_K" and K independent gumbel random variables, i.e., "g_1, g_2, ..., g_K". We have
\argmax_i \log a_i + g_i ~ Categorical({a_i / sum(a)})
This gives you a very simple way to sampl
-
Updated
Oct 14, 2020 - Python
In this repository, I will share some useful notes and references about deploying deep learning-based models in production.
-
Updated
Sep 29, 2020
Reinforcement Learning Coach by Intel AI Lab enables easy experimentation with state of the art Reinforcement Learning algorithms
-
Updated
Oct 7, 2020 - Python
Sandbox for training convolutional networks for computer vision
-
Updated
Oct 8, 2020 - Python
AI on Hadoop
-
Updated
May 20, 2020 - Java
Machine Learning University: Accelerated Natural Language Processing Class
-
Updated
Sep 5, 2020 - Jupyter Notebook
nGraph - open source C++ library, compiler and runtime for Deep Learning
-
Updated
Oct 6, 2020 - C++
A library for training and deploying machine learning models on Amazon SageMaker
-
Updated
Oct 13, 2020 - Python
《深度学习与计算机视觉》配套代码
-
Updated
Jan 24, 2020 - Python
Yolo Model
Description
Implement a YOLO model and add it to the DJL model zoo
References
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
-
Updated
Oct 14, 2020 - Python
Improve this page
Add a description, image, and links to the mxnet topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the mxnet topic, visit your repo's landing page and select "manage topics."


Description
This is a documentation bug. The parameter of API
mxnet.test_utils.check_numeric_gradientis not consistent between signature and Parameter section. There is a parametercheck_epsin the Parameter section, but it is not in the signature.Link to document: https://mxnet.apache.org/versions/1.6/api/python/docs/api/mxnet/test_utils/index.html#mxnet.test_utils.check_numeric_gra