Official Repsoitory for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
-
Updated
Jul 24, 2021 - Jupyter Notebook
{{ message }}
Official Repsoitory for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
Rethinking Image Inpainting via a Mutual Encoder Decoder with Feature Equalizations. ECCV 2020 Oral
PyTorch implementation of Sinusodial Representation networks (SIREN)
Pytorch implementation of SIREN - Implicit Neural Representations with Periodic Activation Function
All the code files related to the deep learning course from PadhAI
AReLU: Attention-based-Rectified-Linear-Unit
Unofficial implementation of 'Implicit Neural Representations with Periodic Activation Functions'
Implementing activation functions from scratch in Tensorflow.
Intro to Deep Learning by National Research University Higher School of Economics
Image to Image Translation using Conditional GANs (Pix2Pix) implemented using Tensorflow 2.0
Reservoir computing library for .NET. Enables ESN , LSM and hybrid RNNs using analog and spiking neurons working together.
A PyTorch implementation of funnel activation https://arxiv.org/pdf/2007.11824.pdf
The "Activation Function Demo" is a demo for implementing activation function with the mathod propsed in paper: Design Space Exploration of Neural Network Activation Function Circuits
Official PyTorch implementation of the paper : ProbAct: A Probabilistic Activation Function for Deep Neural Networks.
Online binary classification content control to filter and flag for inappropriate NSFW content
Korean OCR Model Design(한글 OCR 모델 설계)
Multilayer neural network framework implementation, used for classification and regression task. Can use multiple activation functions with backpropagation based on autograd library. Contains polynomial activation function for regression task.
**DeepLearning** Intro.. Need to wrap up more projects here...
3D visualization of common activation functions
Avoiding the vanishing gradients problem by adding random noise and batch normalization
BSc Thesis at FER-2019/20 led by doc. dr. sc. Marko Čupić
Awesome papers on Neural Networks and Deep Learning
This is a Keras implementation of the paper "LiSHT: Non-Parametric Linearly Scaled Hyperbolic Tangent Activation Function for Neural Networks" - https://arxiv.org/abs/1901.05894
Email Systems Account Propagation & Services REST API + Open PGP
Robustness of Deep Neural Networks using Trainable Activation Functions
Simple Tutorial to Explain the Pros and Cons of Sigmoid Activation Function
A small walk-through to show why ReLU is non linear!
Corruption Robust Image Classification with a new Activation Function. Our proposed Activation Function is inspired by the Human Visual System and a classic signal processing fix for data corruption.
Add a description, image, and links to the activation-functions topic page so that developers can more easily learn about it.
To associate your repository with the activation-functions topic, visit your repo's landing page and select "manage topics."