ncnn is a high-performance neural network inference framework optimized for the mobile platform
-
Updated
Nov 30, 2020 - C++
{{ message }}
ncnn is a high-performance neural network inference framework optimized for the mobile platform
Runtime type system for IO decoding/encoding
Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson.
Grakn Core: The Knowledge Graph
An easy to use PyTorch to TensorRT converter
TensorFlow template application for deep learning
Is there any place with updated info on how to cross compile for raspbian buster?
Current documentation is not working
https://github.com/openvinotoolkit/openvino/blob/2020/build-instruction.md#cross-compilation-using-docker
The Triton Inference Server provides an optimized cloud and edge inferencing solution.
Acceleration package for neural networks on multi-core CPUs
DELTA is a deep learning based natural language and speech processing platform.
Bounds check and call [] operator
Pytorch-Named-Entity-Recognition-with-BERT
'max_request_size' seems to refer to bytes, not mb.
High-efficiency floating-point neural network inference operators for mobile, server, and Web
Hi, I am so interesting in your project, and wonder if you need contributor and how could I make my own contribution?
TensorFlow models accelerated with NVIDIA TensorRT
Embedded and mobile deep learning research resources
Shape and dimension inference (Keras-like) for PyTorch layers and neural networks
Lua Language Server coded by Lua
TensorFlow examples in C, C++, Go and Python without bazel but with cmake and FindTensorFlow.cmake
A REST API for Caffe using Docker and Go
Package for causal inference in graphs and in the pairwise settings. Tools for graph structure recovery and dependencies are included.
LightSeq: A High Performance Inference Library for Sequence Processing and Generation
Fast implementation of BERT inference directly on NVIDIA (CUDA, CUBLAS) and Intel MKL
Train a state-of-the-art yolov3 object detector from scratch!
Add a description, image, and links to the inference topic page so that developers can more easily learn about it.
To associate your repository with the inference topic, visit your repo's landing page and select "manage topics."
Hello, dear Mediapipe guys.
I want to inference the hand pose with Mediapipe model and my own model.
I have my own tf-lite models, it can work on the RGB bitmap.
I try to query the RGB bitmap from input frame with data packet.
My code is