lru-cache
Here are 293 public repositories matching this topic...
Speedier server-side rendering with component caching in React 16
-
Updated
Mar 25, 2019 - JavaScript
Simple and reliable LRU cache for c++ based on hashmap and linkedlist
-
Updated
Jul 29, 2020 - C++
The simple generic LRU memory/disk cache for Android written in Kotlin
-
Updated
Dec 14, 2020 - Kotlin
-
Updated
Jul 24, 2021 - JavaScript
-
Updated
Dec 16, 2019 - C++
A powerful caching library for Python, with TTL support and multiple algorithm options.
-
Updated
Dec 19, 2020 - Python
A lighting fast cache manager for node with least-recently-used policy.
-
Updated
Feb 16, 2020 - JavaScript
Elara DB is an easy to use, lightweight key-value database that can also be used as a fast in-memory cache. Manipulate data structures in-memory, encrypt database files and export data.
-
Updated
Jun 29, 2021 - Python
LRU, type-safe, thread-safe memory cache class in Swift
-
Updated
Apr 29, 2021 - Swift
Eventually consistent distributed in-memory cache Go library
-
Updated
May 1, 2019 - Go
A simple thread-safe and fixed size LRU. Based on the Hashlru Algorithm
-
Updated
Jun 5, 2021 - Go
Expand functools features(lru_cache) to class - methods, classmethods, staticmethods and even for (unofficial) hybrid methods.
-
Updated
May 27, 2021 - Python
Open version of common golang libraries useful to many projects.
-
Updated
Feb 9, 2021 - Go
A C++11 simulator for a variety of CDN caching policies.
-
Updated
Dec 3, 2020 - C++
Expirable Go LRC\LRU cache without goroutines
-
Updated
May 13, 2020 - Go
Derive the optimal cache hit ratio for Internet request traces with variable object sizes.
-
Updated
Dec 9, 2019 - C++
LRU cache for Python. Use Redis as backend. Provides a dictionary-like object as well as a method decorator. pip install redis-lru
-
Updated
Jan 23, 2020 - Python
python cachetools for dlang
-
Updated
Mar 21, 2020 - D
-
Updated
Jul 15, 2021
Improve this page
Add a description, image, and links to the lru-cache topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the lru-cache topic, visit your repo's landing page and select "manage topics."


Currently we don't have any mechanism to limit the maximum number of clients that could be handled simultaneously.
This feature should be designed properly. Here is some clue: https://redis.io/topics/clients#maximum-number-of-clients