If you work with
To install it:
To setup up:
Another
Now to use the cache on a method use its decorator:
#python #flask #cache #redis #memcached
Flask
and have an under heavy load API, you can setup a cache for your endpoints. One of the tools is Flask- Cache
that can work with different cache backends like Memcached
, Redis
, Simple
, etc.To install it:
pip install Flask-Cache
To setup up:
from flask import Flask
from flask.ext.cache import Cache
app = Flask(__name__)
# Check Configuring Flask-Cache section for more details
cache_config = {
"CACHE_TYPE": "redis",
"CACHE_REDIS_HOST": "127.0.0.1",
"CACHE_REDIS_PORT": 6379,
"CACHE_REDIS_DB": 3
}
cache = Cache(app,config=cache_config})
Another
redis
implementation:#: the_app/custom.py
class RedisCache(BaseCache):
def __init__(self, servers, default_timeout=500):
pass
def redis(app, config, args, kwargs):
args.append(app.config['REDIS_SERVERS'])
return RedisCache(*args, **kwargs)
Now to use the cache on a method use its decorator:
@cache.memoize(timeout=50)
def big_foo(a, b):
return a + b + random.randrange(0, 1000)
#python #flask #cache #redis #memcached
Those who are looking for scaling their cache server
https://github.com/twitter/twemproxy
#cache #redis #memcached #twemproxy
horizontally
(redis/memcached) can use twemproxy
. It is created by twitter
:https://github.com/twitter/twemproxy
#cache #redis #memcached #twemproxy
GitHub
GitHub - twitter/twemproxy: A fast, light-weight proxy for memcached and redis
A fast, light-weight proxy for memcached and redis - twitter/twemproxy