设为首页 收藏本站
查看: 415|回复: 0

[经验分享] Thundering Herd Mitigation (memcached redis)

[复制链接]

尚未签到

发表于 2016-12-20 09:44:18 | 显示全部楼层 |阅读模式
  "Modified memcached cache backend"import timefrom threading import localfrom django.core.cache.backends.base import BaseCache, InvalidCacheBackendErrorfrom django.utils.hashcompat import sha_constructorfrom django.utils import importlibfrom django.utils.encoding import smart_strfrom django.conf import settingstry:import pylibmc as memcacheNotFoundError = memcache.NotFoundusing_pylibmc = Trueexcept ImportError:using_pylibmc = Falsetry:import memcacheNotFoundError = ValueErrorexcept ImportError:raise InvalidCacheBackendError('Memcached cache backend requires ' + 'either the "pylibmc" or "memcache" library')# Flavor is used amongst multiple apps to differentiate the "flavor" of the# environment. Examples of flavors are 'prod', 'staging', 'dev', and 'test'.FLAVOR = getattr(settings, 'FLAVOR', '')CACHE_VERSION = str(getattr(settings, 'CACHE_VERSION', 1))CACHE_BEHAVIORS = getattr(settings, 'CACHE_BEHAVIORS', {'hash': 'crc'})CACHE_KEY_MODULE = getattr(settings, 'CACHE_KEY_MODULE', 'newcache')CACHE_HERD_TIMEOUT = getattr(settings, 'CACHE_HERD_TIMEOUT', 60)class Marker(object):passMARKER = Marker()def get_key(key):"""Returns a hashed, versioned, flavored version of the string that was input."""hashed = sha_constructor(smart_str(key)).hexdigest()return ''.join((FLAVOR, '-', CACHE_VERSION, '-', hashed))key_func = importlib.import_module(CACHE_KEY_MODULE).get_keyclass CacheClass(BaseCache):def __init__(self, server, params):super(CacheClass, self).__init__(params)self._servers = server.split(';')self._use_binary = bool(params.get('binary'))self._local = local()@propertydef _cache(self):"""Implements transparent thread-safe access to a memcached client."""client = getattr(self._local, 'client', None)if client:return client# Use binary mode if it's both supported and requestedif using_pylibmc and self._use_binary:client = memcache.Client(self._servers, binary=True)else:client = memcache.Client(self._servers)# If we're using pylibmc, set the behaviors according to settingsif using_pylibmc:client.behaviors = CACHE_BEHAVIORSself._local.client = clientreturn clientdef _pack_value(self, value, timeout):"""Packs a value to include a marker (to indicate that it's a packedvalue), the value itself, and the value's timeout information."""herd_timeout = (timeout or self.default_timeout) + int(time.time())return (MARKER, value, herd_timeout)def _unpack_value(self, value, default=None):"""Unpacks a value and returns a tuple whose first element is the value,and whose second element is whether it needs to be herd refreshed."""try:marker, unpacked, herd_timeout = valueexcept (ValueError, TypeError):return value, Falseif not isinstance(marker, Marker):return value, Falseif herd_timeout < int(time.time()):return unpacked, Truereturn unpacked, Falsedef _get_memcache_timeout(self, timeout):"""Memcached deals with long (> 30 days) timeouts in a specialway. Call this function to obtain a safe value for your timeout."""if timeout is None:timeout = self.default_timeoutif timeout > 2592000: # 60*60*24*30, 30 days# See http://code.google.com/p/memcached/wiki/FAQ# "You can set expire times up to 30 days in the future. After that# memcached interprets it as a date, and will expire the item after# said date. This is a simple (but obscure) mechanic."## This means that we have to switch to absolute timestamps.timeout += int(time.time())return timeoutdef add(self, key, value, timeout=None, herd=True):# If the user chooses to use the herd mechanism, then encode some# timestamp information into the object to be persisted into memcachedif herd and timeout != 0:packed = self._pack_value(value, timeout)real_timeout = timeout + CACHE_HERD_TIMEOUTelse:packed, real_timeout = value, timeoutreturn self._cache.add(key_func(key), packed,self._get_memcache_timeout(real_timeout))def get(self, key, default=None):encoded_key = key_func(key)packed = self._cache.get(encoded_key)if packed is None:return defaultval, refresh = self._unpack_value(packed)# If the cache has expired according to the embedded timeout, then# shove it back into the cache for a while, but act as if it was a# cache miss.if refresh:self._cache.set(encoded_key, val,self._get_memcache_timeout(CACHE_HERD_TIMEOUT))return defaultreturn valdef set(self, key, value, timeout=None, herd=True):# If the user chooses to use the herd mechanism, then encode some# timestamp information into the object to be persisted into memcachedif herd and timeout != 0:packed = self._pack_value(value, timeout)real_timeout = timeout + CACHE_HERD_TIMEOUTelse:packed, real_timeout = value, timeoutreturn self._cache.set(key_func(key), packed,self._get_memcache_timeout(real_timeout))def delete(self, key):self._cache.delete(key_func(key))def get_many(self, keys):# First, map all of the keys through our key functionrvals = map(key_func, keys)packed_resp = self._cache.get_multi(rvals)resp = {}reinsert = {}for key, packed in packed_resp.iteritems():# If it was a miss, treat it as a miss to our response & continueif packed is None:resp[key] = packedcontinueval, refresh = self._unpack_value(packed)if refresh:reinsert[key] = valresp[key] = Noneelse:resp[key] = val# If there are values to re-insert for a short period of time, then do# so now.if reinsert:self._cache.set_multi(reinsert,self._get_memcache_timeout(CACHE_HERD_TIMEOUT))# Build a reverse map of encoded keys to the original keys, so that# the returned dict's keys are what users expect (in that they match# what the user originally entered)reverse = dict(zip(rvals, keys))return dict(((reverse[k], v) for k, v in resp.iteritems()))def close(self, **kwargs):self._cache.disconnect_all()def incr(self, key, delta=1):try:return self._cache.incr(key_func(key), delta)except NotFoundError:raise ValueError("Key '%s' not found" % (key,))def decr(self, key, delta=1):try:return self._cache.decr(key_func(key), delta)except NotFoundError:raise ValueError("Key '%s' not found" % (key,))def set_many(self, data, timeout=None, herd=True):if herd and timeout != 0:safe_data = dict(((key_func(k), self._pack_value(v, timeout))for k, v in data.iteritems()))else:safe_data = dict(((key_func(k), v) for k, v in data.iteritems()))self._cache.set_multi(safe_data, self._get_memcache_timeout(timeout))def delete_many(self, keys):self._cache.delete_multi(map(key_func, keys))def clear(self):self._cache.flush_all()

运维网声明 1、欢迎大家加入本站运维交流群:群②:261659950 群⑤:202807635 群⑦870801961 群⑧679858003
2、本站所有主题由该帖子作者发表,该帖子作者与运维网享有帖子相关版权
3、所有作品的著作权均归原作者享有,请您和我们一样尊重他人的著作权等合法权益。如果您对作品感到满意,请购买正版
4、禁止制作、复制、发布和传播具有反动、淫秽、色情、暴力、凶杀等内容的信息,一经发现立即删除。若您因此触犯法律,一切后果自负,我们对此不承担任何责任
5、所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其内容的准确性、可靠性、正当性、安全性、合法性等负责,亦不承担任何法律责任
6、所有作品仅供您个人学习、研究或欣赏,不得用于商业或者其他用途,否则,一切后果均由您自己承担,我们对此不承担任何法律责任
7、如涉及侵犯版权等问题,请您及时通知我们,我们将立即采取措施予以解决
8、联系人Email:admin@iyunv.com 网址:www.yunweiku.com

所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其承担任何法律责任,如涉及侵犯版权等问题,请您及时通知我们,我们将立即处理,联系人Email:kefu@iyunv.com,QQ:1061981298 本贴地址:https://www.yunweiku.com/thread-316816-1-1.html 上篇帖子: Redis SortedSet 内存测试 下篇帖子: redis安装和加速
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

扫码加入运维网微信交流群X

扫码加入运维网微信交流群

扫描二维码加入运维网微信交流群,最新一手资源尽在官方微信交流群!快快加入我们吧...

扫描微信二维码查看详情

客服E-mail:kefu@iyunv.com 客服QQ:1061981298


QQ群⑦:运维网交流群⑦ QQ群⑧:运维网交流群⑧ k8s群:运维网kubernetes交流群


提醒:禁止发布任何违反国家法律、法规的言论与图片等内容;本站内容均来自个人观点与网络等信息,非本站认同之观点.


本站大部分资源是网友从网上搜集分享而来,其版权均归原作者及其网站所有,我们尊重他人的合法权益,如有内容侵犯您的合法权益,请及时与我们联系进行核实删除!



合作伙伴: 青云cloud

快速回复 返回顶部 返回列表