Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is only tangentially related to:

> you avoid computing/fetching the same value twice

But this problem comes up in reverse proxies too. In Nginx, you can set the `proxy_cache_lock`[0] directive to achieve the same effect (avoiding double requests).

[0]: https://nginx.org/en/docs/http/ngx_http_proxy_module.html#pr...



That right there is Cache Stampede[0] prevention.

[0]: https://en.wikipedia.org/wiki/Cache_stampede


Interesting! This is an issue I had with an external API which I intended to cache on my serverless workers infra.

I hit the API's rate limiter when the workers invalidated their cache, even though I staggered the lifetime of the cache keys for each replicated instance. This is how I found out how many Cloudflare workers run on a single edge instance. Hint: It's many.

Didn't know it had a name. I'm delighted, thanks!


You’re welcome, happy to help. If you are in the .NET space I suggest you to take a look at FusionCache. It has cache stampede protection built in, plus some other nice features like a fail-safe mechanism and soft/hard timeouts https://github.com/jodydonetti/ZiggyCreatures.FusionCache




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: