I saw many caching examples that include locking.I am trying to cache common list for entire application.Do I really need to use locks like below ? The worst thing can happen if I don't use locking is that ;
If multiple threads detect a cache miss at about the same time.They might attempt to load data simultaneously.In conclusion , data will be the same...
Is it true ?
public class Worker
{
private static object someLock;
public static object CacheMethod()
{
var results = HttpContext.Current.Cache["Common"];
if (results == null)
{
lock (someLock)
{
results = HttpContext.Current.Cache["Common"];
if (results == null)
{
results = GetResultsFromSomewhere();
HttpContext.Current.Cache.Insert("Common", results, null,
DateTime.Now.AddHours(1), Cache.NoSlidingExpiration);
}
}
}
return results;
}``
This lock is used to prevent cache stampeding. If the cache is empty and takes a while to fill then all incoming requests will start to fill the cache. Depending on how the numbers work out that can be a catastrophic loss of performance.
If GetResultsFromSomewhere
was Thread.Sleep(10000)
then this could easily happen.
The bad thing about that particular code is that it's one lock for all cache keys. Usually you want one lock per key.