H
Hugo Wetterberg
Ok, this is something that has been bothering me for a while.
When I use the cache I usually check if a cache request returns null.
If that is the case I merrily proceed to loading the resource from
disk, or whatever place it resides in.
The problem is that when you start thinking about the effects of
multithreading the following scenario is possible:
Suppose that a chunk of requests arrive more or less simultaineously.
The probability that several of these threads finds the cache empty
before any of them has set the cache entry is pretty high. Then each
thread start loading the file from its source and appends it to the
cache.
This causes a performance hits and also makes the threads work on
different object instances, which becomes a problem if the data is
changed.
So, my question is if it is overly paranoid to use mutexes to prevent
this kind of situations?
Code example:
protected void LoadCategories()
{
categories=(CategoryCollection)Cache["cats"];
if(categories==null)
{
//Block so that we don't load the
//resource to cache several times
catMutex.WaitOne();
//Check if the categories were loaded during lock
categories=(CategoryCollection)Cache["cats"];
if(categories!=null)
return;
//Get serializer and fileinfo
XmlSerializer serializer=new
XmlSerializer(typeof(CategoryCollection));
FileInfo file=new FileInfo(Settings.Root + "/" +
Settings.DataDirectory + "/categories.xml");
//Deserialize
FileStream fs=file.OpenRead();
try
{
categories=(CategoryCollection)serializer.Deserialize(fs);
Cache.Insert("cats",categories,
new CacheDependency(file.FullName),
DateTime.Now.AddMinutes(10),
Cache.NoSlidingExpiration);
}
finally
{
fs.Close();
}
catMutex.ReleaseMutex();
}
}
When I use the cache I usually check if a cache request returns null.
If that is the case I merrily proceed to loading the resource from
disk, or whatever place it resides in.
The problem is that when you start thinking about the effects of
multithreading the following scenario is possible:
Suppose that a chunk of requests arrive more or less simultaineously.
The probability that several of these threads finds the cache empty
before any of them has set the cache entry is pretty high. Then each
thread start loading the file from its source and appends it to the
cache.
This causes a performance hits and also makes the threads work on
different object instances, which becomes a problem if the data is
changed.
So, my question is if it is overly paranoid to use mutexes to prevent
this kind of situations?
Code example:
protected void LoadCategories()
{
categories=(CategoryCollection)Cache["cats"];
if(categories==null)
{
//Block so that we don't load the
//resource to cache several times
catMutex.WaitOne();
//Check if the categories were loaded during lock
categories=(CategoryCollection)Cache["cats"];
if(categories!=null)
return;
//Get serializer and fileinfo
XmlSerializer serializer=new
XmlSerializer(typeof(CategoryCollection));
FileInfo file=new FileInfo(Settings.Root + "/" +
Settings.DataDirectory + "/categories.xml");
//Deserialize
FileStream fs=file.OpenRead();
try
{
categories=(CategoryCollection)serializer.Deserialize(fs);
Cache.Insert("cats",categories,
new CacheDependency(file.FullName),
DateTime.Now.AddMinutes(10),
Cache.NoSlidingExpiration);
}
finally
{
fs.Close();
}
catMutex.ReleaseMutex();
}
}