Currently using Windows Azure Cache.
To enable local cache I set the following
<localCache isEnabled="true" sync="TimeoutBased" objectCount="100000" ttlValue="10" />
According to msdn - When local cache is enabled, the cache client stores a reference to the object locally. This keeps the object active in the memory of the client application.
My project runs on 2 web role instances. Does this mean that once I set this line it uses the Web Role's RAM as a local cache and when it does not find the object there goes to the Windows Azure cache? I do not want the web role's RAM being bogged down by this since I do not see a way to specify the size for localCache. Any suggestions would be greatly appreciated.
My complete cache configuration looks like this
<dataCacheClients>
<dataCacheClient name="default">
<autoDiscover isEnabled="true" identifier="windowsAzure.mycacheurl.com" />
<localCache isEnabled="true" sync="TimeoutBased" objectCount="100000" ttlValue="10" />
<securityProperties mode="Message" sslEnabled="false">
<messageSecurity authorizationInfo="xxxjdkj" />
</securityProperties>
</dataCacheClient>
You are correct. If you have Local cache enabled the web role will use local RAM to keep a "copy" of the cached object. Another thing to note, that because it is a copy it could become stale and out of sync even from the main distributed cache.
Right now the only reason to use local cache is for reduced latency and that would only happen if you had hundreds of transactions/second to make an important impact. You can test this performance yourself and you will see a negligible difference.
I have deployed several large systems to Azure...two points: