My application needs to perform a number of tasks per tenant on a minute-to-minute basis. These are fire-and-forget operations, so I don't want to use Parallel.ForEach to handle this.
Instead I'm looping through the list of tenants, and firing off a ThreadPool.QueueUserWorkItem to process each tenants task.
foreach (Tenant tenant in tenants)
{
ThreadPool.QueueUserWorkItem(new WaitCallback(ProcessTenant), tenantAccount);
}
This code works perfectly in production, and can generally process over 100 tenants in under 5 seconds.
However on application startup this causes 100% CPU utilization while things like EF get warmed up during the startup process. To limit this I've implemented a semaphore as follows:
private static Semaphore _threadLimiter = new Semaphore(4, 4);
The idea is to limit this task processing to only be able to use half of the machines logical processors. Inside the ProcessTenant method I call:
try
{
_threadLimiter.WaitOne();
// Perform all minute-to-minute tasks
}
finally
{
_threadLimiter.Release();
}
In testing, this appears to work exactly as expected. CPU utilization on startup stays at around 50% and does not appear to affect how quickly initial startup takes.
So question is mainly around what is actually happening when WaitOne is called. Does this release the thread to work on other tasks - similar to asynchronous calls? The MSDN documentation states that WaitOne: "Blocks the current thread until the current WaitHandle receives a signal."
So I'm just wary that this won't actually allow my web app to continue to utilize this blocked thread while it's waiting, which would make the whole point of this exercise meaningless.
WaitOne
does block the thread, and that thread will stop being scheduled on a CPU core until the semaphore is signaled. However, you're holding a large number of threads from the threadpool for possibly a long time ("long" as in "longer than ~500 ms"). This can be an issue because the threadpool grows very slowly, so you may be preventing other part of your application from properly using it.
If you plan on waiting for a significant amount of time, you could use your own threads instead:
foreach (Tenant tenant in tenants)
{
new Thread(ProcessTenant).Start(tenantAccount);
}
However, you're still keeping one thread per item in memory. While they won't eat CPU as they're sleeping on the semaphore, they're still using RAM for nothing (about 1MB per thread). Instead, have a single dedicated thread wait on the semaphore and enqueue new items as needed:
// Run this on a dedicated thread
foreach (Tenant tenant in tenants)
{
_threadLimiter.WaitOne();
ThreadPool.QueueUserWorkItem(_ =>
{
try
{
ProcessTenant(tenantAccount);
}
finally
{
_threadLimiter.Release();
}
});
}