I'm trying to profile a .NET application in Visual Studio using the build-in profiler. Tracking CPU samples, I come across something weird. In a part of the application I have the following (simplified for clarity):
var requestObject = new RequestObject(parameters);
var result = GetResult(requestObject,"stringvalue");
I see that the second row uses about 10% of the samples. However, the method ´GetResult()´ only uses about 7%, the rest seems to be in [clr.dll]. I know clr.dll is responsible for garbage collection, JIT compiling, context switching etc, and the ´GetResult()´ method is fairly complex (spanning multiple assemblies, possibly using multiple threads) so it's not implausible that some of these actions need to be taken once the method returns. The ´RequestObject´ is also a bit complex so that might have something to do with it.
My question is: Can I track down exactly what happens here, and what can I do to make it faster? Note that 3% does not sound much, but ´GetResult()´ will be called a lot of times during the programs lifespan, even though when testing it is only run once. And it is very important that I can reduce the response time of the application.
Thanks a lot in advance for any answers!
You're not alone in trying to figure out what the profiler output means. SO has many questions like that. I work in a big .net app, and I've tried various profilers, and I know it's not what people are taught, but what actually works is this method. For one thing, you can take some samples during initialization, and other samples during basic run time. You don't have to pile the two together and try to guess what the load would be like in each phase without the other.
Also, if you look only at CPU time, you will miss any speedup opportunities due to extra I/O. You should not assume there isn't any, or that it's insignificant. If you do manage to find a CPU-only speedup opportunity, and fix it, then the part you didn't find becomes a larger fraction of the whole. You could get to a point where if you can't find anything else to fix, you might assume there is nothing else, when in fact there is, and it could be large. If you take samples yourself, you get a clear view of what's costing time.
You might want to say "But that's not accurate!" Well, OK, if there's something you could fix, and fixing it would save 90% of the time, but your inquiry is inaccurate and says it's taking 80%, or 95%, does that prevent you from fixing it and getting the 10-times speedup? The fact is, when your goal is to actually find problems, rather than just measure them, the bigger they are, the fewer samples it takes.