Search code examples
c#visual-studiounit-testingload-testingtest-framework

Can I create a custom TestContext timer for UnitTest/LoadTest in Visual Studio?


A few of my UnitTests have a Sleep that is defined in a loop. I want to profile not only each iteration of the test, but the overall time for all iterations, in order to show any non linear scaling. For example, if I profile the "Overall", it includes the time for the sleep. I can use Stopwatch Start/Stop so that it only includes the doAction(). However, I can't write the Stopwatch results to the TestContext results.

[TestMethod]
    public void TestMethod1()
    {
        TestContext.BeginTimer("Overall");
        for (int i = 0; i < 5; i++)
        {
            TestContext.BeginTimer("Per");
            doAction();
            TestContext.EndTimer("Per");
            Sleep(1000);
        }
        TestContext.EndTimer("Overall");
    }

It seems that TestContext can be inherited from and redefined. However, I do not see any examples for how to write this back to the transaction store.

Is there an implementation for this I can refer to, or another idea. I would like to see it in the same report that Visual Studio presents for the LoadTest. Otherwise I have to write my own reporting.

Also, I have tried sniffing the SQL that writes these to the LoadTest database, but was not successful in figuring out how. There should be a SPROC to call but I am thinking it is all of the data at the end of the test.


Solution

  • Well, I had a similar problem. I wanted to report some extra data/reports/counters from my tests in the final test result like Visual Studio does and I found a solution.

    First, this cannot be done with the way you are trying. There is no direct link between the Load Test and the Unit Test where the TestContext exists.

    Second, you have to understand how visual studio creates the reports. It collects data from the performance counters of the OS. You can edit these counters, remove those you don't want and add others you want.

    How to edit the counters

    The load test configuration has two basic sections regarding the counters. These are:

    • The Counter Sets. These are sets of counters, for example agent which is added by default. If you open this counter set you will see that it collects counters such as Memory, Processor, PhysicalDisk e.t.c. So, at the end of the test you can see all these data from all your agents. If you you want to add more counters to this counter set you can double click on it (from the load test editor, see picture below) and select Add Counters. This will open a window with all the counters of your system and select those you want.

    • The Counter Set Mappings. Here you associate the counters sets with your machines. By default the [CONTROLLER MACHINE] and [AGENT MACHINES] are added with some default counter sets. This means that all the counters contained in the counter sets which are mapped to the [CONTROLLER MACHINE] will be gathered from your controller machine. The same applies for all your agents.

    enter image description here

    You can add more counters sets and more machines. By right clicking on the Counter Set Mappings --> Manage Counter Sets... a new window opens as below:

    enter image description here

    As you can see, I have added an extra machine with name db_1. This is the computer name of the machine and it must be at the same domain with the controller in order to have access to it and collect counters. I have also tagged it as database server and selected the sql counter set (default for sql counters but you can edit it and add any counter you want). Now every time this load test is executed, the controller will go to a machine with computer name db_1 and collect data which will be reported at the final test results.


    Now the coding part

    Ok, after this (big) introduction it's time to see how to add your data into the final test results. In order to do this you must create your own custom performance counters. This means that a new Performance Counter Category must be created in the machines you need to collect these data. In your case, in all of your agents because this is where the UnitTests are executed.

    After you have created the counters in the agents, you can edit the Agents counter set as shown above and select your extra custom counters.

    Here is a sample code on how to do this.

    First create the performance counters to all your agents. Run this code only once on every agent machine (or you can add it in a load test plugin):

    void CreateCounter() 
    {
        if (PerformanceCounterCategory.Exists("MyCounters"))
        {
            PerformanceCounterCategory.Delete("MyCounters");
        }
    
        //Create the Counters collection and add your custom counters 
        CounterCreationDataCollection counters = new CounterCreationDataCollection();
        // The name of the counter is Delay
        counters.Add(new CounterCreationData("Delay", "Keeps the actual delay", PerformanceCounterType.AverageCount64));
        // .... Add the rest counters
    
        // Create the custom counter category
        PerformanceCounterCategory.Create("MyCounters", "Custom Performance Counters", PerformanceCounterCategoryType.MultiInstance, counters);
    }
    

    And here the code of your test:

    [TestClass]
    public class UnitTest1
    {
        PerformanceCounter OverallDelay;
        PerformanceCounter PerDelay;
    
        [ClassInitialize]
        public static void ClassInitialize(TestContext TestContext)
        {
            // Create the instances of the counters for the current test
            // Initialize it here so it will created only once for this test class
            OverallDelay= new PerformanceCounter("MyCounters", "Delay", "Overall", false));
            PerDelay= new PerformanceCounter("MyCounters", "Delay", "Per", false));
            // .... Add the rest counters instances
        }
    
        [ClassCleanup]
        public void CleanUp()
        {
            // Reset the counters and remove the counter instances
            OverallDelay.RawValue = 0;
            OverallDelay.EndInit();
            OverallDelay.RemoveInstance();
            OverallDelay.Dispose();
            PerDelay.RawValue = 0;
            PerDelay.EndInit();
            PerDelay.RemoveInstance();
            PerDelay.Dispose();
        }
    
        [TestMethod]
        public void TestMethod1()
        {
             // Use stopwatch to keep track of the the delay
             Stopwatch overall = new Stopwatch();
             Stopwatch per = new Stopwatch();
    
             overall.Start();
    
             for (int i = 0; i < 5; i++)
             {
                 per.Start();
                 doAction();
                 per.Stop();
    
                 // Update the "Per" instance of the "Delay" counter for each doAction on every test
                 PerDelay.Incerement(per.ElapsedMilliseconds);
                 Sleep(1000);
    
                 per.Reset();
             }
    
             overall.Stop();
    
             // Update the "Overall" instance of the "Delay" counter on every test
             OverallDelay.Incerement(overall.ElapsedMilliseconds);
         }
    }
    

    Now, when your tests are executed, they will report to the counter their data. At the end of the load test you will be able to see the counter in every agent machine and add it to the graphs. It will be reported with MIN, MAX and AVG values.

    Conclusion

    1. I think (after months of research) that this is the only way to add custom data from your tests to the final load test report.
    2. It may seems too hard to do it. Well, If you understand the point it's not difficult to optimize it. I have wrap this functionality in a class to be easier to initialize, update and after all to manage the counters.
    3. It is very very useful. I can now see statistics from my tests that it would not be possible with the default counters. Such us, when a web request to a web service fails, I can catch the error and update the appropriate counter (e.g. Timeout, ServiceUnavailable, RequestRejected...).

    I hope I helped. :)