Search code examples
javaexceltestngdata-driven-teststestng-dataprovider

How to run conditional TestNG Tests with data from Excel


I have a class with multiple @Test methods each with a different functionality.

class myTests
{
@Test
test1()
{
...............
}

@Test
test2()
{
...............
}
}

Am following the Data driven approach where I have the data in Excel. Each row in excel sheet corresponds to a different test case that should execute the test methods in above class.

Below is the sample data from excel sheet which contains the method name as one of the parameters(eg: test1, test2) and also a flag which determines if the case should be picked up for execution or not at runtime (eg: y,n)

case1 data1 data2 data3........test1 y
case2 data4 data5 data6........test1 y
case3 data7 data8 data9........test2 y
case4 data10 data11 data12........test1 n  

Below are the questions I have:

  1. How to map the cases to corresponding test methods
  2. How to make the specific cases run based on the flag

My understanding is that using DataProvider annotation, a test method can run with different input data. But am not sure how to map the test methods with corresponding test data if there are multiple test methods in single class.

Also I tried looking for IAnnotationTransformer that can be used to alter the runtime behaviour of test method, but could not find a way to send the flag data from excel to the transformer class.

Thanks in advance..


Solution

  • There are bits and pieces of how this can be done, which you can do, but its going to require a lot of customization from your side. There's nothing in TestNG that gives you this sort of capability out of the box.

    IAnnotationTransformer will not help you because, this listener only considers methods and is not aware of the instances. It also runs before anything runs and there's no way wherein you can have this iterate. So that can be ruled out.

    Your best bet would be to use a @Factory and a @DataProvider wherein the @Factory will produce ONLY one instance of your test class and all your Test methods will have to mandatorily be part of that same test class.

    Here's the idea :

    • Create an additional column in your data source which represents the class that should basically be consuming the data source.
    • Build a data provider such that, it reads the above data source, filters out all the rows whose fully qualified class names match with the current data provider's enclosing class.
    • The data provider also should be built such that it creates a map (key is test method name and value is a list of values that are to be used for different iterations).
    • Once it has all the filtered rows, it creates just one row of test data such that, TestNG creates only one test class instance.
    • Have your test class also implement IHookable interface, so that you can to decide which test method will run and which can be skipped.
    • Now if a method satisfies the check of being run, you will basically have the test method loop through all the test data it has (so its always 1 test method will run in a loop for all its data unlike a data provider setup wherein TestNG does it automatically for you), and you resort to using soft asserts, to assert all the validations. This way your test will pass/fail after all the iterations are done.

    Here's a sample that shows some of this in action.

    import org.testng.IHookCallBack;
    import org.testng.IHookable;
    import org.testng.ITestResult;
    import org.testng.annotations.DataProvider;
    import org.testng.annotations.Factory;
    import org.testng.annotations.Test;
    
    import java.util.ArrayList;
    import java.util.Arrays;
    import java.util.List;
    
    public class TestclassSample implements IHookable {
    
        private List<String> methodsToRun = new ArrayList<>();
    
        @Factory(dataProvider = "dp")
        public TestclassSample(List<String> methodsToRun) {
            this.methodsToRun = methodsToRun;
        }
    
        @Override
        public void run(IHookCallBack callBack, ITestResult testResult) {
            String testMethodName = testResult.getMethod().getMethodName();
            if (methodsToRun.contains(testMethodName)) {
                System.err.println("About to run " + testResult.getMethod().getMethodName());
                callBack.runTestMethod(testResult);
            } else {
                testResult.setStatus(ITestResult.SKIP);
            }
        }
    
        @Test
        public void testMethod() {
            System.err.println("testMethod()");
        }
    
        @Test
        public void anotherTestMethod() {
            System.err.println("anotherTestMethod()");
        }
    
        @Test
        public void thirdTestMethod() {
            System.err.println("thirdTestMethod()");
        }
    
        @DataProvider(name = "dp")
        public static Object[][] getData() {
            return new Object[][]{
                    {Arrays.asList("testMethod", "thirdTestMethod")}
            };
        }
    }
    

    Here's the output :

    About to run testMethod
    testMethod()
    About to run thirdTestMethod
    thirdTestMethod()
    
    Test ignored.
    
    ===============================================
    Default Suite
    Total tests run: 3, Failures: 0, Skips: 1
    ===============================================