Search code examples
eclipsejunitosgi

Running JUnit tests inside a running OSGi application with IDE integration


I pray that you are well.

I am trying to find a more efficient way to run some tests for an OSGi application. I apologise if this question has already been asked, but I couldn't find exactly this question on SO or anywhere on the Web after an hour or so of searching.

Currently, I am using JUnit and Eclipse PDE's built-in JUnit Plug-in Test launcher. It works reasonably well - it fires up the full OSGi application with all the specified bundles (including the test bundle) and runs the JUnit tests within the OSGi framework. The results are reported in the IDE just as with the standard JUnit Test launcher, which is nice because not only can you see what the failure message was without having to trawl through logs, you can also double-click on any test failure and it will take you straight to the line of code containing the assertion that failed. Makes for very efficient test-driven development.

The one thing with this approach that is sub-optimal is the fact that there is a fair bit of overhead in starting and stopping the application - several seconds. This is especially painful if you are only working on one particular test method and the test itself takes less than a second to run.

I think an ideal solution to this would be a JUnit launcher within Eclipse that could connect to & execute tests within an OSGi application that is already running. It would be nice if the launcher could also automatically re-deploy the test bundle to the running application before running the test. Ideally, the procedure would be as follows:

  1. Start the application test instance if it has not already been started.
  2. If the test bundle has been changed/recompiled, re-install the updated test bundle into the running application.
  3. Run the tests within the application.
  4. Collect the test results and report them in the JUnit window (as per the JUnit Test and JUnit Plug-in Test launcher).

Ideally, at the least steps 2-4 would be automated by the test launching framework as a minimum. It would be icing on the cake if it could also automatically take care of step 1 for you.

I haven't managed to find any options that can meet all of these criteria. I have considered:

  • Manually deploying & executing the test plugin in a running application. This removes the overhead of waiting for the app to start, but it is a highly manual process and you also lose the IDE reporting of the test results that the standard JUnit launcher has.
  • JUnit Plug-in Test launcher - this does a good job of reporting the results in the IDE. However, it does not seem to be possible to configure it to connect to a running application instance - it wants to start a fresh application each time you run it. I would be extremely happy to find out that I was wrong about this and that JUnit Plug-in Test launcher can actually do what I am asking.
  • There are a few incarnations of JUnit Remote Runners (eg, https://github.com/ruediste/remote-junit, https://github.com/datastax/remote-junit-runner, https://github.com/Tradeshift/junit-remote) that use remote class loading. These look like a promising option, but I'm not sure how nicely the remote class loading and OSGi environments will play together.
  • I could possibly modify/extend the JUnit Plug-in Test launcher to add the capability of connecting to an existing application server.
  • Apache Sling appears to have this capability (https://sling.apache.org/documentation/bundles/org-apache-sling-junit-bundles.html), however its implementation seems to be pretty tightly coupled to the whole Sling architecture. As I am working with a pre-existing application that already has its own container/webservice architecture I'm not sure I could modify it to use Sling, and I'm not sure if Sling plays nicely with other web/servlet frameworks.
  • Stubbing/mocking/faking to avoid the need for the full-blown application is an option that I looked into extensively, and would have been my preferred option. However, the application in question makes extensive use of a custom RDBMS persistence architecture intermixed with the occasional raw SQL call. Mocks would not have been accurate enough to produce useful tests, and although I could conceivably have come up with a fake that was accurate enough it would have been as complicated as the application's persistence engine itself (and less accurate by definition), so it seemed to make more sense to simply use the inbuilt persistence architecture.

So I have a couple of options that I have already ruled out, and of the remainder I have a couple that I can probably get to work with enough time and effort. However before committing to this time and effort I thought to pose the question here, in the hope that I can avoid any dead-ends or re-inventing the wheel.

Any help/suggestions/info appreciated.


Solution

  • As a bit of an update:

    The ideas behind this post were the catalyst for me looking at the Bnd/Bndtools combination (https://bnd.bndtools.org/). It doesn't support it out-of-the-box yet, but there are a couple of open issues in its Github repository and there is a fork with a working prototype. It looks like this feature might make it into the next release (4.3), which will hopefully be soon.

    Update 1: this feature didn't make it into the 4.3 release of Bndtools, but it is now part of the 4.4 development snapshot as the bundle biz.aQute.tester.junit-platform. Documentation is available here: https://bnd.bndtools.org/chapters/310-testing.html

    Update 2: the 4.4 development snapshot of Bndtools became 5.0, and was released in January 2020.