Bouncycastle, OSGi and uber-jars

I have recently tried to use SSHj in an OSGi bundle.
I had decided to wrap this library and its dependencies in my own bundle (something equivalent to an uber-jar but compliant with OSGi). In a classic context, you would use the Maven Shade plug-in. In an OSGi context, you can simply use the Maven Bundle plug-in, with the embedded-dependency directive.

Anyway, the difficulty here came because SSHj uses Bouncycastle as a security provider. And you cannot do what you want with it. The first attempt to build my all-in-one bundle resulted in a signature error during the Maven build.

Invalid signature file digest for Manifest main attributes

Indeed, some files in the JAR were signed, and some others were not. I solved it by using the Maven ANT plugin, removing signature files from my jar and repackaging it. The JAR could finally be built. Unfortunately, later at runtime, another error came out.

JCE cannot authenticate the provider BC

Looking at the logs of SSHj, no provider was found while the right classes were in the bundle’s class path. And everything was working outside of OSGi. So, there was no error with compilation levels or in my code. For some reason, Bouncycastle was not loaded by the JVM.

The explanation is that JCE (Java Cryptography Extension) providers are loaded in a special way by the JVM. First, they must be signed. And it seems not any certificate can be used (it must be approved by the JVM vendors). Bouncycastle is signed. But if you wrap it into another JAR, you will lose the signature. And then, these providers must be loaded at startup. If you apply this to an OSGi context, it means you cannot deploy Bouncycastle as a bundle whenever you want.

Finally, I solved my issue by…

  • … following Karaf’s documentation and making Bouncycastle a part of my Karaf distribution (copy it in lib/ext and updating some configuration files). See this POM to automate such a thing for you custom Karaf distribution.
  • … not importing org.bouncycastle.* in my packages. That’s because putting these libraries under lib/ext means these packages are considered as root classes (just like java.*). No need to import them then.
  • … making sure all the bundles that depend on Bouncycastle would use the same version. I achieved it by updating the dependencyManagement section in my parent POM.

And although it was not a real issue, I decided to put SSHj as Karaf feature. This way, no need to make an uber-jar. And I can reuse it as much as I want. See this file for a description of this feature (ssh4j-for-roboconf). The dependencies that are not already OSGi bundles are deployed through the wrap protocol.

I spent about a day to solve these issues. That’s why I thought an article might help.

Generating swagger.json files with Enunciate and custom objects mappers

That’s a long title.
To make short, let’s just say I wanted to generate a Swagger file for Roboconf‘s REST API. This API is implemented in Java with Jersey and uses Jackson to handle the mapping between Java and JSon.

After exploring many solutions, I chose Enunciate to generate this file from my API. One of the best aspects of Enunciate is that it uses Javadoc comments (and optionally annotations) to populate the swagger.json file. The generated file can directly be read by Swagger UI.

Custom JSon de/serialization

One of the issues I met was related to our Java-JSon binding.
Indeed, our project uses custom object mappers. These mappers are a way for Jackson to tailor the JSon serialization and deserialization from and to Java objects. It is very convenient as it gives a maximum of control over what is returned by Jersey. However, this resulted in troubles when looking at the type definitions in the swagger.json file. Indeed, be it with Enunciate or Swagger Java tools, they all use Java types introspection to deduce the shape of the JSon objects. That’s a real problem if you use your own object mapper.

Fortunately, since version 2.6, Enunciate now supports Jackson mix-ins. A Jackson mix-in is in fact a Java class that you can annotate to customize the shape of the generated JSon structures. Let’s take an example. Let’s assume you have a model class that is returned by one of your REST operations.

public class MyModel {
     private String firstName, lastName;
     private int age;

     // All the setters and getters here...
}

Let’s assume this class is defined by someone else, in a different project. How can you customize the JSon generation as you cannot (or do not want) to modify the source code? Mix-ins to the rescue! You define another class and annotate it.

public abstract class MyModelMixin {

     // Do not serialize
     @JsonIgnore
     public abstract int getAge();

     // Change the property name
     @JsonProperty( "name" )
     public abstract String getLastName();
}

Jackson provides a way to associate these two classes in an object mapper. And since version 2.6, Enunciate also provides a way to define mix-ins through the enunciate.xml file (the file that configures what Enunciate generates).

<?xml version="1.0"?>
<enunciate 
		xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
		xsi:noNamespaceSchemaLocation="http://enunciate.webcohesion.com/schemas/enunciate-2.6.0.xsd">

	<title>Roboconf REST API</title>
	<description>The REST API for Roboconf's Administration</description>
	<contact name="the Roboconf team" url="http://roboconf.net" />

	<modules>
		<!-- Disabled modules: almost all -->
		<jackson1 disabled="true" />
		<jaxb disabled="true" />
		<jaxws disabled="true" />
		<spring-web disabled="true" />
		<idl disabled="true" />

		<c-xml-client disabled="true" />
		<csharp-xml-client disabled="true" />
		<java-xml-client disabled="true" />
		<java-json-client disabled="true" />
		<gwt-json-overlay disabled="true" />
		<obj-c-xml-client disabled="true" />
		<php-xml-client disabled="true" />
		<php-json-client disabled="true" />
		<ruby-json-client disabled="true" />

		<!-- Enabled modules -->
		<jackson disabled="false" collapse-type-hierarchy="true">
			<mixin source="net.roboconf....MyModelMixin" target="net.roboconf....MyModel" />
		</jackson>
		<jaxrs disabled="false" />
		<docs disabled="false" />
		<swagger disabled="false" basePath="/roboconf-dm" host="localhost:8181" />
	</modules>

</enunciate>

You can define as many mixin elements that you need.

Pros and Cons

I tested this solution yesterday and it works fine.
I however found two drawbacks to it.

First, you have to write a second model just for the documentation. How can you verify that it is coherent with what you defined in your custom object mappers? The best option would be to get rid of your mapper and also rely on Jackson mixins for the Java-JSon conversions.

Second issue, Jackson mix-ins only work when you want to remove or update a JSon property. But they do not allow you to add properties. It is true in Enunciate but also in Jackson. There was a ticket created about it in the Jackson project but it was marked as “won’t be solved” as it would raise many problems.

Let’s illustrate it with our example.
Let’s imagine you defined in your object mapper a new JSon field called “fullName” which is the concatenation of both name parts. Updating your mixin with…

public abstract class MyModelMixin {

     // Do not serialize
     @JsonIgnore
     public abstract int getAge();

     // Change the property name
     @JsonProperty( "name" )
     public abstract String getLastName();

     // Add a new property (will not work)
     @JsonProperty( "fullName" )
     public abstract String getFullName();
}

… will not work.
You will not find a property called “fullName” in the swagger.json file. Mix-ins cannot address this use case. Maybe a new upgrade in Enunciate might do the trick for the documentation generation. But Jackson does not support it. You have to use custom object mappers.

So, to summer it up, Jackson mix-ins are to use only when you want to remove or rename JSon properties. Not to add new properties.

Workaround

There is no solution to directly mix custom objects mappers with Enunciate (or other Swagger tools). They all rely on a Java type hierarchies while custom object mappers let your creativity unbound.

So, you have to make a two-step process.
First, let Enunciate make most of the work, and then, fix what was generated. I implemented it with a Java class invoked during the project’s Maven build process through ANT. Let’s take a look at this. Here are the relevant parts of the POM file.

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">

	<!-- ... -->
	
	<properties>
		<enunciate.version>2.6.0</enunciate.version>
	</properties>
	
	<dependencies>
		
		<dependency>
			<groupId>junit</groupId>
			<artifactId>junit</artifactId>
			<scope>test</scope>
		</dependency>

		<dependency>
			<groupId>com.google.code.gson</groupId>
			<artifactId>gson</artifactId>
			<version>2.2.2</version>
			<scope>test</scope>
		</dependency>
	</dependencies>
	
	<build>
		<plugins>			
			<plugin>
				<groupId>com.webcohesion.enunciate</groupId>
				<artifactId>enunciate-maven-plugin</artifactId>
				<version>${enunciate.version}</version>
				<executions>
					<execution>
						<goals>
							<goal>docs</goal>
						</goals>
					</execution>
				</executions>
				<configuration>
					<docsDir>${project.build.directory}/docs</docsDir>
					<configFile>${project.basedir}/enunciate.xml</configFile>
				</configuration>
			</plugin>
			
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-antrun-plugin</artifactId>
				<executions>
					<execution>
						<phase>test-compile</phase>
						<configuration>
							<target>
								<property name="test_classpath" refid="maven.test.classpath"/>
								<java classname="net.roboconf.dm.rest.services.swagger.UpdateSwaggerJson" classpath="${test_classpath}" />
							</target>	
						</configuration>
						<goals>
							<goal>run</goal>
						</goals>
					</execution>
				</executions>
			</plugin>
			
			<plugin>
				<groupId>org.codehaus.mojo</groupId>
				<artifactId>build-helper-maven-plugin</artifactId>
				<version>1.12</version>
				<executions>
					<execution>
						<goals>
							<goal>attach-artifact</goal>
						</goals>
						<configuration>
							<artifacts>
								<artifact>
									<file>${project.build.directory}/docs/apidocs/ui/swagger.json</file>
									<type>json</type>
									<classifier>swagger</classifier>
								</artifact>
							</artifacts>
						</configuration>
					</execution>
				</executions>
			</plugin>
		</plugins>
	</build>

</project>

As you can see, there are 3 plug-in invocations in this sample. One is about the Enunciate Maven plug-in. It generates the documentation during the process-sources phase. The second one is the Maven ANT plug-in that executes a class located in the test sources. Indeed, since it is a build helper, there is no reason to put it in the main sources. This is why we perform this step during the test-compile phase. Notice we must pass the class path of the test sources. Eventually, we use the Maven build helper plug-in to attach the generated swagger.json file to the Maven artifacts. This way, this file will be deployed on our Maven repository along with our other built artifacts.

Let’s now take a look at our updater class.
It uses GSon to parse and update the JSon file.

public class UpdateSwaggerJson {

	final Set<Class<?>> processedClasses = new HashSet<> ();


	/**
	 * @param args
	 */
	public static void main( String[] args ) {

		try {
			UpdateSwaggerJson updater = new UpdateSwaggerJson();
			JsonObject newDef = updater.prepareNewDefinitions();
			updater.updateSwaggerJson( newDef );

		} catch( Exception e ) {
			e.printStackTrace();
		}
	}


	/**
	 * Prepares the JSon object to inject as the new definitions in the swagger.json file.
	 * @return a non-null object
	 * @throws IOException if something failed
	 */
	public JsonObject prepareNewDefinitions() throws IOException {

		ObjectMapper mapper = JSonBindingUtils.createObjectMapper();
		StringWriter writer = new StringWriter();
		JsonObject newDef = new JsonObject();

		// Create a model, as complete as possible
		TestApplication app = new TestApplication();
		app.bindWithApplication( "externalExportPrefix1", "application 1" );
		app.bindWithApplication( "externalExportPrefix1", "application 2" );
		app.bindWithApplication( "externalExportPrefix2", "application 3" );

		app.setName( "My Application with special chàràcters" );
		app.getTemplate().externalExports.put( "internalGraphVariable", "variableAlias" );
		app.getTemplate().setExternalExportsPrefix( "externalExportPrefix" );
		app.getTemplate().setDescription( "some description" );

		// Serialize things and generate the examples
		// (*) Applications
		writer = new StringWriter();
		mapper.writeValue( writer, app );
		String s = writer.toString();
		convertToTypes( s, Application.class, newDef );

		// (*) Application Templates
		writer = new StringWriter();
		mapper.writeValue( writer, app.getTemplate());
		s = writer.toString();
		convertToTypes( s, ApplicationTemplate.class, newDef );

		// Etc...

		return newDef;
	}


	/**
	 * @param newDef the new "definitions" object
	 * @throws IOException if something went wrong
	 */
	private void updateSwaggerJson( JsonObject newDef ) throws IOException {

		File f = new File( "target/docs/apidocs/ui/swagger.json" );
		if( ! f.exists())
			throw new RuntimeException( "The swagger.json file was not found." );

		JsonParser jsonParser = new JsonParser();
		String content = Utils.readFileContent( f );

		// Hack the file content directly here.
		// Do whatever raw operations you want.

		JsonElement jsonTree = jsonParser.parse( content );

		Set<String> currentTypes = new HashSet<> ();
		for( Map.Entry<String,JsonElement> entry : jsonTree.getAsJsonObject().get( "definitions" ).getAsJsonObject().entrySet()) {
			currentTypes.add( entry.getKey());
		}

		Set<String> newTypes = new HashSet<> ();
		for( Map.Entry<String,JsonElement> entry : newDef.entrySet()) {
			newTypes.add( entry.getKey());
		}

		currentTypes.removeAll( newTypes );
		for( String s : currentTypes ) {
			System.out.println( "Type not appearing in the updated swagger definitions: " + s );
		}

		Gson gson = new GsonBuilder().setPrettyPrinting().create();
		jsonTree.getAsJsonObject().add( "definitions", jsonParser.parse( gson.toJson( newDef )));
		String json = gson.toJson( jsonTree );
		Utils.writeStringInto( json, f );
	}


	/**
	 * Creates a JSon object from a serialization result.
	 * @param serialization the serialization result
	 * @param clazz the class for which this serialization was made
	 * @param newDef the new definition object to update
	 */
	public void convertToTypes( String serialization, Class<?> clazz, JsonObject newDef ) {
		convertToTypes( serialization, clazz.getSimpleName(), newDef );
		this.processedClasses.add( clazz );
	}


	/**
	 * Creates a JSon object from a serialization result.
	 * @param serialization the serialization result
	 * @param className a class or type name
	 * @param newDef the new definition object to update
	 */
	public void convertToTypes( String serialization, String className, JsonObject newDef ) {

		JsonParser jsonParser = new JsonParser();
		JsonElement jsonTree = jsonParser.parse( serialization );

		// Creating the swagger definition
		JsonObject innerObject = new JsonObject();

		// Start adding basic properties
		innerObject.addProperty( "title", className );
		innerObject.addProperty( "definition", "" );
		innerObject.addProperty( "type", jsonTree.isJsonObject() ? "object" : jsonTree.isJsonArray() ? "array" : "string" );

		// Prevent errors with classic Swagger UI
		innerObject.addProperty( "properties", "" );

		// Inner properties
		innerObject.add( "example", jsonTree.getAsJsonObject());

		// Update our global definition
		newDef.add( "json_" + className, innerObject );
	}
}

And that’s it.
In our implementation, we dropped the property keys of the swagger.json file. Instead, we use the example key. Swagger UI does display it well and does not show any error. Instead of showing some kind of schema, it shows an example of a JSon structure.

Testing the Coherence with our Mapper

How can we be sure this class generates examples for all the types we use in our custom object mapper? Well, let’s just write a unit test for that!

public class UpdateSwaggerJsonTest {

	@Test
	public void verifyProcessedClasses() throws Exception {

		UpdateSwaggerJson updater = new UpdateSwaggerJson();
		updater.prepareNewDefinitions();

		Set<Class<?>> classes = new HashSet<> ();

		// You need a registry somewhere that lists all the classes managed by your object mapper
		classes.addAll( JSonBindingUtils.SERIALIZERS.keySet());
		
		// Remove those you processed in the updater.
		classes.removeAll( updater.processedClasses );

		// They should all have been processed.
		Assert.assertEquals( Collections.emptySet(), classes );
	}
}

Just to show how you how we dealt with this class registry, here is a short snippet taken from our custom object mapper.

public final class JSonBindingUtils {

	public static final Map<Class<?>,? super JsonSerializer<?>> SERIALIZERS = new HashMap<> ();

	static {
		SERIALIZERS.put( Instance.class, new InstanceSerializer());
		SERIALIZERS.put( ApplicationTemplate.class, new ApplicationTemplateSerializer());
		SERIALIZERS.put( Application.class, new ApplicationSerializer());
	}


	/**
	 * Creates a mapper with specific binding for Roboconf types.
	 * @return a non-null, configured mapper
	 */
	@SuppressWarnings( { "unchecked", "rawtypes" } )
	public static ObjectMapper createObjectMapper() {

		ObjectMapper mapper = new ObjectMapper();
		mapper.configure( DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false );
		SimpleModule module = new SimpleModule( "RoboconfModule", new Version( 1, 0, 0, null, null, null ));

		for( Map.Entry<Class<?>,? super JsonSerializer<?>> entry : SERIALIZERS.entrySet())
			module.addSerializer((Class) entry.getKey(), (JsonSerializer) entry.getValue());

		mapper.registerModule( module );
		return mapper;
	}

Conclusion

This solution (or workaround) may not seem ideal.
However, custom object mappers are somehow a workaround themselves. They require you to code things. Therefore, it is not surprising we may have to code some little things to hook up with Swagger-based documentation.

IMO, the code shown here remains quite simple to maintain.
And it allows you to add unit tests to verify assertions on your swagger.json file.

I created a Gist for all the sources here. You can see them in action in the sources of Roboconf’s REST API.

Auto-repaired connections with RabbitMQ

Here is a short article about making RabbitMQ’s Java clients automatically repair their connections. Indeed, sometimes, network issues may result in broken connections.

Since version 3.3.0, default Java clients for RabbitMQ have options for this situation. Almost everything is in the factory that creates the channel and consumers.

ConnectionFactory factory = new ConnectionFactory();
factory.setUsername( messageServerUsername );
factory.setPassword( messageServerPassword );

// Timeout for connection establishment: 5s
factory.setConnectionTimeout( 5000 );

// Configure automatic reconnections
factory.setAutomaticRecoveryEnabled( true );

// Recovery interval: 10s
factory.setNetworkRecoveryInterval( 10000 );

// Exchanges and so on should be redeclared if necessary
factory.setTopologyRecoveryEnabled( true );

When an established connection gets broken, the client will automatically try to recover everything, including queues, exchanges and bindings. Please, refer to the user guide for more details. From my experience, the only part that can be a problem is about consumers.

Indeed, in my code, I used to rely on QueueingConsumers. This class is perfectly working, except it is deprecated and that it breaks connection recovery. So, if you are using it and you want connection recovery, you MUST replace it by your own consumer.

import java.io.IOException;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.logging.Logger;

import net.roboconf.core.utils.Utils;
import net.roboconf.messaging.api.messages.Message;
import net.roboconf.messaging.api.utils.SerializationUtils;

import com.rabbitmq.client.AMQP.BasicProperties;
import com.rabbitmq.client.Channel;
import com.rabbitmq.client.Consumer;
import com.rabbitmq.client.DefaultConsumer;
import com.rabbitmq.client.Envelope;
import com.rabbitmq.client.ShutdownSignalException;

/**
 * Notice: QueueingConsumer is deprecated, hence this implementation that supports recovery.
 */
public class MyConsumer extends DefaultConsumer implements Consumer {

	private final Logger logger = Logger.getLogger( getClass().getName());


	/**
	 * Constructor.
	 * @param channel
	 */
	public MyConsumer( Channel channel ) {
		super( channel );
	}


	@Override
	public void handleDelivery( String consumerTag, Envelope envelope, BasicProperties properties, byte[] body )
	throws IOException {

		// Do what you have to do with your message.
		// Prefer a short processing...
	}


	@Override
	public void handleShutdownSignal( String consumerTag, ShutdownSignalException sig ) {

		if( sig.isInitiatedByApplication()) {
			this.logger.fine( "The connection to the messaging server was shut down." + id( consumerTag ));

		} else if( sig.getReference() instanceof Channel ) {
			int nb = ((Channel) sig.getReference()).getChannelNumber();
			this.logger.fine( "A RabbitMQ consumer was shut down. Channel #" + nb + ", " + id( consumerTag ));

		} else {
			this.logger.fine( "A RabbitMQ consumer was shut down." + id( consumerTag ));
		}
	}


	@Override
	public void handleCancelOk( String consumerTag ) {
		this.logger.fine( "A RabbitMQ consumer stops listening to new messages." + id( consumerTag ));
	}


	@Override
	public void handleCancel( String consumerTag ) throws IOException {
		this.logger.fine( "A RabbitMQ consumer UNEXPECTABLY stops listening to new messages." + id( consumerTag ));
	}


	/**
	 * @param consumerTag a consumer tag
	 * @return a readable ID of this consumer
	 */
	private String id( String consumerTag ) {

		StringBuilder sb = new StringBuilder();
		sb.append( " Consumer tag = " );
		sb.append( consumerTag );
		sb.append( ")" );

		return sb.toString();
	}
}

Before testing for real, and to be able to debug efficiently such code, you should add a recovery listener on your connections.

Channel channel = factory.newConnection().createChannel();

// Add a recoverable listener (when broken connections are recovered).
// Given the way the RabbitMQ factory is configured, the channel should be "recoverable".
((Recoverable) this.channel).addRecoveryListener( new MyRecoveryListener());

And a logging listener…

import java.util.logging.Logger;

import com.rabbitmq.client.Channel;
import com.rabbitmq.client.Recoverable;
import com.rabbitmq.client.RecoveryListener;

public class MyRecoveryListener implements RecoveryListener {

	private final Logger logger = Logger.getLogger( getClass().getName());


	@Override
	public void handleRecovery( Recoverable recoverable ) {

		if( recoverable instanceof Channel ) {
			int channelNumber = ((Channel) recoverable).getChannelNumber();
			this.logger.fine( "Connection to channel #" + channelNumber + " was recovered." );
		}
	}
}

Now, you can test recovery for real.
Start your RabbitMQ server and then your client. Verify the connection is established. Then, turn off RabbitMQ and check your logs, the connection should appear as broken. Turn the server on. Wait few seconds and verify the connection was recovered.

If you do not want to turn off RabbitMQ, you can also play with your firewall (e.g. iptables) and disabled temporarily connections to the server. Both cases work with the code above.

Notice that connection recovery only works when a connection was successfully established. If you start the RabbitMQ server AFTER your client, recovery will not work. You may then consider using the Lyra project.

Passing parameters to a JUnit runner

This article explains how to pass parameters to a JUnit runner (through annotated test classes). Although I used an extension of Pax-Exam’s JUnit runner, this can be applied to any JUnit runner.

In few words…

I have a custom JUnit runner that had to check assertions on the test environment before running tests. By assertions, I mean “is Docker installed on the machine?”, “is RabbitMQ installed?”, “is RabbitMQ installed with credentials user/pwd?”, etc.

These tests are not unit tests, but integration tests.
And they rely on JUnit. Or more exactly, they rely on Pax-Exam, which itself rely on JUnit. Pax-Exam is a tool that allows to run tests within an OSGi environment (Apache Karaf in my case). Tests that use a probe work as follows: first, the OSGi container is deployed and/or started. Then, a probe is injected in this container (as an OSGi bundle) and it is this probe that runs the tests.

Usually, with JUnit, one would use the Assume functions to verify a test can run. If a condition verified by Assume is not satisfied, then the test is skipped. With Pax-Exam, this would not be very efficient, because before running the test, and thus detecting an assumption is wrong, the OSGi container would have been set up. And this takes time. What I wanted was to verify assertions on the environment BEFORE the container was set up. So, I sub-classed Pax’s runner, which itself extends JUnit’s runner.

And since not all my tests have the same requirements, I had to find a way to pass parameters to my JUnit runner. Notice this is not parameterized tests. Parameterized tests imply passing parameters to a test class. What I wanted was to pass parameters to the runner (the class that runs the tests) through the test class.

The solution

Well, nothing original here. I created a custom Java annotation.

import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;

@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.TYPE)
public @interface RoboconfITConfiguration {

	/**
	 * @return true if the test requires RabbitMQ running with default credentials
	 */
	boolean withRabbitMq() default true;

	/**
	 * @return true if the test requires RabbitMQ running with "advanced" credentials
	 */
	boolean withComplexRabbitMq() default false;

	/**
	 * @return true if the test required Docker to be installed on the local machine
	 */
	boolean withDocker() default false;
}

It is used by my test classes to specify assertions on the environment.

@RunWith( RoboconfPaxRunner.class )
@RoboconfITConfiguration( withDocker = true, withComplexRabbitMq = true )
@ExamReactorStrategy( PerMethod.class )
public class LocalDockerWithAgentChecksTest {

When my runner loads the test class, it verifies if it is annotated. And if so, it verifies the assumptions on the environment. If one fails, then the test is skipped. This has the advantage of being able to compile and run as much tests as possible, even when the compile environment is not complete.

Here is the global look of my runner.
It only adds an overlay on the default Pax-Exam runner.

public class RoboconfPaxRunner extends PaxExam {

	private final Class<?> testClass;


	/**
	 * Constructor.
	 * @param klass
	 * @throws InitializationError
	 */
	public RoboconfPaxRunner( Class<?> klass ) throws InitializationError {
		super( klass );
		this.testClass = klass;
	}


	@Override
	public void run( RunNotifier notifier ) {

		boolean runTheTest = true;
		if( this.testClass.isAnnotationPresent( RoboconfITConfiguration.class )) {
			RoboconfITConfiguration annotation = this.testClass.getAnnotation( RoboconfITConfiguration.class );

			// Default RMQ settings
			if( annotation.withRabbitMq()
					&& ! RabbitMqTestUtils.checkRabbitMqIsRunning()) {
				Description description = Description.createSuiteDescription( this.testClass );
				notifier.fireTestAssumptionFailed( new Failure( description, new Exception( "RabbitMQ is not running." )));
				runTheTest = false;
			}

			// Advanced RMQ settings
			else { /* etc. */ }
		}

		// No annotation? Consider RMQ must be installed by default.
		else if( ! RabbitMqTestUtils.checkRabbitMqIsRunning()) {
			Description description = Description.createSuiteDescription( this.testClass );
			notifier.fireTestAssumptionFailed( new Failure( description, new Exception( "RabbitMQ is not running." )));
			runTheTest = false;
		}

		// If everything is good, run the test
		if( runTheTest )
			super.run( notifier );
	}
}