Mocking Nexus API for a Local Maven Repository

For the Roboconf project, the build of our Docker images relies on Sonatype OSS repository.
We use Nexus’ Core API to dynamically retrieve Maven artifacts. That’s really convenient. However, we quickly needed to be able to download local artifacts, for test purpose (without going through a remote Maven repository). Let’s call it a developer scope.

After searching for many solutions, we finally decided to mock Nexus’ API locally and specify to our build process where to download artifacts. We really wanted something light and simple. Besides, we were only interested by the redirect operation. Loading a real Nexus was too heavy. And we really wanted to use a local Maven repository, the same one that developers usually populate. The idea of volume was very pleasant.

So, we made a partial implementation of Nexus’ Core API.
We used NodeJS (efficient for I/O) and Restify. Restify allows to implement a REST API very easily. And NodeJS comes with a very small Docker image (we took the one based on Alpine).

The server class is quite simple.
We expect the local Maven repository to be loaded as a volume in the Docker container. We handle SHA1 requests specifically, as developers generally do not use the profile that generate hashes.

'user strict';
  
var restify = require('restify');
var fs = require('fs');


/**
 * Computes the hash (SHA1) of a file.
 * <p>
 * By default, local Maven repositories do not contain
 * hashes as we do not activate the profiles. So, we compute them on the fly.
 * </p>
 * 
 * @param filePath
 * @param res
 * @param next
 * @returns nothing
 */
function computeSha1(filePath, res, next) {

  var crypto = require('crypto'),
    hash = crypto.createHash('sha1'),
    stream = fs.createReadStream(filePath);

  stream.on('data', function (data) {
    hash.update(data, 'utf8')
  });

  stream.on('end', function () {
    var result = hash.digest('hex');
    res.end(result);
    next();
  });
}


/**
 * The function that handles the response for the "redirect" operation.
 * @param req
 * @param res
 * @param next
 * @returns nothing
 */
function respond(req, res, next) {

  var fileName = req.params.a +
  '-' + req.params.v +
  '.' + req.params.p;

  var filePath = '/home/maven/repository/' +
    req.params.g.replace('.','/') +
    '/' + req.params.a +
    '/' + req.params.v +
    '/' + fileName;

  fs.exists(filePath, function(exists){
    if (filePath.indexOf('.sha1', filePath.length - 5) !== -1) {
      filePath = filePath.slice(0,-5);
      computeSha1(filePath,res, next);
    }

    else if (! exists) {
      res.writeHead(400, {'Content-Type': 'text/plain'});
      res.end('ERROR File ' + filePath + ' does NOT Exists');
      next();
    }

    else {
      res.writeHead(200, {
        'Content-Type': 'application/octet-stream',
        'Content-Disposition' : 'attachment; filename=' + fileName});
      fs.createReadStream(filePath).pipe(res);
      next();
    }
  });
}


// Server setup

const server = restify.createServer({
  name: 'mock-for-nexus-api',
  version: '1.0.0'
});

server.use(restify.plugins.queryParser());
server.get('/redirect', respond);

server.listen(9090, function() {
  console.log('%s listening at %s', server.name, server.url);
});

Eventually, here is the Dockerfile, which ships NodeJS and our web application to be used with Docker.

FROM node:8-alpine

LABEL maintainer="The Roboconf Team" \
      github="https://github.com/roboconf"

EXPOSE 9090
COPY ./*.* /usr/src/app/
WORKDIR /usr/src/app/
RUN npm install
CMD [ "npm", "start" ]

We then run…

docker run -d –rm -p 9090:9090 -v /home/me/.m2:/home/maven:ro roboconf/mock-for-nexus-api

And our other build process downloads local Maven artifacts from http://localhost:9090/redirect
You can find the full project on Github. If anyone faces the same problem, I hope this article will provide some hints.

Checking root resources with Checkstyle and Maven

The Maven plug-in for Checkstyle allows to run Checkstyle on your project’s resources. However, options were specialized for a Maven structure. It means it was designed to work on source, resource and test directories. But nothing is built-in for root resources, like your pom.xml files.

I had submitted an enhancement request on Apache’s JIRA. At the time, I had even though about submitting a patch for it. Other things happened then and I somehow put this issue apart to focus on other priorities. Fortunately, NoĆ©mi Balassa posted a workaround yesterday. I tested it this evening and it works great. I though it might be useful to share it here.

Here is the sample configuration I used for the Maven Checkstyle plug-in.

<plugin>
	<groupId>org.apache.maven.plugins</groupId>
	<artifactId>maven-checkstyle-plugin</artifactId>
	<version>2.13</version>
	<configuration>
		<consoleOutput>true</consoleOutput>
		<logViolationsToConsole>true</logViolationsToConsole>
		<includeTestSourceDirectory>true</includeTestSourceDirectory>
	</configuration>
			
	<executions>
		<!-- ... -->
					
		<!-- XML Files -->
		<execution>
			<id>check-xml</id>
			<phase>process-sources</phase>
			<goals>
				<goal>check</goal>
			</goals>
			<configuration>
				<configLocation>${build.resources.url}/checkstyle/checkstyle-xml-rules.xml</configLocation>
				<headerLocation>${build.resources.url}/checkstyle/header-xml.txt</headerLocation>

				<includeResources>false</includeResources>
				<includeTestResources>false</includeTestResources>

				<!-- Process the pom.xml file too -->
				<sourceDirectory>${project.basedir}</sourceDirectory>
				<testSourceDirectories />
				<includes>**/*.xml, **/*.xsd</includes>
				<excludes>**/target/**/*, **/*.properties</excludes>
			</configuration>
		</execution>
	</executions>
</plugin>

It goes in the build section of your POM.

As you can see, we consider the project’s directory as a source one. We exclude the Maven’s target directory from the search. Resource and test directories are already searched in the source processing. So, no need to enable something specific for them.

It works well and it is not that much complicated.

Web-hosted CSS and the Maven Javadoc plugin

The Maven Javadoc plug-in allows to generate JAR files that contain the Javadoc of a given project/module. This is necessary, as an example, when you want to publish your artifacts to Maven Central or Sonatype OSS repositories.

If you look at the options of this plug-in, you can see there are only two solutions to use a custom CSS stylesheet. Either this CSS file is located on your file disk, or it is located in a Maven dependency. For a project I am working on, we had this stylesheet located on a web server. We wanted Maven to download this file during the build process and add it to the generated JAR. I thus submitted a PR to upgrade the CSS pick action to an URL.

After some discussions, we finally agreed with Michael Osipov that there were alternative solutions to changing the code. So, here is the option I kept. During the generate-sources phase, we download the remote CSS stylesheet somewhere over the disk. And we then pass the file’s location to the Javadoc plug-in.

<build>
	<plugins>
		<!-- Download the stylesheet for the javadoc -->
		<plugin>
			<groupId>org.apache.maven.plugins</groupId>
			<artifactId>maven-antrun-plugin</artifactId>
			<version>1.8</version>
			<executions>
				<execution>
					<phase>generate-sources</phase>
					<configuration>
						<exportAntProperties>true</exportAntProperties>
						<target>
							<tempfile property="dir" destdir="${java.io.tmpdir}" />
							<mkdir dir="${dir}" />
							<property name="javadoc.stylesheet" value="${dir}/roboconf-javadoc.css" />
							<get src="${build.resources.url}/javadoc/stylesheet.css" dest="${javadoc.stylesheet}" />
						</target>	
					</configuration>
					<goals>
						<goal>run</goal>
					</goals>
				</execution>
			</executions>
		</plugin>

		<plugin>
			<groupId>org.apache.maven.plugins</groupId>
			<artifactId>maven-javadoc-plugin</artifactId>
			<version>${javadoc.plugin.version}</version>
			<executions>
				<execution>
					<id>attach-javadoc</id>
					<goals>
						<goal>jar</goal>
					</goals>
					<configuration>
						<stylesheetfile>${javadoc.stylesheet}</stylesheetfile>
					</configuration>
				</execution>
			</executions>
		</plugin>
	</plugins>
</build>

Notice that I use the Maven ANT plug-in to create a directory in the OS’ temporary directory. The ANT plug-in must at least be in version 1.7, as we export ANT properties as Maven properties. Notice also that I download the file only once and cache it in a same location, for all the modules. It avoids downloading N times when we have a multi-module project.

You might also use download single mojo instead of ANT. Eventually, Michael also suggested a pure HTML solution, using…

@import url("http://example.com/my-stylesheet.css")

But this is not exactly what we wanted to do.
With this last option, Maven artifacts assume the web server will ALWAYS be available, while the initial approach requires the server to be online at build time only.

The PR could have been useful, but I was not going to fight about it. This use case is somehow very limited, and the alternatives listed here are not that hard to setup.

Finding dependencies artifacts in your Maven plug-in

This article explains how to retrieve the location of dependency artifacts in a Maven plug-in. This solution only applies to Maven 3 (and more exactly, for versions >= 3.1). Indeed, from version 3.1, Maven’s repository management library, Aether, is hosted by Eclipse and not by Sonatype anymore.

What this article explains is not new.
I just gathered various pieces from here and there. However, they were not that easy to find and this is why I wrote this post.

Here is the beginning of the story.
I have a Maven plug-in with its own project type. The packaging is a ZIP file. And dependencies to the same kind of project can be specified in the POM. So, basically, my plug-in needed to be able to locate a dependency’s built package, be it on a remote Maven repository, in the local repository, or even in the reactor. Looking into the local repository was not hard.

Injecting the local repository component in my mojo was working fine.

@Parameter( defaultValue = "${localRepository}", readonly = true, required = true )
private ArtifactRepository local;
@Override
public void execute() throws MojoExecutionException, MojoFailureException {

	for( Artifact unresolvedArtifact : this.project.getDependencyArtifacts()) {

		// Find the artifact in the local repository.
		Artifact art = this.local.find( unresolvedArtifact );

		File file = art.getFile();
		// ...
}

What was more complicated was dealing with reactor packaging.
What does it mean? If you type in mvn clean install, Maven will package your projects and copy the artifacts in the local repository. But, if you type in mvn clean package, the built artifacts will not be copied to the local repository. They will only be available in the target directory.

Consequently, if you have a multi-module project and dependencies across these modules (one of them depends on another one), and that you type mvn clean package, the build will fail because one of the dependencies was not resolved. In fact, Maven, or at least, Maven 3, is totally able of resolving such an artifact. The artifact location will not be in the local repository but under the module’s target directory. You only need to know which code use.

The solution for this problem lies into a single thing. We address remote, local and reactor dependencies resolution in the same way.

package whatever;

import java.io.File;
import java.util.List;

import org.apache.maven.artifact.Artifact;
import org.apache.maven.plugin.AbstractMojo;
import org.apache.maven.plugin.MojoExecutionException;
import org.apache.maven.plugin.MojoFailureException;
import org.apache.maven.plugins.annotations.Component;
import org.apache.maven.plugins.annotations.Mojo;
import org.apache.maven.plugins.annotations.Parameter;
import org.apache.maven.project.MavenProject;
import org.eclipse.aether.RepositorySystem;
import org.eclipse.aether.RepositorySystemSession;
import org.eclipse.aether.artifact.DefaultArtifact;
import org.eclipse.aether.repository.RemoteRepository;
import org.eclipse.aether.resolution.ArtifactRequest;
import org.eclipse.aether.resolution.ArtifactResolutionException;
import org.eclipse.aether.resolution.ArtifactResult;

@Mojo( name="mojoName" )
public class ReactorDependenciesResolverMojo extends AbstractMojo {

	@Parameter( defaultValue = "${project}", readonly = true )
	private MavenProject project;

	@Component
	private RepositorySystem repoSystem;

	@Parameter( defaultValue = "${repositorySystemSession}", readonly = true, required = true )
	private RepositorySystemSession repoSession;

	@Parameter( defaultValue = "${project.remoteProjectRepositories}", readonly = true, required = true )
	private List<RemoteRepository> repositories;

	@Override
	public void execute() throws MojoExecutionException, MojoFailureException {

		for( Artifact unresolvedArtifact : this.project.getDependencyArtifacts()) {

			// Here, it becomes messy. We ask Maven to resolve the artifact's location.
			// It may imply downloading it from a remote repository,
			// searching the local repository or looking into the reactor's cache.

			// To achieve this, we must use Aether
			// (the dependency mechanism behind Maven).
			String artifactId = unresolvedArtifact.getArtifactId();
			org.eclipse.aether.artifact.Artifact aetherArtifact = new DefaultArtifact(
					unresolvedArtifact.getGroupId(),
					unresolvedArtifact.getArtifactId(),
					unresolvedArtifact.getClassifier(),
					unresolvedArtifact.getType(),
					unresolvedArtifact.getVersion());

			ArtifactRequest req = new ArtifactRequest().setRepositories( this.repositories ).setArtifact( aetherArtifact );
			ArtifactResult resolutionResult;
			try {
				resolutionResult = this.repoSystem.resolveArtifact( this.repoSession, req );

			} catch( ArtifactResolutionException e ) {
				throw new MojoExecutionException( &quot;Artifact &quot; + artifactId + &quot; could not be resolved.&quot;, e );
			}

			// The file should exists, but we never know.
			File file = resolutionResult.getArtifact().getFile();
			if( file == null || ! file.exists()) {
				getLog().warn( "Artifact " + artifactId + " has no attached file. Its content will not be copied in the target model directory." );
				continue;
			}

			// Do whatever you want with the file...
		}
	}
}

I also created a Gist about it.
If you want an example of a unit test that configures such a mojo, please refer to the Gist. And if you want a real condition tests, consider writing integration tests with the Maven invoker plugin, rather than (only) relying on unit tests.