Let's assume that we have to execute a bunch of acceptance tests with a BDD framework like Cucumber as part of a Maven build.
Using Maven Failsafe Plugin is not complex. But it has an implicit requirement:
The container that hosts the implementation we are about to test needs to be already running.
Many containers like Jetty or JBoss provide their own Maven plugins, to allow to start the server as part of a Maven job. And there is also the good generic Maven Cargo plguin that offers an implementation of the same behavior for many different container.
These plugins allow for instance, to start the server at the beginning of a Maven job, deploy the implementation that you want to test, fire your tests and stop the server at the end.
All the mechanisms that I have described work and they are usually very useful for the various testing approaches.
Unluckily, I cannot apply this solution if my container is not a supported container. Unless obviuosly, I decide to write a custom plugin or add the support to my specific container to Maven Cargo.
In my specific case I had to find a way to use Red Hat's JBoss Fuse, a Karaf based container.
I decided to try keeping it easy and to not write a full featured Maven plugin and eventually to rely to GMaven plugin, or how I have recently read on the internet the "Poor Man's Gradle".
GMaven is basically a plugin to add Groovy support to you Maven job, allowing you to execute snippets of Groovy as part of your job. I like it because it allows me to inline scripts directly in the
pom.xml
. It permits you also to define your script in a separate file and execute it, but that is exactly the same behaviour you could achieve with plain java and Maven Exec Plugin; a solution that I do not like much because hides the implementation and makes harder to imagine what the full build is trying to achieve.
Obviously this approach make sense if the script you are about to write are simple enough to be autodescriptive.
I will describe my solution starting with sharing with you my trial and errors and references to various articles and posts I have found:
At first I have considered to use Maven Exec Plugin to directly launch my container. Something like what was suggested here
http://stackoverflow.com/questions/3491937/i-want-to-execute-shell-commands-from-mavens-pom-xml
That plugin invocation, as part of a Maven job, actually allows me to start the container, but it has a huge drawback: he Maven lifecycle stops until the external process terminates or is manually stopped.org.codehaus.mojo exec-maven-plugin 1.1.1 some-execution compile exec hostname
This is because the external process execution is "synchronous" and Maven doesn't consider the command execution finished, so, it never goes on with the rest of the build instructions.
This is not what I needed, so I have looked for something different.
At first I have found this suggestion to start a background process to allow Maven not to block:
http://mojo.10943.n7.nabble.com/exec-maven-plugin-How-to-start-a-background-process-with-exec-exec-td36097.html
The idea here is to execute a shell script, that start a background process and that immediately returns.
and the script isorg.codehaus.mojo exec-maven-plugin 1.2.1 start-server pre-integration-test exec src/test/scripts/run.sh {server.home}/bin/server
#! /bin/sh $* > /dev/null 2>&1 & exit 0
This approach actually works. My Maven build doesn't stop and the next lifecycle steps are executed.
But I have a new problem now.
My next steps are immediately executed.
I have no way to trigger the continuation only after my container is up and running.
Browsing a little more I have found this nice article:
http://avianey.blogspot.co.uk/2012/12/maven-it-case-background-process.html
The article, very well written, seems to describe exactly my scenario. It's also applied to my exact context, trying to start a flavour of Karaf.
It uses a different approach to start the process in background, the Antrun Maven plugin. I gave it a try and unluckily I am in the exact same situation as before. The integration tests are executed immediately, after the request to start the container but before the container is ready.
Convinced that I couldn't find any ready solution I decided to hack the current one with the help of some imperative code.
I thought that I could insert a "wait script", after the start request but before integration test are fired, that could check for a condition that assures me that the container is available.
So, if the container is started during this phase:
pre-integration-test
and my acceptance tests are started during the very next
integration-test
I can insert some logic in
pre-integration-test
that keeps polling my container and that returns only after the container is "considered" available.import static com.jayway.restassured.RestAssured.*; println("Wait for FUSE to be available") for(int i = 0; i < 30; i++) { try{ def response = with().get("http://localhost:8383/hawtio") def status = response.getStatusLine() println(status) } catch(Exception e){ Thread.sleep(1000) continue }finally{ print(".") } if( !(status ==~ /.*OK.*/) ) Thread.sleep(1000) }
And is executed by this GMaven instance:
My (ugly) script, uses Rest-assured and an exception based logic to check for 30 seconds if a web resource, that I know my container is deploying will be available.org.codehaus.gmaven gmaven-plugin 1.8 ########### wait for FUSE to be available ############ pre-integration-test execute <![CDATA[ import static com.jayway.restassured.RestAssured.*; ... } ]]>
This check is not as robust as I'd like to, since it checks for a specific resource but it's not necessary a confirmation that the whole deploy process has finished. Eventually, a better solution would be use some management API that could be able to check the state of the container, but honestly I do not know if they are exposed by Karaf and my simple check was enough for my limited use case.
With the GMaven invocation, now my maven build is behaving like I wanted.
This post showed a way to enrich your Maven script with some programmatic logic without the need of writing a full featured Maven plugin. Since you have full access to the Groovy context, you can think to perform any kind of task that you could find helpful. For instance you could also start background threads that will allow the Maven lifecycle to progress while keep executing your logic.
My last suggestion is to try keeping the logic in your scripts simple and to not turn them in long and complex programs. Readability was the reason I decided to use rest-assured instead of direct access to Apache HttpClient.
This is a sample full
pom.xml
4.0.0 ${groupId}.${artifactId} xxxxxxx esb 1.0.0-SNAPSHOT acceptance /data/software/RedHat/FUSE/fuse_full/jboss-fuse-6.0.0.redhat-024/bin/ maven-failsafe-plugin 2.12.2 integration-test verify org.apache.maven.plugins maven-surefire-plugin **/*Test*.java integration-test test integration-test none **/RunCucumberTests.java maven-antrun-plugin 1.6 ############## start-fuse ################ pre-integration-test run maven-antrun-plugin 1.6 ############## stop-fuse ################ post-integration-test run org.codehaus.gmaven gmaven-plugin 1.8 ########### wait for FUSE to be available ############ pre-integration-test execute <![CDATA[ import static com.jayway.restassured.RestAssured.*; println("Wait for FUSE to be available") for(int i = 0; i < 30; i++) { try{ def response = with().get("http://localhost:8383/hawtio") def status = response.getStatusLine() println(status) } catch(Exception e){ Thread.sleep(1000) continue }finally{ print(".") } if( !(status ==~ /.*OK.*/) ) Thread.sleep(1000) } ]]> info.cukes cucumber-java ${cucumber.version} test info.cukes cucumber-picocontainer ${cucumber.version} test info.cukes cucumber-junit ${cucumber.version} test junit junit 4.11 test org.apache.httpcomponents httpclient 4.2.5 com.jayway.restassured rest-assured 1.8.1