jueves, julio 26, 2012

Answering with Mockito

Overwhelmed by industry, Searching for a modern day savior from another place, Inclined toward charity (The Answer - Bad Religion)

When you are writing unit tests, you must keep in mind to not have dependencies to external components. To avoid this we use mock frameworks which for me the easiest one to use is Mockito.

In this post we are going to see an "advanced" technique used in Mockito to return same argument instance on a mocked method using Answer interface.

Suppose we are writing unit tests for class which manages Person and Job classes and as operation it uses a DAO class for inserting the relationship class (M:N) between Person and Job called PersonJob.

For example class under test will look something like:

So in this case it seems obvious that you need to mock personJobDao.

Let's create the mock and record the interaction:

Yes as you can see you don't know what to return, because instance is created by class under test and in the test method you don't know which instance is created by createPersonJob method. To solve this problem, you need to use thenAnswer instead of thenReturn method:

Note that Answer interface requires you to implement answer method, which in our case simply returns the first argument (PersonJob instance) of personJobDao.create method.

Now we can write assertions in peace without worrying about returned instance.

Hope you have found this post useful.

We keep learning

Download Code
Music: http://www.youtube.com/watch?v=S2a3q0nIsoM

martes, julio 17, 2012

JaCoCo in Maven Multi-Module Projects

Can you blow my whistle baby, whistle baby, Let me know Girl I'm gonna show you how to do it And we start real slow You just put your lips together. (Whistle - Flo Rida)
Code coverage is an important measure used during our development that describes the degree to which source code is tested.

In this post I am going to explain how to run code coverage using Maven and JaCoCo plugin in multi-module projects.

JaCoCo is a code coverage library for Java, which has been created by the EclEmma team. It has a plugin for Eclipse, and can be run with Ant and Maven too.

Now we will focus only in Maven approach.

In a project with only one module is as easy as registering a build plugin:

And now running mvn package, in site/jacoco directory, a coverage report will be present in different formats.

But with multimodule projects a new problem arises. How to merge metrics of all subprojects into only one file, so we can have a quick overview of all subprojects? For now Maven JaCoCo Plugin does not support it.

There are many alternatives and I am going to cite the most common:

  • Sonar. It has the disadvantage that you need to install Sonar (maybe you are already using, but maybe not).
  • Jenkins. Plugin for JaCoCo is still under development. Moreover you need to run a build job to inspect your coverage. This is good in terms of continuous integration but could be a problem if you are trying to "catch" some piece of code that has not covered with already implemented tests.
  • Arquillian JaCoCo Extension. Arquillian is a container test framework that has an extension which during test execution can capture the coverage. Also a good option if you are using Arquillian. The disadvantage is that maybe your project does not require a container.
  • Ant. You can use Ant task with Maven. JaCoCo Ant task can merge results from multiple JaCoCo files result. Note that is the most generic solution, and this is the chosen approach that we are going to use.
First thing to do is add JaCoCo plugin to parent pom so all projects could generate coverage report. Of course if there are modules which does not require coverage, plugin definition should be changed from parent pom to specific projects.

Next step is creating a specific submodule for appending all results of JaCoCo plugin by using Ant task. I suggest  using something like project-name-coverage.

Then let's open generated pom.xml and we are going to insert required plugins to join all coverage information. To append them, as we have already written we are going to use a JaCoCo Ant task which has the ability to open all JaCoCo output files and append all their content into one. So first thing to do is download the jar which contains the JaCoCo Ant task. To automatize download process, we are going to use maven dependency plugin:

During process-test-resources phase Jacoco Ant artifact will be downloaded and copied to target directory, so can be registered into pom without worrying about jar location.

We also need a way to handle Ant tasks from Maven. And this is as simple as using maven antrun plugin, which you can specify any ant command in its configuration section. See next simple example:

Notice that into target tag we can specify any Ant task. And now we are ready to start configuring JaCoCo Ant task. JaCoCo report plugin requires you set the location of build directory, class directory, source directory or generated-source directory. For this purpose we are going set them as properties.

And now the Ant task part which will go into target tag of antrun plugin.

First we need to define report task.

See that org.jacoco.ant.jar file is downloaded by dependency plugin, you don't need to worry about copying it manually.

Then we are going to call report task as defined in taskdef section.

Within executiondata element, we specify locations where JaCoCo execution data files are stored. By default is target directory, and for each project we need to add one entry for each submodule.

Next element is structure. This element defines the report structure, and can be defined with hierarchy of group elements. Each group  should contain class files and source files of all projects that belongs to that group. In our example only one group is used.

And finally we are setting output format using html, xml and csv tags.

Complete Code:

And now simply run mvn clean verify and in my-project-coverage/target/coverage-report, a report with code coverage of all projects will be presented.

Hope you find this post useful.

We Keep Learning,

Download Code
Music: http://www.youtube.com/watch?v=cSnkWzZ7ZAA

lunes, julio 02, 2012

NoSQLUnit 0.3.1 Released

And the last known survivor, Stalks his prey in the night, And he's watching us all with the, Eye of the tiger (Eye Of The Tiger - Survivor)

NoSQLUnit is a JUnit extension to make writing unit and integration tests of systems that use NoSQL backend easier. Visit official page for more information.

In 0.3.1 release, two new features among some bug fixes has been implemented. These two new features are the ability of create simultaneous connections to different backends at same test, and partial support for jsr-330 specification.

Simultaneous engines

Sometimes applications will contain more than one NoSQL engine, for example some parts of your model will be expressed better as a graph ( Neo4J for example), but other parts will be more natural in a column way (for example using Cassandra ). NoSQLUnit supports this kind of scenarios by providing in integration tests a way to not load all datasets into one system, but choosing which datasets are stored in each backend.

For declaring more than one engine, you must give a name to each database Rule using connectionIdentifier() method in configuration instance.

And also you need to provide an identified dataset for each engine, by using withSelectiveLocations attribute of @UsingDataSet annotation. You must set up the pair "named connection" / datasets.

In example we are refreshing database declared on previous example with data located at test3 file.

Also works in expectations annotation:

When you use more than one engine at a time you should take under consideration next rules:
  • If location attribute is set, it will use it and will ignore withSelectiveMatcher attribute data. Location data is populated through all registered systems.
  • If location is not set, then system tries to insert data defined in withSelectiveMatcher attribute to each backend.
  • If withSelectiveMatcher attribute is not set, then default strategy is taken. Note that default strategy will replicate all datasets to defined engines.
You can also use the same approach for inserting data into same engine but in different databases. If you have one MongoDb instance with two databases, you can also write tests for both databases at one time. For example:

Support for JSR-330

NoSQLUnit supports two annotations of JSR-330 aka Dependency Injection for Java. Concretely @Inject and @Named annotations.

During test execution you may need to access underlying class used to load and assert data to execute extra operations to backend. NoSQLUnit will inspect @Inject annotations of test fields, and try to set own driver to attribute. For example in case of MongoDb, com.mongodb.Mongo instance will be injected.

Note that in example we are setting this as second parameter to the Rule.

But if you are using more than one engine at same time, you need a way to distinguish each connection. For fixing this problem, you must use @Named annotation by putting the identifier given in configuration instance. For example:

Next release will support Neo4J and Cassandra. Stay in touch with the project and of course I am opened to any ideas that you think that could make NoSQLUnit better.