miércoles, marzo 06, 2013

NoSQLUnit 0.7.5 Released



NoSQLUnit is a JUnit extension to make writing unit and integration tests of systems that use NoSQL backend easier. Visit official page for more information.

In 0.7.5 release, next changes has been added:
We keep learning,
Alex.
Cuando la pena cae sobre mi, el mundo deja de existir, miro hacia atrás, y busco entre mis recuerdos (Entre Mis Recuerdos - Luz Casal)

Music: http://www.youtube.com/watch?v=zUnOWCURe2M

lunes, febrero 25, 2013

Code Quality stage using Jenkins


In Continuous Delivery each build is potentially shippable. This fact implies among a lot of other things, to assign a none snapshot version to your components as fast as possible so you can refer them through all the process.

Usually automated software delivery process consist of several stages like Commit stage, Code Quality, Acceptance Tests, Manual Test, Deployment, ... But let's focusing on second stage related to code quality. Note that in my previous post (http://www.lordofthejars.com/2013/02/conditional-buildstep-jenkins-plugin.html) there are some concepts that are being used here.

Second stage in continuous delivery is the code quality. This step is very important because is where we are running static code analysis for detecting possible defects (mostly possible NPE), code conventions or unnecessary object creation. Some of projects that are typically used are Checkstyle, PMD or FindBugs among others. In this case we are going to see how to use Checkstyle, but of course it is very similar in any other tool.

So the first thing to do is configure Checkstyle into our build tool (in this case Maven). Because we only want to run the static analysis in second stage of our pipeline we are going to register the Checkstyle Maven plugin into a metrics profile. Keep in mind that all plugins run for code analysis should be added into that profile.


Now that we have our pom configured with Checkstyle, we can configure Jenkins to run Code Quality stage after the first stage (explained in my previous post).

In this case we are going to use Trigger Parameterized Build plugin to execute code quality job from commit stage.

Because code of current build version has been pushed into a release branch (see my previous post) during commit stage, we need to set branch name as parameter for the code quality Jenkins job, so code can be downloaded and then run the static analysis.

In build job of our first stage, we add a Post-build Action of type Trigger parameterized build on other projects. First we open the Configure menu of first build job of pipeline and we configure it so next build job of the pipeline (helloworld-code-quality) is executed only if current job is stable. Also we define the RELEASE_BRANCH_NAME parameter with branch name.



Then let's create a new build job that will be in charge of running static code analysis, we are going to name it helloworld-code-quality.

And we configure the new build job. First of all check the option "This build is parameterized", and add a String parameter and set the name RELEASE_BRANCH_NAME. After that we can use RELEASE_BRANCH_NAME parameter in current job. So at Source Code Management section we add the repository URL and in Branches to build we set origin/${RELEASE_BRANCH_NAME}.

Then at Build section we add a Maven build step, which executes Checkstyle goal: checkstyle:checkstyle -P metrics.

And finally to have a better visibility of the result, we can install Checkstyle Jenkins plugin and publish the report. After plugin is installed, we can add a new Post-build Actions with name "Publish Checkstyle analysis result". In our case report is located at **/target/checkstyle-result.xml.



And that's all for current stage, next stage is the responsible of executing the acceptance tests, but this would be in another post.

So in summary we have learned how after code is compiled and some tests are executed (in first stage of pipeline), the Code Quality stage is run into Jenkins using Checkstyle Maven plugin.

We keep learning,
Alex
En algun lugar de un gran pais, Olvidaron construir, Un hogar donde no queme el sol, Y al nacer no haya que morir… (En Algún Lugar - Dunncan Dhu)
Music: http://www.youtube.com/watch?v=Myn7ghLQltI

miércoles, febrero 13, 2013

The Reality of Developer's Life


In this post I am going to try to illustrate in a funny way the reality of developer's life. This post is a translation of another post written in Spanish.

When you upload something to production environment:


When you find a problem solution without searching in Google:



When you close your IDE without saving the code:


When you try to fix a bug at 3AM:


When your regular expression returns what you expect it:



When my boss reported me that the module I have been working will never be used:


When I show to my boss that I have fixed a bug:


When I upload a code without tests and it works as expected:


When marketing folks show to developers what they have sold:


The first time you apply a CSS to a web page:


When the sysadmin gives you root access:


When you run your script the first time after several hours working on it:


When you go on weekend and everyone else are at office trying to fix all issues:



When your boss finds someone to fix a critical bug:


When you receive an extra paid if project ends before deadline:


When something that had worked on Friday and on Monday did not work:


When you develop without specifications:


When boss tells me that "tests are for those who doesn't know how to code":


When you update a database script and you note that you have deleted whole database:


Have you ever life one of these experiences? I hope so :D, if not maybe you are not a real developer ;)

PD: Original Source In Spanish












martes, febrero 12, 2013

Conditional BuildStep Jenkins Plugin for Improving Continuous Delivery decisions.




In Continuous Delivery each build is potentially shippable. This fact implies among a lot of other things, to assign a none snapshot version to your components as fast as possible so you can refer them through all the process.

Usually automated software delivery process consist of several stages like Commit stage, Code Quality, Acceptance Tests, Manual Test, Deployment, ... But let's focusing on the first stage. 

The first stage can contain next steps:
  • create a release branch
  • assign a version to the project
  • compile + testing
  • packaging (create a war, jar, ...)
But during the execution of one of these steps, a fail may occurs, for example code does not compile or some tests does not pass, and in this case we should delete the created release branch and stop the pipeline execution. On the other hand, if this stage ends successfully, Jenkins should run the next stage defined in pipeline.

To take this decision we are going to use Conditional BuildStep plugin, which as its name suggests, it allows us to choose which actions to fire depending on result of current job.

So after plugin is installed, we can create a job for the first stage:

So first of all, let's add a new build step (Execute Windows batch command or Execute shell) and launch a git command for creating a branch:

git checkout -b helloworld-release-%VERSION_NUMBER%.%BUILD_NUMBER%

In this case we are creating a branch with name of the project, the version number and finally the build number (which is provided by Jenkins).

Next step is to change the version of pom files (keep in mind that we are changing poms of branched project, not the "master") to current version. So let's create a new build step of type "Invoke top-level Maven targets" to invoke Versions Maven plugin.

versions:set -DnewVersion=%VERSION_NUMBER%.%BUILD_NUMBER%

Then we can call the clean install goals in a new build step.

clean install

Now is where Conditional BuildStep plugin starts to play:

If current build status is success, which means that code has been compiled and all tests pass, then we must commit the changes and pushing them to remote SCM server.

So let's create a new build step called Conditional Step (single), and set that if build status is successful, git commands are executed.




But if build fails, the branch should be removed, so let's create a new conditional build step in the same build job, which will remove the created branch from current location:



We have learned how to use Conditional BuildStep plugin to take decisions depending on the status of the current build.

We keep learning,
Alex.

PD: Although I am using this approach successfully in my projects, it is based on John Smart presentation.
The bugle sounds, the charge begins, But on this battlefield, no one wins, The smell of acrid smoke and horses' breath, As I plunge on into certain death, oh, oh (The Trooper - Iron Maiden)
Music: http://www.youtube.com/watch?v=dTaD9cd8hvw


lunes, febrero 11, 2013

NoSQLUnit 0.7.4 Released


NoSQLUnit is a JUnit extension to make writing unit and integration tests of systems that use NoSQL backend easier. Visit official page for more information.

In 0.7.4 release, next changes has been added:
We keep learning,
Alex.

I laughed at love cause I thought it was funny, but you came along and moooooved me honey, I've changed my mind, This love is fine (Great Balls Of Fire - Jerry Lee Lewis)

Music: http://www.youtube.com/watch?v=7IjgZGhHrYY

miércoles, febrero 06, 2013

NoSQLUnit Forge Plugin is Forged




NoSQLUnit is a JUnit extension to make writing unit and integration tests of systems that use NoSQL backend easier.

Forge is a core framework and next-generation shell for tooling and automation at a command line level.

With NoSQLUnit Forge Plugin we can use Forge to create tests for NoSQL databases using NoSQLUnit.

This plugin we can create three kind of tests depending on the lifecycle that is required:
  • Embedded: typically used in unit testing which starts an embedded instance of required database (not supported by all engines).
  • Managed: usually used during integration or high level tests, which starts a remote instance in the same computer where tests are run.
  • Remote: which uses already run database instances, usually in remote computers.
In current version of plugin, it supports next databases:
  • MongoDB
  • Neo4j
  • Redis
  • Cassandra
  • HBase
  • Infinispan
  • CouchDB
When we execute the main command of this plugin, one JUnit test configured with NoSQLUnit features and one dataset file will be created.

Moreover the created test will contain one method for each public method of development class under test.

The main command is nosqlunit. Then the lifecycle, which can be embedded, managed or remote. And finally depending on the chosen lifecycle some the arguments.

The common arguments are:
  • engine: we choose which database engine we want to use.
  • databaseName: we set the name of the database under test.
  • classname: we set the name of the test class created by the plugin.
  • classUnderTest: full class name of the class we want to write a test.

Embedded

There is no special arguments

Managed
  • path: home directory where NoSQL database is installed.
Remote
  • host: server address.
  • port: server port.
So for example a valid command will be:

nosqlunit managed --engine MONGODB --path /opt/mongo --databaseName test --classname MyTest --classUnderTest com.example.MyClass.java

And it creates MyTest test class under /src/test/java/com/example and a dataset file in /src/test/resources/com/example.

As almost all Forge plugins, you can install NoSQLUnit Forge Plugin by calling forge find-plugin nosqlunit and  forge install-plugin nosqlunit.

And finally remember that you can play with TAB to make your life easier.

We keep learning,
Alex.

Poder jugar al cel, això és un amor, I sentir el porc grunyir fort. Això ja no, quin horror!  (Dr Slump)
Music: http://www.youtube.com/watch?v=ts1J4lGexcg


jueves, enero 24, 2013

Jenkins Description Setter Plugin for Improving Continuous Delivery Visibility


In Continuous Delivery each build is potentially shippable. This fact implies among a lot of other things, to assign a none snapshot version to your components as fast as possible so you can refer them through all the process. I suggest creating a release branch, assign the version to the project and then start the typical pipeline (compile, tests, code quality ...) steps to release branch.

If you are using Jenkins, your build job screen will look something like:


Note that we have released the project many times, but there is no quick way to know exactly which version has been constructed in build number 40. To avoid this problem and having a quick overview of which version has been executed in each build job instance, we can use Jenkins description setter plugin. This plugin sets the description for each build, based upon a regular expression of the build log file.

So your build job screen will look something like:


Much better, now we know exactly the result of a build job and which product version has been generated.

So first step is installing the plugin by simply going to:

Jenkins -> Manage Jenkins -> Manage Plugins -> Available

After installation you can open Build Job configuration screen and add a post-build action called "Set build description". Then add a regular expression for extracting the version number. In this case the regular expression is:

\[INFO\]         from version 0\.0\.1-SNAPSHOT to (.*)

Take a look at next fragment of build log file:

[INFO] Scanning for projects...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building hello 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- versions-maven-plugin:2.0:set (default-cli) @ hello ---
[INFO] Searching for local aggregator root...
[INFO] Local aggregation root: /jobs/helloworld-inital-build/workspace
[INFO] Processing com.lordofthejars.helloworld:hello
[INFO]     Updating project com.lordofthejars.helloworld:hello
[INFO]         from version 0.0.1-SNAPSHOT to 1.0.43
Props: {project.version=1.0.43, project.artifactId=hello, project.groupId=com.lordofthejars.helloworld}


At line 12 we are logging the final version of our product for current pipeline execution, so we create a regular expression which parses that line and the part between brackets are used as decorator.

Depending on log traces the regular expression will differ from this one. In this case, we are always using the same SNAPSHOT version in development and only when product is going to be released (this could be 3 times per day or every night) the final version is generated and set.

Hope this plugin helps you to make your builds more clear.

We keep learning,
Alex.
Yellow diamonds in the light, And we're standing side by side, As your shadow crosses mine (We Found Love - Lindsey Stirling)
Music: http://www.youtube.com/watch?v=0g9poWKKpbU


lunes, enero 07, 2013

NoSQLUnit 0.7.3 Released



NoSQLUnit is a JUnit extension to make writing unit and integration tests of systems that use NoSQL backend easier. Visit official page for more information.

In 0.7.3 release, next changes has been added:

  • Support for Infinispan.
  • Adding the possibility to add custom insertion and comparison methods for each engine. Thanks to Bob Tiernay for the idea. https://github.com/lordofthejars/nosql-unit/issues/45
  • Adding the possibility to avoid NoSQLUnit injects fields annotated by @Inject by using @ByContainer annotation.  Very useful for Spring Framework Tests, Arquillian Tests or Needle Tests.
  • Removed JMockMongo as embedded Mongo implementation for Fongo project. Users should not notice any difference from the point of view of NoSQLUnit. Thank to Bob Tiernay for providing this valuable information about Fongo.
  • Updated mongo-java-driver to 2.10.1.
  • Updated neo4j to 1.8.
  • Fixed bug #46 thanks to MrKeyholder for discovering and attaching the solution code.
We keep learning,
Alex.
Fiery mountain beneath the moon, The words unspoken, we'll be there soon, For home a song that echoes on, And all who find us will know the tune. (The Lonely Mountain - Neil Finn)

miércoles, enero 02, 2013

Testing Spring Data MongoDB Applications with NoSQLUnit

Spring Data MongoDB


Spring Data MongoDB is the project within Spring Data project which provides an extension to the Spring programming model for writing applications that uses MongoDB as database.

To write tests using NoSQLUnit for Spring Data MongoDB applications, you do need nothing special apart from considering that Spring Data MongoDB uses a special property called _class for storing type information alongside the document.

_class property stores the fully qualified classname inside the document for the top-level document as well as for every value if it is a complex type.

Note Type mapping
MappingMongoConverter is used as default type mapping implementation but you can customize even more using @TypeAlias or implementing TypeInformationMapper interface.

Application


Starfleet has asked us to develop an application for storing all logs of starship crew members into their systems.

To implement this requirement we are going to use MongoDB database as backend system and Spring Data MongoDB at persistence layer.

Log documents have next json format:

Example of Log Document

{
        "_class" : "com.lordofthejars.nosqlunit.springdata.mongodb.log.Log" ,
        "_id" : 1 ,
        "owner" : "Captain" ,
        "stardate" : {
                "century" : 4 ,
                "season" : 3 ,
                "sequence" : 125 ,
                "day" : 8
        } ,
        "messages" : [
                        "We have entered a spectacular binary star system in the Kavis Alpha sector on a most critical mission of astrophysical research. Our eminent guest, Dr. Paul Stubbs, will attempt to study the decay of neutronium expelled at relativistic speeds from a massive stellar explosion which will occur here in a matter of hours." ,
                        "Our computer core has clearly been tampered with and yet there is no sign of a breach of security on board. We have engines back and will attempt to complete our mission. But without a reliable computer, Dr. Stubbs' experiment is in serious jeopardy."
        ]
}

This document is modelized into two Java classes, one for whole document and another one for stardate part.

Stardate class

@Document
public class Stardate {

        private int century;
        private int season;
        private int sequence;
        private int day;

        public static final Stardate createStardate(int century, int season, int sequence, int day) {

                Stardate stardate = new Stardate();

                stardate.setCentury(century);
                stardate.setSeason(season);
                stardate.setSequence(sequence);
                stardate.setDay(day);

                return stardate;

        }

        //Getters and Setters
}
Log class

@Document
public class Log {

        @Id
        private int logId;

        private String owner;
        private Stardate stardate;

        private List<String> messages = new ArrayList<String>();

        //Getters and Setters
}

Apart from model classes, we also need a DAO class for implementing CRUD operations, and spring application context file.

MongoLogManager class

@Repository
public class MongoLogManager implements LogManager {

        private MongoTemplate mongoTemplate;

        public void create(Log log) {
                this.mongoTemplate.insert(log);
        }

        public List<Log> findAll() {
                return this.mongoTemplate.findAll(Log.class);
        }

        @Autowired
        public void setMongoTemplate(MongoTemplate mongoTemplate) {
                this.mongoTemplate = mongoTemplate;
        }

}
application-context file

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
       xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
       xmlns:context="http://www.springframework.org/schema/context"
       xsi:schemaLocation="http://www.springframework.org/schema/beans
           http://www.springframework.org/schema/beans/spring-beans-3.1.xsd
           http://www.springframework.org/schema/context
           http://www.springframework.org/schema/context/spring-context-3.1.xsd">

     <context:component-scan base-package="com.lordofthejars.nosqlunit.springdata.mongodb"/>
     <context:annotation-config/>

</beans>
Tip
For this example we have used MongoTemplate class for accessing to MongoDB to not create an overcomplicated example, but in a bigger project I recommend use Spring Data Repository approach by implementing CrudRepository interface on manager classes.

Testing


As has been told previously, you don’t have to do anything special beyond using class property correctly. Let’s see the dataset used to test the findAll method by seeding _log
collection of logs database.

all-logs file

{
        "log":[
                {
                        "_class" : "com.lordofthejars.nosqlunit.springdata.mongodb.log.Log" ,
                        "_id" : 1 ,
                        "owner" : "Captain" ,
                        "stardate" : {
                                "century" : 4 ,
                                "season" : 3 ,
                                "sequence" : 125 ,
                                "day" : 8
                        } ,
                        "messages" : [
                                "We have entered a spectacular binary star system in the Kavis Alpha sector on a most critical mission of astrophysical research. Our eminent guest, Dr. Paul Stubbs, will attempt to study the decay of neutronium expelled at relativistic speeds from a massive stellar explosion which will occur here in a matter of hours." ,
                                "Our computer core has clearly been tampered with and yet there is no sign of a breach of security on board. We have engines back and will attempt to complete our mission. But without a reliable computer, Dr. Stubbs' experiment is in serious jeopardy."
                        ]
                }
                ,
                {
                        "_class" : "com.lordofthejars.nosqlunit.springdata.mongodb.log.Log" ,
                        "_id" : 2 ,
                        "owner" : "Captain" ,
                        "stardate" : {
                                "century" : 4 ,
                                "season" : 3 ,
                                "sequence" : 152 ,
                                "day" : 4
                        } ,
                        "messages" : [
                                "We are cautiously entering the Delta Rana star system three days after receiving a distress call from the Federation colony on its fourth planet. The garbled transmission reported the colony under attack from an unidentified spacecraft. Our mission is one of rescue and, if necessary, confrontation with a hostile force."
                        ]
                }
                ...
}

See that _class property is set to full qualified name of Log class.
Next step is configuring MongoTemplate for test execution.

LocalhostMongoAppConfig

@Configuration
@Profile("test")
public class LocalhostMongoAppConfig {

        private static final String DATABASE_NAME = "logs";

        public @Bean Mongo mongo() throws UnknownHostException, MongoException {
                Mongo mongo = new Mongo("localhost");
                return mongo;
        }

        public @Bean MongoTemplate mongoTemplate() throws UnknownHostException, MongoException {
                MongoTemplate mongoTemplate = new MongoTemplate(mongo(), DATABASE_NAME);
                return mongoTemplate;
        }

}

Notice that this MongoTemplate object will be instantiated only when test profile is active.
And now we can write the JUnit test case:

WhenAlmiralWantsToReadLogs

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = "classpath:com/lordofthejars/nosqlunit/springdata/mongodb/log/application-context-test.xml")
@ActiveProfiles("test")
@UsingDataSet(locations = "all-logs.json", loadStrategy = LoadStrategyEnum.CLEAN_INSERT)
public class WhenAlmiralWantsToReadLogs {

        @ClassRule
        public static ManagedMongoDb managedMongoDb = newManagedMongoDbRule()
                        .mongodPath(
                                        "/Users/alexsotobueno/Applications/mongodb-osx-x86_64-2.0.5")
                        .build();

        @Rule
        public MongoDbRule mongoDbRule = newMongoDbRule().defaultManagedMongoDb("logs");

        @Autowired
        private LogManager logManager;

        @Test
        public void all_entries_should_be_loaded() {

                List<Log> allLogs = logManager.findAll();
                assertThat(allLogs, hasSize(3));

        }

}

There are some important points in the previous class to take a look:
  1. Because NoSQLUnit uses JUnit Rules you can use @RunWith(SpringJUnit4ClassRunner) freely.
  2. Using @ActiveProfiles we are loading the test configuration instead of the production ones.
  3. You can use Spring annotations like @Autowired without any problem.

Conclusions


There is not much difference between writing tests for none Spring Data MongoDB and applications that use it. Only keep in mind to define correctly the _class property.

We keep learning,
Alex.
Els astronautes volen baix, Els núvols passen com qui no diu res. Amb les butxaques a les mans, Caminarem els passos d’altres peus. (Pa amb Oli i Sal - Blaumut)
Music: http://www.youtube.com/watch?v=Hkc5piclElg

jueves, diciembre 20, 2012

NoSQLUnit 0.7.1 Released


NoSQLUnit is a JUnit extension to make writing unit and integration tests of systems that use NoSQL backend easier. Visit official page for more information.

In 0.7.1 release:
  • One new NoSQL system is supported and is CouchDB
  • JUnit version has been upgraded to 4.11. Now using @Inject does not require to pass this reference.
  • Business objects do not contain any dependency to JUnit classes. This is the first step to integration to Arquillian Framework.
  • Now we can test sharding and master/slave replication of Redis servers.
  • Bug fixing.
In next version of NoSQLUnit support for Infinispan, adding class capabilities for testing sharding and master/slave replication in MongoDB and bug fixes will be provided.

We keep learning,
Alex.

Como una sonrisa, eres tú, eres tú.
Así, así, eres tú. (Eres Tú - Mocedades)



jueves, diciembre 13, 2012

Metrics, A new way to monitorize your application


When you are running long term applications like web applications, it is good to know some statistics about them, like number of requests served, request durations, or the number active requests. But also some more generic information like the state of your internal collections, how many times some portion of code is being executed, or health checks like database availability, or any kind of connection to an external system.

All this kind of instrumentalization can be achieved by using native JMX or using a modular project like Metrics. Metrics provides a powerful way to measure the behaviour of your critical components and reporting them to a variety of systems like, JConsole, System Console, Ganglia, Graphite, CSV, or making them available through a web server.

To install Metrics, we only have to add metrics dependency. In this example we are going to use Maven.

<dependencies>
    <dependency>
        <groupId>com.yammer.metrics</groupId>
        <artifactId>metrics-core</artifactId>
        <version>2.2.0</version>
    </dependency>
</dependencies>

Now it is time to add some metrics to our code. In Metrics we can use 6 types of metrics:
  • Gauges: an instantaneous measurement of a discrete value. 
  • Counters: a value that can be incremented and decremented. Can be used in queues to monitorize the remaining number of pending jobs.
  • Meters: measure the rate of events over time. You can specify the rate unit, the scope of events or  event type.
  • Histograms: measure the statistical distribution of values in a stream of data.
  • Timers: measure the amount of time it takes to execute a piece of code and the distribution of its duration.
  • Healthy checks: as his name suggests, it centralize our service's healthy checks of external systems.
So let's write a really simple application (in fact it is a console application) which sends queries to Google Search system. We will measure the number of petitions, the number of characters sent to Google, the last word searched, and a timer for measuring the rate of sending a request and receiving a response.

The main class where Measures will be applied is called MetricsApplication and is the responsible of connecting to Google and sending the entered word.

The first thing we can see is the counter instance. This counter will count the number of characters that are sent to Google in the whole life of the applications (meanwhile you don't stop it).

The next property is a meter that measures the rate of sending queries over time.

Then we have got a timer that rates the sendQueryToGoogle method callings and its distribution over time.

And finally a LinkedList for storing all queries sent. This instance will be used to return the last query executed, and is used in gauge for returning the last inserted element.

Notice that in each measure we are setting a class which will be used as folder in jconsole. Moreover a label is provided to be used as name inside folder.

Let's see a screenshot of jconsole with previous configuration and an execution of three searches:


By default all metrics are visible via JMX. But of course we can report measurements to console, http server, Ganglia or Graphite.  

Also note that in this example we are mixing business code and metrics code. If you are planning to use  Metrics in your production code I suggest you to put metrics logic into AOP whenever possible.

We have learned an easy way to monitorize our applications without using JMX directly. Also keep in mind that Metrics comes with some built-in metrics for instrumenting HttpClient, JDBI, Jetty, Jersey, Log4j, Logback or Web Applications.

We Keep Learning,
Alex.

But I know that one and one is two, And if this one could be with you, What a wonderful world this would be. (Wonderful World - Sam Cooke)

jueves, diciembre 06, 2012

Ada Is Here


Yesterday my daughter was born in Barcelona. We named Ada in honor of Augusta Ada King, who is considered the first person that wrote the first computer program. Both are Sagittarius, Ada was born on December 10 and my daughter on December 5, almost the same day.

Hope to see her with Octocat Onesie quickly and pushing code :D.

We Keep Learning,
Alex.

Im pickin up good vibrations, Shes giving me excitations, Im pickin up good vibrations (Good Vibrations - The Beach Boys)
Music: http://www.youtube.com/watch?v=B0yoiBYbT2I

martes, diciembre 04, 2012

Writing Acceptance Tests for Openshift + MongoDb Applications



Acceptance testing are used to determine if the requirements of a specification are met. It should be run in an environment as similar as possible of the production one. So if your application is deployed into Openshift you will require a parallel account to the one used in production for running the tests. In this post we are going to write an acceptance test for an application deployed into Openshift which uses MongoDb as database backend.

The application deployed is a very very simple library which returns all the books available for lending. This application uses MongoDb for storing all information related to books.

So let's start describing the goal, feature, user story and acceptance criteria for previous application.

Goal: Expanding lecture to most people.
Feature: Display available books.
User Story: Browse Catalog -> In order to find books I would like to borrow, As a User, I want to be able to browse through all books.
Acceptance Criteria: Should see all available books.

Scenario:
Given I want to borrow a book
When I am at catalog page
Then I should see available books information: The Lord Of The Jars - 1299 - LOTRCoverUrl , The Hobbit - 293 - HobbitCoverUrl

Notice that this is a very simple application, so the acceptance criteria is simple too.

For this example, we need two test frameworks, the first one for writing and running acceptance tests, and the other one for managing the NoSQL backend. In this post we are going to use Thucydides for ATDD and NoSQLUnit for dealing with MongoDb.

The application is already deployed in Openshift, and you can take a look at https://books-lordofthejars.rhcloud.com/GetAllBooks



Thucydides is a tool designed to make writing automated acceptance and regression tests easier. 

Thucydides uses WebDriver API to access HTML page elements. But also helps you to organise your tests and user stories by using a concrete programming model, create reports of executed tests, and finally it also measures functional cover. 

To write acceptance tests with Thucydides next steps should be followed. 
  • First of all choose a user story of one of your features. 
  • Then implement the PageObject class. PageObject is a pattern which models web application's user interface elements as objects, so tests can interact with them programmatically.  Note that in this case we are coding "how" we are accessing to html page.
  • Next step is implementing steps library. This class will contain all steps that are required to execute an action. For example creating a new book requires to open addnewbook page, insert new data, and click to submit button. In this case we are coding "what" we need to implement the acceptance criteria.
  • And finally coding the chosen user story following defined Acceptance Criteria and using previous step classes.








NoSQLUnit is a JUnit extension that aims us to manage lifecycle of required NoSQL engine, help us to maintain database into known state and standarize the way we write tests for NoSQL applications. NoSQLUnit is composed by two groups of JUnit rules,  and two annotations. In current case, we don't need to manage lifecycle of NoSQL engine, because it is managed by external entity (Openshift).

So let's getting down on work:

First thing we are going to do is create a feature class which contains no test code; it is used as a way of representing the structure of requirements.

Note that each implemented feature should be contained within a class annotated with @Feature annotation. Every method of featured class represents a user story.

Next step is creating the PageObject class. Remember that PageObject pattern models web application's user interface as object. So let's see the html file to inspect what elements must be mapped.

The most important thing here is that table tag has an id named listBooks which will be used in PageObject class to get a reference to its parameters and data. Let's write the page object:

Using @DefaultUrl we are setting which URL is being mapped, with @FindBy we map the web element with id listBooks, and finally getBooksTable() method which returns the content of generated html table.

The next thing to do is implementing the steps class; in this simple case we only need two steps, the first one that opens the GetAllBooks page, and the other one which asserts that table contains the expected elements.

And finally class for validating the acceptance criteria:

There are some things that should be considered in previous class:
  • @Story should receive a class defined with @Feature annotation, so Thucydides can create correctly the report.
  • We use MongoDbRule to establish a connection to remote MongoDb instance. Note that we can use localhost address because of port forwarding Openshift capability so although localhost is used, we are really managing remote MongoDb instance.
  • Using @Steps Thucydides will create an instance of previous step library.
  • And finally @UsingDataSet annotation to populate data into MongoDb database before running the test.


Note that NoSQLUnit maintains the database into known state by cleaning database before each test execution and populating it with known data defined into a json file.

Also keep in mind that this example is very simple so only and small subset of capabilities of Thucydides and NoSQLUnit has been shown. Keep watching both sites: http://thucydides.info and https://github.com/lordofthejars/nosql-unit

We keep learning,
Alex.
Love Is A Burning Thing, And It Makes A Fiery Ring, Bound By Wild Desire, I Fell Into A Ring Of Fire (Ring of Fire - Johnny Cash)