miércoles, septiembre 28, 2011

Ratti Ratti Sachi Maine Jaan Gavayi Hai, Nach Nach Koylo Pe Raat Bitayi Hai, Akhiyon Ki Neend Maine Phoonko Se Uda Di (Jai Ho - A.R. Rahman)




Last week Spring Integration 2.1 M1 was released. One of new features is an implementation of MessageStorage interface that relies upon MongoDB for persistence.  

In current post I will show you an example of Spring Integration using Claim Check pattern and MongoDB as storage system.

First of all a small introduction of each component we will use.

Spring Integration provides an extension of the Spring programming model to support the well-known Enterprise Integration Patterns. It enables lightweight messaging within Spring-based applications and supports integration with external systems via declarative adapters.

MongoDB is a high-performance, schema-free, document-oriented database, which stores JSON-like documents.

Claim Check pattern is an EIP pattern which allows you to replace message content with a claim check (a unique key), which can be used to retrieve the message content at later time. This pattern can be used when message content is very large and should be sent by request or when you cannot trust the information with an outside party and you use a claim check to talk with it.

As an example I have used a simplified version of Cafe Sample Application that are shipped with Spring Integration bundle, but adding MongoDB MessageStore.


In this diagram you can see a schema of sample application. First of all cafe component is the gateway, the point of entree to the system, and is connected to orders channel. Orders are sent to splitter, and each order is sent through pack channel. Last step is check-in component. This component is responsible to store order into MongoDB and generate an UUID. Finally UUID is sent to output (in this case console output).

Now I will show you the code of first part of application. I have assumed that you have already installed MongoDB on your system and is running.

Application context file contains Spring Integration beans that define structure defined above. See that the most important part is the definition of MongoDbFactory at line 18, and line 22 where we are defining database name (test database is created by MongoDB by default). The other part of file is self-explained. See that in line 37, claim check is defined and MongoDB message storage is referenced.

Next important piece of code is Cafe interface. This class acts as a facade for placing orders and not explained yet, for retrieving orders. Let's see this class:

Typical Spring Integration gateway class, where entering points are defined. For now line 7 is the most important because this method will be called for placing orders into orders channel.
And now you are ready to send orders using Cafe interface and receiving at console output not the order but an UUID.

This is a simple snippet that can be executed as unit test (in fact this is not a unit test I know, but for teaching purpose is enough). And if you execute it you will see two long numbers in console, something like:


Remember that number because is required in next step. Meanwhile orders are stored into database. If mongo console is opened, we can query test database and all orders stored will be returned.


Now that check-in has been implemented and tested, it is time to implement check-out method. In this example user (you) will enter UUID using console. Entered reference will be sent to input channel, order object with entered code will be retrieved from database, and sent it to stdout channel. Let's see the code:


Not much difference but instead of using claim-check-in we are using claim-check-out. In line 37 we are defining a gateway to enter UUID (recoverOrderAndSent method of Cafe interface). Introduced identification is sent to input channel and check-out is executed, consequently order with that identifier is sent to output channel (console output).

And our "test" now looks like:


Take a look, first of all two orders are sent to check-in component. Orders are saved to MongoDB and returned UUID are printed to console. Next step in that method is asking user to introduce one of previous UUID and order is checked-out. 

And the output:


Note how simply is to configure your application to use claim-check pattern with Spring Integration, only configuring XMLs, without writing one line of code.. This is a simple example, of course you can use all the power of Spring Integration (aggregators, splitters, adapters, ...) with claim-check.

I wish you have found that post useful.

Download code.

Music: http://www.youtube.com/watch?v=NtDzUwVWQL0

jueves, septiembre 22, 2011

Luna quieres ser madre Y no encuentras querer Que te haga mujer Dime luna de plata (Hijo de la Luna - Mecano)



In previous post link I talked about aspects, concretely ITD, and how can be used to design your classes with its own responsibilities while maintaining source code clear and concise. In that post I used aspectj and spring-aspects as aspect-oriented implementations. An important concept of aspect programming is the process of weaven. An aspect weaver takes information from raw classes and aspects and creates new classes with the aspect code appropriately weaved into the classes.

If you are using Eclipse with AJDT plugin, weaver is executed automatically, but if you are using any build tool like Maven, you should take care of configuring correctly so generated classes contains aspect code too.

In this post I will explain how I have modified a pom file so compilation process also weaves aspect code into classes.

First thing I always do when I generate a pom file is adding component version in properties section. In this case spring and aspectj:

Next step is adding dependencies. To work with aspectj and spring-aspects four dependencies must be added.

But one more important dependency is required, and let me surprise you:


You can think I am joking, but not, JPA API is required. Here you can read why: https://jira.springsource.org/browse/SPR-6819. AnnotationDrivenStaticEntityMockingControl class requires javax.persistence.Entity to be in classpath. Hopefully nowadays most projects use JPA, so this dependency may already be required by your code, if not, then you should append it as dependency.

And finally aspectj-maven-plugin is registered:


This plugin requires that you define which dependencies will be used using <dependencies> tag. In order to apply already compiled aspects to your own sources you need to setup all JAR files you would like to weave in the plugin configuration using <aspectLibraries> section; in this case spring-aspects artifact is required so application can use capabilities that offer @Configurable annotation. Finally execution section should contain compile and test-compile goals so main and test classes can be woven.

Hope you find this post useful.

Alex.

lunes, septiembre 19, 2011

The Bell That Rings Inside Your Mind, Is Challenging The Doors Of Time, It’s A Kind Of Magic (A Kind Of Magic - Queen)



During class design we should take decisions about the assignment of responsibilities that will have every class. If we have chosen well, systems tend to be easier to understand, maintain and extend.

Almost all of our projects have a persistence layer, either relational database, document stores, or simply XML files. And typically you will use DAO pattern to implement abstract interface between your business objects and your data store.

In this post but I am going to explain another pattern that can be used instead of DAO pattern. Active record pattern is an architectural pattern that force you to implement CRUD operations on your model  class, hence model class itself is responsible for saving, deleting, loading from database.

There are many strategies to follow to implement this pattern, but for me, the best one is using Aspect Oriented Programming, because we are still maintaining separation of concerns favoring isolated unit testing, and not breaking encapsulation.

Aspect-oriented programming entails breaking down program logic into distinct parts. These parts are known as crosscutting concerns because they "cut across" multiple abstractions in a program. Example of crosscutting concerns can be logging, transaction manager, error manager or splitting large datasets. For people that have worked with aspects not much secret here, to use them you simply create an aspect defining the advice and the pointcut, and your aspect is ready to be executed. 

I guess most of us use aspects-oriented programming as I have described in previous paragraph, but will be fewer that uses ITD (Inter-type Declarations) feature.

Inter-type Declarations provide a way to express crosscutting concerns affecting the structure of modules enabling programmers to declare members of another class.

As we say in my country "bad said but well understood", ITD is a way to declare new components (attributes, methods, annotations) of a class from an aspect.

AspectJ is an aspect-oriented extension for Java. AspectJ supports ITD, and for this reason will be used in this post. Moreover I recommend you install AJDT plugin because it will help you develop aspects and having a quick overview of which Java classes are aspecterized.



If you have not understood what ITD is, don't worry, it is a typical example of concept that is best understood with an example.

Let's start with simple example:

Imagine having to model a car. You would have a car class, with some attributes, for this example three attributes (vin number, miles drived and model) is enough.


It is a POJO with three attributes and their getters and setters.

Now we want to add persistence layer, but in this case we are going to persist our POJOs in a XML file instead of a database. So Car objects should be transformed to XML stream. For this purpose JAXB annotations will be used. For those who don’t know, JAXB allows developers to map Java classes to XML representations and viceversa.

I am sure that first idea that comes to your brain is annotating Car class with @XmlRootElement (annotation to map root element in JAXB). Don’t do that, use aspects. Your first mission is trying to maintain Car file as simple as possible. To add an annotation using ITD, is as simple as:


With @type you are exposing which member is annotated. In this case only class. Other possibilities are @method, @constructor and @field. Then elements pattern that should be annotated, in this case Car class, but you could use any regular expressions like org.alexsotob..*. Finally the annotation.

Next step is using JAXB classes to marshalling/unmarshalling objects. In this example I am using spring-oxm package and briefly you will understand why. Spring-oxm is a part of spring-core that contains classes for dealing with O/X Mapping.

This spring module contains one class for each Xml binding supported. In our case Jaxb2Marshaller is used as marshaller and unmarshaller.

It is possible that you are thinking of creating a service class where you inject Jaxb2Marshaller instance. This service would include two methods (save and load) with Car class as argument or return value. Sorry but, doing this, you are implementing DAO pattern. Let's implement Active Record pattern approach. And as you may suppose, aspectj comes to rescue you to avoid mixing concepts in same source file.

Let's update previous aspect file so all required logic by JAXB will be in same file.


See that apart from annotating Car class we are creating two methods, and an annotated attribute.  Attributes must follow same rule as methods,  <class name> dot (.) and <attribute name>. Note that in this case attribute is transient because should not be bound in XML file.

Last step is configuring marshaller in spring context file.

Not much secret. Now let's code a unit test.

Run junit class and BOOM all red, with an amazing NullPointerException. Marshaller is created in Spring context, but not injected into Car class (Car is not managed by spring container, so is impossible to be injected). And now I suppose you are telling yourself: "I told you a service layer would be better, because it would be managed by Spring and autowired would work perfect.". But wait and see. How about using spring-aspects module? Spring Aspects contains an annotation-driven aspect (@Configurable) allowing dependency injection of any object, whatever is or not controlled by container. So let's apply last two changes and the application will run.

First of all is creating a new aspectj file to annotate Car class as Configurable.

And finally modify spring context file to allow @Configurable annotation.

Adding <context:spring-configured></context:spring-configured> namespace is enough. As a result, any time you instantiate an object (via the "new" keyword), Spring will attempt to perform dependency injection on that object.

Now run unit test again and green will invade your computer :D.

ITD is a really nice solution to design classes with its own responsibilities. It gives you the oportunity of writing maintainable and understandable code, without loosing encapsulation. Of course you should take care of not to have high coupling in aspected classes, and convert them in "God Classes".

Note that implementing same approach but using relational database, it is as simple as changing Jaxb2Marshaller to EntityManager.

I wish you have found this post useful.

Download Full code

Music:  http://www.youtube.com/watch?v=KLFZzInXAWI

lunes, septiembre 12, 2011

Questa di Marinella è la storia vera Che scivolò nel fiume a Primavera Ma il vento che la vide così bella Dal fiume la portò sopra una stella (Canzone Di Marinella - Mina)




Most of us use Maven as a build automation tool. One of most important section are those related to dependencies. Typically projects have dependencies to external libraries like Spring, Hibernate, Slf4j, ... which are downloaded from External Maven Repository Servers like mvnrepository, ibiblio or from own Internal Maven Repository Server using JFrog or Nexus. This scenario but does not cover all possible cases.

It is uncommon that your project requires an external library that is not present in any repository, but can occurs, and in these cases are where you use company Internal Maven Repository to upload these artifacts. But this is not always possible, and typical example is when you are writing in a blog about a beta library and it is not present in any repository, and you want to add this library to pom file. Would be perfect if Maven could resolve these dependencies too without packaging them into project. So what can we do?

There are some alternatives:

- packaging .class files of external library into your project, so they are packaged as project files.
- asking library creators to upload Milestones to public repositories.
- create your own public Maven Repository.
- using GitHub?

Using GitHub, YES!!. You can configure your GitHub account as Maven Repository. In my case I always upload sample code to GitHub. Why not creating a GitHub project that acts as a Maven Repository and poms referencing to this repository?

Now I will summarize steps that shall be followed: (assuming that you have already created a project).
  1. Create a GitHub repository.
  2. Initialize local directory as Git repo.
  3. Modify project pom so distribution management tag points to local repository.
  4. Perform deploy goal so artifact is deployed to local repository.
  5. Push changes to GitHub.
  6. Nothing more. Now other project poms can use your repository.

Let's get down to work:

The first thing you should know is that the project I want to upload to Maven Repository is located at /media/share/workspace/github-test and project that will have a dependency will be located at /media/share/workspace/bar.

First step is creating a new GitHub repository. So login to your GitHub account and create a new repository. In my case I have named maven-repository.


Second step is initialize local directory as Git repository.

Third step is adding distributionManagement tag pointing to repository so when project is deployed, artifact is created to Git directory.

See that url tag is pointing to previously created Git directory (local disk).

Fourth Step is running mvn -DperformRelease=true clean deploy. Project is compiled, tested, and packaged. Now go to local repository and see what has appeared.

Fifth Step implies two operations, committing changes and pushing them to GitHub.

See that first command is adding root directory. Then all files are committed to local repository and pushed to GitHub repository.


Sixth Step if you want to call it step, is adding repository tag to project that requires published component. In our case bar project requires it.

This pom represents a project that wants to use github-test module. It is important to note that repository url is special (not showed when you explore GitHub projects with navigator), and is https://github.com/maggandalf/maven-repository/raw/master. After repository name (in this case maven-repository) you should concatenate /raw/master. raw because you want to access to files without any decoration (no HTML), and master because it is the branch name.

When bar project is being compiled, github-test artifact is downloaded.


I think it is a good approach when you need to upload a Maven artifact temporally, or simply because your project requires a milestone version of artifact that is not uploaded into any repository.

I hope you like this post, and ... MOVE TO GIT.

Music: http://www.youtube.com/watch?v=TL69oTe6HHY

martes, septiembre 06, 2011

Soy El Capitán De La Nave Tengo El Control, Llamando A La Tierra Esperando Contestación, Soy Un Cowboy Del Espacio Azul Eléctrico (Llamando A La Tierra - M.Clan)



Groovy is an object-oriented programming language for the Java platform and can be used as a scripting language. Most of us but instead of using Groovy alone, we use Grails (web framework based on Groovy) for developing web applications.

But Groovy can be used standalone for developing your internal tools. Let me explain why Groovy scripts have simplified our development of tools for generating data for integration tests.

In my company we create clinical instruments. These instruments are so expensive, and big enough to say that they are not portable. For these reasons each instrument has an emulator, so integration tests are run without having physically any instrument.

Our emulator has an XML file where all required resources for running tests are configured. In summary each file contains a list of blood barcodes, and reagent barcodes. In our case barcodes have a meaning (kind of resource, expiry date, checksum, ...).

So one can think about creating a standard configuration file that is used in integration tests. It is a good idea if you don't know that expiry date is expressed in months. So one day without knowing exactly why your tests begin to fail. The answer is obvious, resources have become expired. We need a (script ?) that updates this XML file so when a resource is becoming expired, its month field is changed.

Moreover not all resources should be updated. Some tests imply working with expired resources. For this reason some barcodes have special format that suggests that update should not be applied.

Because project is developed in Java, one can think about creating a small Java class that do all these work. But wait and think if this class would be small or not, we need a parser (DocumentBuilder), a method that walks all nodes and see if resource is expired or not (NodeList, Element.getAttribute(), ...), and finally writing modifications to file. Furthermore, some barcodes contains special charachtrs that should be detected using a regular expression (Pattern, Matcher) for avoiding its update. Although this class is not difficult, it is far away from a small class with few lines.

But how about polyglot programming? Groovy can be a choice. Let's see how Groovy deals with XML and Regular Expressions.

Imagine next XML file:


Groovy and XML

Meanwhile in Java you create a DocumentBuilderFactory that returns a DocumentBuilder, and then call parse method, in Groovy is as simple as:

and root variable points to root element of XML file.

And now imagine that you want to update all blood barcodes of given holder. With Java you will use an iterator over NodeList or using an XPath expression. See the simplicity with Groovy.


See that with Groovy, nodes are explored as object attributes, and finally we call findAll() method that returns a list of sample nodes belonging to SampleHolder11 tag. For returning an attribute value is as easy as adding @ character before attribute name, in our case barcode.

And for writing an XML is as easy as:


In previous example, output is written to console.

Groovy and Regular Expressions

Remember that some barcodes have special format. In our case if any barcode starts with A, B, C or D, should not be modified. A clean, reusable and maintainable solution is using regular expressions to check if one barcode matches or not special format. In Java the process is not banal, you have to create a Pattern object with a regular expression, a Matcher and use find() method to see if pattern is found on entry or not.

But how we can determine if one string matches a regular expression in Groovy ? ==~ operator does the work for us. Tell me what you think about this expression:


Elementary. You will agree with me that Groovy approach is simpler than Java one.  For specifying a pattern you only have to use ~ character plus a slash (/) then a regular expression and finally a slash to delimit. But Groovy also supports =~ (create a Matcher) and ==~ (return boolean, whether String matches the pattern).

I think that creating a script that reads/parses/writes XML file is so fast and easy, not much classes are involved compared to Java. And not worth making the comparison with regular expression approach, in Groovy is the simplest way one may think.



After the success of previous Groovy script, we decided to create another tool (Groovy script) for manipulating database.

System registers into database every incidence that has occurred to an execution. Some incidences are easy to reproduce with emulator, but others not. In integration tests there is no problem, because a defined-filled database is used, but acceptance tests contains no data and is generated with the execution of tests. But because there are some incidences that are hard to reproduce in emulator, a Groovy script to insert incidences are created.

Groovy and SQL

As Java, Groovy can also access to databases and as you can suppose in a simple way. No Connection object, no PreparedStatement, no ResultSet, ....

Let's see an example of searching data and using them for creating a new registry. Imagine that we have a table called Execution and another one called Incidence, and one-to-many relationship.


Simple, yes? With one line a connection to database is established.

Executing SELECT query and iterate over results are also easy. Using eachRow method, all results are iterated. No ResultSet object is required anymore. Parameters values are passed between brackets ([]), and as XML each row is accessed using Closure. In previous example each row is mapped to execution variable. Moreover see how easy is read a value of each tuple. As in XML you access the value as a class attribute, no more getters of ResultSet methods, in example execution.dboid is used to refer to dboid field.

Finally execute method is used to update database.



Now that I have shown you some of nice features that Groovy offers us, I will explain you how we execute these tools.

GMaven

We use Jenkins with Maven as Continuos Integration System. Before Jenkins starts to execute Integration Tests, Groovy scripts are executed so emulator is configured propertly. The iteresting part of this step is how pom is configured for executing Groovy scripts.

Exists a plugin called gmaven-plugin. This plugin runs Groovy scripts depending on phase and goal.


No problem configuring gmaven plugin, the most important part is where you specify which script should be executed.

As final notes I want to say that I am a huge fan of Java and my intention is not criticize it, but I think that there are some problems that are best suited using other languages rather than Java. My advice is  learning as many kind of languages as you can (Scala, Groovy, ...) so as programmer you can choose the best solution to a given problem.

I wish you find this post useful.

Music: http://www.youtube.com/watch?v=9u54Xs22_SI