jueves, noviembre 10, 2011

Fear of the dark, fear of the dark I have a phobia that someone's always there (Fear of the Dark - Iron Maiden)



Some time ago I wrote about how to implement your Restful Web API using Spring MVC. Read my previous  post to know about it. 

In that post it was developed a simple Rest example. For testing the application,  file was copied into a web server  (Tomcat for example), and then accessing to http://localhost:8080/RestServer/characters/1 information of character 1 was returned.

In current post I am going to explain how to transform that application to a Google App Engine and be deployed into Google's infrastructure using Maven. Of course in this case we are going to deploy a Rest Spring MVC application, but same approach can be used for migrating a Spring MVC web application (or any other application developed with other web framework) to GAE.

First of all, obviously you should create a Google Account and register a new application (remember the name because will be used in next step). After that you can start the migration.

Three changes are required, create appengine-web.xml defining application name; add server tag to settings.xml with Google account information, and modify pom.xml for adding GAE plugin and its dependencies.

Let's start with appengine-web.xml. This file is used by GAE to configure application and is created into WEB-INF directory (at same level of web.xml).

The most important field is application tag. This tag contains the name of our application (defined when you register a new Google Application).

Other tags are version, system properties and environment variables, and misc configuration like if you want a precompilation to enhance performance or if your application requires sessions.

And your project should not be modified anymore, now only Maven files will be touched.

In settings.xml, account information should be added:

See that it is as easy as registering any other server in Maven.

And finally the most tedious part, modifying pom.xml.

First thing is adding new properties:

At first line we are defining Appengine Java SDK location. If you have already installed then insert location in this tag, if not, copy same location of this pom and simply change maven repository directory, in my case /media/share/maven_repo, to yours. Typically your Maven repository location will be  /home/user/.m2/repositories. Maven will download SDK for you at deploy time.

Next step is adding Maven GAE repository.

Because our project is dummy project, Datanucleus are not used. In case of more complex projects, that database access is required using, for example JDO, next dependencies should be added:

And in case you are using Datanucleusmaven-datanucleus-plugin should be registered. Take care to configure it properly depending on your project. 

Now Google App Engine dependencies are added.


Then if you want to test GAE functionalities (not used in our dummy project), next GAE libraries are added:

Next change is a modification on maven-war-plugin including appengine-web.xml into generated package:

And finally adding maven-gae-plugin and configuring it to upload application to appspot.

See that <serviceId> tag contains the server name defined previously in settings.xml file.

Also if you are using maven-release-plugin you can upload application to the appspot automatically, during release:perform goal:

Now run gae:deploy goal. If you have already installed Appengine Java SDK, then your application will be uploaded to your GAE site. But if it is the first time you run the plugin, you will receive an error. Do not panic, this error occurs because Maven plugin does not find Appengine SDK into directory you specified in <gae.home> tag. But if you have configured gae.home location into your local Maven repository, simply run gae:unpack goal, and SDK will be installed correctly so when you rerun gae:deploy your application will be uploaded into Google infrastructure. 

In post example you can go to http://alexsotoblog.appspot.com/characters/1 and character information in JSON format is displayed into your browser.

As I have noted at the beginning of the post, the same process can be used for any web application, not only for Spring Rest MVC.

Because of teaching purpose all modifications have been made into application pom. My advice is that you create a parent pom with GAE related tags, so each project that must be uploaded into Google App Engine extends from same pom file.

I wish you have found this post useful.

This week I am at devoxx, meet me there ;) I will be speaking on Thursday 17 at 13:00 about Speeding Up Javascript & CSS Download Times With Aggregation and Minification

Full pom file:


Download Code.

lunes, noviembre 07, 2011

I en una paret al fons imprès en blanc i negre hi havia un pòster d'en Godard. Potser ell em podria dir-me perquè em ballava el cap. (Jean Luc - Els Amics de les Arts)


Although you can think that Comma-Separated Values (CSV) files are simple files where each value is separated by commas, it is far from reality. The most important part of CSV files are delimiters, and one can think that the delimiter, as type name suggests, is comma. But this assumption is not always true and CSV is often applied to files using other delimiters. Typical example is found in countries where comma (,) is used as decimal separator, because you would have no way to discern between decimal number or separator, semicolon (;) is used.

If your application is distributed across different countries and your application should support  Comma-Separated Values files, then you should take care about this important fact.

Let's see how can we resolve this problem so we can read CSV files depending on country. 

For reading CSV files we are going to use openCSV library. This API contains one class for reading CSV files (CsvReader) and another for writing (CsvWriter). By default it uses comma as column delimiter, but you can also inform which delimiter character must be used.

So the problem is how to know which delimiter we should use. And for resolving this problem we can use DecimalFormatSymbols class. This class represents the set of symbols needed to format numbers.

DecimalFormatSymbols has a method called getDecimalSeparator() that as his name suggests returns a decimal symbol, comma (,) in case of countries like Germany, and dot (.) in case US for example. So with a simple ternary you could know which delimiter is used in CSV files depending on locale.


So final class will look like:


And previous factory can be instantiated with Spring and injecting delimiter, using Spring Expression module.


See that thanks of Spring Expression delimiter is injected directly into class.

I think it is not difficult to implement an application that can read Localized Comma-Separated Values files, and can avoid having problems when application is distributed around the world.

Full Code: https://github.com/downloads/maggandalf/Blog-Examples/csv.zip

Music: http://www.youtube.com/watch?v=A3Yl3Fsj7tw&feature=related

miércoles, noviembre 02, 2011

En ti puedo ver la libertad, Tu me haces sentir que puedo volar, Y se que aquí es mi lugar, Y se que a ti yo quiero amar (Cuando Me Miras Asi - Cristian Castro)




From ServiceLoader javadoc: A service is a well-known set of interfaces and classes. A service provider is a specific implementation of a service. The classes in a provider typically implement the interfaces and subclass the classes defined in the service itself.

Since JDK 6, a simple service-provider loading facility is implemented. As is suggested it is simple, you cannot understand that implementation as a complex system for implementing plugins for your application, but can help you in many situations where you want  implementations of a service could be discovered by other module automatically.

What I really like about using JDK Services is that your services do not have any dependency to any class. Moreover for registering a new implementation of a service, you just have to put jar file into classpath and nothing more.

Now I will explain the service we are going to implement, and then I will show you how to code it.

We want to implement a system that depending on kind of structured input (comma-separated value, tab-separated value, ...) returns a String[] of each value; so for example you can receive input a,b,c,d or 1<tab>2<tab>3<tab>4 and the system should return an array with  [a, b, c, d] or [1, 2, 3, 4].

So our system will have three Java projects.

One defining service contract (an interface) and, because of teaching purpose, a main class where internet media type, for example text/csv, is received with input data. Then using a factory class that I have created, it will ask which registered service can transform input to String[].

And two projects each one implementing a service following defined contract, one for comma-separated values and another one for tab-separated values.

Let's see the code:

Main project (reader) is composed by an interface, a main class and a factory class.

The most important part is Decode interface which defines service contract.


Two operations are defined, one that returns if service supports given input, and another that transforms data to String[].

DecodeFactory class is responsible for finding an implementation service that supports required encoding. In fact, this class encapsulates java.util.ServiceLoader calls. ServiceLoader class is in charge of load registered services.


At line 3 we are loading all services that are registered in classpath. At line 7 we only iterate through all services asking if given encoding name is supported.

And finally main class.


And now if you run this class with java -jar reader.jar "text/cvs" "a, b, c, d", an UnsupportedEncodingException will be thrown. Now we are going to implement our first service. Note that reader project will not be modified nor recompiled.

First service we are going to implement is one that can support comma-separated values encoding. Only one class and one file are important.

CSV class is an implementation of Decode interface and transforms comma-separated values.


As you can see a simple StringTokenizer class. Only take care that this class is Locale sensitive, countries where comma (,) is used as decimal delimiter, separation character is semicolon (;).

And next important file is a file that is placed into META-INF. This file contains a pointer to service implementation class.

This file should be in META-INF/services and should be called as interface full qualified name. In this case org.alexsotob.reader.Decode. And its content should be service implementation full qualified name.


And now you can package this project, and you can reexecute reader project but with generated jar (csv.jar) into classpath. And now output will be an array with a, b, c and d characters instead of unsupported exception.

See that reader project has not been modified and its behaviour has been changed. Now you can develop new implementations for decoding new inputs, and you should only take care of copying them into classpath.

Only take care that all services should have a default constructor with no arguments.

And for those who use Spring Framework, Services are also supported through three different FactoryBeans, ServiceFactoryBean, ServiceListFactoryBean, ServiceLoaderFactoryBean.

As I have noted at start of this post, JDK services is a simple (yet powerful) solution if you need to create a simple plugins system. In my case it has been enough with JDK services, and I have never required more complex structure; but in case you are thinking about a complete plugin solution you can use JPF that offers a solution like Eclipse plugins, or even OSGI.

I wish you have found this post useful and now you know (if you didn't know yet), an easy solution to develop modules that are plugged and play.


martes, octubre 25, 2011

Soy el rey de la mar tiburón Que te come a besos Pero yo soy el rey del mar tiburón El que te come mi amor (El Rey Tiburón - Maná))



Today I was googling about mocking when I have found next question:

"Injecting mock beans into spring context for testing. What I want to be able to do is via the HotswappableTargetSource override the bean definitions of select beans in my application context with my test versions and then run the test.
Then for each test case I'd like to specify which beans I want to be hot swappable and then each test must be able to create its own mock versions and swap those in, and be able to swap back again."

And there you can find a solution using ProxyFactoryBean and HotSwappableTargetSource. Well it is a solution for me a bit complicated, if I should do the same I would do using StaticApplicationContext class, because from my point of view, environment is more controlled and easy to understand. Of course the easiest solution is using Spring 3.1 Profile feature, but meanwhile it is a milestone/RC or simply because you will not be able to change Spring version, I will show you how to use StaticApplicationContext and how to inject mocked beans.

StaticApplicationContext is an implementation of ApplicationContext interface which supports programmatic registration of beans and messages, rather than reading bean definitions from external configuration sources. StaticApplicationContext has been created mainly for testing purpose. To solve the problem  we focus, some of registered beans will be the "real" beans, but others will be mocked beans with all their interactions.

For this example Mockito has been used. Let's see some code.

Imagine you have an application that should create users into a database. I suppose we would have a UserDao class for communicating with database and a UserService class to aggregate user operations. Moreover UserService class would not be alone, would be used in several modules; in fact all modules that requires user information.

Now it is time for testing. Unit test is simple, when you want to test UserService you set a mock of UserDao. Here no problem with Spring because it has not started to play yet.

But when you want to test whole system, may be you want that low-level classes like UserDao be mocked and also you want to run tests with Spring capabilities (for example developed BeanPostProcessors, Messaging, Spring AOP, ...). For solving this case you can create a test Spring context file, then you can create required mocks and set them manually. But as you can suppose another approach is using StaticApplicationContext.



The most important line is number 12 where we are injecting UserDao mock into Spring context. See that at line 11 we are also registering autowired annotation post processor. If  it was not registered, classes annotated with @Autowired would not work.

In current post I have explained how to inject mock beans into Spring Context for testing. Personally I don't like mixing mock beans with real beans in integration tests, I prefer only real beans in this kind of tests; but I can understand that in big modules where a system has multiple subsystems can be useful to isolate some modules from test. 

For example in my department we are developing clinical instruments which as you can imagine contains complex modules all related between them. When we are running some integration test, we are not available to connect to device, so communication module could be mocked. But in our case we are running integration tests with an emulator, but mocking some parts of our system could be another solution.

The example in this case is so simple I know, but in more complex scenarios you can create an application context referencing to real beans and passing it to StaticApplicationContext constructor so in static application context you only register mock beans.

This this would look:



Well now I have showed you how to register mock beans into a Spring Application Context using StaticApplicationContext instead of using HotSwappableTargetSource

Thank you very much for reading my blog.

Download Code.

Music: http://www.youtube.com/watch?v=Njbm_MABQJE&ob=av2n

lunes, octubre 17, 2011

Una terra promessa Un mondo diverso Dove crescere i nostri pensieri Noi non ci fermeremo Non ci stancheremo di cercare Il nostro camino (Terra Promessa - Eros Ramazzotti)


INTRODUCTION

The essence of the Observer Pattern is to "Define a one-to-many dependency between objects so that when one object changes state, all its dependents are notified and updated automatically." GoF. Observer pattern is a subset of publish/subscribe pattern which allows a number of observer objects to see an event. 

This pattern can be used in different situations, but in summary we can say that Observer pattern can be applied when an object should be able to notify messages to other objects, and you don't want these objects  being tightly coupled. In my case I have used this pattern when an asynchronous event should be notified to one or more graphical component.

This pattern can be implemented using an adhoc solution or using java.util.Observer/Observable classes. But my projects are always developed with Spring whether they are web or desktop applications. So in current post I will explain how I implement Observer pattern with Spring.

HANDS ON

Event handling in Spring ApplicationContext is provided through ApplicationEvent class and ApplicationListener interface. If a bean that implements ApplicationListener interface is deployed into the context, every time an ApplicationEvent is published to container, ApplicationListener receives it.

Spring comes with built-in events, like ContextStartedEvent, ContextStoppedEvent, but you can also create your own custom events.

For developing your own events, three classes are required, observer role, observable role and the event. Observers are those who receive events and must implement ApplicationListener class. Observable classes are responsible of publishing events and must implement ApplicationEventPublisherAware. Finally event class has to extend ApplicationEvent.

CODING

What I am going to implement is wikipedia example of Observer pattern (http://en.wikipedia.org/wiki/Observer_pattern#Example) but using Spring Events instead of Observer/Observable Java classes. The example is a basic publish/subscribe example where one String message is sent from one module to another one.

Let's create MessageEvent. This event contains a String that represents the message we want to send. It is a simple class that extends from ApplicationEvent.


Next class is the Observable class. This class must implements ApplicationEventPublisherAware. This interface defines a setter method with ApplicationEventPublisher as parameter. This parameter is used for publishing events.

In current implementation see that also implements Runnable interface so user can create from console input, asynchronous messages. Most important line is 26 where an event is created and published.


The Observer class is even simpler. Implements ApplicationListener interface. Method onApplicationEvent is called when an event is published. See that it is a generic interface, so no cast is required. This differs from java.util.Observer class.


In application context file, you register both ApplicationListener and ApplicationEventPublisherAware beans.

And finally a main class to test the system. A thread is created to execute multiple asynchronous events. 


So start the program and write something to console. You will see something like:

hello
Thread-0
Thread-0
MessageEvent [message=hello]

I have entered "hello" message and thread name of event publisher is printed. Then event is sent and handler thread name is printed too. Finally the received event is shown. There is one thing that should call your attention. Both sender (Observable) and receiver (Observer) are executed in same thread; by default event listeners receive events synchronously. This means that publishEvent() method, blocks until all listeners have finished processing the event. This approach has many advantages (for example reusing transaction contexts, ...), but in some cases you will prefer that each event is executed in new thread, Spring also supports this strategy.

In Spring, class responsible of managing events is SimpleApplicationEventMulticaster. This class multicasts all events to all registered listeners, leaving it up to the listeners to ignore events that they are not interested in. Default behaviour is that all listeners are invoked in calling thread. 

Now I am going to explain how Spring Event Architecture is initialized and how you can modify. By default when ApplicationContext is started up, it calls initApplicationEventMulticaster method. This method verify if exists a bean with id applicationEventMulticaster of type ApplicationEventMulticaster. If it is the case defined ApplicationEventMulticaster is used, if not a new SimpleApplicationEventMulticaster with default configuration is created.

SimpleApplicationEventMulticaster has a setTaskExecutor which can be used for specifying which java.util.concurrent.Executor will execute events. So if you want that each event is executed in a different thread, a good approach would be using a ThreadPoolExecutor. As explained in last paragraph, now we must explicitly define SimpleApplicationEventMulticaster instead of using default ones. Let's implement:


First of all SimpleApplicationEventMulticaster must be defined as a bean with id applicationEventMulticaster. Then task pool is set, and we rerun our main class. And output will be:

hello
Thread-1
pool-1
MessageEvent [message=hello]

Note that now sender and receiver thread is different.

And of course you can create your own ApplicationEventMulticaster for more complex operations. You just have to implement ApplicationEventMulticaster and defining it with applicationEventMulticaster bean name, and events will be executed depending on your own strategy.

Hope that now your Spring desktop applications can take full advantage of Spring events for separating modules.


Music: http://www.youtube.com/watch?v=GfnkcKiocRw

martes, octubre 11, 2011

You're so sexy sex sex sexy. Feel me now and stop the conversation. No, no, no don't stop the desire no, No, no, no, no! (Sexy - French Affair)





INTRODUCTION

In current post I am going to explain how to implement and register a custom Spring MVC HttpMessageConverter object. Specifically a converter that binds objects to Yaml protocol. As starting point I am going to use the Rest application I implemented in previous post. That application is a simple Restful application where XML and JSON (Spring MVC already support them) are used. Because Spring MVC does not implement YamlMessageConverter, I am going to explain how to transform previous application from supporting XML and JSON to support Yaml.

Yaml is a human-readable data serialization format that takes concepts from programming languages such as C, Perl, and Python, and ideas from XML and the data format of electronic mail (RFC 2822).

SnakeYAML  is a YAML parser and emitter for the Java programming language, and will be used to implement our message converter.

DESIGN

Let's start with a UML class diagram of HttpMessageConverter that are going to be implemented.



HttpMessageConverter is a base interface that must be implemented. It is a strategy interface that specifies methods to convert objects from and to HTTP requests and responses. AbstractHttpMessageConverter is the abstract base class for most HttpMessageConverter implementation (both provided by springframework), and is our base class.

First developed class is an abstract class called AbstractYamlHttpMessageConverter. This class is responsible of generic operations that "should" be required by all Yaml parsers/emitters. In my case it deals with charset options, and transforms HttpInputMessage and HttpOutputMessage to java.io.InputStreamWriter and java.io.OutputStreamWriter. In fact it acts as a Template Pattern to read and write operations (readInternal and writeInternal methods).

Next abstract class is AbstractSnakeYamlHttpMessageConverter. This class is a base class for HttpMessageConverters that use SnakeYaml as Yaml binder. This class gets an instance of Yaml class (central class of SnakeYaml project).

And finally JavaBeanSnakeYamlHttpMessageConverter. This class uses SnakeYaml JavaBeans features for converting from object to Yaml and viceversa. SnakeYaml does not support annotations like Jackson (JSON) or Jaxb (XML), but if some day this feature is implemented, we should create a new class extending from AbstractSnakeYamlHttpMessageConverter with required change.

CODE

First of all is adding a new dependency to pom. In this case SnakeYaml.


Then three classes previously exposed should be developed.

First class is a generic Yaml converter where we are setting accepted media type, in this case application/yaml, and creating a Reader and Writer with required Charset. We are leaving to children classes the responsibility of implementing read and write code.


Next class is specific of API that will be used to bind classes to messages, in this case SnakeYaml. This class will be responsible of creating an instance of Yaml class (from SnakeYaml). As is warned in http://snakeyamlrepo.appspot.com/releases/1.9/site/apidocs/index.html each thread must have its own instance; for this reason a ThreadLocal is used to carry out this restriction.


Final class is an implementation of read/write methods using SnakeYaml. As I have explained previously this class has a meaning because allows us changing SnakeYaml binding strategy (for example to annotation approach) and only worries to rewrite read/write operations.


Now it is time to register created message converter to AnnotationMethodHandlerAdapter. First thing you should do is not to use <mvc:annotation-driven>. This annotation registers default message converters and you are not able to modify them. So first step is comment or remove annotation-driven. Next step is declare DefaultAnnotationHandlerMapping bean and AnnotationMethodHandlerAdapter which registers http message converters. In our case only Yaml http message converter is added.


RUNNING

And now you can try application. Deploy it on a server, and using for example Rest Client, try http://localhost:8080/RestServer/characters/1 and you will receive a response like:



If you want you can use POST instead of GET to insert a new Character to our Map.


As you can see developing a Spring MVC HTTP Message Converter is so easy, in fact you must implement two basic operations, when a resource can be read or written and resource conversion.

Hope you found this post useful.

martes, octubre 04, 2011

Can't you see, It all makes perfect sense, Express in dollars and cents, Pounds shillings and pents, Can't you see, It all makes perfect sense (Perfect Sense Part II - Roger Waters)



REST INTRODUCTION

From Wikipedia: REST-style architectures consist of clients and servers. Clients initiate requests to servers; servers process requests and return appropriate responses. Requests and responses are built around the transfer of representations of resources. A resource can be essentially any coherent and meaningful concept that may be addressed.

As you have read the most important thing in Rest architecture is the existance of a resource. This resource  can be anything (typically required information requested by client) that can be identified with a global identifier (URI in case of HTTP). In order to manipulate these resources, client communicates using standard interfaces (like HTTP) and exchange representations of these resources (using HTML, XML, ...).

Note that Rest does not force you to use any specific network protocol nor how resources are identified.

For those who have never read about Rest this description of Rest architecture could seem something strange and bit complicated. 

A RESTful web service is a simple web service implemented using HTTP and the principles of REST. URI is defined as global identified, communication interface is HTTP and resource representation can be any valid Internet media type like JSON, XML or YAML. The set of operations that can be executed to resources depend on HTTP Methods and are (GET - retrieving/listing, PUT - replacing/updating, POST - creating and DELETE - deleting).

HANDS ON WORK

Let's create our first Rest application with help of Spring MVC. Imagine an application that has a database of manga characters, and you want to provide a Rest interface so clients can retrieve characters following a RESTful strategy.

First thing to do is identify the resource. In this case it is easy, "a character". Next step is finding a URI that determines unequivocally a character. Easy too de facto rule can be applied here. This rule suggests that a unique URI can be <host>/<applicationname>/<resourceName>s/<id> in our case to return (GET) character with id 1 the URI would be "http://localhost:8080/RestServer/characters/1". If no identifier is present all characters should be retrieved. If instead of GET, POST is used, a character with id "1" would be inserted. And finally decide which Internet media type is required, in this case doesn't matter because we are implementing both client and server so initially XML will be used.

CODING

Let's start with a simple Spring MVC application created with Spring MVC template. Not much secret here, you will have a servlet-context.xml where component-scan, annotation-driven and InternalResourceViewResolver are registered.

Next step is defining Character class. A simple POJO with four attributes. Class is converted to its XML representation using Jaxb annotation.  Jaxb allows developers to map Java classes to XML representations and viceversa.

And finally the most important class in Spring MVC, "The Controller". Controller will be the responsible of implementing required operations of Character resource. In current case only GET is implemented, the other operations would be similar. Let's see the code:


First part is a map where all characters are stored. I have used this approach to not focus in data access. Then findCharacter method that is called when URI is /characters/{characterId}. This is a URI template and is a URI-like string, containing one or more variable names, which can be accessed using @PathVariable annotation. So when you are accessing to /characters/1 parameter characterId is bound to 1.

Last important part is @ResponseBody annotation. This annotation can be put on a method and indicates that the return type should be written straight to the HTTP response body, and not placed in a Model, or interpreted as a view name as standard behaviour of Spring MVC. So findCharacter method returns a Character object.

And that's all if you execute this code, and for example you enter URI http://localhost:8080/RestServer/characters/1 the output (using RestClient UI) will be:


And now is when you are wondering, ¿If I am returning a Character object and output is a XML, where is conversion between object and XML? So easy, let me introduce a new concept: HttpMessageConverters. HttpMessageConverter is responsible for converting from HTTP request message to an object and converting from an object to HTTP response body. Next HttpMessageConverters are registered by default:

- ByteArrayHttpMessageConverter
- StringHttpMessageConverter
- ResourceHttpMessageConverter
- SourceHttpMessageConverter
- XmlAwareHttpMessageConverter
- Jaxb2RootElementHttpMessageConverter
- MappingJacksonHttpMessageConverter

So now you understand why works perfectly. When you are returning Character instance, Jaxb2RootElementHttpMessageConverter using canWrite method checks if class contains XmlRootElement annotation. If class is annotated, write method is called. In this case Jaxb marshaller is called, and XML is returned. Same from XML to object but using Jaxb unmarshaller class.

So easy, no complicated configurations, no complicated mappings, no unclear code, and you only need to worry about your model objects, not in conversion. But let me introduce one change. Now instead of returning XML we want to return JSON.

Change could not be easier, add Jackson library to pom.xml and change @XmlRootElement to @JsonAutoDetect. And now MappingJacksonHttpMessageConverter will handle this object and will transform Character instance to JSON protocol using Jackson library. Only changing one line of code!!!

And now output will be:



CONCLUSIONS

Of course this is a very simple application with only one operation, but it gives you an idea of how to develop Restful web services using Spring MVC. It is a matter of time of writing all your required operations using same approach that I have used with GET.

Arriving at this point I think that all of us have arrived to same conclusion. Annotations are really really powerful, and Spring MVC fits perfectly for developing RESTful web services.

See you next time.

miércoles, septiembre 28, 2011

Ratti Ratti Sachi Maine Jaan Gavayi Hai, Nach Nach Koylo Pe Raat Bitayi Hai, Akhiyon Ki Neend Maine Phoonko Se Uda Di (Jai Ho - A.R. Rahman)




Last week Spring Integration 2.1 M1 was released. One of new features is an implementation of MessageStorage interface that relies upon MongoDB for persistence.  

In current post I will show you an example of Spring Integration using Claim Check pattern and MongoDB as storage system.

First of all a small introduction of each component we will use.

Spring Integration provides an extension of the Spring programming model to support the well-known Enterprise Integration Patterns. It enables lightweight messaging within Spring-based applications and supports integration with external systems via declarative adapters.

MongoDB is a high-performance, schema-free, document-oriented database, which stores JSON-like documents.

Claim Check pattern is an EIP pattern which allows you to replace message content with a claim check (a unique key), which can be used to retrieve the message content at later time. This pattern can be used when message content is very large and should be sent by request or when you cannot trust the information with an outside party and you use a claim check to talk with it.

As an example I have used a simplified version of Cafe Sample Application that are shipped with Spring Integration bundle, but adding MongoDB MessageStore.


In this diagram you can see a schema of sample application. First of all cafe component is the gateway, the point of entree to the system, and is connected to orders channel. Orders are sent to splitter, and each order is sent through pack channel. Last step is check-in component. This component is responsible to store order into MongoDB and generate an UUID. Finally UUID is sent to output (in this case console output).

Now I will show you the code of first part of application. I have assumed that you have already installed MongoDB on your system and is running.

Application context file contains Spring Integration beans that define structure defined above. See that the most important part is the definition of MongoDbFactory at line 18, and line 22 where we are defining database name (test database is created by MongoDB by default). The other part of file is self-explained. See that in line 37, claim check is defined and MongoDB message storage is referenced.

Next important piece of code is Cafe interface. This class acts as a facade for placing orders and not explained yet, for retrieving orders. Let's see this class:

Typical Spring Integration gateway class, where entering points are defined. For now line 7 is the most important because this method will be called for placing orders into orders channel.
And now you are ready to send orders using Cafe interface and receiving at console output not the order but an UUID.

This is a simple snippet that can be executed as unit test (in fact this is not a unit test I know, but for teaching purpose is enough). And if you execute it you will see two long numbers in console, something like:


Remember that number because is required in next step. Meanwhile orders are stored into database. If mongo console is opened, we can query test database and all orders stored will be returned.


Now that check-in has been implemented and tested, it is time to implement check-out method. In this example user (you) will enter UUID using console. Entered reference will be sent to input channel, order object with entered code will be retrieved from database, and sent it to stdout channel. Let's see the code:


Not much difference but instead of using claim-check-in we are using claim-check-out. In line 37 we are defining a gateway to enter UUID (recoverOrderAndSent method of Cafe interface). Introduced identification is sent to input channel and check-out is executed, consequently order with that identifier is sent to output channel (console output).

And our "test" now looks like:


Take a look, first of all two orders are sent to check-in component. Orders are saved to MongoDB and returned UUID are printed to console. Next step in that method is asking user to introduce one of previous UUID and order is checked-out. 

And the output:


Note how simply is to configure your application to use claim-check pattern with Spring Integration, only configuring XMLs, without writing one line of code.. This is a simple example, of course you can use all the power of Spring Integration (aggregators, splitters, adapters, ...) with claim-check.

I wish you have found that post useful.

Download code.

Music: http://www.youtube.com/watch?v=NtDzUwVWQL0

jueves, septiembre 22, 2011

Luna quieres ser madre Y no encuentras querer Que te haga mujer Dime luna de plata (Hijo de la Luna - Mecano)



In previous post link I talked about aspects, concretely ITD, and how can be used to design your classes with its own responsibilities while maintaining source code clear and concise. In that post I used aspectj and spring-aspects as aspect-oriented implementations. An important concept of aspect programming is the process of weaven. An aspect weaver takes information from raw classes and aspects and creates new classes with the aspect code appropriately weaved into the classes.

If you are using Eclipse with AJDT plugin, weaver is executed automatically, but if you are using any build tool like Maven, you should take care of configuring correctly so generated classes contains aspect code too.

In this post I will explain how I have modified a pom file so compilation process also weaves aspect code into classes.

First thing I always do when I generate a pom file is adding component version in properties section. In this case spring and aspectj:

Next step is adding dependencies. To work with aspectj and spring-aspects four dependencies must be added.

But one more important dependency is required, and let me surprise you:


You can think I am joking, but not, JPA API is required. Here you can read why: https://jira.springsource.org/browse/SPR-6819. AnnotationDrivenStaticEntityMockingControl class requires javax.persistence.Entity to be in classpath. Hopefully nowadays most projects use JPA, so this dependency may already be required by your code, if not, then you should append it as dependency.

And finally aspectj-maven-plugin is registered:


This plugin requires that you define which dependencies will be used using <dependencies> tag. In order to apply already compiled aspects to your own sources you need to setup all JAR files you would like to weave in the plugin configuration using <aspectLibraries> section; in this case spring-aspects artifact is required so application can use capabilities that offer @Configurable annotation. Finally execution section should contain compile and test-compile goals so main and test classes can be woven.

Hope you find this post useful.

Alex.

lunes, septiembre 19, 2011

The Bell That Rings Inside Your Mind, Is Challenging The Doors Of Time, It’s A Kind Of Magic (A Kind Of Magic - Queen)



During class design we should take decisions about the assignment of responsibilities that will have every class. If we have chosen well, systems tend to be easier to understand, maintain and extend.

Almost all of our projects have a persistence layer, either relational database, document stores, or simply XML files. And typically you will use DAO pattern to implement abstract interface between your business objects and your data store.

In this post but I am going to explain another pattern that can be used instead of DAO pattern. Active record pattern is an architectural pattern that force you to implement CRUD operations on your model  class, hence model class itself is responsible for saving, deleting, loading from database.

There are many strategies to follow to implement this pattern, but for me, the best one is using Aspect Oriented Programming, because we are still maintaining separation of concerns favoring isolated unit testing, and not breaking encapsulation.

Aspect-oriented programming entails breaking down program logic into distinct parts. These parts are known as crosscutting concerns because they "cut across" multiple abstractions in a program. Example of crosscutting concerns can be logging, transaction manager, error manager or splitting large datasets. For people that have worked with aspects not much secret here, to use them you simply create an aspect defining the advice and the pointcut, and your aspect is ready to be executed. 

I guess most of us use aspects-oriented programming as I have described in previous paragraph, but will be fewer that uses ITD (Inter-type Declarations) feature.

Inter-type Declarations provide a way to express crosscutting concerns affecting the structure of modules enabling programmers to declare members of another class.

As we say in my country "bad said but well understood", ITD is a way to declare new components (attributes, methods, annotations) of a class from an aspect.

AspectJ is an aspect-oriented extension for Java. AspectJ supports ITD, and for this reason will be used in this post. Moreover I recommend you install AJDT plugin because it will help you develop aspects and having a quick overview of which Java classes are aspecterized.



If you have not understood what ITD is, don't worry, it is a typical example of concept that is best understood with an example.

Let's start with simple example:

Imagine having to model a car. You would have a car class, with some attributes, for this example three attributes (vin number, miles drived and model) is enough.


It is a POJO with three attributes and their getters and setters.

Now we want to add persistence layer, but in this case we are going to persist our POJOs in a XML file instead of a database. So Car objects should be transformed to XML stream. For this purpose JAXB annotations will be used. For those who don’t know, JAXB allows developers to map Java classes to XML representations and viceversa.

I am sure that first idea that comes to your brain is annotating Car class with @XmlRootElement (annotation to map root element in JAXB). Don’t do that, use aspects. Your first mission is trying to maintain Car file as simple as possible. To add an annotation using ITD, is as simple as:


With @type you are exposing which member is annotated. In this case only class. Other possibilities are @method, @constructor and @field. Then elements pattern that should be annotated, in this case Car class, but you could use any regular expressions like org.alexsotob..*. Finally the annotation.

Next step is using JAXB classes to marshalling/unmarshalling objects. In this example I am using spring-oxm package and briefly you will understand why. Spring-oxm is a part of spring-core that contains classes for dealing with O/X Mapping.

This spring module contains one class for each Xml binding supported. In our case Jaxb2Marshaller is used as marshaller and unmarshaller.

It is possible that you are thinking of creating a service class where you inject Jaxb2Marshaller instance. This service would include two methods (save and load) with Car class as argument or return value. Sorry but, doing this, you are implementing DAO pattern. Let's implement Active Record pattern approach. And as you may suppose, aspectj comes to rescue you to avoid mixing concepts in same source file.

Let's update previous aspect file so all required logic by JAXB will be in same file.


See that apart from annotating Car class we are creating two methods, and an annotated attribute.  Attributes must follow same rule as methods,  <class name> dot (.) and <attribute name>. Note that in this case attribute is transient because should not be bound in XML file.

Last step is configuring marshaller in spring context file.

Not much secret. Now let's code a unit test.

Run junit class and BOOM all red, with an amazing NullPointerException. Marshaller is created in Spring context, but not injected into Car class (Car is not managed by spring container, so is impossible to be injected). And now I suppose you are telling yourself: "I told you a service layer would be better, because it would be managed by Spring and autowired would work perfect.". But wait and see. How about using spring-aspects module? Spring Aspects contains an annotation-driven aspect (@Configurable) allowing dependency injection of any object, whatever is or not controlled by container. So let's apply last two changes and the application will run.

First of all is creating a new aspectj file to annotate Car class as Configurable.

And finally modify spring context file to allow @Configurable annotation.

Adding <context:spring-configured></context:spring-configured> namespace is enough. As a result, any time you instantiate an object (via the "new" keyword), Spring will attempt to perform dependency injection on that object.

Now run unit test again and green will invade your computer :D.

ITD is a really nice solution to design classes with its own responsibilities. It gives you the oportunity of writing maintainable and understandable code, without loosing encapsulation. Of course you should take care of not to have high coupling in aspected classes, and convert them in "God Classes".

Note that implementing same approach but using relational database, it is as simple as changing Jaxb2Marshaller to EntityManager.

I wish you have found this post useful.

Download Full code

Music:  http://www.youtube.com/watch?v=KLFZzInXAWI

lunes, septiembre 12, 2011

Questa di Marinella è la storia vera Che scivolò nel fiume a Primavera Ma il vento che la vide così bella Dal fiume la portò sopra una stella (Canzone Di Marinella - Mina)




Most of us use Maven as a build automation tool. One of most important section are those related to dependencies. Typically projects have dependencies to external libraries like Spring, Hibernate, Slf4j, ... which are downloaded from External Maven Repository Servers like mvnrepository, ibiblio or from own Internal Maven Repository Server using JFrog or Nexus. This scenario but does not cover all possible cases.

It is uncommon that your project requires an external library that is not present in any repository, but can occurs, and in these cases are where you use company Internal Maven Repository to upload these artifacts. But this is not always possible, and typical example is when you are writing in a blog about a beta library and it is not present in any repository, and you want to add this library to pom file. Would be perfect if Maven could resolve these dependencies too without packaging them into project. So what can we do?

There are some alternatives:

- packaging .class files of external library into your project, so they are packaged as project files.
- asking library creators to upload Milestones to public repositories.
- create your own public Maven Repository.
- using GitHub?

Using GitHub, YES!!. You can configure your GitHub account as Maven Repository. In my case I always upload sample code to GitHub. Why not creating a GitHub project that acts as a Maven Repository and poms referencing to this repository?

Now I will summarize steps that shall be followed: (assuming that you have already created a project).
  1. Create a GitHub repository.
  2. Initialize local directory as Git repo.
  3. Modify project pom so distribution management tag points to local repository.
  4. Perform deploy goal so artifact is deployed to local repository.
  5. Push changes to GitHub.
  6. Nothing more. Now other project poms can use your repository.

Let's get down to work:

The first thing you should know is that the project I want to upload to Maven Repository is located at /media/share/workspace/github-test and project that will have a dependency will be located at /media/share/workspace/bar.

First step is creating a new GitHub repository. So login to your GitHub account and create a new repository. In my case I have named maven-repository.


Second step is initialize local directory as Git repository.

Third step is adding distributionManagement tag pointing to repository so when project is deployed, artifact is created to Git directory.

See that url tag is pointing to previously created Git directory (local disk).

Fourth Step is running mvn -DperformRelease=true clean deploy. Project is compiled, tested, and packaged. Now go to local repository and see what has appeared.

Fifth Step implies two operations, committing changes and pushing them to GitHub.

See that first command is adding root directory. Then all files are committed to local repository and pushed to GitHub repository.


Sixth Step if you want to call it step, is adding repository tag to project that requires published component. In our case bar project requires it.

This pom represents a project that wants to use github-test module. It is important to note that repository url is special (not showed when you explore GitHub projects with navigator), and is https://github.com/maggandalf/maven-repository/raw/master. After repository name (in this case maven-repository) you should concatenate /raw/master. raw because you want to access to files without any decoration (no HTML), and master because it is the branch name.

When bar project is being compiled, github-test artifact is downloaded.


I think it is a good approach when you need to upload a Maven artifact temporally, or simply because your project requires a milestone version of artifact that is not uploaded into any repository.

I hope you like this post, and ... MOVE TO GIT.

Music: http://www.youtube.com/watch?v=TL69oTe6HHY

martes, septiembre 06, 2011

Soy El Capitán De La Nave Tengo El Control, Llamando A La Tierra Esperando Contestación, Soy Un Cowboy Del Espacio Azul Eléctrico (Llamando A La Tierra - M.Clan)



Groovy is an object-oriented programming language for the Java platform and can be used as a scripting language. Most of us but instead of using Groovy alone, we use Grails (web framework based on Groovy) for developing web applications.

But Groovy can be used standalone for developing your internal tools. Let me explain why Groovy scripts have simplified our development of tools for generating data for integration tests.

In my company we create clinical instruments. These instruments are so expensive, and big enough to say that they are not portable. For these reasons each instrument has an emulator, so integration tests are run without having physically any instrument.

Our emulator has an XML file where all required resources for running tests are configured. In summary each file contains a list of blood barcodes, and reagent barcodes. In our case barcodes have a meaning (kind of resource, expiry date, checksum, ...).

So one can think about creating a standard configuration file that is used in integration tests. It is a good idea if you don't know that expiry date is expressed in months. So one day without knowing exactly why your tests begin to fail. The answer is obvious, resources have become expired. We need a (script ?) that updates this XML file so when a resource is becoming expired, its month field is changed.

Moreover not all resources should be updated. Some tests imply working with expired resources. For this reason some barcodes have special format that suggests that update should not be applied.

Because project is developed in Java, one can think about creating a small Java class that do all these work. But wait and think if this class would be small or not, we need a parser (DocumentBuilder), a method that walks all nodes and see if resource is expired or not (NodeList, Element.getAttribute(), ...), and finally writing modifications to file. Furthermore, some barcodes contains special charachtrs that should be detected using a regular expression (Pattern, Matcher) for avoiding its update. Although this class is not difficult, it is far away from a small class with few lines.

But how about polyglot programming? Groovy can be a choice. Let's see how Groovy deals with XML and Regular Expressions.

Imagine next XML file:


Groovy and XML

Meanwhile in Java you create a DocumentBuilderFactory that returns a DocumentBuilder, and then call parse method, in Groovy is as simple as:

and root variable points to root element of XML file.

And now imagine that you want to update all blood barcodes of given holder. With Java you will use an iterator over NodeList or using an XPath expression. See the simplicity with Groovy.


See that with Groovy, nodes are explored as object attributes, and finally we call findAll() method that returns a list of sample nodes belonging to SampleHolder11 tag. For returning an attribute value is as easy as adding @ character before attribute name, in our case barcode.

And for writing an XML is as easy as:


In previous example, output is written to console.

Groovy and Regular Expressions

Remember that some barcodes have special format. In our case if any barcode starts with A, B, C or D, should not be modified. A clean, reusable and maintainable solution is using regular expressions to check if one barcode matches or not special format. In Java the process is not banal, you have to create a Pattern object with a regular expression, a Matcher and use find() method to see if pattern is found on entry or not.

But how we can determine if one string matches a regular expression in Groovy ? ==~ operator does the work for us. Tell me what you think about this expression:


Elementary. You will agree with me that Groovy approach is simpler than Java one.  For specifying a pattern you only have to use ~ character plus a slash (/) then a regular expression and finally a slash to delimit. But Groovy also supports =~ (create a Matcher) and ==~ (return boolean, whether String matches the pattern).

I think that creating a script that reads/parses/writes XML file is so fast and easy, not much classes are involved compared to Java. And not worth making the comparison with regular expression approach, in Groovy is the simplest way one may think.



After the success of previous Groovy script, we decided to create another tool (Groovy script) for manipulating database.

System registers into database every incidence that has occurred to an execution. Some incidences are easy to reproduce with emulator, but others not. In integration tests there is no problem, because a defined-filled database is used, but acceptance tests contains no data and is generated with the execution of tests. But because there are some incidences that are hard to reproduce in emulator, a Groovy script to insert incidences are created.

Groovy and SQL

As Java, Groovy can also access to databases and as you can suppose in a simple way. No Connection object, no PreparedStatement, no ResultSet, ....

Let's see an example of searching data and using them for creating a new registry. Imagine that we have a table called Execution and another one called Incidence, and one-to-many relationship.


Simple, yes? With one line a connection to database is established.

Executing SELECT query and iterate over results are also easy. Using eachRow method, all results are iterated. No ResultSet object is required anymore. Parameters values are passed between brackets ([]), and as XML each row is accessed using Closure. In previous example each row is mapped to execution variable. Moreover see how easy is read a value of each tuple. As in XML you access the value as a class attribute, no more getters of ResultSet methods, in example execution.dboid is used to refer to dboid field.

Finally execute method is used to update database.



Now that I have shown you some of nice features that Groovy offers us, I will explain you how we execute these tools.

GMaven

We use Jenkins with Maven as Continuos Integration System. Before Jenkins starts to execute Integration Tests, Groovy scripts are executed so emulator is configured propertly. The iteresting part of this step is how pom is configured for executing Groovy scripts.

Exists a plugin called gmaven-plugin. This plugin runs Groovy scripts depending on phase and goal.


No problem configuring gmaven plugin, the most important part is where you specify which script should be executed.

As final notes I want to say that I am a huge fan of Java and my intention is not criticize it, but I think that there are some problems that are best suited using other languages rather than Java. My advice is  learning as many kind of languages as you can (Scala, Groovy, ...) so as programmer you can choose the best solution to a given problem.

I wish you find this post useful.

Music: http://www.youtube.com/watch?v=9u54Xs22_SI