Mostrando entradas con la etiqueta persistence layer tests. Mostrar todas las entradas
Mostrando entradas con la etiqueta persistence layer tests. Mostrar todas las entradas

martes, mayo 02, 2017

Testing Dockerized SQL Databases


One of the big advantages of using Docker for testing is that you don't need to install the required dependencies of code under tests in all machines where you are going to run these tests. This is really helpful for external services such as database servers, mail services, JMS queues, ... Also one of the big advantages of this approach is that the tests are going to use the same version used in production.

So for persistence tests using Docker is a really good approach to follow. But as usually this approach comes with some drawbacks. 

The first one is that obviously you need to have Docker installed in all machines that needs to run the tests, not a big problem but something to take into consideration, as well as Docker inside Docker problem.

The second one is that you need to automate somehow the starting and stopping of the container.

The third one is that Docker containers are ephemeral. This means that when you start the container, in this case a container with a SQL server, then you need to migrate the database schema there.

The fourth one, and this is not only related to Docker, is that you need to maintain test method execution isolated from test to test execution, by providing known data before execution and cleaning data after the execution so other test finds the environment clean.

First and second problems are fixed with Arquillian Cube (http://arquillian.org/arquillian-cube/). It manages lifecycle of containers by starting and stopping them automatically before and after test class execution. Also it detects when you are running into a DinD situation and configures started containers accordantly.

Arquillian Cube offers three different ways to define container(s).

  • Defining a docker-compose file.
  • Defining a Container Object.
  • Using Container Object DSL.

For this post, Container Object DSL approach is the one used. To define a container to be started before executing tests and stopped after you only need to write next piece of code.


In this case a JUnit Rule is used to define which image should be used in the test (redis:3.2.6) and add as binding port the Redis port (6379).

The third one can be fixed using Flyway. It is an open-source database migration tool for SQL databases that allows you to automate the creation of database schemas.

Flyway is useful here since you can start the Docker container and then apply all migrations to the empty database using Flyway.

The fourth problem can be fixed by using tools like DBUnit. iI puts your database into a known state between test runs by populating database with known data, and cleaning it after the test execution.

Arquillian integrates with both of these tools (Flyway and DBUnit)  among others with its extension called Arquillian Persistence Extension (aka APE),

An example on how to use APE with DBUnit is shown in next snippet:

You can use Arquillian runner as shown in dbunit-ftest-example or as shown in previous snippet using a JUnit Rule. Choosing one or other depends on your test requirements.

So how everything fits together in Arquillian so you can boot up a Docker container with a SQL database, such as PostgreSQL, before test class execution, then migrate SQL schema and populate it with data, execute the test method, then clean the whole database so next test method finds a clean database and finally after test class execution, the Docker container is destroyed?

Let's see it in the next example:

Test is not so much complicated and it is pretty  much self explanatory of what it is doing in each step . You are creating the Docker container using Arquillian Cube DSL, and also you are configuring the populators by just using Arquillian APE DSL.

So thanks of Arquillian Cube and Arquillian APE  you can make your test totally isolated from your runtime, it will be executed always agains the same PostgreSQL database version and each test method execution will be isolated.

You can see full code at https://github.com/arquillian/arquillian-extension-persistence/tree/2.0.0/arquillian-ape-sql/standalone/dbunit-flyway-ftest

We keep learning,
Alex
Ya no me importa nada, Ni el día ni la hora, Si lo he perdido todo, Me has dejado en las sombras (Súbeme la Radio - Enrique Iglésias)
Music: https://www.youtube.com/watch?v=9sg-A-eS6Ig

lunes, abril 10, 2017

Arquillian Persistence with MongoDB and Docker


In this screencast you are going to see how you can use Arquillian Persistence Extension (https://github.com/arquillian/arquillian-extension-persistence/tree/2.0.0) and Docker to write persistence tests for MongoDB.

To manage Docker lifecycle, I have used Arquillian Cube (http://arquillian.org/arquillian-cube/) and for populating data into MongoDB, the fairly new integration between Arquillian Persistence Extension (aka APE) and NoSQLUnit (https://github.com/lordofthejars/nosql-unit).



We keep learning,
Alex.

Ridi, Pagliaccio, Sul tuo amore infranto! Ridi del duol, che t'avvelena il cor! (Vesti la giubba (Pagliacci) - Leoncavallo)
Music: https://www.youtube.com/watch?v=Z0PMq4XGtZ4


lunes, mayo 19, 2014

Testing Polyglot Persistence Done Right [slides]


This week I was at Krakow in the amazing conference called GeeCon. My talk was "Testing Polyglot Persistence Done Right" and it was cospoken with my friend Bartosz Majsak.

The abstract was:

"Data storage is one of the most crucial parts of any applications, and we use many different tools and tricks to keep it in a good shape. We frequently use both old school relational systems with new approaches commonly known as NoSQL. We write sophisticated queries and use optimization techniques to give our end users the greatest possible experience.
So why is persistence very often skipped in the testing efforts? Is it really that complex and painful to setup? During this talk we will have a closer look at Arquillian Persistence Extension together with NoSQLUnit. These tools remove that burden and boilerplate to make you a happy and productive programmer again! Join this session and see for yourself that writing tests for your data storage logic is as easy as writing normal unit tests!"

And here you can see the slides (http://www.slideshare.net/asotobu/testing-polyglot-persistence-done-right). If you have any question don't hesitate to approach us.



We keep learning,
Alex.
En los mapas me pierdo. Por sus hojas navego. Ahora sopla el viento, cuando el mar quedó lejos hace tiempo. (Pájaros de Barro - Manolo García)

Music: https://www.youtube.com/watch?v=9zdEXRKJSNY


lunes, febrero 11, 2013

NoSQLUnit 0.7.4 Released


NoSQLUnit is a JUnit extension to make writing unit and integration tests of systems that use NoSQL backend easier. Visit official page for more information.

In 0.7.4 release, next changes has been added:
We keep learning,
Alex.

I laughed at love cause I thought it was funny, but you came along and moooooved me honey, I've changed my mind, This love is fine (Great Balls Of Fire - Jerry Lee Lewis)

Music: http://www.youtube.com/watch?v=7IjgZGhHrYY

miércoles, febrero 06, 2013

NoSQLUnit Forge Plugin is Forged




NoSQLUnit is a JUnit extension to make writing unit and integration tests of systems that use NoSQL backend easier.

Forge is a core framework and next-generation shell for tooling and automation at a command line level.

With NoSQLUnit Forge Plugin we can use Forge to create tests for NoSQL databases using NoSQLUnit.

This plugin we can create three kind of tests depending on the lifecycle that is required:
  • Embedded: typically used in unit testing which starts an embedded instance of required database (not supported by all engines).
  • Managed: usually used during integration or high level tests, which starts a remote instance in the same computer where tests are run.
  • Remote: which uses already run database instances, usually in remote computers.
In current version of plugin, it supports next databases:
  • MongoDB
  • Neo4j
  • Redis
  • Cassandra
  • HBase
  • Infinispan
  • CouchDB
When we execute the main command of this plugin, one JUnit test configured with NoSQLUnit features and one dataset file will be created.

Moreover the created test will contain one method for each public method of development class under test.

The main command is nosqlunit. Then the lifecycle, which can be embedded, managed or remote. And finally depending on the chosen lifecycle some the arguments.

The common arguments are:
  • engine: we choose which database engine we want to use.
  • databaseName: we set the name of the database under test.
  • classname: we set the name of the test class created by the plugin.
  • classUnderTest: full class name of the class we want to write a test.

Embedded

There is no special arguments

Managed
  • path: home directory where NoSQL database is installed.
Remote
  • host: server address.
  • port: server port.
So for example a valid command will be:

nosqlunit managed --engine MONGODB --path /opt/mongo --databaseName test --classname MyTest --classUnderTest com.example.MyClass.java

And it creates MyTest test class under /src/test/java/com/example and a dataset file in /src/test/resources/com/example.

As almost all Forge plugins, you can install NoSQLUnit Forge Plugin by calling forge find-plugin nosqlunit and  forge install-plugin nosqlunit.

And finally remember that you can play with TAB to make your life easier.

We keep learning,
Alex.

Poder jugar al cel, això és un amor, I sentir el porc grunyir fort. Això ja no, quin horror!  (Dr Slump)
Music: http://www.youtube.com/watch?v=ts1J4lGexcg


lunes, enero 07, 2013

NoSQLUnit 0.7.3 Released



NoSQLUnit is a JUnit extension to make writing unit and integration tests of systems that use NoSQL backend easier. Visit official page for more information.

In 0.7.3 release, next changes has been added:

  • Support for Infinispan.
  • Adding the possibility to add custom insertion and comparison methods for each engine. Thanks to Bob Tiernay for the idea. https://github.com/lordofthejars/nosql-unit/issues/45
  • Adding the possibility to avoid NoSQLUnit injects fields annotated by @Inject by using @ByContainer annotation.  Very useful for Spring Framework Tests, Arquillian Tests or Needle Tests.
  • Removed JMockMongo as embedded Mongo implementation for Fongo project. Users should not notice any difference from the point of view of NoSQLUnit. Thank to Bob Tiernay for providing this valuable information about Fongo.
  • Updated mongo-java-driver to 2.10.1.
  • Updated neo4j to 1.8.
  • Fixed bug #46 thanks to MrKeyholder for discovering and attaching the solution code.
We keep learning,
Alex.
Fiery mountain beneath the moon, The words unspoken, we'll be there soon, For home a song that echoes on, And all who find us will know the tune. (The Lonely Mountain - Neil Finn)

jueves, diciembre 20, 2012

NoSQLUnit 0.7.1 Released


NoSQLUnit is a JUnit extension to make writing unit and integration tests of systems that use NoSQL backend easier. Visit official page for more information.

In 0.7.1 release:
  • One new NoSQL system is supported and is CouchDB
  • JUnit version has been upgraded to 4.11. Now using @Inject does not require to pass this reference.
  • Business objects do not contain any dependency to JUnit classes. This is the first step to integration to Arquillian Framework.
  • Now we can test sharding and master/slave replication of Redis servers.
  • Bug fixing.
In next version of NoSQLUnit support for Infinispan, adding class capabilities for testing sharding and master/slave replication in MongoDB and bug fixes will be provided.

We keep learning,
Alex.

Como una sonrisa, eres tú, eres tú.
Así, así, eres tú. (Eres Tú - Mocedades)



martes, diciembre 04, 2012

Writing Acceptance Tests for Openshift + MongoDb Applications



Acceptance testing are used to determine if the requirements of a specification are met. It should be run in an environment as similar as possible of the production one. So if your application is deployed into Openshift you will require a parallel account to the one used in production for running the tests. In this post we are going to write an acceptance test for an application deployed into Openshift which uses MongoDb as database backend.

The application deployed is a very very simple library which returns all the books available for lending. This application uses MongoDb for storing all information related to books.

So let's start describing the goal, feature, user story and acceptance criteria for previous application.

Goal: Expanding lecture to most people.
Feature: Display available books.
User Story: Browse Catalog -> In order to find books I would like to borrow, As a User, I want to be able to browse through all books.
Acceptance Criteria: Should see all available books.

Scenario:
Given I want to borrow a book
When I am at catalog page
Then I should see available books information: The Lord Of The Jars - 1299 - LOTRCoverUrl , The Hobbit - 293 - HobbitCoverUrl

Notice that this is a very simple application, so the acceptance criteria is simple too.

For this example, we need two test frameworks, the first one for writing and running acceptance tests, and the other one for managing the NoSQL backend. In this post we are going to use Thucydides for ATDD and NoSQLUnit for dealing with MongoDb.

The application is already deployed in Openshift, and you can take a look at https://books-lordofthejars.rhcloud.com/GetAllBooks



Thucydides is a tool designed to make writing automated acceptance and regression tests easier. 

Thucydides uses WebDriver API to access HTML page elements. But also helps you to organise your tests and user stories by using a concrete programming model, create reports of executed tests, and finally it also measures functional cover. 

To write acceptance tests with Thucydides next steps should be followed. 
  • First of all choose a user story of one of your features. 
  • Then implement the PageObject class. PageObject is a pattern which models web application's user interface elements as objects, so tests can interact with them programmatically.  Note that in this case we are coding "how" we are accessing to html page.
  • Next step is implementing steps library. This class will contain all steps that are required to execute an action. For example creating a new book requires to open addnewbook page, insert new data, and click to submit button. In this case we are coding "what" we need to implement the acceptance criteria.
  • And finally coding the chosen user story following defined Acceptance Criteria and using previous step classes.








NoSQLUnit is a JUnit extension that aims us to manage lifecycle of required NoSQL engine, help us to maintain database into known state and standarize the way we write tests for NoSQL applications. NoSQLUnit is composed by two groups of JUnit rules,  and two annotations. In current case, we don't need to manage lifecycle of NoSQL engine, because it is managed by external entity (Openshift).

So let's getting down on work:

First thing we are going to do is create a feature class which contains no test code; it is used as a way of representing the structure of requirements.

Note that each implemented feature should be contained within a class annotated with @Feature annotation. Every method of featured class represents a user story.

Next step is creating the PageObject class. Remember that PageObject pattern models web application's user interface as object. So let's see the html file to inspect what elements must be mapped.

The most important thing here is that table tag has an id named listBooks which will be used in PageObject class to get a reference to its parameters and data. Let's write the page object:

Using @DefaultUrl we are setting which URL is being mapped, with @FindBy we map the web element with id listBooks, and finally getBooksTable() method which returns the content of generated html table.

The next thing to do is implementing the steps class; in this simple case we only need two steps, the first one that opens the GetAllBooks page, and the other one which asserts that table contains the expected elements.

And finally class for validating the acceptance criteria:

There are some things that should be considered in previous class:
  • @Story should receive a class defined with @Feature annotation, so Thucydides can create correctly the report.
  • We use MongoDbRule to establish a connection to remote MongoDb instance. Note that we can use localhost address because of port forwarding Openshift capability so although localhost is used, we are really managing remote MongoDb instance.
  • Using @Steps Thucydides will create an instance of previous step library.
  • And finally @UsingDataSet annotation to populate data into MongoDb database before running the test.


Note that NoSQLUnit maintains the database into known state by cleaning database before each test execution and populating it with known data defined into a json file.

Also keep in mind that this example is very simple so only and small subset of capabilities of Thucydides and NoSQLUnit has been shown. Keep watching both sites: http://thucydides.info and https://github.com/lordofthejars/nosql-unit

We keep learning,
Alex.
Love Is A Burning Thing, And It Makes A Fiery Ring, Bound By Wild Desire, I Fell Into A Ring Of Fire (Ring of Fire - Johnny Cash)