Showing posts with label IntegrationTes. Show all posts
Showing posts with label IntegrationTes. Show all posts

Wednesday, August 19, 2020

Docker Database Integration Testing

 

Background


One of the the most important things in the Software Engineering world is the ability to automate your tests properly against your database, and by this I mean your actual database. Something that can startup your database for you incase it's not running and then also run the tests and finally shutdown your database when the tests are done. 

In my experience I have seen developers introduce a bit of manual work to get the following going : 

  • Starting up the database before running tests.
  • Clearing out database manually when done or writing something to do that.
  • Sometimes, and I mean sometimes, shutting down the database when done with the tests.

One of the great solutions to this problem, is to use some nice "In-Memory" databases. This is great and has helped with automation of the key points mentioned above. No need to start the database and clear it out manually after running tests etc  ...

Now in most cases the in-memory databases don't necessarily match up to the actual database you will be running in the Development, QA and Production environments, i.e, when using something like H2 IN -Memory Database, then we need to keep in mind that it's not really the MSSQL Server or Postgres DB that you are running on the actual environments. This means that you may be limited when testing something more intense, complex and database specific processes and operations. 

Would it not be nicer to be able to test out against an actual database that we running on the various environments? I think it would be awesome. 

So then one day I was with my Chief Architect & CEO discussing about putting a small system together, all the way from tech stack selection, frameworks and to do some RND. I was excited that I will finally get to put this tool to practice and see how it works. I wanted something that can have much like the In-Memory databases except it should be a real database we will be running on. 


Things To Keep In Mind

Before we get started. If you will be checking out the article's code repository for reference then may be set up these tools before: 
  • Java 11
  • Maven 3.6.2
  • Docker

Otherwise this depends on some assumptions : 

  • You are familiar with docker.
  • You already have a Java project that already has datasource connection component. 
  • Some understanding of maven.
  • You have worked with JUnit, in this case we are talking about JUnit 5.


Context


Now that you are ready with the tools needed, we are going to go through a library named, Test Containers. This is a nice light weight Docker API of sorts. Through the power of JUnit 5 we will also be able to startup and shutdown the database automatically. The database for this article is Postgres. Test Containers supports a lot of database services, so you are not tied to Postgres database. Remember this also uses Docker meaning you can pretty much use Test Containers for anything else besides a database service, further more you can even play with some Docker Compose files. So this really opens a whole you avenue of possibilities in the Software Engineering world.


Getting Started


Adding Dependencies


If you check that home page there's a Gradle equivalent that you can try of you are using it. In our case you will need : 

<dependency>
    <groupId>org.testcontainers</groupId>
    <artifactId>postgresql</artifactId>
    <version>1.14.3</version>
    <scope>test</scope>
</dependency>
<dependency>
    <groupId>org.testcontainers</groupId>
    <artifactId>testcontainers</artifactId>
    <version>1.14.3</version>
<scope>test</scope> </dependency>

Docker


For this to work you need docker to be running, then you don't have to worry about : 

  • The actual database service ...
  • Nor the database container ...
  • Or even, the container image. 

Test Containers will sort out the rest for you, even if you don't have the image on your docker. 

Datasource Configuration


It's as simple as using the "Default" Test Container datasource configurations : 

URLjdbc:tc:postgresql:13:///test
Username : test
Password : test
Drive Class Nameorg.testcontainers.jdbc.ContainerDatabaseDriver

Notice the "tc" inside the URL literal. That's how you know that it's a Test Containers url. The default database name is "test" by default, if you check the end of the url literal. This is the same with username and password. 

So then we are almost there. Believe or not, that's all you need. Now the final piece. Some Java JUnit 5 code to test out the magic. 

Java & JUnit


We are going to use the JUnit - @ClassRule, annotation to dow some work before before the the rest of the actual test loads. This is similar to the the @BeforeClass annotation. The reason we will be using this is to kick start our Docker container before the tests even get to run. So it's preparation for the actual test cases. So create a new test case for you existing Data Access code. Add this class field or variable in your code.

 @ClassRule
public static PostgreSQLContainer postgreSQLContainer = (PostgreSQLContainer) new PostgreSQLContainer("postgres:13");

Write some sample code to test this out. In my case I have something along the lines of ...

// Some code here
...

@Test
public void assert_That_We_Can_Save_A_Book() {
    Book saved_lordOfTheRings_001 = bookDataAccess.saveAndFlush( lordOfTheRings_001);
    assertNotNull( saved_lordOfTheRings_001 );
    assertNotNull( saved_lordOfTheRings_001.getId() );
    assertEquals( saved_lordOfTheRings_001.getTitle(), lordOfTheRings_001.getTitle() );
}

So that's pretty much it. You can run your test cases and it should integrate into your database and store some information. We try to test this out. 


Validating The Data Graphically


So we can look into it by debugging our test case with a break point immediately after the line that saves a database record and saves. 

Start up any docker client of your choice and then. At this point you will not have have any Postgres docker container running unless you already were using Postgres in docker. The name of the container that will be run by your test cases will be a random name, there's no way you will miss it plus you can check out the image version on your docker client i.e. postgres:13.

So this is my docker client before running the tests. By the way the docker client I am using is Portainer




So now we are going to run a test case and pause it just after saving to the database. 



As you can see, my debug break point is in place and has paused just after saving to the database. Now let's go back to our docker client interface, Portainer in my case... 




Notice the two new containers being created : 

  • testcontainers-ryuk-40abb5dc-... : it the test containers service running in there. 
  • eloquent_robinson : The more important one is your test database running, as you can see that it's a random name, notice the image, which is postgres:13 that's our guy right there. The port for this thread at this moment is 32769

The port number is important because it's also random like the container name. This is by design according to the Test Containers engineers. For this Test Case run we are going to try to connect to the database while the debug is still on. So that we see our saved data. So get on to your Postgress database client and connect using the following properties : 

  • Username : test
  • Password : test
  • Database Name : test
  • Port : 32769

The URL will be something like : jdbc:postgresql://localhost:32769/test
Connect and run a simple query statement to see your data, saved through that test case. You should get something like the one in the image below : 




Ka-BOOM! There you go. You have not only managed to run a successful integration test, but also validated that it does indeed save your data to the docker hosted database, as you expected. Done, done and done! 


In Closing


I hope this was fun and you have cases where this can help ease your software development processes.

Leave some comments below and you may checkout the GitHub Docker Database Test - Source Code for reference.




Saturday, June 20, 2020

Testing Your Java EE Thorintail Microservice

Context


Oops! Almost left one important thing, we had a view on how to "dockerize" your "Hollow JAR", but we did not focus on implementing some sort of an automated integration test for your web service. There are a lot of way one can perform this exercise. One of the recurrent practices is through the use of Postman followed by Postwoman.

For as long as your service is running then you can develop some really nice Javascript processes that can automate this for you through the tools mentioned above. Another way for traditional "Java" developers would be to go the "JUnit" route. At some point you will also need your service to be running somewhere somehow waiting for client requests.

The JUnit route later got improved for Java EE Integration Testing through the configuration and use of frameworks, Arquillian & ShrinkWrap as top ups to JUnit.


Arquillian & ShrinkWrap

The role that Arquillian plays is of a "middle-man" between an artifact (.jar .war .sar .ear) that you want to deploy and the container or application server you want to deploy to. There are two ways you can use deploy your artifact. Either you deploy the actual physical artifact that you have just built on your machine or you programmatically build one for testing. To build one programmatically you use SkrinkWrap.


Getting Started

For those that already know, you will agree that generally when you are working with a standalone application server, in this case, WildFly, setting up Arquillian is quite a bit of work, but at least you set it up once and the rewards are out of this world. On the other side I have noticed over the past 6+- (since 2013) years that it has improved with all the dependencies one must configure and configuration files that one needs to create. I am really happy with what they have done with it in Thorntail and that's what we are going to look at for our Java EE Microservice.

Once again we will continue working on the previous Thorntail repo we have been working on since our first article, Java EE Micro Services Using Thorntail to the one that followed, which is, Dockerizing Your Java EE Thorntail Microservice.

Start up by opening your "pom.xml" file and adding the following dependency. In the case of Thorntail and the fact that it's already packed nicely for you the developers. You are basically installing or including a "Fraction".

<dependency>
<groupId>
io.thorntail</groupId>
<artifactId>
arquillian</artifactId>
<scope>
test</scope>
</dependency>

Part of the normal process with Arquillian is to also include JUnit dependenies / libraries, right? Ha ha ha ha, well this Thorntail already includes it. Currently it's including JUnit 4.12 so you may exclude it using Maven and rather include JUnit 5 if you want. For the purpose of getting you up and running with Thorntail I am just going to keep it as is. This fraction also includes Arquillian & ShrinkWrap libraries that you would have to configure separately, but not today! The next thing is to configure our maven testing plugin, the Maven Failsafe Plugin
.

This plugin was desgined for integration testing which is exactly what we want since we want to integrate into our REST Service when testig and get real results.

<plugin>
<artifactId>
maven-failsafe-plugin</artifactId>
<version>
2.22.2</version>
<executions>
<execution>
<goals>
<goal>
integration-test</goal>
<goal>
verify</goal>
</goals>
</execution>
</executions>
</plugin>


Test Case Implementation

Now let's get to the fun stuff, our test case implementation. So create a new Test Suite aka Test Class. Make sure that he name of the class ends with "IT" (which means Integration Test) because by default, Maven will look for classes that end with "IT" in order to run them as Integration Tests. For example I named mine "SampleResourceIT". So let's move on to some action ...

Add an annotation at class level as follows :

...
@RunWith(
Arquillian.class)
public class
SampleResourceIT { }
...

You are instructing JUnit to run with Arquillian. Literally what's there. So this will not be a normal JUnit test case. You will notice that immediately after adding this annotation, you will have some errors already. This is because Arquillian now wants to know how you would like to package your artifact that it should deploy.

So then we should now add a new method with the @Deployment annotation. This annotation is from the Arquillian Framework. It is where we will build our artifact for Arquillian to deploy it.

...
@Deployment
public static
Archive createDeployment() { }
...

Now let's build our test ".war" file inside that method using the SkrinkWrap API.

...
@Deployment
public static
Archive createDeployment() {
WebArchive webArtifact = ShrinkWrap.create( WebArchive.class, "thorntail-test-api.war");
webArtifact.addPackages( Boolean.TRUE, "za.co.anylytical.showcase");
webArtifact.addAsWebResource("project-defaults.yml");

// Print all file and included packages
System.out.println( webArtifact.toString(true));

return
webArtifact;
}
...

So now have built our small simple test web archive file. We are giving our war file a name. "thorntail-test-api.war", I believe you know that you can name it anything you want so you are not tied to naming it almost similar to the original file name. The next thing we are doing is include a our package which contains pretty much the our REST Service and it's business logic. So in a nutshell it contains our application java classes, in "za.co.anylytical.showcase" this follows a flag that tells ShrinkWrap to search recursively. The next part is about including any of our resource file that we may want to include. So I knwo that some of our actual REST configuations are in our file, "projet-defaults.yml", and to be as close as possible to our actual application we should include it in our test web archive file. Last part is printing everything that's in our file just to see what this test war file contains and be sure that we have everything that we want in there.


Great stuff. So now we want to write a test that just calls our test service and affirm that we managed to reach the Web Resource just fine.

...
@Test
public void
test_That_We_Reach_Our_WebResource_Just_Fine_Yea() throws Exception {
Client client = ClientBuilder.newBuilder().build();
WebTarget target = client.target("http://localhost:8881/that-service/text");
Response response = target.request().get();
int statusCode = response.getStatusInfo().getStatusCode();
String reponseBody = response.readEntity(String.class);
assertEquals( 200, statusCode);

    
System.out.println("RESPONSE CODE : " + statusCode);
    System.out.println("RESPONSE BODY : " + reponseBody);
}

...

We have a simple test case that uses the standard JAX-RS 2.x Client API Client API, so no magic there. We build our client code and then call the REST Service we want to call and then validate that we get a HTTP Status Code 200 which means that things wet well. No issues, ZILCH!




These are my IntelliJ IDEA test run results. You can also go through with Maven as follows :


mvn clean install



This will perfom an build while running integration tests for us which should give you results like the ones in the image below :



If you pay close attention to this image above as you will also see that Arquillian started up Wildfly for us and also used ShrinkWrap to build a ".war" file and also deploy it for us, all done automatically. Together with this, JUnit kicked in our test case when Arquillian was done with the deploy. Now that's magic! KA-BOOM!

Full test case looks like :

@RunWith( Arquillian.class)
public class
SampleResourceIT {

@Deployment
public static
Archive createDeployment() {
WebArchive webArtifact = ShrinkWrap.create( WebArchive.class, "thorntail-test-api.war");
webArtifact.addPackages( Boolean.TRUE, "za.co.anylytical.showcase");
webArtifact.addAsWebResource("project-defaults.yml");

// Print all file and included packages
System.out.println( webArtifact.toString( true));

return webArtifact;
}

@Test
public void
test_That_We_Reach_Our_WebResource_Just_Fine_Yea() throws Exception {
Client client = ClientBuilder.newBuilder().build();
WebTarget target = client.target("http://localhost:8881/that-service/text");
Response response = target.request().get();
int statusCode = response.getStatusInfo().getStatusCode();
String reponseBody = response.readEntity(String.class);
assertEquals( 200, statusCode);

System.out.println("RESPONSE CODE : " + statusCode);
System.out.println("RESPONSE BODY : " + reponseBody);
}
}


As usual you may ...

Leave some comments below and you may checkout the GitHub Source Code to validate the steps.