Friday, November 15, 2013

Beginner's guide to using Maven

I have been trying to convince a fellow Engineer for a long time that Maven is this great build tool that he needs to start using ASAP and is much more than just a replacement to ANT. He finally gave it a try and came back SCREAMING!!!

Why was he Screaming?

He was using Maven in the wrong way

Why is Maven this great tool?

In my opinion it is a great tool because it handles library dependancies on your behalf. Whenever I build a new application, I do not need to scan my hard drive for libraries and then add it to my IDE project classpath. 

I worked in an organisation that had a cool way of handling this situation using ANT. Everyone worked on a specified "workspace" structure. There were scripts that created the structure for you. The workspace contained libraries, scripts and slots. You developed in a slot and referenced libraries in your workspace/libraries folder. The libraries and scripts were all stored in a source control repository. This meant that if you had to use a new library, you would need to download it from the internet, commit it to the code repository, run an update on your workspace and reference it in your ant scripts.

It worked like a charm if you understood the process behind the madness. In a nutshell I found Maven to this ALL for you out-of-the-box. Let me not undersell this tool(Maven). This is but just one of its cool features (Dependancy Management). 

Why do people think that MAVEN is rubbish?

I have promoted Maven and managed to convince developers to start using it. They did and the part that was meant to help them, turned out to be the monster. The dependancy management was a mess.

This made me wonder how could a tool be great for some people and at the same time be a disaster for others?

I soon realised that there are actually 2 ways of using Maven:
  1. Self Development using Maven
  2. Group Development using Maven

Self Development using Maven

If you are a developer who flies solo, then just download Maven and you are good to go.

What is happening behind the scenes?

When you run Maven for the first time, it creates a local cache folder for you. This is located in your user directory and is called ".m2".



If you look into this folder you will find:

  • A settings.xml file which should not concern you at this time
  • A repository folder which contains dependencies that you used to build your projects

How do these dependancies end up here?

When you compile or build your code using Maven; Maven first checks that the dependancy exists in your local cache. If not, it connects to the internet and searches for the dependancy on the Maven Central Repository. If it finds the dependancy, it downloads it to your local cache. If not, it searches on the other repositories that the Maven Central repository is connected to or references. You should be able to download most of the libraries in this way.

Let us consider 2 other possible scenarios:

1. The dependancy required is another project of yours. It is a completely separate project and you have decided not to include it as a module within your current project. If this second project; which is a dependancy to your first project is a maven project then, by running: 
 mvn install  
you will not only build and compile the project but it also installs the output artefacts of each of the modules to your local cache.

2. The dependancy may or not be another project of yours. If it is your project and it is not in a Maven structure, then building or installing the project will have no value. The only thing you do have is the built library(jar). You can install this library manually into you local cache by running: 
 mvn install:install-file -Dfile=<path-to-file> -DgroupId=<group-id> -DartifactId=<artifact-id> -Dversion=<version> -Dpackaging=<packaging>  

You can then add this dependancy details into your Maven POM(Project Object Model) file.

What is the Maven Central Repository? 

The Maven Central Repository is a Maven Artefact Repository hosted on the Internet that contains most of the common or often used java libraries. It some cases it is hosted on the Maven Central Repository and in other cases it references other Maven Artefact Repositories that hosts libraries themselves.

The above point is very interesting. I am saying that not only does a Maven Artefact Repository host libraries but it can also reference other Maven Artefact Repositories which has the same capability. This is huge!!! I guess you can call it chaining.

To summarise, this is what happens step by step:

Step 1 - Look for dependency in local cache, if not found, step 2 else if found then complete build process.
Step 2 - Look for dependency in Maven Central Repository, if not found then step 3 else if found, then it is downloaded to local cache.
Step 3 - Look for  dependency in remote repository or repositories, if found then it is downloaded to local cache otherwise Maven as expected stops processing and throws an error (Unable to find dependency).

Group Development using Maven

If you have a clear understanding of the above, you will by now have gathered why the above setup will not work for group development. Okay, it can work to some degree but it will be a mission!

The problem with the above setup is that every Developer in the group would need to manually add libraries into their local cache's when required. This can become tedious, error prone and amateur.

What can we do to solve this?

The answer is to install a Maven Artefact Repository for the organisation. I recommend using Sonatype Nexus. This is a dead simple installation. 

How to Install Nexus Artefact Repository

  • Download the latest war file
  • Deploy it on a Tomcat instance
  • Make sure that it can connect to the internet. You can check this by selecting Repositories and then checking that the Repository status is "In Service".

  • You might have to configure Proxy settings, if you connect to the internet via a Proxy Server. The default username is admin and the password is admin123. you can configure the proxy in the Administration --> Server settings

  • You will also need to enable a "Deployment" user. This user should have the ability to deploy or install artefacts to the Artefact Repository. Take notice of the roles applied here.

  • The next step is to get Maven to reference your newly configured Artefact Repository.
You can achieve this by configuring the "settings.xml" file. You can place this file in MAVEN_HOME/conf directory or .m2 directory. Maven first reads configuration from the .m2 location and if its not found it looks for the "settings.xml" file in the MAVEN_HOME/conf location.  

Getting Maven to call your own Artefact Repository

Use this as a temple and customise where necessary(You only need to change the deployment username & password and the host & port for this template to work):
1:  <?xml version="1.0" encoding="UTF-8"?>  
2:  <settings xmlns=""   
3:       xmlns:xsi=""   
4:       xsi:schemaLocation="">  
5:   <pluginGroups>  
6:   </pluginGroups>  
7:   <proxies>  
8:   </proxies>  
9:  <!-- This is used when deploying or publishing to the Artefact Repository  
10:     The user must exist and must have deploy rights -->  
11:   <servers>  
12:       <server>  
13:            <id>snapshots</id>  
14:            <username>deployment</username>  
15:            <password>password</password>  
16:       </server>  
17:     <server>  
18:            <id>releases</id>  
19:            <username>deployment</username>  
20:            <password>password</password>  
21:       </server>  
22:       <server>  
23:            <id>milestones</id>  
24:            <username>deployment</username>  
25:            <password>password</password>  
26:       </server>  
27:       <server>  
28:            <id>thirdparty</id>  
29:            <username>deployment</username>  
30:            <password>password</password>  
31:       </server>  
32:   </servers>  
33:   <mirrors>  
34:    <mirror>  
35:     <id>public</id>  
36:     <mirrorOf>*</mirrorOf>  
37:     <name>Public Repositories</name>  
38:     <url>http://host:port/nexus/content/groups/public</url>  
39:    </mirror>   
40:   </mirrors>  
41:   <profiles>  
42:       <profile>  
43:     <id>custom-repository</id>  
44:     <repositories>  
45:      <repository>  
46:       <id>custom-repository-group</id>  
47:       <name>Custom Maven Repository Group</name>  
48:       <url>http://host:port/nexus/content/groups/public</url>  
49:       <layout>default</layout>  
50:       <releases>  
51:        <enabled>true</enabled>  
52:        <updatePolicy>always</updatePolicy>  
53:       </releases>  
54:       <snapshots>  
55:        <enabled>true</enabled>  
56:        <updatePolicy>always</updatePolicy>  
57:       </snapshots>  
58:      </repository>  
59:     </repositories>  
60:     <pluginRepositories>  
61:      <pluginRepository>  
62:       <id>custom-repository-repository-group</id>  
63:       <name>Custom Maven Repository Group</name>  
64:       <url>http://host:port/nexus/content/groups/public</url>  
65:       <layout>default</layout>  
66:       <releases>  
67:        <enabled>true</enabled>  
68:        <updatePolicy>always</updatePolicy>  
69:       </releases>  
70:       <snapshots>  
71:        <enabled>true</enabled>  
72:        <updatePolicy>always</updatePolicy>  
73:       </snapshots>  
74:      </pluginRepository>  
75:             <pluginRepository>  
76:         <id>public</id>  
77:          <url>http://host:port/nexus/content/groups/public</url>  
78:           <snapshots>  
79:             <enabled>true</enabled>  
80:           </snapshots>  
81:                      <releases>  
82:                           <enabled>true</enabled>  
83:                           <updatePolicy>always</updatePolicy>  
84:                      </releases>  
85:        </pluginRepository>  
86:     </pluginRepositories>  
87:    </profile>  
88:   </profiles>  
89:   <activeProfiles>  
90:    <activeProfile>custom-repository</activeProfile>  
91:   </activeProfiles>  
92:  </settings>  

If you require further clarity I guess the Official Maven Site does have more information.

Once you have all this in place, you have modified the process mentioned above to the following:
Step 1 - Look for dependency in local cache, if not found, step 2 else if found then complete build process.
Step 2 - Look for dependency in the Custom Maven Repository, if not found then step 3 else if found, then it is downloaded to local cache.
Step 3 - Look for  dependency in remote repository or repositories, if found then it is downloaded to local cache otherwise Maven as expected stops processing and throws an error (Unable to find dependency).
This looks almost the same barring the fact that we are now referencing our own Artefact Repository instead of the Maven Central Artefact Repository. 

What if you wish to download an Artefact that is not referenced by the Maven Central Repository or your own Repository

Consider these scenarios:
  • You organisation is huge and you require a library from another section or department that hosts their own Artefact Repository.
  • You require a library from a Maven Artefact Repository hosted online.
Just add it as a another Hosted Repository on Nexus:

What are the advantages of Hosting your own Artefact Repository

  • Saving Bandwidth (It only downloads the artefact from the internet once. When request again the call does not go over the internet)
  • Control over libraries used
  • Building a standard for all Developers
  • Storing Organisation specific Milestones and Releases

Why do I need to configure a Deployment user on Nexus

This is to store your own custom Organisation specific Milestones and Releases (libraries). You can upload these custom libraries in the following ways:
  1. Using the Maven Release plugin, when creating a Release or Milestone. Check out this tutorial 
  2. Upload the Artefact to Maven manually via the front end


I strongly believe that Maven and an Artefact Repository go together like toothpaste & a toothbrush. You can use them separately but to get the most benefit, you got to combine them even if you are flying solo.

If you are adding dependancy libraries to your project in any other way than specifying them in your POM file, you are using Maven in the wrong way!

As always, I would love to hear some of your comments or questions. I can also provide my services if you require.


Tuesday, November 5, 2013

Creating a Release using the Maven Release plugin

What is a Release

I spent some time tutoring a junior on configuring the Maven Release plugin to create Milestone and Release artefacts. We were not making much ground up until I took a few steps back and discussed what exactly a Release process is.

Here is a simple Release process. I am pretty sure that if you understand this basic process, you will be able to extend and modify this process for your organisation's specific requirements:
  1. The Latest set of code exists in a Source Code repository, has been tested successfully and is ready to be promoted.
  2. We need to Tag the repository so that we have a way of knowing that something special happened at this point in time. A tag is simply a Snapshot of the code at a particular point in time. This tag gives us a reference point so that we can compare code before and after the release.
  3. We build or create our deployment artefact(s).
  4. We store our artefact(s) in a secure location so that it can be retrieved at a later date for rollback or auditing purposes.

What do I want the Maven Release Plugin to do for me

The Maven Release Plugin is by no means a One-Trick-Pony. It has the ability to Build Code, Tag Code, Change Version Numbers, Branch Code, Rollback what its done and Stage artefacts. I guess if you really want know want it does, please visit

The cool part of this Plugin is that you do not have use all of these features. You can choose what suits your release process and only use that.

I want the Maven Release to automate this process:
  • Build the Code
  • Tag the code
  • Change the code Version from a Snapshot to a Milestone or Release version
  • Build an Artefact
  • Change the code Version from a Milestone or Release version back to a Snapshot version
  • Deploy the Artefact to a Maven Repository

The 2 Parts to configure the Maven Release Plugin

There are two parts to focus on in order to get the Maven Release Plugin to achieve the goals mentioned above:
  1. Configuration in the POM file
  2. How to Run the Plugin

Configuration in the POM file

Make sure that every POM file in the Project has a Version, Artefact ID & Group ID Tag.

The Parent or Main POM file needs to have these 3 settings configured:

1.) The SCM tag must be configured.

The syntax changes slightly between the Source Code repositories. Fill it out with caution and consult the maven user guide as this can throw a very cryptic error. This section is used by the plugin to connect and Tag the Code in the Repository.

2.) The Distribution Management Tag Must be configured.

The Plugin deploys the generated artefact to the Artefact Repository configured in this Tag. 

3.) Add the Maven Release Plugin to the POM.

How to Run the Plugin

1:  mvn -Dtag=<Release_Tag>  
3:  -Dproject.rel.<POM_1_Group_id>:<POM_1_Artifact_id>=<Release_version><POM_1_Group_id>:<POM_1_Artifact_id>=<NextDevelopmentVersion>   
5:  -Dproject.rel.<POM_2_Group_id>:<POM_2_Artifact_id>=<Release_version><POM_2_Group_id>:<POM_2_Artifact_id>=<NextDevelopmentVersion>  
7:   .  
8:   .  
9:   .  
11:  -Dproject.rel.<POM_1_Group_id>:<POM_1_Artifact_id>=<Release_version><POM_1_Group_id>:<POM_1_Artifact_id>=<NextDevelopmentVersion>  
13:  release:clean  
14:  release:prepare  
15:  release:perform   
(The Script to Execute this Plugin should be run on a single line. I am only adding line breaks for illustrative purposes.)

Let us try and understand each line:
   mvn -Dtag=<Release_Tag>  
The general way to execute Maven and the message which we want to Tag the Code Repository with.
This needs to be configured for each POM file in the project. In essence, each Module is Built with the relevant Release version and the next Development version is set and committed to the Repository. The output artefact of each module is deployed to the Artefact Repository.
This plugin creates temporary files during its execution. The temporary files keep the previous state or version of the POM files. This is used if or when a Rollback is executed. The clean deletes these files.
This is the heart of the Plugin. This is responsible for changing the versions, tagging the code and committing to the Code Repository.
This deploys the generated Artefact to the Artefact Repository.

Suggestions and Hurdles to Look out for

  • The error or exceptions thrown by the plugin is very cryptic and often misleading.
  • Use the Maven -X option at the end of the script for the maven script to be run in verbose mode.
  • First try and get the script working locally in interactive mode before automating or deploying the job to Jenkins or some Build server.
  • Omit the release:perform step, up until you get the script to run up to and including the release:prepare step.
  • When working with a GIT repository, the plugin will commit and push the code.
  • Use the --batch-mode option to run in non-interactive mode.
  • Create a profile in the POM and use variables rather than hard coding the distribution management tag. In this way you can use the Plugin to create Milestones and Releases by merely changing the profile which reference the different locations in the Artefact Repository.



The Maven Release plugin is often seen to be a pain and very difficult to use but if you do manage to get it working, I guarantee that you will be saving yourself hours. Follow my instructions and tips and please contact me if you feel that I have missed something important. I will gladly add it in.

As always; please send through your comments, suggestions or questions and I will try to address them. I can also assist you if you require technical assistance from my side in developing scripts for your organisation.

Tuesday, October 29, 2013

Creating Multiple Server Instances on JBoss 7

Problem Space

If you are a typical Java Developer, chances are that you are highly likely to be working on more than one project at a time. Even if you are lucky enough to be working on only one project, you may find situations where you need to work on 2 different versions of the code at the same time (Fixing current Production issues and developing new functionality).

If you are in a support role, you should have testing environments with each hosting multiple versions of the code(DEV, SIT, QA or Production) or even multiple projects independent of each other.

The point that I am trying to make is that; in almost every situation, there is a need for more than one Application Server instance on a machine. You can get away with a single instance but then you would spend an enormous amount of time reconfiguring your server instance when developing or testing code.

You could also get away with "cloning" the complete product in multiple directories but this is very messy, amateur and tough to maintain.

Application Servers Today

Application servers are built to address the issues mentioned above. There are basically 2 mechanisms which are used:
  1. The application server is contained as a package. You can use a wizard or some script that will assist you in creating a new server instance. The newly created server instance can exist anywhere on the machine but it will reference libraries and scripts in the base package. 
  2. Other application servers are shipped as a package which contains a few sample server instances. You can create a new server instance by simply copying one the the sample server instances, renaming it and pasting it in the same directory or location as the other server instances. 
JBoss 7 falls in the second category mentioned above. The sample server instances in previous versions of JBoss are "Default", "Minimal" & "All". In JBoss 7 the sample instance is called "standalone".

In previous versions of JBoss, the sample server instances("Default", "Minimal" & "All") represented the JBoss services that you required. In JBoss 7 this is addressed by multiple configuration files(standalone-full-ha.xml, standalone-full.xml, standalone-ha.xml & standalone.xml) all located in the configuration folder of the server instance. This is a separate topic on its own but for now, this is enough information to get you going.

How to create multiple server instances on JBoss 7

Cloning(Copy & Paste) the Standalone directory is the quickest way to create a separate or new server instance. You would need to rename the newly created instance as the operating system will not allow 2 files or directories in the same location with the same name. The directory name will be the server instance name.

You can then use this command in the bin directory of JBoss to run your newly created server instance:

 ./ -Djboss.server.base.dir=<Instance_Name>  

If you want JBoss to bind to your IP Address instead of localhost you would need to run the following command:

 ./ -Djboss.server.base.dir=<Instance_Name> -Djboss.bind.address=<your_ip_address><your_ip_address>  

And one step further, if you required to run a specific configuration; lets say for example you wanted to use the JMS queues you would need to run the following command with Configuration_file being standalone-full.xml:

 ./ -Djboss.server.base.dir=<Instance_Name> -Djboss.bind.address=<your_ip_address><your_ip_address> -c <Configuration_File>  

You would theoretically have 2 JBoss 7 server instances but they would not be able to run concurrently because of port conflicts(Both instances use the same ports).

To address the above issue we can change the port offset.

Locate this line in the configuration file:
Change the value of the port-offset to a positive number. All ports will be offset by this value. As an example, if we were to set this port-offset value to 10.

  • The http port would be 8090 instead of 8080
  • The ajp port would be 8019 instead of 8009
  • The https port would be 8453 instead of 8443

I trust that you understand this point. I think that the engineers at JBoss have done a great job using this mechanism. I remember me having to update ports all over the show in a previous version and it was a nightmare to maintain or maybe, I was just not well informed.

My Solution to a Robust Distribution

I suggest that you start off with a clean or new installation.

1.) Copy the standalone directory and clone it 8 times. This means that you should have your original standalone directory and 8 other standalone directories. Name them as follows: standalone_1, standalone_2, ..., standalone_8.

2.) Set the port offset in:
standalone_1 to 100
standalone_2 to 200
standalone_3 to 300
standalone_8 to 800

3.) Create a scripts directory inside the bin directory. Create start and stop scripts to each of the standalone instances. You would want to dynamically get the IP address instead of hard coding it.

A Windows batch script to start standalone instance 1 should look something like this:

1:  @echo off  
2:  for /f "tokens=1-2 delims=:" %%a in ('ipconfig^|find "IPv4"') do set ip=%%b  
3:  set address=%ip:~1%  
4:  set server=standalone_1  
6:  echo ****************************************  
7:  echo The server ip address is: %address%  
8:  echo The server instance is: %server%  
9:  echo ****************************************  
10:  cd ..  
11:  CALL standalone.bat -Djboss.server.base.dir=%server% -Djboss.bind.address=%address% -c standalone-full.xml  
12:  cd scripts  

A Unix bash script to start standalone instance 1 should look something like this:

1:  #!/bin/sh  
2:  address=`ifconfig | grep "inet " | grep -v | cut -f2 -d':' | cut -f1 -d' '`  
3:  server=standalone_1  
5:  echo "****************************************"  
6:  echo "The server ip address is: "$address  
7:  echo "The server instance is: "$server  
8:  echo "****************************************"  
9:  cd ..;  
10:  ./ -Djboss.server.base.dir=$server -Djboss.bind.address=$address$address -c standalone-full.xml &
12:  cd scripts  

The biggest issue that I had creating these scripts were that there was a different way of retrieving the IP address dynamically from almost all of the Operating Systems. Here are some scripts to retrieve the IP address from the OS:

Linux ifconfig Example
 ifconfig | grep 'inet addr:'| grep -v '' | cut -d: -f2 | awk '{ print $1}'  

FreeBSD/OpenBSD ifconfig Example
 ifconfig | grep -E 'inet.[0-9]' | grep -v '' | awk '{ print $2}'  

Sun / Oracle Solaris Unix Example
 ifconfig -a | grep inet | grep -v '' | awk '{ print $2}'  

MAC OSX Mountain Lion Example
 ifconfig | grep -E 'inet.[0-9]' | grep -v '' | grep -i 'broadcast' |awk '{ print $2}'  

4.) Create stop Scripts for each of the standalone instances. 

A Windows batch script to stop standalone instance 1 should look something like this:

1:  @echo off  
2:  for /f "tokens=1-2 delims=:" %%a in ('ipconfig^|find "IPv4"') do set ip=%%b  
3:  set address=%ip:~1%  
4:  set port=10099  
5:  echo ****************************************  
6:  echo The server ip address is: %address%  
7:  echo The server port number is: %port%  
8:  echo ****************************************  
9:  cd ..  
10:  CALL jboss-cli.bat --connect --controller=%address%:%port% --command=:shutdown  
11:  cd scripts  

A Unix bash script to stop standalone instance 1 should look something like this:

1:  #!/bin/sh  
2:  address=`ifconfig | grep "inet " | grep -v | cut -f2 -d':' | cut -f1 -d' '`  
3:  port=10099  
4:  echo "****************************************"  
5:  echo "The server ip address is: "$address  
6:  echo "The server port number is: "$port  
7:  echo "****************************************"  
8:  cd ..;  
9:  ./ --connect --controller=$address:$port --command=:shutdown  
10:  cd scrips  

5.) JBoss-as-7.1.1.Final is packaged with a bug on user creation. I addressed this by creating an administrative user on the standard standalone instance using the bin/ script and then copying the to the configuration folders of each of the newly created server instances. When you try to connect to the admin console and are prompted for the username and password, you can use the same credentials in every server instance.

6.) Do not modify or use the the original standalone server instance. This way you can easily extend your installation. I use the original standalone server instance for trouble shooting purposes.


I had this problem and worked in a team of at least 10 developers who would eventually run into the same issues. Instead of wasting time and money I created a company specific distribution.

Contact me and I can assist you with a custom distribution for your organisation.

Again, I would love to hear your questions or comments!

Wednesday, October 23, 2013

Jenkins Jobs to Deploy Artefact to Amazon Cloud Server hosting JBoss Instance

Problem background:

We develop IT solutions for clients. Mainly Java applications running on a JEE application server and connecting to a database. Development and support is done at Head office. The clients are based throughout the world. We have a Continuous Integration process hosted and running at Head office.

Our Continuous Integration Process contains the following:

  • GIT Repository
  • Jenkins Build Server
  • Maven Build Tool (Maven projects)
  • Nexus Artefact Repository

Our  clients host the production solutions on their own infrastructure. We configure the environments at project initiation time and then a typical deployment consists of only deploying a new java artefact(ear, war or jar).

We traditionally connect to our Client's server environments via a VPN connection but this time our client went for a Cloud Hosting Solution(Amazon).


My usual advice to my colleagues that are attempting to build any Jenkins job is, to first get the process working manually. If you are able to execute the process manually, then building a Jenkins job is very focused and you are able to break down your specific problem area. It is also a form of decoupling the problem.

When I tried to execute my process manually for this particular scenario, I have a few issues. I was provided with a security key file, a hostname and a username. I tried to use the : ssh -i <path_to_key> <user>@<Cloud_server> command but it kept prompting me for a password. I initially thought that it was looking for a passphrase for the security key but I was assured that there was no passphrase. The problem turned out to be that the security keys were in .ppk format. Someone converted the .PEM file to .ppk format so that it could be loaded into Putty(Telnet, ssh, ... ,keygen utility). My problem was that I use a MAC and use the native ssh client which requires a .PEM file and not a .ppk file.

I tried to convert the .ppk file back into .PEM with no success. It was easier asking the server administrator for the .PEM file.

The Solutions available:

Open Up Ports to the Cloud server 

The first option available is getting ports opened on the Cloud server to the internet. The advantage of this approach is that we can create our deployment job on Jenkins using the JBoss CLI utility to deploy the artefact onto the application server. I usually just create a "free-style software project", use the "wget" command to download the artefact from Nexus and run the JBoss command line utility to deploy the application. The other option is to use the Maven jboss-as plugin to do the deployment. This would work well if you are deploying a Snapshot version built from a fresh maven install. The only other snag that you might encounter is, to configure Proxy settings if connecting to the internet using a Proxy Server.

The disadvantages are far too many in my opinion. Our Internet connection is not that great. This means that the JBoss CLI utility can timeout. The next problem is that we need to secure the JBoss server. Remember now, everyone on the internet has access to the administration ports on JBoss, which is a huge security risk. One way of countering this issue, is to only open up access to these ports to specific public IP addresses. Not to forget, someone needs to keep account of which ports have been opened up and to which IP addresses are they accessible. Migrating your Jenkins Server to an alternative IP address can easily turn into a nightmare

The simplicity to this mechanism is that, you can build your Job in the same way as you would connect to any other server hosted in your own network.

SSH into the Server

The other option is to use the Publish over SSH Jenkins plugin.
Create a "free-style software project". Use the "wget" command to download the deployment artefact from Artefact Repository(Nexus) into the Job's workspace. Use the ssh plugin to copy over the artefact to the Cloud server. Once the artefact is on the Cloud server,  use the same plugin to execute a script hosted on the Cloud server. This script needs to either copy the artefact into the JBoss server's deployment directory or executes the JBoss CLI utility to do the deployment. The Jenkins Publish over SSH plugin is capable of both connecting via ssh and securely copying files or scp.

One disadvantage of this mechanism is that the script that is executed on the Cloud server should be able to return both positive and negative results. In this way the job can report successful or unsuccessful execution of the project. Another disadvantage is loading and maintaining the SSL keys.


I am pretty sure that if we had a perfect internet connection I would have had a much simpler life. Unfortunately this is IT, always have a plan and a backup plan!

This is my first Blog Posting and I would love to hear your opinions or questions.