Testing Automation with Jenkins as Docker Container on Amazon ECS

Reading Time: 5 minutes

This article was authored by Supriya Sadhwani a PGP- Cloud Computing alumnus from Great Learning. 

Over the last few years, Continuous Integration and Continuous Development have gained a lot of traction. Automation of Testing during CI/CD, ensures that as soon as developers check in code, automated tests run on it and any defects can be solved proactively by the development team. Jenkins is leading open source automation server which is used most widely by various organizations to automate their software development processes. It provides hundreds of plugins to support building, deploying and automating any project, including a lot of plugins to integrate with AWS. As Jenkins has been around since quite a lot of time and is still being currently used in almost 70% of the organizations using CI/CD automation, this project leverages AWS capabilities to simplify Jenkins configuration and management.

To set up Jenkins, an organization generally has to provision for machines on premises, set up Jenkins on each one of them. A typical Jenkins setup contains the master instance and one or multiple worker agents which run the whole time. In that case, there are either too many agents (adding up the costs) or too few agents (leading to wait times) which get overloaded and sometimes crash. Also, these agents often have (over time) a lot of different tools installed, and are not reliable anymore. In case of any changes in the configuration, then each machine needs to be checked and updated. Alternatively, even if Jenkins is set up on the cloud, then organizations need to set up various EC2 instances with Jenkins deployed on them, configure them and manage them.

Our Capstone Project done as a part of Post Graduate Program in Cloud Computing at Great Learning, aimed to resolve these issues by implementing Automation of testing in the CI/CD pipeline using Jenkins on AWS ECS. As the solution has been built around Jenkins, which is the most widely used automation tool, any organization can easily transition the current Jenkins setup to this one. AWS CloudFormation has been used to provision the entire AWS infrastructure, so an organization can migrate to this solution infrastructure in a matter of few clicks. Amazon Elastic File System (Amazon EFS) has been used to provide parallel shared access to all the Jenkins master instances on the ECS cluster. Amazon’s Simple Storage Service (S3 which is designed for 99.99% availability) is being used to store the test results rather than storing them on-premises or on a drive in any Jenkins instance. Amazon’s Simple Notification Service (SNS) has been used to send email notifications containing test results to the testing teams instantly after a code checkin. Amazon’s business intelligence service, Quicksight has been used for generating analytical dashboards for managers, where they can gain insightful information on the test executions and code quality. Thus, various AWS services have been effectively leveraged as part of this solution.


Our project leverages the powers of containers – as containers are portable and provisioning containers are very fast as compared to virtual machines. Further, it uses Amazon ECS – a highly scalable and high-performance container management service where a cluster of EC2 instances are run as Jenkins Masters in an auto-scalable fashion (Using the Amazon Elastic Container Service plugin of Jenkins). Each Jenkins build is executed on a dedicated slave docker container that is automatically wiped-out at the end of the build. We have configured slave docker containers to use our custom Docker image cloudbees_jnlpslave_with_chromedriver which have been built using the cloudbees/jnlp-slave-with-java-build-tools image which is generally used for ECS Jenkins Slave agents, but with added support for Chrome Browser for Selenium. Since the cloudbees slave image does not have Chrome and ChromeDriver in it, hence we built and used our custom docker image.

Jenkins stores master node configuration in $JENKINS_HOME directory rather than a database Therefore $JENKINS_HOME has been mapped to EFS so it can be accessible by multiple Jenkins servers and need not be repeated across each replicated instance. Also, if the Auto Scaling Group scales out and a new Master comes up, it already will have all the Jenkins configurations, required plugins, job configurations etc. as it uses this EFS drive.

CloudFormation template is being used to build the entire infrastructure – VPC, Public and Private Subnets, NAT and Internet Gateways, ECS Cluster, ECS Service and ECS Tasks, RDS, EFS, EC2 instances, Elastic LoadBalancer, Autoscaling Group and Launch Configuration, IAM roles and Security Groups. It accepts the parameters like:

Stack NameWill be used to name the resources which are created
AZ1 and AZ2Availability Zones where Jenkins Masters will be created
DB usernameRDS MySQL DB instance with the given username will be created
DB Password Password of the RDS MySQL DB
KeyPairThe .pem file which will be used to SSH into the Jenkins Masters
PublicAccessCIDRThe CIDR address which will be given the SSH Port 22 access to Jenkins masters, this can be set as our own IP address to restrict the access of EC2 instances


GitHub repository contains the maven project test automation framework code written in Selenium Java having the test cases for functional testing of the application. Any code changes in GitHub Test Code Repository, triggers the Jenkins Build process, which is handled by the ECS Jenkins cluster. GitHub webhooks in Jenkins are used to trigger the build whenever a developer commits on the repository. (using the GitHub Integration Plugin of Jenkins) Different masters on the Jenkins ECS Cluster can be configured to handle the commits on different branches of the Github Repository. WebHook URL on the GitHub repository is the URL of our Application Load Balancer that sits in front of the ECS Cluster. (Route53 Domain name can be used here). The LoadBalancer then, will direct the build on one of the Jenkins Masters, where a dedicated slave docker will come up and run the build.

As a part of build execution once all the test scenarios are executed, Jenkins will create test report in two formats. Cucumber JSON test results will be stored in a S3 bucket json-report (using the S3 publisher plugin of Jenkins) and html report artefacts(like html, CSS, images etc.) will be stored in the S3 bucket html-cucumber-reports (using the Post Build Task plugin of Jenkins and AWS CLI commands to copy the artefacts to S3). Jenkins will then finally use SNS email notifications to update about the Build Status to the Testing and Development Teams. (using the Amazon SNS Build Notifier plugin of Jenkins). The Email to the Testing Team will contain the Build Result along with the Link of the Cucumber HTML report showing the Passed and Failed Test Results (Generated from the S3 bucket html-cucumber-reports)

A Lambda function has been configured to be triggered on the json-report bucket Put event. Therefore, as soon as build finishes and json file is placed in the bucket json-report, the Lambda function will kick in. It will parse the JSON file and store the required fields from it, in the Multi-AZ RDS MySQL Database. QuickSight Dashboard has been further integrated with this RDS Database, where custom Dashboards have been configured which can be viewed by the Management. These analytical dashboards can be viewed on a Daily or Weekly basis to gain insights on the overall Builds Success/Failure rates and thus the Code quality.

Thus, in our Capstone Project, we have demonstrated Automation of testing in the CI/CD pipeline using Jenkins on AWS ECS and several managed services of AWS. With this robust CI/CD automation architecture, organizations can focus more on their business processes, thus ensuring faster development cycles with greater success rates. 

Weekly AI News Roundup: 17th April

Reading Time: 1 minute

Here is our weekly round-up of essential Artificial Intelligence news from around the globe.

China battles the US in the artificial intelligence arms raceChina seems to have an advantage in the global AI race due to better implementation, while the US seems to be depending on innovation to maintain its hegemony.

Artificial Intelligence Powering Boom in Israel’s Digital Health Sector: Israel’s troves of electronic medical records are being leveraged by digital health startups for early detection of diseases and producing accurate medical diagnoses.

CBSE plans artificial intelligence course in Class IX syllabus: The Central Board of Secondary Education is looking to include Artificial Intelligence as an optional subject for Class 9 students.

Artificial Intelligence to help Indian Navy to deal with different threats:  The Navy’s warships are being used to bolster Air Defence, and AI is being used to increase the accuracy of its offensive capabilities.

On-Device AI – The next wave of Artificial Intelligence: On-device AI signals a great proliferation of AI touchpoints into everyday consumer actions and is poised to drive the next stage of AI growth.

If you’re interested in learning AI to advance your professional career, take a look at our comprehensive AI program.


Transition to Data Analyst made easy- Utkarsh, PGP-DSE

Reading Time: 2 minutes

Utkarsh Nipane is a PGP-DSE alum and a recent engineering graduate. He talks about why he preferred Great Learning’s program over the others, and how he made the transition to a Data Science career. 

1.What is your work experience?

I am a 2018 pass-out engineering graduate in Information Technology from Mumbai University. I had experience as a Freelancer as a Website developer and Subject Matter Expert. I have worked for NGOs and Educational Institutions with experience of 1 year before joining the Great Learning data science course.

2.Why did you take up the Data Science Program from Great Learning?

In my last year of Engineering, I came across the data science field which led me on a search for institutions offering Data Science programs. I was not in a financial position to invest a huge amount of money so, Great Learning was the most obvious choice. Great Lakes Data Science program stands out the most when it comes to financial investment and duration of the course. As a fresher, I wanted to get a jump-start into the world of analytics and Great Learning’s Data Science and Engineering program was an excellent platform for that.

3.What did you like the most about the program?

When it comes to GL, I liked the flexibility in study standards and actual opinions of the students being considered. It’s a program which is more student-oriented and helped me build the mindset required in this field. The impeccable faculty, management and program office lead to the building of my personality and knowledge pedigree.

4.Did the program curriculum meet your needs?

Program Curriculum was designed with the help of industry experts so, we were directly able to jump into industry applications as a result. Career Support is one of the best you can find, they will stick with you, motivate you, scold you when needed, It was more like a family which took care of every possible problem I had regarding the course.

When it comes to Data Science there are loads of commentators that talk a good game, but Great Learning helps you make it happen. They have enabled me, coached me and given me the confidence to what I believed. My idea was to straight-jump to the data science field rather than taking an indirect route of having some experience in irrelevant fields and gathering the industrial experience in six months rather than spending time at random company for two years. I learned all the things from the diverse crowd in my classroom program, it is the best as long as you believe in it

5.How did this program help you make the transition to Data Science?

The transition to my fresher life into the designation of Data Analyst was quite smooth thanks to the experienced faculty of Great Learning who shared their life experiences to the students. In Bewakoof Brands, I mostly analyze data, whether it’s sales figures, market research, logistics, or transportation costs. It is my job to take that data and use it to help my company make better business decisions.

6.What is your advice to aspirants who want to take up this program?

While the data science field may seem attractive but you can’t jump into it without any dedication. It’s a quite unique field where learning never stops. So, prepare yourself well before making the decision. I would definitely recommend this course to anyone who is interested and dedicated, but you have to be thorough with all the concepts.

The Program Covered all the Topics asked in the Interview. Pugazhendhi

Reading Time: 2 minutes

Even modest support can accomplish a big dream. Pugazhendi, our PGP-DSE alumnus’ story starts with his parents supporting him to pursue data science. Read on to know more about his data science journey with Great Learning.

Why did you choose the PGP-DSE program by Great Learning?

I did my engineering in physics and nanotechnology. In my final year project, I had to work on data. It took me a lot of time to get the data, clean it, and work on it. At that time I had no clue about data engineering, but my seniors told me that there is something called data science and asked me to look into it. That is how I came to know about data science and I developed an interest in it. I was planning for higher studies but I felt that there are unexplored opportunities in the field of data science, so I took up the PGP-DSE program.

When I told my family that I wanted to take up this program, they had no idea about it. They knew about the IT courses like Java, so I explained to them the field and opportunities in it and they supported me after that to pursue the course.

Did you consider any other program before enrolling?

I had considered Imarticus but I felt that the program lacked credibility in what they were promising and it is more expensive as compared to Great Learning’s program.

Did the program curriculum prepare you for interviews?

I feel that the program had covered almost all the topics that were asked during the interviews. I knew about the things that were asked. I worked on a lot of different projects during the program. The mini project which I had worked on helped me a lot in the Mahindra and Mahindra Hackathon.

What did you like the most in the program?

The peer group is really good, we used to sit and discuss things. As we had people from different backgrounds, it helped us to exchange what we knew and learn together.

Faculty were too good and they made it very simple for us to understand the concepts.

Did the placement support help you prepare for the interviews?

We had many companies coming in and we were groomed well before the placement process began. The mock interviews and CV reviews gave me an idea of what I was good at.

How was the Interview process?

There was a hackathon in Mahindra and Mahindra, which I had won. After that, there was a code review followed by multiple telephonic interviews. The course made it easy for us to explain the concepts and algorithms better. That is what impressed the interviewers and they offered the role. I have already started working on some interesting projects here.

The program didn’t require me to leave my job: Divya, PGP-BA Alumnus.

Reading Time: 1 minute

Divya Tandon, PGP – Business Analytics alum gives us a quick look at her Great Learning experience, in her own words.

I had worked in a market research company for 1.5 years as a research analyst before joining Great Lakes PGP-BA Program. My decision for choosing Great Lakes was influenced by 2 important reasons:

1.The Institute was ranked amongst the top programs for business analytics

2.It did not require me to leave my current job.

The professors at Great Learning are extremely helpful and provided regular guidance when required. Even during the off-residency time, we could communicate with them about any doubts through the knowledge portal. The assignments were also designed based on real problems which further helped us strengthen our foundation in analytics.

This program helped me add the needed skills and tools in my resume for transitioning to business analytics and earned me interview calls from various companies. Also, my enhanced/improved knowledge and skills after completing the course helped me clear the interview at my current company (Nepa AB) and allowed me to shift to core analytics. I would definitely recommend this course to any person who is looking to make a shift to analytics without leaving their current job.


Weekly Data Science Round-up: March 27th, 2019

Reading Time: 1 minute

5 Ways to Check if Data Science is the Right Career Option for you: Do you love problem-solving? Do numbers excite you? Do you possess the patience of a saint? And lastly, do you love data?

Can Outsourcing Reduce the Jobs Shortage in Data Science?  Outsourcing data can help companies save 50% of their budget as the company do not have to bear office cost, equipment and their maintenance costs.

Why Python is More Relevant in 2019 Than Ever Before: Owing to its versatility, simplicity, flexibility, and a plethora of available libraries Python has the largest community of programmers.

Watch this IPL with a dash of Data Science: ESPNcricinfo will use data science, big data, and cricket intelligence, to produce a set of numbers that will help fans understand and appreciate the game better.

5 Crucial Data Science Skills to Learn in 2019:  Machine learning, statistics, quantitative analysis, mathematics, and programming languages are broad areas in which you’ll need to build expertise. If we were to delve a little deeper in what exactly these skills entail, you’ll come across these tools.

Weekly Artificial Intelligence Roundup: March 27th, 2019

Reading Time: 1 minute

The ethics of artificial intelligence: Teaching computers to be good people: As algorithms become more ‘intelligent’, researchers are working on new ways to regulate the extent to which AI is able to grasp and mimic human intelligence.

Spot and fix supply chain problems with artificial intelligence: Due to the colossal volume of data at the disposal of organisations, deriving insights from data can reduce the error percentage as close to zero as possible.

How Will AI Create More Jobs by 2025?Reports suggest about 9% of the workforce will be employed for roles which do not exist today. These new jobs will be powered by artificial intelligence and machine learning.

AI in Healthcare to Become a Billion Dollar IndustryThe integration of AI in healthcare will automate many processes. Approximately 21% of healthcare facilities have a specific purchase plan of AI tools that is predicted to be executed in the upcoming years in Europe alone.

How Amazon is Dazzling the World with Artificial Intelligence?: Amazon has been investing in AI for more than 2 decades now to predict and mould customer experience. The current state of AI and machine learning promise an exciting future and Amazon is undoubtedly at the forefront.

My impossible Transition to Analytics – Ashwaraj, PGP-BABI, Alumnus

Reading Time: 3 minutes

Analytics provides opportunities for people from a diverse set of professional backgrounds, and Ashwaraj is a shining example. As a mechanical engineer who was working in the Automotive and Education industries, he decided to make his mark with an Analytics career. Here’s how PGP-BABI helped him in his journey. 

Why did you choose the PGP-BABI program?

I come from a mechanical engineering background and after graduation, I worked in the automotive industry for a while. After that, I worked in the education industry and then decided to shift into analytics. I was looking for programs that were available in the market and based on my peers and friends suggestions, I realised that the PGP-BABI program by Great Learning was one of the best programs that I could have gone for.

Why data science?

This happened when I was in the education industry. I used to manage a territory where I had to develop good business outcomes. Basically, the target was to get more business in less time. And practically, I couldn’t find any feasible solution in front of me. The only thing that I had access to was loads and loads of data. When I started crunching the data using Excel, I was unable to get faster outcomes and I realised that maybe I would need other tools that could help me to turn things around faster. And moreover, I also lacked the required knowledge. As I started digging deeper, I realised that I should go for formal training that could help me in turning around the data into goldmines.

What did you like the most about the program?

PGP-BABI program is a perfect blend of both knowledge and skills. Every concept was followed by a practical assignment, case study, or a quiz. The program enabled me to think beyond the simple learning tools of the trade and inculcated the ability to think from a business perspective. Being taught and mentored by some of the best individuals in the industry and academics ensured that my knowledge and skill grew simultaneously. I can humbly say that without the guidance and help from Great Learning, my transition from Education to Analytics would have been impossible.

How did you land your current job?

When I was joining this course, I found a relevant contact in the automotive industry. I also spoke to a few professors who came to Hyderabad. There was a marketing professor, he mentioned that Ford was looking for analysts and I should try for it. When I was about to complete my course, one of the alumni with whom I had worked in Mahindra offered me a job in Ford and the rest, as they say, is history.

How did you manage to learn with a full-time job?

It was a little challenging in the beginning because my job required me to be on the field most of the time. But after proper planning and talking to my company stakeholders, they were cooperative as I was able to show immediate benefits that I was getting out of learning. I also became more organised as time passed. In a way, I was able to learn time management as well with Great Learning.

What advice would you give to the aspirants?

I would say, one should keep three things in mind. First, take the course very seriously because the more serious you are towards the program, the better are the outcomes.

Second thing is to have an open mind when classes are happening, even online classes. Just go through the recordings again and get an understanding of the content.

The third and the most important thing is that when you are given assignments and quizzes, do them deliberately and only for yourself, not for others. These three things will definitely help in churning out the best outcomes from the program.

Weekly Data Science Round-up: March 22, 2019

Reading Time: 1 minute

Top Data Science use cases in Gaming: Data science is making its benefits apparent in game design, monetisation, visual effect and much more.

The difference between Data Science and Machine Learning: Here’s how you spot the biggest differences between machine learning and data science.

How to negotiate your data scientist salary: Building the right skillset, having the right tone and demeanour can go a long way in getting the salary that you want.

Data Science generalists vs. specialists: A “T-shaped” data science professionals with broad general knowledge and specialisation in certain skills is the most valuable.

5 Qualities to Look for When Hiring a Data Scientist: Companies look for candidates with statistical thinking chops and data intuition when hiring, among these other skills.