Docker Tutorial Part 3 -> Setup and installation on Ubuntu

Installing docker on Linux is as simple as installing any other Linux package, We don’t require the whole Docker toolbox for working with docker on Linux.

In this blog post, I will be talking about installing community edition installation of Docker.

To install Docker, you need the 64 bit version of either one of below Ubuntu-

  • Xenial 16.04 (LTS)
  • Trusty 14.04 (LTS)
  • Yakkety  16.10

Uninstall older version of Docker – Older version of Docker called docker or docker-engine. if you have these then uninstall them, otherwise skip this part

sudo apt-get remove docker docker-engine

Install Docker – You can install the Docker in different ways, as per your needs

  1. Setup docker repositories and install from them – this is easy in installation and upgrades (recommended approach)
  2. Download the DEB package and install it manually and also manage upgrades manually (prefer when lack of internet access)

Install using Repository – If you are doing the setup for the first time on a new host machine, you need to setup the docker repository. Then you can use the same repository for install and updates

  • Install packages to allow apt to use a repository over HTTPS:
sudo apt-get install apt-transport-https ca-certificates curl software-properties-common
  • Add Docker’s official GPG key :
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
  • Verify that the key fingerprint is 9DC8 5822 9FC7 DD38 854A E2D8 8D81 803C 0EBF CD88
sudo apt-key fingerprint 0EBFCD88

pub   4096R/0EBFCD88 2017-02-22
      Key fingerprint = 9DC8 5822 9FC7 DD38 854A  E2D8 8D81 803C 0EBF CD88
uid                  Docker Release (CE deb) <docker@docker.com>
sub   4096R/F273FCD8 2017-02-22
  • use the below command to setup the stable repository
sudo add-apt-repository \
   "deb [arch=amd64] https://download.docker.com/linux/ubuntu \
   $(lsb_release -cs) \
   stable"
  • Update the apt package
sudo apt-get update
  • Install the latest / specific version of Docker with below commands
sudo apt-get install docker-ce   //for latest version
sudo apt-get install docker-ce=  //for specific version
  • Verify that Docker CE is installed correctly by running a sample hello-world docker image-
sudo docker run hello-world
  • You should see something like below if your installation is successful and complete

docker-linux

If you guys face issues in installation , please mention in comments section.

In the next blog post , we would be learning docker terminology and different docker components

To know more in details, hands-on and for personal / corporate training please reach out to – gauravtiwari91@yahoo.com

Advertisements

Docker Tutorial Part 2 ->Getting started with Docker: Setup and Installation on Windows

Now we have the basic understanding of docker technology, lets go ahead and do the installations. if you are still not aware about it, please go back and read my post Docker technology overview: How is it different from virtual machines and come back here.

Please make sure virtualization is enabled in your Windows system and follow below steps to install Docker toolbox on Windows-

  • Click on the link and download docker toolbox from – Get Docker Toolbox for Windows
  • Docker toolbox will include following docker tools – (Don’t worry, we will cover each of them in upcoming blog posts)-
  1. Docker CLI client for running Docker Engine to create images and containers
  2. Docker Machine for running Docker Engine commands from Windows terminal
  3. Docker Compose for running docker-compose command
  4. Docker Kinematic – ( Docker GUI – for interactive docker operations)
  5. Oracle VM Virtual box
  6. Git MSYS-git UNIX tools
  • Docker Engine uses Linux-specific kernel features, we can’t run Docker Engine natively on Windows. (So indirectly you will be creating containers inside a small Linux VM running in the Oracle virtual box). The new Docker for Windows uses native virtualization and does not need Virtual box to run docker. (lets stick to this as of now for learning purpose)
  • Install the executable which you downloaded in first step. double click and keep following installation instructions. Once you done with installation, you will see below icons on your desktop-

installed

  • Click on the Docker quick start to launch the toolbox terminal. After this if it asks for any permissions, press yes. When its started, you will see a terminal displays $ prompt
  • Now type command docker and you will see all help options for docker as below-

dockerinstallationNow you are good to go and play around docker images and containers. You can give a try to a hello-world docker images. This image checks for your installation and print success message if installation is correct. Type “docker run hello-world” on terminal and hit Enter.

In the next blog post , we will learn about doing setup on Linux environment 🙂

To know more in details, hands-on and for personal / corporate training please reach out to – gauravtiwari91@yahoo.com

Docker Tutorial Part 1 ->Docker technology overview: How is it different from virtual machines

Before we start blindly follow the docker training program and start learning it. Lets understand why we should learn, why we need it. What this technology is and how it works.

What is Docker – All applications have their own dependencies, which include both software and hardware resources. Docker is an open source platform for developers, QA etc. Its a mechanism that helps in isolating the dependencies per each application by packaging them into a single unit called container. Containers are safe to use and deployed easily compared to previous approaches

How containers are different as a concept – Lets understand the difference by an analogy. Consider your virtual machine as a house and container as an apartment.

Houses (Virtual machines) are fully self-contained which has its own infrastructure- plumbing, electricity, water supply etc. On a majority all house would have at least a bedroom, living area, bathroom and kitchen. Still, even if I am trying to bought a house with only a room, I would end up buying more than what i need in a house.

Apartments (The containers) are built around shared infrastructure. The apartment building (Docker host) shares plumbing, electricity, water supply etc. They are also offered in different sizes as per your need. You also have to pay for only those services which you want to use.

Also maintenance cost for house will always be higher than an apartment.

So with containers, you share the underlying resource of the Docker host and use only the software which you need to run your application.

And with virtual machines- its just opposite, you are going to have full operating systems and default programs comes with it.

Now when we have understand the concept, lets go a little technical. Consider the building as docker-host and builder as docker-engine in below explaination-

Docker containers versus Virtual Machines – Virtual machines have full OS with its own memory management with the overhead of virtual device drivers. In an virtual machine, valuable resources are emulated for the guest OS and Hypervisor, which makes it possible to run many instances of one or more operating systems in parallel on a single machine

While Docker containers are executed with Docker engine rather than a Hypervisor, therefore containers are smaller than virtual machines and enable faster startup and better performance, great compatibility due to sharing of the host’s kernel. Architecture level visual difference is as below – containersvsVM

So to optimize our SDLC and reduce time spent in test script execution,overhead of maintaining the execution/deployment environments, We should really go for container technology.

Now we know what docker is and why we should use it. To know more in details and for personal / corporate trainings please reach out to – gauravtiwari91@yahoo.com

 

 

Docker Training Program – [Build, Ship, and Run Any App, Anywhere]

In the automation driven industry, we are way more advance in automating our test cases, deployments etc. but automating your infrastructure, setup of environments is still a pain. We all have seen situations when something is working on a machine but it is not working on other machine. Sometimes a QA files a OS specific defect , but developer is no longer able to re-produce. So solution of all these problem is one single thing – DOCKER

I have recently started working on Docker and utilized this very efficiently in Automation Testing and Dev-ops; specially in setting up the execution environment. Following are the major benefit of docker-

  • Build, Ship, and Run Any App / automation script , Anywhere
  • Setup of execution environment for dev/testing is a matter of seconds
  • Docker hub – A cloud of docker images which provides image for every possible software you are looking for, and you can also push your own images and use it form anywhere
  • Continuous integration and fast deployment

Going forward, I would be going through following topics-

  1. Introduction to Docker container technology, how is it different from virtual machines
  2. Installing and Setting up Docker on Windows
  3. Installing and Setting up Docker on Linux (Ubuntu)
  4. Understanding Docker components: docker-machine, Dockerfile, Images and Containers
  5. Hooking your local source code in to container
  6. Understanding major docker commands and shortcuts
  7. Executing your local selenium test inside the container
  8. How to use Selenium-Grid with docker
  9. Building custom images from dockerfile
  10. How to minimize the size of your docker images
  11. Managing your containers with docker compose
  12. How to scale your execution environment with docker and multi-threadinng
  13. Using docker containers as Jenkins slaves
  14. Docker on AWS

So far, I will  be writing about each topic mentioned above. I will also keep adding new topics to the list. Stay tuned to this post for all docker related stuff.

You can reach out to – gauravtiwari91@yahoo.com for more details and personal training with live projects.

Shifting left with DevOps and Continuous Integration

Adopting Continuous delivery helps to achieve rapid application development throughout the software application life cycle. It is a methodology, a mindset change,a shift-left approach and a leadership practice to streamline manual processes and enforce consistency and repeat-ability in software delivery pipeline. This is about enhancing collaboration and shared matrices and processes across Developers, QA and Ops team.

Read time – 10 minutes

In order to establish a Continuous Delivery environment, the most important requirement is the implementation of an automated Continuous Integration (CI) System. The CI process involves all stages – right from code commits to a version control system (done on the CI server) that serves as a kick off for a build to compile, run tests, and finally package the code. DevOps is playing a major role in going towards defect-preventive approach.

This blog will demonstrate the steps and advantages of implementing a Continuous Integration System using Jenkins and a group of virtual machines. You can utilize your CI system for automatic infrastructure setup and execution of suitable automated scripts whenever a new build or commit happens in that automation script code.

Highlights

  1. Setting up a Jenkins Master machine
  2. Setting Jenkins slaves for distributed execution
  3. Creating new Jenkins job for new scripts
  4. Creating execution pipeline for automating the build steps
  5. Scheduling script execution

 

  • Setting up a Jenkins Master Machine – Installation of Jenkins is very easy; it is just about executing a jar file or executable. Once this is up, it can be accessed from any machine available on that network through a web browser. Jenkins Master is responsible for re-directing all commands and execution to Slave machines. For set up instructions, go through Setting up Jenkins in 5 minutes
  • Setting Jenkins Slave for distributed execution – Jenkins Slave machine can be a real machine, a virtual machine or a dockerized container which has capability of an operating system. Jenkins slave can also be set up by just executing a jar file and registering that machine as a slave against Master Jenkins machine. When Jenkins master receives instructions, it processes that and decides which script would be executed on which slave machine. So if multiple slaves are connected, execution can be done in distributed manner and will result in multi processing + multi threading of execution.
  • Creating new jobs for script execution – A job in Jenkins is about defining a series of action for successful execution of script. First it fetches the latest script code from sub-version, and then it builds that code on some selected slave or master (which is decided by Jenkins Master). Once the code is built on slave machine, Scripts get started and executed. usually, we use Selenium, Java and Maven for automated build process. Once a job is created, it has a web URL. This web URL is helpful in sharing execution results, build info etc.
  • Creating execution pipe line – Execution pipe line is a visual representation of different job executions in sequence. This helps in automating the whole process of script execution. So once all Jenkins jobs are set up, you can decide the order of their execution based on the type of script e.g. Smoke, Regression, release validation, web services test etc. So it creates a pipe line of execution which has a single trigger point. That trigger is initiated whenever a new build / release happen on platform or there is any requirement of executing it. Once a single job in this pipeline gets finished, it automatically triggers the next job in the pipe line. QA/Developer keeps getting the continuous feedback of automated results, which helps us in identifying the defects. Below diagrams represents a pipe line which shows headless scripts execution which is done using plugin – Build pipeline plugin

build-pipeline

  • Scheduling script execution – Jenkins job is that these can be scheduled for future executions and can provide results of nightly test build script executions. It helps a lot of time spent in manual triggering and monitoring of execution and enhance the automated process of script execution. There are two ways to do this-
  1. Jenkins plugin for scheduling – Build schedule plugin
  2. Select the option for build periodically in build steps and define a regular expression which is a 5 digits separated number define the execution timing and cycle. for e.g. below images showing that this job will be executed at 22:00 (10:00 PM) daily.schedule

I have setup this kind of infrastructure for automated test script execution, Next step is to do the same at build deployment level. I will keep posting more stuff related to Devops and continuous integration. Happy Testing 🙂