Docker Tutorial Part 3 -> Setup and installation on Ubuntu

Installing docker on Linux is as simple as installing any other Linux package, We don’t require the whole Docker toolbox for working with docker on Linux.

In this blog post, I will be talking about installing community edition installation of Docker.

To install Docker, you need the 64 bit version of either one of below Ubuntu-

  • Xenial 16.04 (LTS)
  • Trusty 14.04 (LTS)
  • Yakkety  16.10

Uninstall older version of Docker – Older version of Docker called docker or docker-engine. if you have these then uninstall them, otherwise skip this part

sudo apt-get remove docker docker-engine

Install Docker – You can install the Docker in different ways, as per your needs

  1. Setup docker repositories and install from them – this is easy in installation and upgrades (recommended approach)
  2. Download the DEB package and install it manually and also manage upgrades manually (prefer when lack of internet access)

Install using Repository – If you are doing the setup for the first time on a new host machine, you need to setup the docker repository. Then you can use the same repository for install and updates

  • Install packages to allow apt to use a repository over HTTPS:
sudo apt-get install apt-transport-https ca-certificates curl software-properties-common
  • Add Docker’s official GPG key :
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
  • Verify that the key fingerprint is 9DC8 5822 9FC7 DD38 854A E2D8 8D81 803C 0EBF CD88
sudo apt-key fingerprint 0EBFCD88

pub   4096R/0EBFCD88 2017-02-22
      Key fingerprint = 9DC8 5822 9FC7 DD38 854A  E2D8 8D81 803C 0EBF CD88
uid                  Docker Release (CE deb) <docker@docker.com>
sub   4096R/F273FCD8 2017-02-22
  • use the below command to setup the stable repository
sudo add-apt-repository \
   "deb [arch=amd64] https://download.docker.com/linux/ubuntu \
   $(lsb_release -cs) \
   stable"
  • Update the apt package
sudo apt-get update
  • Install the latest / specific version of Docker with below commands
sudo apt-get install docker-ce   //for latest version
sudo apt-get install docker-ce=  //for specific version
  • Verify that Docker CE is installed correctly by running a sample hello-world docker image-
sudo docker run hello-world
  • You should see something like below if your installation is successful and complete

docker-linux

If you guys face issues in installation , please mention in comments section.

In the next blog post , we would be learning docker terminology and different docker components

To know more in details, hands-on and for personal / corporate training please reach out to – gauravtiwari91@yahoo.com

Docker Tutorial Part 2 ->Getting started with Docker: Setup and Installation on Windows

Now we have the basic understanding of docker technology, lets go ahead and do the installations. if you are still not aware about it, please go back and read my post Docker technology overview: How is it different from virtual machines and come back here.

Please make sure virtualization is enabled in your Windows system and follow below steps to install Docker toolbox on Windows-

  • Click on the link and download docker toolbox from – Get Docker Toolbox for Windows
  • Docker toolbox will include following docker tools – (Don’t worry, we will cover each of them in upcoming blog posts)-
  1. Docker CLI client for running Docker Engine to create images and containers
  2. Docker Machine for running Docker Engine commands from Windows terminal
  3. Docker Compose for running docker-compose command
  4. Docker Kinematic – ( Docker GUI – for interactive docker operations)
  5. Oracle VM Virtual box
  6. Git MSYS-git UNIX tools
  • Docker Engine uses Linux-specific kernel features, we can’t run Docker Engine natively on Windows. (So indirectly you will be creating containers inside a small Linux VM running in the Oracle virtual box). The new Docker for Windows uses native virtualization and does not need Virtual box to run docker. (lets stick to this as of now for learning purpose)
  • Install the executable which you downloaded in first step. double click and keep following installation instructions. Once you done with installation, you will see below icons on your desktop-

installed

  • Click on the Docker quick start to launch the toolbox terminal. After this if it asks for any permissions, press yes. When its started, you will see a terminal displays $ prompt
  • Now type command docker and you will see all help options for docker as below-

dockerinstallationNow you are good to go and play around docker images and containers. You can give a try to a hello-world docker images. This image checks for your installation and print success message if installation is correct. Type “docker run hello-world” on terminal and hit Enter.

In the next blog post , we will learn about doing setup on Linux environment 🙂

To know more in details, hands-on and for personal / corporate training please reach out to – gauravtiwari91@yahoo.com

Docker Tutorial Part 1 ->Docker technology overview: How is it different from virtual machines

Before we start blindly follow the docker training program and start learning it. Lets understand why we should learn, why we need it. What this technology is and how it works.

What is Docker – All applications have their own dependencies, which include both software and hardware resources. Docker is an open source platform for developers, QA etc. Its a mechanism that helps in isolating the dependencies per each application by packaging them into a single unit called container. Containers are safe to use and deployed easily compared to previous approaches

How containers are different as a concept – Lets understand the difference by an analogy. Consider your virtual machine as a house and container as an apartment.

Houses (Virtual machines) are fully self-contained which has its own infrastructure- plumbing, electricity, water supply etc. On a majority all house would have at least a bedroom, living area, bathroom and kitchen. Still, even if I am trying to bought a house with only a room, I would end up buying more than what i need in a house.

Apartments (The containers) are built around shared infrastructure. The apartment building (Docker host) shares plumbing, electricity, water supply etc. They are also offered in different sizes as per your need. You also have to pay for only those services which you want to use.

Also maintenance cost for house will always be higher than an apartment.

So with containers, you share the underlying resource of the Docker host and use only the software which you need to run your application.

And with virtual machines- its just opposite, you are going to have full operating systems and default programs comes with it.

Now when we have understand the concept, lets go a little technical. Consider the building as docker-host and builder as docker-engine in below explaination-

Docker containers versus Virtual Machines – Virtual machines have full OS with its own memory management with the overhead of virtual device drivers. In an virtual machine, valuable resources are emulated for the guest OS and Hypervisor, which makes it possible to run many instances of one or more operating systems in parallel on a single machine

While Docker containers are executed with Docker engine rather than a Hypervisor, therefore containers are smaller than virtual machines and enable faster startup and better performance, great compatibility due to sharing of the host’s kernel. Architecture level visual difference is as below – containersvsVM

So to optimize our SDLC and reduce time spent in test script execution,overhead of maintaining the execution/deployment environments, We should really go for container technology.

Now we know what docker is and why we should use it. To know more in details and for personal / corporate trainings please reach out to – gauravtiwari91@yahoo.com

 

 

Triggering Remote Jenkins jobs from another Jenkins

Continuous integration and delivery is a very crucial part of software life cycle for automatic execution or deployment of the code. Usually we have single Jenkins for deployment, automation scripts etc. But many times different teams like Ops, Dev and QA create their own Jenkins for their own purposes.

Now if we talk about integrating everything at same place, it is not feasible to manage and re-create jobs. So solution is to make communication between two Jenkins servers and trigger build accordingly.

In this blog post, I will talk about how can you trigger a JOB-A (on remote-jenkins ) from JOB-B (on local-jenkins). The real scenario of this would be; when we are trying to trigger an automation script on Jenkins1 after successful completion of code deployment on Jenkins2.

For understanding this let assumes few things below-

  • We have a local job-  Job-B (local-jenkins) on server local-jenkins:8080
  • We have a remote job – Job-A (remote-jenkins) on server – remote-jenkins:8080

Now we want to trigger Job-A from Job-B. For achieving this we need to install few plugins in our local Jenkins (from which we want to trigger the job- local-jenkins in this case) –

Go to manage-Jenkins->configure system->Parameterized Remote Trigger Configuration, and do configuration as stated below-

remotetrigger1

You can add many remote servers. Now you have to do following changes in your local Jenkins job i.e. Job B.

buildsteps

buildinfo

Now save the configuration of your job and build your local job i.e. Job-B. Console of local job looks like below- local job

Console of Remote job looks like below – remotejob

You can see it says that started by local-Jenkins. So it has triggered this job on remote Jenkins from local Jenkins.

Similarly you can link multiple jobs to be build across different Jenkins server. I hope now you can easily integrate multiple Jenkins. Please add comments in case of any issues encountered. 🙂

 

Docker Training Program – [Build, Ship, and Run Any App, Anywhere]

In the automation driven industry, we are way more advance in automating our test cases, deployments etc. but automating your infrastructure, setup of environments is still a pain. We all have seen situations when something is working on a machine but it is not working on other machine. Sometimes a QA files a OS specific defect , but developer is no longer able to re-produce. So solution of all these problem is one single thing – DOCKER

I have recently started working on Docker and utilized this very efficiently in Automation Testing and Dev-ops; specially in setting up the execution environment. Following are the major benefit of docker-

  • Build, Ship, and Run Any App / automation script , Anywhere
  • Setup of execution environment for dev/testing is a matter of seconds
  • Docker hub – A cloud of docker images which provides image for every possible software you are looking for, and you can also push your own images and use it form anywhere
  • Continuous integration and fast deployment

Going forward, I would be going through following topics-

  1. Introduction to Docker container technology, how is it different from virtual machines
  2. Installing and Setting up Docker on Windows
  3. Installing and Setting up Docker on Linux (Ubuntu)
  4. Understanding Docker components: docker-machine, Dockerfile, Images and Containers
  5. Hooking your local source code in to container
  6. Understanding major docker commands and shortcuts
  7. Executing your local selenium test inside the container
  8. How to use Selenium-Grid with docker
  9. Building custom images from dockerfile
  10. How to minimize the size of your docker images
  11. Managing your containers with docker compose
  12. How to scale your execution environment with docker and multi-threadinng
  13. Using docker containers as Jenkins slaves
  14. Docker on AWS

So far, I will  be writing about each topic mentioned above. I will also keep adding new topics to the list. Stay tuned to this post for all docker related stuff.

You can reach out to – gauravtiwari91@yahoo.com for more details and personal training with live projects.