Docker in Practices

Docker in Practices

We are already familiar with container technology, and Docker in recent years. We can see Docker everywhere, tech blogs, tutorials, guidelines, etc. It slowly began with DevOps folks, who used container orchestration and container technology to help the system be scalable and maintainable while keeping the overhead of management effort minimal. However, Docker can be very useful in every aspect of software design and development. And today we want to give an overview of our usage of container technology, throughout all teams and all technical departments.

Containers
Photo by Fejuz / Unsplash

Overview

We can easily find basic introductions about Docker as they are very popular at the moment. Also, a comparison between Docker and traditional virtualization technologies like VirtualBox or VMWare would be nice to have as well, to understand more about why fast-growing companies prefer to have everything containerized. However, in short:

Docker (and containers) helps with fastening environment checkups and delivery pipelines by packaging dependencies and application binaries into a universal, single executable that can be run (mostly) anywhere.

Basic Usage

At InvestIdea Tech, as usual, we use Docker with very basic operations:

  • docker images, docker volumes, docker ps and so on, just to check the availablility of local images, volumes, and running containers.
  • We rely on Docker Desktop in our local environment, friendly user interface, quick installation, ready-to-run, and the fastest path to have a project ready to be developed, are several advantages of having docker-compose in every project.
  • docker-compose.yaml is utilized to declare all dependencies that our engineers should have before running the application, like a Spring-based Java app. It can bring redis, postgres, mysql, kafka online in just a few minutes without any hassle of complex configuration.

So actually, our engineers do not need to deeply understand the containerization stuff, but just-enough to help them do their work faster and much more efficiently.

Continuous Integration and Continous Delivery

The advanced usecases are actually in our CI/CD workflow. InvestIdea Tech heavily invested in cloud-native infrastructure. In fact, all of our workloads are running on AWS and are 100% based on the Kubernetes offering that EKS gives us. Databases, storage, micro-services, etc. are deployed in Kubernetes via a customized Helm chart. That helps us to quickly adapt, upgrade, monitor, and replicate applications to various environments as quickly as possible. And again, thanks to the resiliency of Kubernetes scheduling and cluster-autoscaler, we are able to have a cost-effective cluster while maintaining the highest availability of the system, not just our internal ones, but also customers' as well.

The CI/CD workflow plays a signinificant role in our system. We run a highly-available, auto-scalable GitLab runner in our Kubernetes cluster. This reduces the cost dramatically, because we don't need to have redundant resources when no CI/CD jobs are running and can allocate dynamic resources based on the needs of each kind of project. The CI/CD runners and workflows are integrated with our monitoring and alerting toolchains as well as JIRA - the project management software. As the runners are Kubernetes-based, all the jobs must be run via Docker containers with bundled tools inside the corresponding images.

Photo by Pankaj Patel / Unsplash

Backend

Spring and Java apps, Nodejs and even PHP, etc. each of them can customize the needs for Docker images which will be used to:

  • Compiling and testing: like maven, php-fpm, php-compose (which includes Compose for PHP and Laravel), node with npm or yarn.
  • Bundling: we prefer to not use docker-dind (Docker in Docker solution), our choice is kaniko which can be used to build OCI-compatible Docker images from regular Dockerfile. Kaniko helps to reduce the resources needed to run the docker-build jobs, and it is faster than regular docker build, docker tag and docker push commands. The outcome is pushed to our internal Docker registry with automatic credentials, already pre-configured via our Infrastructure as Code in Kubernetes.
  • Running: finally, the jdk image with the .jar file from the application, already configured, will be run in the Kubernetes cluster with an automated deployment pipeline, just like with other node or php.

Frontend

Usage for CI/CD at the frontend is quite similar. Reactjs, Angular apps already support production builds which produce plain HTML/CSS and other static stuff like Javascript files, and images. So far:

  • Compiling and testing: just use a node image with proper caching for the node_modules directory
  • Bundling: kaniko will build Docker images from nginx with static sites. For any site that requires SSR (Server-side Rendering) capability, node is used instead
  • Running: the rest is very much the same, thank you, Kubernetes!

Mobile

Mobile team can use Docker too! Don't be surprised. We have several ways to do it:

  • Native apps: use a customized Docker image with android-sdk bundled, with gradle and jdk as well. We can easily build apk and aab artifacts and test them in our CI/CD runners. However, currently we have no internal support for iOS, yet. We still rely on third party services like TravisCI or CircleCI which offer paid solutions for iOS CI/CD.
  • Flutter apps: we can find flutter-based Docker images easily on the net. In our company, we use the very same images, except we customize them a little bit to include more tools like sonar-scanner to be able to analyze the code quality and metrics.
  • Even more, mobile teams and mobile development in general, can utilize Docker to run helper tools like Push Notification, or even the backend service by themselves to avoid depending on output from other teams. That will help a lot, I promise!

Conclusion

In general, we think Docker is now a must-have knowledge and skill, for every developer, not just DevOps guys anymore. You don't need to understand it and be able to tweak it in advance, but by just being able to use it, you already have a very powerful tool in your hands. Docker is even more and will be more popular, especially in this cloud-native era, and with the rise of cloud providers like AWS, GCP or Azure. So embrace yourselves, and if you find our approaches interesting, join us today!