Setting the Stage

The Role of Artifacts, Storage, and Registries in Software Deployment

Image by vectorjuice

Hello Muser!

In our previous newsletter, we looked at the differences between VMs and Containers, using the Matrix as our comparison. Now that we have a better understanding of the concepts, let's explore some of the infrastructure to support Docker images and their use! Namely: software artifacts, artifact storage, and container registries.

As usual, I like to find analogies to try and connect the ideas and streaming video seemed appropriate here. Imagine you're a content creator, producing videos. Your process involves creating the video, storing it, and then making it available for your viewers on a streaming platform. Now, let's relate this to the world of software development and delivery.

If you’ve got a different perspective on how these fit together, add a comment and let me know!

----------

Software Artifacts: The Content

Image by Freepik

Just as you, the content creator, produce finalized versions of your videos, ready for distribution or streaming, software artifacts are the polished, final versions of software components. They're the end product, ready for deployment. Think of them as the director's cut of a movie, the version that's ready for the big screen.

Examples of software artifacts include

  • JAR files for Java applications

  • WAR files for web applications

  • compiled binaries from Go or Rust

  • Python script files

  • Bash script files

  • Javascript, CSS, and HTML served from a web server (in the previous newsletter, the index.html file would be considered an artifact!)

The possibilities are really endless and aren't limited to what you might consider a "software engineering" project!

----------

Artifact Storage: Centralized Content Storage

Image By vecstock

Before you release your work to the world, you store your final video versions in a centralized storage system. This ensures you have a backup and can access them whenever needed. This could be as simple as an external hard all the way up to corporate IT storage solutions like NAS/SANs depending on how sophisticated the production is! Similarly, artifact storage is the centralized location where software artifacts are stored, ready to be packaged into Docker images. There are several products I've run across that have various features to simplify the organization of whatever you're storing:

Container Registries: Streaming Video Platform

Now, the grand premiere! Once your videos are ready for the public, you upload them to a streaming platform where your eager viewers can access and watch them. In the same vein, once Docker images, which contain our software artifacts, are ready, they're pushed to a container registry. This makes them accessible for deployment. Just as viewers pull videos from the streaming platform, developers pull Docker images from the registry. And just like a streaming platform where watchers can pull multiple copies of your video to rewatch, a container registry allows multiple copies to be downloaded for use.

It's useful here to make a slight distinction between public and private registries. Using YouTube as an example platform, you can choose to set your videos to Public or Private access. In many situations, you won't want your container images available for public consumption, so you can create private registries to house those container images. Some examples here include:

Public

Private

Why Understanding This Matters

Let's leave the streaming video analogy for now and remember the objective of a Devops engineer: make things easier and smooth the path for software delivery. Docker makes config management of your environments more standardized but you still need to create the Docker images with your running software inside it!

Without an Artifact Storage system where would you keep your binaries? On your local computer? On a shared filesystem? Do you designate a folder? How do you keep track of versions?

Many of the same questions apply for your container images! It is possible to build a Docker image locally, save it to a tar.gz file and transfer it over a network to coworkers but it certainly isn't efficient!

Setting up this infrastructure is a bit advanced for now but I wanted to explain where in the puzzle they fit when it comes to deploying and easing software delivery with modern Devops tools.

----------

Next time, we'll talk about the nginx image we built, some details about versioning images, and how to upload ours to Docker Hub! Subscribe now and share with your colleagues who might find this useful!

Have any questions or experiences to share about container infrastructure? Leave a comment below. Also, your comments help me understand what you're curious about, and what topics you'd like to see covered next! I'd love to hear from you!

Keep learning and keep growing,

Darrell

Reply

or to participate.