Wednesday, April 29, 2026

Build and Deploy Dockerize Python Application to Azure Container Instances (ACI) using Azure DevOps

 

As a software Engineer, I would like to deploy my Dockerized Python Application to Azure Container Instance (ACI) using Azure DevOps so that I can automate my workflow and eliminate any manual intervention during the deployment.

In this blog post, I will walk you through how to build and deploy a Dockerized Python Application to Azure Container Instances (ACI) using Azure DevOps. The workflow will build and deploy the docker image to Azure Container Registry (ACR), and finally, the application will be deployed to Azure Container Instances.

What is Azure? Azure is a cloud computing platform and services, provided by Microsoft. It offers many services, including Databases, storage, Virtual Machines, and more. Azure allows developers and organizations; to build, deploy, and manage applications and services through Microsoft-managed data centers.

Azure Container Instance (ACI) is a managed Azure service that allows you to run containers in the cloud without managing the underlying infrastructure. ACI allows developers to quickly deploy and manage containerized applications without the overhead of managing Kubernetes clusters or virtual machines (infrastructure). ACI is a serverless container service, which simply means that you only pay for the resources (CPU and memory) your container uses and you do not need to worry about how to provision or manage servers.

Azure Container Registry (ACR) is a managed Azure registry service that allows you to manage and store your Docker container images in Azure. It integrates seamlessly with Azure services and it is designed for building and storing Docker container images that can later be used in Azure Container Instance (ACI), Azure Kubernetes Service (AKS), or other containerized services.

Azure DevOps is a set of development tools and services from Microsoft that allow you to build, deploy, test, and deploy your software more efficiently and effectively. It also provides a comprehensive DevOps lifecycle solution, which includes source control, continuous integration and Delivery (CICD), collaboration tools, and project management.

Prerequisites

  • Azure Subscription: You must have an active Azure account and subscription. You can create a free account here
  • Azure CLI: Install Azure CLI (Command Line Interface) if you do not have it already. Install Azure CLI here
  • Docker: Docker is installed on your local machine. If you do not have Docker. Install Docker here
  • Azure DevOps Account: Setup an Azure DevOps Organization project here

Below is a step-by-step guide on deploying a Dockerized Python Application from Azure pipelines and later deploying it to an Azure Container instance.

Step 1: Create the Python App with Classes, Getters, and Setters

The Python app will be written in classes and you will be using Flask which is a Python framework to develop the Python application. The app will demonstrate how to manage a simple greeting message and expose getter and setter methods.

The code can be found here

https://medium.com/media/3cf1f346c389775ad3eb29d523aa3d4a/href

Step 2: Explanation of the Python Code

2a: Class Greeting:

  • It manages the _message attribute, which stores the greeting message Hello World
  • It has the getter method get_message() to fetch the current greeting.
  • It has the setter method set_message(message) to modify the greeting message.

2b: Class App:

  • It initializes the Flask application and the Greeting class instance.
  • The index / route returns the current greeting message.
  • The /update/<message> route allows the greeting message to be updated using either GET or POST method

Step 3: Create the requirements.txt file

Navigate to the root project and create a new file with the filename requirements.txt which will contain the necessary Python dependencies. Copy and paste the code below in the requirements.txt file.

Flask==3.1.0
gunicorn==23.0.0
setuptools>=70.0.0

Step 4: Create the Dockerfile

Navigate to the root project and create a new file with the filename Dockerfile with the code below

The code can be found here

https://medium.com/media/ae63b49fa2f87cbe7ce0090667de9655/href

Step 5: Build the Dockerfile and Test locally

Before deploying using Azure DevOps, let's test this setup locally

a. Build the Docker image using the command below

docker build -t python-app .

b. Run the Docker container

docker run -p 5000:5000 python-app

c. In your browser, navigate to http://localhost:5000/ to see the greeting message.

d. To update the greeting message from your browser, visit http://localhost:5000/update/<new-message>. The actual greeting message will be http://localhost:5000/update/Hello%20from%20Dockerized%20App

Step 6: Create the Azure DevOps Pipeline

Navigate to the root project and create a new file with the filename azure-pipelines.yml

https://medium.com/media/45a3034f5f4a8d4b0834382115c0982f/href

You also need to configure ServiceConnections.

  • connectedServiceNameARM This is a profile for which to establish integration with Azure, which will create permissions and allow you to create services (ACR and ACI on Azure). It will also allow you to link your Azure subscription.
  • sampleapp This is the profile for establishing connectivity with ACR (Azure Container Registry)
  • ExitoLab, this is the profile for establishing connectivity with Git.

Here is a summary of the YAML file / deployment file

  • The pipeline runs automatically when changes are pushed to the main branch
  • Different variable definitions define the key configurations, such as azure subscription, resource group, ACR name, image name / tag
  • There is a step that verifies if the specified resource group exists, if not it creates it.
  • A step checks if the Azure Container Registry (ACR) exist with admin access enabled, if not it creates it.
  • Implemented a caching mechanism to make the pipeline (speed up Docker builds) runs faster
  • A step installs Trivy, scan the image for High and Critical vulnerabilities but it does not fail that pipeline if a vulnerability is found but it can fail the pipeline if a vulnerability is found. It then pushes the the docker image with tag latest, $(imageTag) to ACR.
  • A steps deploys the container image to ACI (Azure Container Instances) with specified resources (CPU, Memory, ports and DNS label)

Once the deployment is successful, you can access the application on http://python-app-demo.eastus.azurecontainer.io:5000/

Conclusion

I hope you can now deploy a Dockerized Python application by leveraging Azure DevOps and Azure Container Instances (ACI) by automating the entire process of building, scanning and deploying with minimal manual effort. This pipeline demonstrates how infrastructure components like resource group, ACR are provisioned if missing, scan docker image using Trivy for vulnerabilities and deploy the Docker image to ACI (Azure Container Instances).

Check out the completed code on GitHub

Creating a Custom Python Application Image and Deploying to Docker

This ties in well with the previous infrastructure studies I’ve been completing, and I’d like to be able to combine both skills to build out a cloud-based deployment pipeline for my Python projects.

Get Michael Rodgers’s stories in your inbox

Join Medium for free to get updates from this writer.

To begin with, I’ve created a local development Docker host, with Jenkins installed for CI/CD, and Prometheus and Grafana installed for monitoring. My next step is following the Docker documentation for Pythonwhich details deploying a Flask application to a container. I’ve listed the steps I’ve taken below.

Steps Taken

Prepare Repository and Minimal Flask Server

I’ll be using the GitLab repo that I’ve previously stored files relating to this Docker host, creating a new branch for this project.

  • mkdir hello-py && cd hello-py to create a folder within the file structure to hold project files
  • mkdir server && cd server && touch main.py to create the code file structure and Flask server entry point, entering the following code:
from flask import Flask

app = Flask(__name__)

@app.route("/")
def home():
return print("hello, py!")


if __name__ == "__main__":
app.run(debug=True)
  • Within the hello-py/server folder, open a terminal instance and create a Python virtual environment, selecting relevant OS specific commands:
# Linux & macOS
python3 -m venv .venv
source .venv/bin/activate

# Windows
py -3 -m venv .venv
.venv\scripts\activate
  • Following this, in VSCode, press ⇧⌘P and select the Python interpreter
  • python -m pip install --upgrade pip within VSCode’s .venv terminal to update pip
  • python -m pip install flask to install Flask
  • pip freeze > requirements.txt to output the Python package requirements
  • I added .venv to the .gitignore file to prevent the folder from being committed to the repo
  • python -m flask run to start the server at URL http://localhost:5000

Create a Dockerfile

  • touch Dockerfile within the hello-py directory create a file to build the custom Docker image
  • # syntax=docker/dockerfile:1 specify the Dockerfile syntax
  • FROM python:3.9.13-slim-buster specify the base Python image to use
  • WORKDIR /hello-py create a working directory for the subsequent commands
  • COPY requirements.txt requirements.txt copy the pip requirements file
  • RUN pip3 install -r requirements.txt use the copied requirements file to install pip dependencies
  • COPY . . will copy the Python project files into the /hello-py image directory
  • CMD [ "python3", "-m" , "flask", "run", "--host=0.0.0.0"] supply the command to run that will launch the server, and make it accessible outside of the container

The final Dockerfile code should look like this:

# syntax=docker/dockerfile:1

# base python image for custom image
FROM python:3.9.13-slim-buster

# create working directory and install pip dependencies
WORKDIR /hello-py
COPY requirements.txt requirements.txt
RUN pip3 install -r requirements.txt

# copy python project files from local to /hello-py image working directory
COPY . .

# run the flask server
CMD [ "python3", "-m" , "flask", "run", "--host=0.0.0.0"]

Build the Image

  • docker build --tag hello-py . in the local file’s terminal to build the image using the Dockerfile
Custom Python server image built with Dockerfile
Custom Python server image built with Dockerfile

Testing the Image in a Local Container

  • docker run -d -p 5001:5000 hello-py on my local machine to test that the image was built successfully

I’m using an external port of 5001 on the container on my local machine as I have been using port 5000 for running the Flask server through VSCode’s terminal, and want to make sure there are no conflicts. When we move the image to the remote Docker host I’ll use an external port of 5000

Container deployed from custom image on local computer
Container deployed from custom image on local computer

Push to Docker Hub and Deploy to Remote Container

  • docker login --username=mjrod run this if required to connect CLI to Docker Hub account
  • docker tag hello-py mjrod/hello-py tag the image with your Docker Hub username: <username>/<image-name>
  • docker push mjrod/hello-py to push a new repository to Docker Hub to store the image
Image pushed from CLI to Docker Hub successfully
Image pushed from CLI to Docker Hub successfully

After this, log into the remote Docker host using SSH and run the following

  • docker pull mjrod/hello-py:latest to pull the previously uploaded image from Docker Hub
  • docker run -d -p 5000:5000 --name=hello-py mjrod/hello-py to launch the container
Containers deployed on remote Docker host
Containers deployed on remote Docker host

Navigate to http://<remote-docker-ip>:5000 to see the Python application running in the remote container

Next Steps

  1. Begin coding my Python application — now that I have the ability to package up my Python code and deploy it to a container, I’d like to begin building high-quality web applications to cement both my programming and infrastructure learnings.
  2. Build a Jenkins pipeline to automate the rebuilding phases for each Docker images — each time I make code changes, the steps outlined in this document will need to be repeated to make the images reflect the code base. I’d like to link my Gitlab repo and my Docker host using Jenkins to automate some of this task away.

Thursday, November 27, 2025

Cache Design and patterns

 In this article  we will look at application design and how cache design will be helping to get data from back end quickly.  scope of this article is to look in to general use cases and the best practices available.

There could be lot of specific scenario and good solutions but customization will be based on some standard practices.  Cache at different levels will be starting point for this topic. When data cached at the client side , browser side,  servicer side or even DB side all different scenarios will be suitable for different application/ functionality. 

  • Web server caching frequently requested web pages.
  • Database Query Caching: Caching the results of frequently executed database queries to avoid repetitive database hits.
  • API Response Caching: Caching the responses from external APIs to reduce network round trips and improve response times.
  • Web Server Caching: Web servers like Nginx and Apache can cache static and dynamic content to serve requests faster.
  • Browser Caching: Your web browser caches resources (images, CSS, JS) to speed up subsequent visits to websites.
  • DNS Caching: DNS resolvers cache domain name to IP address mappings to accelerate website lookups.


In this article I will also go though complete details of API caching from graphql  and few other available solutions.

Technologies used in cache design: 

Client-side caching is a technique used to store data locally on the client’s device to improve the performance and efficiency of web applications

Server-side caching is a technique used to store copies of data on the server to improve response times and reduce the load on back-end systems. By keeping frequently accessed data in a cache, servers can quickly serve requests without needing to repeatedly query a database or perform complex computations. 

Lotof legacy systems used Akamai to cheche content and it was very good solution for application of large scale as Akamai has edge network around the world and cache servers are quite powerful. so it all started especially images or some of the key web data akamaized and delivered to web sites lot.

There are many companies made similar products some of them are based on edge network infrastructure and some are cloud based . Here are few examples 


Content Delivery Network (CDN) Caching Techniques

To implement CDN caching, website owners can use the following solutions:

  • Cloud-based CDNs: Services like Amazon CloudFront, Google Cloud CDN, and Cloudflare offer CDN solutions that can be easily integrated with the website.
  • Self-hosted CDNs: Website owners can also set up their own CDN infrastructure using open-source solutions like Varnish Cache or Nginx.

Dynamic Caching Techniques

To implement dynamic caching, website owners can use the following solutions:

  • Varnish Cache: Varnish Cache can be used to cache the output of dynamic scripts and serve them directly from the cache.
  • WordPress Caching Plugins: WordPress has a range of caching plugins, such as WP Rocket, W3 Total Cache, and WP Super Cache, that can help with dynamic caching.
  • Custom Caching Mechanisms: Website owners can also build their own custom caching mechanisms using in-memory data stores like Memcached or Redis.
Above techniques re more depending on third party infrastructure to scale up the application instead of fine tuning application using in build cache techniques.  Again inbuild cache can help to some extent and after that either it should be hardware scaling which is expensive so alternative should be above mentioned third party vendors software solutions.

Some of the use cases just plugging in third party will just solve the performance and scalability. Eg in my web site i do read content i.e images from Choudary. I can still deliver from azure blog and rendering on page. But cloud nary already make sure these images are loaded faster  and easy to adopt. No additional coding required to achieve this.

Technical challenge: 
Initial design is get data from azure blob . Which is big slow as www.talash.azurewebsites.net  display 100 images on landing page. These images will change for each request.  API is coming from Graphql end point.
Graphql request cached. But once cache expires it takes 19 seconds to get images urls and get meta data for images like likes, downloads etc. On top of   design should address real time  showing likes for images i.e updating this data  if other users browsing and done some thing like downloaded or commented. so if we request images every time it is 19 sec loading time. If we cache request data and still if we want to show user actions real time make tis design bit tricky.

Solution

1. Signal R solved problem of real time updates. works perfect if two users brow same content and do any user action will reflect on other user pages.

2. Graphql api cached at server level. so fetching time reduced. If it is ne fetch it is 19 sec otherwise it is .5 ms.  Appollo client used at client side to get more control on cache . 

.3 On expansion of tool bars which basically show image information like clicks and download numbers  latest data will be fetched. This section made collapsable so that only if user interested will get latest data otherwise data will be same when page loaded. this will avoid every user click for all images to all concurrent users. 

4. To reduce 19 sec after cache expiry, some work around solution made. As there is over head with "Cold Start" of the some azure resource which is around 5 to 8 sec , implemented an function app which runs every 5 min basically it will reduce the
any latency related to "Cold start" in azure. 

Enhancements:

1. Load images query should get new data in every 5 min and fetch should be cached and also it should avoid " stampede" situation where cache expiration should not cause bad user experience to multiple concurrent users. 

2. "User Query" like data fetching and refetching mechanism to get image information and refetch modified information . It should handle mutation situation carefully.

3. Looking at scalability aspects like if application restarts or latency related to "cold start" with cloud infrastructure and some features of distributed cache using raidis or other external cache systems. 

So finally can get all images with their updated information real time still able to cache content and achieve 20ms loading tie for my landing page. The solution can be further fine tuned. Any mutation can expire meta data and then send to concurrent users as notifications. But above design address lot of performance issues using cache at the same time avoiding any data integrity issues. 

We can look at more Use cases and design and solutions. Above example is to understand different dimensions of caching. Caching support will be different for different languages and technologies. Here are few good examples for Node js,
react and few other technologies.

Node Js:
React:
dotnet:

Angularl:



1. Twitter

  • Caching Strategy: Cache-Aside and In-Memory Caching
  • Problem: Twitter deals with massive amounts of data, with millions of tweets being read and written every second. The need to quickly serve user timelines and handle the high read/write throughput is critical.
  • Solution: Twitter uses Memcached, an in-memory caching system, to store timelines and user sessions. By caching the results of expensive database queries, Twitter can serve user requests more quickly.
  • Benefits: This reduces the load on the primary database, speeds up data retrieval, and enhances the overall user experience.

2. Netflix

  • Caching Strategy: Distributed Caching and Write-Through Caching
  • Problem: Netflix needs to deliver video content to millions of users worldwide with minimal latency and high reliability.
  • Solution: Netflix uses an open-source tool called EVCache, which is based on Memcached, to cache metadata and frequently accessed data. This distributed caching system spans multiple data centers to ensure data availability and quick access.
  • Benefits: This strategy allows Netflix to serve content recommendations, user data, and other API responses quickly, ensuring a seamless viewing experience even during peak times.

3. Amazon

  • Caching Strategy: Content Delivery Network (CDN) Caching and Cache-Aside
  • Problem: Amazon's e-commerce platform handles an immense volume of product queries, user sessions, and transactional data.
  • Solution: Amazon uses Amazon CloudFront, a CDN, to cache static assets like images, videos, and CSS files at edge locations closer to users. Additionally, they employ DynamoDB with DAX (DynamoDB Accelerator) to provide fast in-memory acceleration for read-heavy workloads.
  • Benefits: This reduces latency, speeds up data access, and decreases the load on backend systems, ensuring a fast and reliable shopping experience.


Build and Deploy Dockerize Python Application to Azure Container Instances (ACI) using Azure DevOps

  As a software Engineer, I would like to deploy my Dockerized Python Application to Azure Container Instance (ACI) using Azure DevOps so th...