Building AI-Powered Microservices with .NET 8 and Azure Container App

Summary: This post explores how to design, develop, and deploy AI-powered microservices using .NET 8 and Azure Container Apps. Learn how to create scalable, resilient AI services that leverage the latest features of .NET 8 and the managed container environment of Azure Container Apps.

Introduction

Microservices architecture has become a popular approach for building complex applications, offering benefits like independent scalability, technology diversity, and resilience. When combined with AI capabilities, microservices can provide powerful, scalable intelligent services that can be deployed and managed efficiently.

In this post, we’ll explore how to build AI-powered microservices using .NET 8 and Azure Container Apps. We’ll cover everything from designing the microservices architecture to implementing AI capabilities, containerizing the services, and deploying them to Azure Container Apps. By the end of this article, you’ll have the knowledge to create scalable, resilient AI services that leverage the latest features of .NET 8 and the managed container environment of Azure Container Apps.

Understanding Microservices and AI Integration

Before diving into implementation, let’s understand the key concepts of microservices architecture and how AI capabilities can be integrated.

Microservices Architecture

Microservices architecture is an approach to building applications as a collection of small, independently deployable services. Key characteristics include:

  • Service Independence: Each service can be developed, deployed, and scaled independently
  • Domain-Driven Design: Services are organized around business capabilities
  • Decentralized Data Management: Each service manages its own data
  • Smart Endpoints, Dumb Pipes: Services communicate through simple protocols
  • Infrastructure Automation: CI/CD pipelines for automated deployment
  • Design for Failure: Services are designed to be resilient

AI Integration Patterns

There are several patterns for integrating AI capabilities into microservices:

  1. AI as a Service: Dedicated microservices that provide AI capabilities to other services
  2. Embedded AI: AI capabilities embedded directly within domain microservices
  3. AI Orchestration: Microservices that coordinate multiple AI services
  4. Event-Driven AI: AI services that respond to events in the system
  5. Hybrid Approaches: Combining multiple patterns based on requirements

Benefits of AI-Powered Microservices

Combining AI with microservices offers several advantages:

  • Scalability: Scale AI services independently based on demand
  • Flexibility: Use different AI technologies for different services
  • Isolation: Failures in AI services don’t affect the entire system
  • Iterative Improvement: Update AI models without affecting other services
  • Resource Optimization: Allocate resources based on the needs of each AI service

Setting Up the Development Environment

Let’s start by setting up our development environment for building AI-powered microservices.

Prerequisites

To follow along with this tutorial, you’ll need:

  • Visual Studio 2022 or Visual Studio Code
  • .NET 8 SDK
  • Docker Desktop
  • Azure CLI
  • An Azure subscription
  • Azure Container Registry (ACR) access
  • Azure OpenAI Service access

Creating a Solution Structure

Let’s create a solution structure for our AI-powered microservices:

bash

# Create a new solution
dotnet new sln -n AIMicroservices

# Create projects for each microservice
dotnet new webapi -n AIMicroservices.TextAnalysis
dotnet new webapi -n AIMicroservices.ImageAnalysis
dotnet new webapi -n AIMicroservices.Orchestrator
dotnet new classlib -n AIMicroservices.Shared

# Add projects to the solution
dotnet sln add AIMicroservices.TextAnalysis
dotnet sln add AIMicroservices.ImageAnalysis
dotnet sln add AIMicroservices.Orchestrator
dotnet sln add AIMicroservices.Shared

Setting Up Docker Support

Let’s add Docker support to our microservices:

bash

# Navigate to the TextAnalysis project
cd AIMicroservices.TextAnalysis

# Add Docker support
dotnet add package Microsoft.NET.Build.Containers

# Create a Dockerfile
touch Dockerfile

Edit the Dockerfile for the TextAnalysis service:

dockerfile

# Dockerfile for TextAnalysis service
FROM mcr.microsoft.com/dotnet/aspnet:8.0 AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443

FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build
WORKDIR /src
COPY ["AIMicroservices.TextAnalysis/AIMicroservices.TextAnalysis.csproj", "AIMicroservices.TextAnalysis/"]
COPY ["AIMicroservices.Shared/AIMicroservices.Shared.csproj", "AIMicroservices.Shared/"]
RUN dotnet restore "AIMicroservices.TextAnalysis/AIMicroservices.TextAnalysis.csproj"
COPY . .
WORKDIR "/src/AIMicroservices.TextAnalysis"
RUN dotnet build "AIMicroservices.TextAnalysis.csproj" -c Release -o /app/build

FROM build AS publish
RUN dotnet publish "AIMicroservices.TextAnalysis.csproj" -c Release -o /app/publish /p:UseAppHost=false

FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "AIMicroservices.TextAnalysis.dll"]

Repeat the same process for the ImageAnalysis and Orchestrator services, creating similar Dockerfiles for each.

Setting Up Docker Compose

Create a docker-compose.yml file at the solution level:

yaml

version: '3.8'

services:
  textanalysis:
    image: aimicroservices/textanalysis
    build:
      context: .
      dockerfile: AIMicroservices.TextAnalysis/Dockerfile
    ports:
      - "5001:80"
    environment:
      - ASPNETCORE_ENVIRONMENT=Development
      - AzureOpenAI__Endpoint=${AZURE_OPENAI_ENDPOINT}
      - AzureOpenAI__Key=${AZURE_OPENAI_KEY}
      - AzureOpenAI__DeploymentName=${AZURE_OPENAI_DEPLOYMENT}

  imageanalysis:
    image: aimicroservices/imageanalysis
    build:
      context: .
      dockerfile: AIMicroservices.ImageAnalysis/Dockerfile
    ports:
      - "5002:80"
    environment:
      - ASPNETCORE_ENVIRONMENT=Development
      - AzureVision__Endpoint=${AZURE_VISION_ENDPOINT}
      - AzureVision__Key=${AZURE_VISION_KEY}

  orchestrator:
    image: aimicroservices/orchestrator
    build:
      context: .
      dockerfile: AIMicroservices.Orchestrator/Dockerfile
    ports:
      - "5000:80"
    environment:
      - ASPNETCORE_ENVIRONMENT=Development
      - Services__TextAnalysis=http://textanalysis
      - Services__ImageAnalysis=http://imageanalysis
    depends_on:
      - textanalysis
      - imageanalysis

Implementing the Shared Library

Let’s implement the shared library that will be used by all microservices.

Creating Common Models

csharp

// AIMicroservices.Shared/Models/TextAnalysisRequest.cs
namespace AIMicroservices.Shared.Models
{
    public class TextAnalysisRequest
    {
        public string Text { get; set; }
        public string[] AnalysisTypes { get; set; } = { "sentiment", "entities", "keyphrases" };
    }
}

// AIMicroservices.Shared/Models/TextAnalysisResponse.cs
namespace AIMicroservices.Shared.Models
{
    public class TextAnalysisResponse
    {
        public string Text { get; set; }
        public SentimentAnalysis Sentiment { get; set; }
        public List<Entity> Entities { get; set; }
        public List<string> KeyPhrases { get; set; }
        public string RequestId { get; set; }
        public DateTime Timestamp { get; set; }
    }

    public class SentimentAnalysis
    {
        public string Sentiment { get; set; }
        public double PositiveScore { get; set; }
        public double NeutralScore { get; set; }
        public double NegativeScore { get; set; }
    }

    public class Entity
    {
        public string Text { get; set; }
        public string Category { get; set; }
        public double ConfidenceScore { get; set; }
    }
}

// AIMicroservices.Shared/Models/ImageAnalysisRequest.cs
namespace AIMicroservices.Shared.Models
{
    public class ImageAnalysisRequest
    {
        public byte[] ImageData { get; set; }
        public string[] AnalysisTypes { get; set; } = { "tags", "objects", "description" };
    }
}

// AIMicroservices.Shared/Models/ImageAnalysisResponse.cs
namespace AIMicroservices.Shared.Models
{
    public class ImageAnalysisResponse
    {
        public List<ImageTag> Tags { get; set; }
        public List<DetectedObject> Objects { get; set; }
        public string Description { get; set; }
        public string RequestId { get; set; }
        public DateTime Timestamp { get; set; }
    }

    public class ImageTag
    {
        public string Name { get; set; }
        public double Confidence { get; set; }
    }

    public class DetectedObject
    {
        public string Name { get; set; }
        public double Confidence { get; set; }
        public Rectangle Rectangle { get; set; }
    }

    public class Rectangle
    {
        public int X { get; set; }
        public int Y { get; set; }
        public int Width { get; set; }
        public int Height { get; set; }
    }
}

// AIMicroservices.Shared/Models/OrchestratorRequest.cs
namespace AIMicroservices.Shared.Models
{
    public class OrchestratorRequest
    {
        public string Text { get; set; }
        public byte[] ImageData { get; set; }
        public string[] TextAnalysisTypes { get; set; }
        public string[] ImageAnalysisTypes { get; set; }
    }
}

// AIMicroservices.Shared/Models/OrchestratorResponse.cs
namespace AIMicroservices.Shared.Models
{
    public class OrchestratorResponse
    {
        public TextAnalysisResponse TextAnalysis { get; set; }
        public ImageAnalysisResponse ImageAnalysis { get; set; }
        public string RequestId { get; set; }
        public DateTime Timestamp { get; set; }
    }
}

Implementing Common Utilities

csharp

// AIMicroservices.Shared/Utilities/HttpClientExtensions.cs
using System.Net.Http.Json;
using System.Text.Json;

namespace AIMicroservices.Shared.Utilities
{
    public static class HttpClientExtensions
    {
        private static readonly JsonSerializerOptions JsonOptions = new( )
        {
            PropertyNameCaseInsensitive = true
        };

        public static async Task<T> PostAsJsonAsync<T, TRequest>(
            this HttpClient client,
            string requestUri,
            TRequest request,
            CancellationToken cancellationToken = default)
        {
            var response = await client.PostAsJsonAsync(requestUri, request, cancellationToken);
            response.EnsureSuccessStatusCode();
            return await response.Content.ReadFromJsonAsync<T>(JsonOptions, cancellationToken);
        }
    }
}

// AIMicroservices.Shared/Utilities/Resilience.cs
using Polly;
using Polly.Extensions.Http;
using System.Net;

namespace AIMicroservices.Shared.Utilities
{
    public static class Resilience
    {
        public static IAsyncPolicy<HttpResponseMessage> GetRetryPolicy()
        {
            return HttpPolicyExtensions
                .HandleTransientHttpError()
                .OrResult(msg => msg.StatusCode == HttpStatusCode.TooManyRequests)
                .WaitAndRetryAsync(3, retryAttempt => TimeSpan.FromSeconds(Math.Pow(2, retryAttempt)));
        }

        public static IAsyncPolicy<HttpResponseMessage> GetCircuitBreakerPolicy()
        {
            return HttpPolicyExtensions
                .HandleTransientHttpError()
                .CircuitBreakerAsync(5, TimeSpan.FromSeconds(30));
        }
    }
}