Building Intelligent Chat Applications with GPT-3.5 and .NET

Summary: This post explores how to create sophisticated chat applications using OpenAI’s GPT-3.5 models with .NET, covering conversation management, context handling, and building a responsive chat interface in a Blazor application.

Introduction

Conversational AI has evolved dramatically in recent months, with OpenAI’s GPT-3.5 models setting new standards for natural language understanding and generation. For .NET developers, these advancements open exciting possibilities for building intelligent chat applications that can understand context, maintain coherent conversations, and provide valuable assistance to users.

In this post, we’ll explore how to leverage GPT-3.5 models through Azure OpenAI Service to create sophisticated chat applications in .NET. We’ll cover the fundamentals of conversation management, context handling, and build a responsive chat interface using Blazor.

Understanding GPT-3.5 Chat Models

GPT-3.5 models, particularly gpt-35-turbo (available through Azure OpenAI Service), are specifically designed for conversational applications. Unlike completion models that simply continue text, chat models understand the roles in a conversation (system, user, assistant) and maintain context across multiple turns.

Key advantages of GPT-3.5 for chat applications include:

  1. Contextual understanding: The model can reference information from earlier in the conversation
  2. Role-based interactions: Different message types help structure the conversation
  3. Efficiency: Chat models are more cost-effective than completion models for dialogue
  4. Instruction following: Better adherence to guidelines specified in system messages

Setting Up the Project

Let’s start by creating a new Blazor Server application that will serve as our chat interface:

csharp

dotnet new blazorserver -n IntelligentChatApp
cd IntelligentChatApp
dotnet add package Azure.AI.OpenAI --version 1.0.0-beta.1

Next, we’ll need to configure our application to connect to Azure OpenAI Service. Add the following to your appsettings.json:

json

{
  "AzureOpenAI": {
    "Endpoint": "https://your-resource-name.openai.azure.com/",
    "Key": "your-api-key",
    "DeploymentName": "gpt-35-turbo"
  },
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "Microsoft.AspNetCore": "Warning"
    }
  },
  "AllowedHosts": "*"
}

Creating a Chat Service

Now, let’s create a service to handle interactions with the Azure OpenAI API. Create a new file called ChatService.cs in a Services folder:

csharp

using Azure;
using Azure.AI.OpenAI;
using Microsoft.Extensions.Configuration;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;

namespace IntelligentChatApp.Services
{
    public class ChatService
    {
        private readonly OpenAIClient _client;
        private readonly string _deploymentName;

        public ChatService(IConfiguration configuration )
        {
            string endpoint = configuration["AzureOpenAI:Endpoint"];
            string key = configuration["AzureOpenAI:Key"];
            _deploymentName = configuration["AzureOpenAI:DeploymentName"];

            _client = new OpenAIClient(
                new Uri(endpoint),
                new AzureKeyCredential(key));
        }

        public async Task<string> GetChatResponseAsync(List<ChatMessage> messages)
        {
            try
            {
                ChatCompletionsOptions options = new ChatCompletionsOptions
                {
                    DeploymentName = _deploymentName,
                    Temperature = 0.7f,
                    MaxTokens = 800
                };

                // Add all messages to the request
                foreach (var message in messages)
                {
                    options.Messages.Add(message);
                }

                Response<ChatCompletions> response = await _client.GetChatCompletionsAsync(options);
                ChatCompletions completions = response.Value;

                return completions.Choices[0].Message.Content;
            }
            catch (Exception ex)
            {
                return $"Error: {ex.Message}";
            }
        }
    }
}

Register this service in Program.cs:

csharp

builder.Services.AddSingleton<ChatService>();

Implementing a Conversation Manager

To maintain conversation history and manage context effectively, we’ll create a ConversationManager class:

csharp

using Azure.AI.OpenAI;
using System.Collections.Generic;

namespace IntelligentChatApp.Services
{
    public class ConversationManager
    {
        private List<ChatMessage> _messages;
        private int _maxMessages;

        public ConversationManager(string systemPrompt = "You are a helpful assistant.", int maxMessages = 20)
        {
            _maxMessages = maxMessages;
            _messages = new List<ChatMessage>
            {
                new ChatMessage(ChatRole.System, systemPrompt)
            };
        }

        public List<ChatMessage> GetConversationHistory()
        {
            return new List<ChatMessage>(_messages);
        }

        public void AddUserMessage(string content)
        {
            _messages.Add(new ChatMessage(ChatRole.User, content));
            TrimConversationIfNeeded();
        }

        public void AddAssistantMessage(string content)
        {
            _messages.Add(new ChatMessage(ChatRole.Assistant, content));
            TrimConversationIfNeeded();
        }

        public void ClearConversation(string systemPrompt = "You are a helpful assistant.")
        {
            _messages.Clear();
            _messages.Add(new ChatMessage(ChatRole.System, systemPrompt));
        }

        private void TrimConversationIfNeeded()
        {
            // Keep the system message but trim oldest messages if we exceed max length
            // This helps manage token limits
            if (_messages.Count > _maxMessages + 1)
            {
                // Always keep the system message (at index 0)
                _messages.RemoveRange(1, _messages.Count - _maxMessages - 1);
            }
        }
    }
}

Register this service as well:

csharp

builder.Services.AddScoped<ConversationManager>();

Building the Chat Interface

Now, let’s create a chat interface in Blazor. Replace the content of Pages/Index.razor with the following:

razor

@page "/"
@using IntelligentChatApp.Services
@using Azure.AI.OpenAI
@inject ChatService ChatService
@inject ConversationManager ConversationManager

<PageTitle>Intelligent Chat</PageTitle>

<div class="chat-container">
    <div class="chat-header">
        <h1>Intelligent Chat Assistant</h1>
        <button class="btn btn-sm btn-outline-secondary" @onclick="ClearChat">New Chat</button>
    </div>

    <div class="chat-messages" id="chatMessages">
        @foreach (var message in displayMessages)
        {
            <div class="message @(message.Role == "user" ? "user-message" : "assistant-message")">
                <div class="message-content">
                    @((MarkupString)FormatMessage(message.Content))
                </div>
            </div>
        }
        @if (isLoading)
        {
            <div class="message assistant-message">
                <div class="message-content">
                    <div class="typing-indicator">
                        <span></span>
                        <span></span>
                        <span></span>
                    </div>
                </div>
            </div>
        }
    </div>

    <div class="chat-input">
        <textarea 
            @bind="userInput" 
            @bind:event="oninput" 
            @onkeydown="HandleKeyDown"
            placeholder="Type your message here..."
            rows="2"
            disabled="@isLoading"></textarea>
        <button class="btn btn-primary" @onclick="SendMessage" disabled="@(isLoading || string.IsNullOrWhiteSpace(userInput))">
            <i class="oi oi-arrow-right"></i>
        </button>
    </div>
</div>

@code {
    private string userInput = "";
    private List<(string Role, string Content)> displayMessages = new();
    private bool isLoading = false;

    protected override void OnInitialized()
    {
        // Initialize with a system message that won't be displayed
        ConversationManager.ClearConversation("You are a helpful, friendly assistant. Provide concise and accurate responses. Format your responses using markdown for code blocks and lists.");
    }

    private async Task SendMessage()
    {
        if (string.IsNullOrWhiteSpace(userInput) || isLoading)
            return;

        string userMessage = userInput.Trim();
        userInput = "";

        // Add user message to display
        displayMessages.Add(("user", userMessage));
        
        // Add to conversation manager
        ConversationManager.AddUserMessage(userMessage);
        
        // Show loading indicator
        isLoading = true;
        StateHasChanged();
        
        // Scroll to bottom
        await JS.InvokeVoidAsync("scrollToBottom", "chatMessages");

        try
        {
            // Get response from AI
            var history = ConversationManager.GetConversationHistory();
            string response = await ChatService.GetChatResponseAsync(history);
            
            // Add assistant response
            ConversationManager.AddAssistantMessage(response);
            displayMessages.Add(("assistant", response));
        }
        catch (Exception ex)
        {
            displayMessages.Add(("assistant", $"Error: {ex.Message}"));
        }
        finally
        {
            isLoading = false;
            StateHasChanged();
            
            // Scroll to bottom again after response
            await JS.InvokeVoidAsync("scrollToBottom", "chatMessages");
        }
    }

    private void ClearChat()
    {
        displayMessages.Clear();
        ConversationManager.ClearConversation("You are a helpful, friendly assistant. Provide concise and accurate responses. Format your responses using markdown for code blocks and lists.");
    }

    private async Task HandleKeyDown(KeyboardEventArgs e)
    {
        if (e.Key == "Enter" && !e.ShiftKey)
        {
            // Send message on Enter (but allow Shift+Enter for new lines)
            await SendMessage();
        }
    }

    private string FormatMessage(string message)
    {
        // Simple markdown formatting for code blocks and lists
        // In a real app, use a proper markdown parser
        message = message.Replace("\n", "<br>");
        
        // Format code blocks
        message = System.Text.RegularExpressions.Regex.Replace(
            message,
            @"```(\w+)?\n(.*?)\n```",
            m => $"<pre><code class=\"language-{m.Groups[1].Value}\">{m.Groups[2].Value}</code></pre>",
            System.Text.RegularExpressions.RegexOptions.Singleline
        );
        
        // Format inline code
        message = System.Text.RegularExpressions.Regex.Replace(
            message,
            @"`([^`]+)`",
            "<code>$1</code>"
        );
        
        return message;
    }
}

Add the following JavaScript function to wwwroot/js/site.js to handle scrolling:

javascript

window.scrollToBottom = function (elementId) {
    const element = document.getElementById(elementId);
    if (element) {
        element.scrollTop = element.scrollHeight;
    }
}

And make sure to reference it in Pages/_Layout.cshtml:

html

<script src="~/js/site.js"></script>

Finally, add some CSS to wwwroot/css/site.css to style our chat interface:

css

.chat-container {
    display: flex;
    flex-direction: column;
    height: 90vh;
    max-width: 800px;
    margin: 0 auto;
    border: 1px solid #ddd;
    border-radius: 8px;
    overflow: hidden;
}

.chat-header {
    padding: 15px;
    background-color: #f5f5f5;
    border-bottom: 1px solid #ddd;
    display: flex;
    justify-content: space-between;
    align-items: center;
}

.chat-header h1 {
    margin: 0;
    font-size: 1.5rem;
}

.chat-messages {
    flex: 1;
    overflow-y: auto;
    padding: 15px;
    display: flex;
    flex-direction: column;
    gap: 15px;
}

.message {
    display: flex;
    margin-bottom: 10px;
}

.user-message {
    justify-content: flex-end;
}

.assistant-message {
    justify-content: flex-start;
}

.message-content {
    padding: 10px 15px;
    border-radius: 18px;
    max-width: 80%;
    word-break: break-word;
}

.user-message .message-content {
    background-color: #0078d4;
    color: white;
}

.assistant-message .message-content {
    background-color: #f0f0f0;
    color: #333;
}

.chat-input {
    display: flex;
    padding: 15px;
    background-color: #f5f5f5;
    border-top: 1px solid #ddd;
}

.chat-input textarea {
    flex: 1;
    padding: 10px 15px;
    border: 1px solid #ddd;
    border-radius: 20px;
    resize: none;
    outline: none;
    margin-right: 10px;
}

.chat-input button {
    border-radius: 50%;
    width: 40px;
    height: 40px;
    display: flex;
    align-items: center;
    justify-content: center;
    padding: 0;
}

/* Code formatting */
pre {
    background-color: #f4f4f4;
    padding: 10px;
    border-radius: 5px;
    overflow-x: auto;
    margin: 10px 0;
}

code {
    font-family: Consolas, Monaco, 'Andale Mono', monospace;
    font-size: 0.9em;
}

/* Typing indicator */
.typing-indicator {
    display: flex;
    align-items: center;
}

.typing-indicator span {
    height: 8px;
    width: 8px;
    background-color: #888;
    border-radius: 50%;
    display: inline-block;
    margin: 0 2px;
    animation: bounce 1.5s infinite ease-in-out;
}

.typing-indicator span:nth-child(2) {
    animation-delay: 0.2s;
}

.typing-indicator span:nth-child(3) {
    animation-delay: 0.4s;
}

@keyframes bounce {
    0%, 60%, 100% {
        transform: translateY(0);
    }
    30% {
        transform: translateY(-5px);
    }
}

Advanced Conversation Management Techniques

Managing Token Limits

GPT models have token limits for the combined prompt and response. For GPT-3.5 Turbo, this is typically 4096 tokens. To handle longer conversations, we need strategies to manage this limit:

  1. Summarization: Periodically summarize the conversation to reduce token count
  2. Windowing: Keep only the most recent N messages
  3. Selective Context: Include only relevant parts of the conversation

Let’s enhance our ConversationManager to include a summarization feature:

csharp

public async Task SummarizeConversation(ChatService chatService)
{
    // Skip if we don't have enough messages to summarize
    if (_messages.Count < 10)
        return;
    
    // Create a system message asking for summarization
    var summaryRequest = new List<ChatMessage>
    {
        new ChatMessage(ChatRole.System, "Summarize the following conversation concisely while preserving all important information:"),
    };
    
    // Add all messages except the system message
    summaryRequest.AddRange(_messages.Skip(1));
    
    // Get summary from the model
    string summary = await chatService.GetChatResponseAsync(summaryRequest);
    
    // Replace conversation with summary
    string originalSystemMessage = _messages[0].Content;
    _messages.Clear();
    _messages.Add(new ChatMessage(ChatRole.System, originalSystemMessage));
    _messages.Add(new ChatMessage(ChatRole.System, $"Previous conversation summary: {summary}"));
}