Ep.07 Building a Professional React Chat Interface with TypeScript for FastAPI AI Chat

Views: 12

To build a React frontend for FastAPI AI chat, create a TypeScript React app using Vite for fast development. Implement custom hooks for managing chat state and API calls, use the Fetch API with Server-Sent Events for streaming responses, and structure your components with proper TypeScript interfaces. Key components include a ChatMessage component for individual messages, a ChatInput for user input, and a ChatContainer for state management. Use TailwindCSS for styling and implement error boundaries for production reliability. This architecture ensures type safety, maintainability, and excellent user experience.

🎓 What You’ll Learn

By the end of this tutorial, you’ll be able to:

  • Set up a modern React project with TypeScript and Vite
  • Create reusable React components with proper TypeScript typing
  • Implement streaming chat with Server-Sent Events (SSE)
  • Manage complex state with React hooks (useState, useEffect, useRef)
  • Build a responsive, professional chat UI with TailwindCSS
  • Handle errors gracefully with error boundaries
  • Connect seamlessly to your FastAPI backend
  • Deploy a production-ready React application

📖 Understanding the Frontend Stack

Why React + TypeScript?

TechnologyWhy We Use It
ReactMost popular UI library, component-based, huge ecosystem
TypeScriptType safety, better IDE support, catches errors early
ViteLightning-fast dev server, modern build tool
TailwindCSSUtility-first CSS, rapid styling, consistent design

Architecture Overview

React App (Frontend)
    ↓
Fetch API / EventSource (HTTP/SSE)
    ↓
FastAPI Backend (Port 8000)
    ↓
Ollama Service (AI Models)

🛠️ Step-by-Step Implementation

Step 1: Create React Project with Vite

# Navigate to your project root
cd ~/Documents/fastapi-ai-backend

# Create React app with TypeScript template
npm create vite@latest frontend -- --template react-ts

# Navigate to frontend directory
cd frontend

# Install dependencies
npm install

# Install additional packages
npm install axios lucide-react
npm install -D tailwindcss
npm install -D @tailwindcss/vite
npm install -D @types/node

What we installed:

  • axios: HTTP client for API calls
  • lucide-react: Beautiful icon library
  • tailwindcss: Utility-first CSS framework
  • @tailwindcss/vite: This is the new official plugin for v4. It replaces the old PostCSS workflow and is the reason your previous init command failed.
  • @types/node: TypeScript types for Node.js

Step 2: Configure Styles

Update frontend/src/index.css:

css

@import "tailwindcss";

/* 1. Configuration & Theme */
@theme {
  /* Defining your custom colors so @apply can find them */
  --color-primary-500: #3b82f6;
  /* Replace with your actual primary color hex */
  --color-secondary-500: #8b5cf6;
  /* Replace with your actual secondary color hex */
  --color-border: #e5e7eb;
  /* Defining 'border-border' */
}

/* 2. Base Styles */
@layer base {
  * {
    @apply border-border;
  }

  body {
    @apply bg-gradient-to-br from-primary-500 to-secondary-500 text-gray-900 antialiased;
  }
}

/* 3. Component Classes */
@layer components {
  .btn {
    @apply px-4 py-2 rounded-lg font-medium transition-all duration-200;
  }

  .btn-primary {
    @apply bg-gradient-to-r from-primary-500 to-secondary-500 text-white hover:opacity-90 hover:scale-105;
  }

  .btn-secondary {
    @apply bg-white text-gray-700 hover:bg-gray-50 border border-gray-200;
  }

  .input {
    @apply w-full px-4 py-2 border border-gray-200 rounded-lg focus:outline-none focus:ring-2 focus:ring-primary-500 focus:border-transparent transition-all;
  }

  .card {
    @apply bg-white rounded-xl shadow-lg;
  }
}

Step 3: Create TypeScript Types

Create frontend/src/types/chat.ts:

/**
 * TypeScript interfaces for chat application
 */

export interface Message {
  id: string;
  role: 'user' | 'assistant' | 'system';
  content: string;
  timestamp: Date;
  isStreaming?: boolean;
}

export interface ChatRequest {
  message: string;
  model: string;
  conversation_id?: string;
  temperature: number;
  stream: boolean;
}

export interface ChatResponse {
  message: string;
  model: string;
  conversation_id: string;
  created_at: string;
  finish_reason?: string;
  prompt_tokens?: number;
  completion_tokens?: number;
  total_tokens?: number;
}

export interface Conversation {
  conversation_id: string;
  model: string;
  messages: Message[];
  created_at: string;
  updated_at: string;
  metadata?: Record<string, any>;
}

export interface ModelInfo {
  name: string;
  size?: string;
  family?: string;
  parameter_size?: string;
  quantization?: string;
  modified_at?: string;
}

export interface StreamChunk {
  content?: string;
  done?: boolean;
  conversation_id?: string;
  error?: string;
}

export interface ChatSettings {
  model: string;
  temperature: number;
  stream: boolean;
}

Step 4: Create API Service

Create frontend/src/services/api.ts:

/**
 * API service for communicating with FastAPI backend
 */

import axios, { AxiosInstance } from 'axios';
import {
  ChatRequest,
  ChatResponse,
  ModelInfo,
  Conversation,
  StreamChunk,
} from '../types/chat';

const API_BASE_URL = import.meta.env.VITE_API_URL || 'http://localhost:8000/api/v1';

class ApiService {
  private client: AxiosInstance;

  constructor() {
    this.client = axios.create({
      baseURL: API_BASE_URL,
      headers: {
        'Content-Type': 'application/json',
      },
      timeout: 30000, // 30 seconds
    });
  }

  /**
   * Get available AI models
   */
  async getModels(): Promise<ModelInfo[]> {
    const response = await this.client.get('/ai/models');
    return response.data.models;
  }

  /**
   * Send chat message (non-streaming)
   */
  async sendMessage(request: ChatRequest): Promise<ChatResponse> {
    const response = await this.client.post('/ai/chat', request);
    return response.data;
  }

  /**
   * Send chat message (streaming)
   * Returns async generator for streaming chunks
   */
  async *streamMessage(request: ChatRequest): AsyncGenerator<StreamChunk> {
    const response = await fetch(`${API_BASE_URL}/ai/chat/stream`, {
      method: 'POST',
      headers: {
        'Content-Type': 'application/json',
      },
      body: JSON.stringify(request),
    });

    if (!response.ok) {
      const error = await response.json();
      throw new Error(error.detail || 'Stream request failed');
    }

    if (!response.body) {
      throw new Error('Response body is null');
    }

    const reader = response.body.getReader();
    const decoder = new TextDecoder();

    try {
      while (true) {
        const { done, value } = await reader.read();
        if (done) break;

        const chunk = decoder.decode(value);
        const lines = chunk.split('\n');

        for (const line of lines) {
          if (line.startsWith('data: ')) {
            const data = JSON.parse(line.slice(6));
            yield data as StreamChunk;
          }
        }
      }
    } finally {
      reader.releaseLock();
    }
  }

  /**
   * Get conversation by ID
   */
  async getConversation(conversationId: string): Promise<Conversation> {
    const response = await this.client.get(`/ai/conversations/${conversationId}`);
    return response.data;
  }

  /**
   * Create new conversation
   */
  async createConversation(model: string = 'llama2'): Promise<Conversation> {
    const response = await this.client.post('/ai/conversations', null, {
      params: { model },
    });
    return response.data;
  }

  /**
   * Check API health
   */
  async checkHealth(): Promise<{ status: string; ollama_url: string }> {
    const response = await this.client.get('/ai/health');
    return response.data;
  }
}

export const apiService = new ApiService();

Step 5: Create Custom Hooks

Create frontend/src/hooks/useChat.ts:

typescript

/**
 * Custom hook for managing chat state and API interactions
 */

import { useState, useCallback, useRef } from 'react';
import { Message, ChatSettings, StreamChunk } from '../types/chat';
import { apiService } from '../services/api';

export interface UseChatReturn {
  messages: Message[];
  isLoading: boolean;
  error: string | null;
  conversationId: string | null;
  sendMessage: (content: string) => Promise<void>;
  clearMessages: () => void;
  settings: ChatSettings;
  updateSettings: (settings: Partial<ChatSettings>) => void;
}

export const useChat = (initialSettings?: Partial<ChatSettings>): UseChatReturn => {
  const [messages, setMessages] = useState<Message[]>([]);
  const [isLoading, setIsLoading] = useState(false);
  const [error, setError] = useState<string | null>(null);
  const [conversationId, setConversationId] = useState<string | null>(null);
  const [settings, setSettings] = useState<ChatSettings>({
    model: 'llama2',
    temperature: 0.7,
    stream: true,
    ...initialSettings,
  });

  // Use ref to track if component is mounted (prevent state updates after unmount)
  const isMounted = useRef(true);

  const updateSettings = useCallback((newSettings: Partial<ChatSettings>) => {
    setSettings((prev) => ({ ...prev, ...newSettings }));
  }, []);

  const clearMessages = useCallback(() => {
    setMessages([]);
    setConversationId(null);
    setError(null);
  }, []);

  const addMessage = useCallback((message: Omit<Message, 'id' | 'timestamp'>) => {
    const newMessage: Message = {
      ...message,
      id: Math.random().toString(36).substring(7),
      timestamp: new Date(),
    };
    setMessages((prev) => [...prev, newMessage]);
    return newMessage;
  }, []);

  const updateMessage = useCallback((id: string, updates: Partial<Message>) => {
    setMessages((prev) =>
      prev.map((msg) => (msg.id === id ? { ...msg, ...updates } : msg))
    );
  }, []);

  const sendMessageNonStream = useCallback(
    async (content: string) => {
      try {
        const response = await apiService.sendMessage({
          message: content,
          model: settings.model,
          conversation_id: conversationId || undefined,
          temperature: settings.temperature,
          stream: false,
        });

        if (isMounted.current) {
          setConversationId(response.conversation_id);
          addMessage({
            role: 'assistant',
            content: response.message,
          });
        }
      } catch (err) {
        throw err;
      }
    },
    [settings, conversationId, addMessage]
  );

  const sendMessageStream = useCallback(
    async (content: string) => {
      // Create placeholder message for streaming
      const assistantMessage = addMessage({
        role: 'assistant',
        content: '',
        isStreaming: true,
      });

      try {
        let fullContent = '';

        for await (const chunk of apiService.streamMessage({
          message: content,
          model: settings.model,
          conversation_id: conversationId || undefined,
          temperature: settings.temperature,
          stream: true,
        })) {
          if (!isMounted.current) break;

          if (chunk.content) {
            fullContent += chunk.content;
            updateMessage(assistantMessage.id, {
              content: fullContent,
            });
          }

          if (chunk.done && chunk.conversation_id) {
            setConversationId(chunk.conversation_id);
            updateMessage(assistantMessage.id, {
              isStreaming: false,
            });
          }

          if (chunk.error) {
            throw new Error(chunk.error);
          }
        }
      } catch (err) {
        if (isMounted.current) {
          updateMessage(assistantMessage.id, {
            content: 'Error: Failed to get response',
            isStreaming: false,
          });
        }
        throw err;
      }
    },
    [settings, conversationId, addMessage, updateMessage]
  );

  const sendMessage = useCallback(
    async (content: string) => {
      if (!content.trim()) return;

      setIsLoading(true);
      setError(null);

      // Add user message
      addMessage({
        role: 'user',
        content: content.trim(),
      });

      try {
        if (settings.stream) {
          await sendMessageStream(content);
        } else {
          await sendMessageNonStream(content);
        }
      } catch (err) {
        const errorMessage = err instanceof Error ? err.message : 'An error occurred';
        if (isMounted.current) {
          setError(errorMessage);
        }
      } finally {
        if (isMounted.current) {
          setIsLoading(false);
        }
      }
    },
    [settings.stream, sendMessageStream, sendMessageNonStream, addMessage]
  );

  return {
    messages,
    isLoading,
    error,
    conversationId,
    sendMessage,
    clearMessages,
    settings,
    updateSettings,
  };
};

Create frontend/src/hooks/useModels.ts:

typescript

/**
 * Custom hook for managing AI models
 */

import { useState, useEffect } from 'react';
import { ModelInfo } from '../types/chat';
import { apiService } from '../services/api';

export interface UseModelsReturn {
  models: ModelInfo[];
  isLoading: boolean;
  error: string | null;
  refetch: () => Promise<void>;
}

export const useModels = (): UseModelsReturn => {
  const [models, setModels] = useState<ModelInfo[]>([]);
  const [isLoading, setIsLoading] = useState(true);
  const [error, setError] = useState<string | null>(null);

  const fetchModels = async () => {
    setIsLoading(true);
    setError(null);

    try {
      const data = await apiService.getModels();
      setModels(data);
    } catch (err) {
      const errorMessage = err instanceof Error ? err.message : 'Failed to load models';
      setError(errorMessage);
    } finally {
      setIsLoading(false);
    }
  };

  useEffect(() => {
    fetchModels();
  }, []);

  return {
    models,
    isLoading,
    error,
    refetch: fetchModels,
  };
};

Step 6: Create Chat Components

Create frontend/src/components/ChatMessage.tsx:

tsx

/**
 * Individual chat message component
 */

import React from 'react';
import { Message } from '../types/chat';
import { User, Bot, Loader2 } from 'lucide-react';

interface ChatMessageProps {
  message: Message;
}

export const ChatMessage: React.FC<ChatMessageProps> = ({ message }) => {
  const isUser = message.role === 'user';

  return (
    <div
      className={`flex gap-3 animate-slide-up ${
        isUser ? 'justify-end' : 'justify-start'
      }`}
    >
      {!isUser && (
        <div className="flex-shrink-0 w-8 h-8 rounded-full bg-gradient-to-br from-primary-500 to-secondary-500 flex items-center justify-center">
          <Bot className="w-5 h-5 text-white" />
        </div>
      )}

      <div
        className={`max-w-[70%] rounded-2xl px-4 py-3 ${
          isUser
            ? 'bg-gradient-to-br from-primary-500 to-secondary-500 text-white'
            : 'bg-white text-gray-800 shadow-md'
        }`}
      >
        <div className="whitespace-pre-wrap break-words">
          {message.content || (
            <div className="flex items-center gap-2">
              <Loader2 className="w-4 h-4 animate-spin" />
              <span className="text-sm">Thinking...</span>
            </div>
          )}
        </div>

        <div
          className={`text-xs mt-2 ${
            isUser ? 'text-white/70' : 'text-gray-500'
          }`}
        >
          {message.timestamp.toLocaleTimeString([], {
            hour: '2-digit',
            minute: '2-digit',
          })}
          {message.isStreaming && (
            <span className="ml-2 inline-flex items-center">
              <Loader2 className="w-3 h-3 animate-spin" />
            </span>
          )}
        </div>
      </div>

      {isUser && (
        <div className="flex-shrink-0 w-8 h-8 rounded-full bg-gray-200 flex items-center justify-center">
          <User className="w-5 h-5 text-gray-600" />
        </div>
      )}
    </div>
  );
};

Create frontend/src/components/ChatInput.tsx:

tsx

/**
 * Chat input component with send button
 */

import React, { useState, useRef, useEffect, KeyboardEvent } from 'react';
import { Send } from 'lucide-react';

interface ChatInputProps {
  onSend: (message: string) => void;
  disabled?: boolean;
  placeholder?: string;
}

export const ChatInput: React.FC<ChatInputProps> = ({
  onSend,
  disabled = false,
  placeholder = 'Type your message...',
}) => {
  const [input, setInput] = useState('');
  const textareaRef = useRef<HTMLTextAreaElement>(null);

  // Auto-resize textarea
  useEffect(() => {
    if (textareaRef.current) {
      textareaRef.current.style.height = 'auto';
      textareaRef.current.style.height = `${textareaRef.current.scrollHeight}px`;
    }
  }, [input]);

  const handleSend = () => {
    if (input.trim() && !disabled) {
      onSend(input);
      setInput('');
    }
  };

  const handleKeyDown = (e: KeyboardEvent<HTMLTextAreaElement>) => {
    if (e.key === 'Enter' && !e.shiftKey) {
      e.preventDefault();
      handleSend();
    }
  };

  return (
    <div className="border-t border-gray-200 bg-white px-4 py-4">
      <div className="flex gap-2 items-end">
        <textarea
          ref={textareaRef}
          value={input}
          onChange={(e) => setInput(e.target.value)}
          onKeyDown={handleKeyDown}
          placeholder={placeholder}
          disabled={disabled}
          rows={1}
          className="flex-1 resize-none input max-h-32 disabled:opacity-50 disabled:cursor-not-allowed"
        />
        <button
          onClick={handleSend}
          disabled={disabled || !input.trim()}
          className="btn btn-primary disabled:opacity-50 disabled:cursor-not-allowed flex items-center gap-2"
        >
          <Send className="w-4 h-4" />
          <span className="hidden sm:inline">Send</span>
        </button>
      </div>
      <div className="text-xs text-gray-500 mt-2">
        Press Enter to send, Shift + Enter for new line
      </div>
    </div>
  );
};

Create frontend/src/components/ChatHeader.tsx:

tsx

/**
 * Chat header with model selector and settings
 */

import React from 'react';
import { Settings, Trash2, RefreshCw } from 'lucide-react';
import { ModelInfo, ChatSettings } from '../types/chat';

interface ChatHeaderProps {
  models: ModelInfo[];
  settings: ChatSettings;
  onSettingsChange: (settings: Partial<ChatSettings>) => void;
  onClear: () => void;
  isLoading?: boolean;
}

export const ChatHeader: React.FC<ChatHeaderProps> = ({
  models,
  settings,
  onSettingsChange,
  onClear,
  isLoading = false,
}) => {
  return (
    <div className="bg-gradient-to-r from-primary-500 to-secondary-500 text-white px-6 py-4">
      <div className="flex items-center justify-between mb-4">
        <h1 className="text-2xl font-bold flex items-center gap-2">
          🤖 AI Chat
        </h1>
        <button
          onClick={onClear}
          disabled={isLoading}
          className="p-2 hover:bg-white/20 rounded-lg transition-colors disabled:opacity-50"
          title="Clear conversation"
        >
          <Trash2 className="w-5 h-5" />
        </button>
      </div>

      <div className="flex flex-wrap gap-4 items-center">
        {/* Model Selector */}
        <div className="flex items-center gap-2">
          <Settings className="w-4 h-4" />
          <select
            value={settings.model}
            onChange={(e) => onSettingsChange({ model: e.target.value })}
            disabled={isLoading}
            className="bg-white/20 border border-white/30 rounded-lg px-3 py-1.5 text-sm focus:outline-none focus:ring-2 focus:ring-white/50 disabled:opacity-50"
          >
            {models.length === 0 ? (
              <option>Loading models...</option>
            ) : (
              models.map((model) => (
                <option key={model.name} value={model.name.split(':')[0]}>
                  {model.name}
                </option>
              ))
            )}
          </select>
        </div>

        {/* Temperature Slider */}
        <div className="flex items-center gap-2">
          <span className="text-sm">Temperature:</span>
          <input
            type="range"
            min="0"
            max="2"
            step="0.1"
            value={settings.temperature}
            onChange={(e) =>
              onSettingsChange({ temperature: parseFloat(e.target.value) })
            }
            disabled={isLoading}
            className="w-24 disabled:opacity-50"
          />
          <span className="text-sm font-mono w-8">
            {settings.temperature.toFixed(1)}
          </span>
        </div>

        {/* Stream Toggle */}
        <label className="flex items-center gap-2 cursor-pointer">
          <input
            type="checkbox"
            checked={settings.stream}
            onChange={(e) => onSettingsChange({ stream: e.target.checked })}
            disabled={isLoading}
            className="w-4 h-4 disabled:opacity-50"
          />
          <span className="text-sm">Stream</span>
        </label>
      </div>
    </div>
  );
};

Create frontend/src/components/ChatMessages.tsx:

tsx

/**
 * Container for chat messages with auto-scroll
 */

import React, { useEffect, useRef } from 'react';
import { Message } from '../types/chat';
import { ChatMessage } from './ChatMessage';

interface ChatMessagesProps {
  messages: Message[];
}

export const ChatMessages: React.FC<ChatMessagesProps> = ({ messages }) => {
  const messagesEndRef = useRef<HTMLDivElement>(null);
  const containerRef = useRef<HTMLDivElement>(null);

  // Auto-scroll to bottom when new messages arrive
  useEffect(() => {
    messagesEndRef.current?.scrollIntoView({ behavior: 'smooth' });
  }, [messages]);

  return (
    <div
      ref={containerRef}
      className="flex-1 overflow-y-auto px-6 py-4 space-y-4 bg-gray-50"
    >
      {messages.length === 0 ? (
        <div className="h-full flex items-center justify-center">
          <div className="text-center text-gray-500">
            <p className="text-lg mb-2">👋 Welcome to AI Chat!</p>
            <p className="text-sm">Send a message to get started</p>
          </div>
        </div>
      ) : (
        <>
          {messages.map((message) => (
            <ChatMessage key={message.id} message={message} />
          ))}
          <div ref={messagesEndRef} />
        </>
      )}
    </div>
  );
};

Create frontend/src/components/ErrorBoundary.tsx:

tsx

/**
 * Error boundary for graceful error handling
 */

import React, { Component, ErrorInfo, ReactNode } from 'react';
import { AlertTriangle } from 'lucide-react';

interface Props {
  children: ReactNode;
}

interface State {
  hasError: boolean;
  error: Error | null;
}

export class ErrorBoundary extends Component<Props, State> {
  constructor(props: Props) {
    super(props);
    this.state = {
      hasError: false,
      error: null,
    };
  }

  static getDerivedStateFromError(error: Error): State {
    return {
      hasError: true,
      error,
    };
  }

  componentDidCatch(error: Error, errorInfo: ErrorInfo) {
    console.error('Error caught by boundary:', error, errorInfo);
  }

  render() {
    if (this.state.hasError) {
      return (
        <div className="min-h-screen flex items-center justify-center bg-gray-50 p-4">
          <div className="card p-8 max-w-md w-full text-center">
            <AlertTriangle className="w-16 h-16 text-red-500 mx-auto mb-4" />
            <h1 className="text-2xl font-bold text-gray-900 mb-2">
              Oops! Something went wrong
            </h1>
            <p className="text-gray-600 mb-4">
              {this.state.error?.message || 'An unexpected error occurred'}
            </p>
            <button
              onClick={() => window.location.reload()}
              className="btn btn-primary"
            >
              Reload Page
            </button>
          </div>
        </div>
      );
    }

    return this.props.children;
  }
}

Step 7: Create Main Chat Container

Create frontend/src/components/ChatContainer.tsx:

tsx

/**
 * Main chat container component
 */

import React, { useState } from 'react';
import { ChatHeader } from './ChatHeader';
import { ChatMessages } from './ChatMessages';
import { ChatInput } from './ChatInput';
import { useChat } from '../hooks/useChat';
import { useModels } from '../hooks/useModels';
import { AlertCircle, Wifi, WifiOff } from 'lucide-react';

export const ChatContainer: React.FC = () => {
  const { messages, isLoading, error, sendMessage, clearMessages, settings, updateSettings } =
    useChat();
  const { models, isLoading: modelsLoading, error: modelsError } = useModels();
  const [showError, setShowError] = useState(true);

  return (
    <div className="h-screen flex items-center justify-center p-4">
      <div className="card w-full max-w-4xl h-[90vh] flex flex-col overflow-hidden">
        {/* Header */}
        <ChatHeader
          models={models}
          settings={settings}
          onSettingsChange={updateSettings}
          onClear={clearMessages}
          isLoading={isLoading}
        />

        {/* Error Display */}
        {(error || modelsError) && showError && (
          <div className="bg-red-50 border-l-4 border-red-500 p-4 m-4">
            <div className="flex items-start">
              <AlertCircle className="w-5 h-5 text-red-500 mt-0.5 mr-3 flex-shrink-0" />
              <div className="flex-1">
                <h3 className="text-sm font-medium text-red-800">Error</h3>
                <p className="text-sm text-red-700 mt-1">
                  {error || modelsError}
                </p>
                {modelsError && (
                  <p className="text-xs text-red-600 mt-2">
                    Make sure your FastAPI server is running on http://localhost:8000
                  </p>
                )}
              </div>
              <button
                onClick={() => setShowError(false)}
                className="text-red-500 hover:text-red-700"
              >
                ×
              </button>
            </div>
          </div>
        )}

        {/* Connection Status */}
        <div className="px-4 py-2 bg-gray-50 border-b border-gray-200 flex items-center gap-2 text-sm">
          {modelsLoading ? (
            <>
              <WifiOff className="w-4 h-4 text-gray-400" />
              <span className="text-gray-600">Connecting...</span>
            </>
          ) : modelsError ? (
            <>
              <WifiOff className="w-4 h-4 text-red-500" />
              <span className="text-red-600">Disconnected</span>
            </>
          ) : (
            <>
              <Wifi className="w-4 h-4 text-green-500" />
              <span className="text-green-600">Connected</span>
              <span className="text-gray-400">•</span>
              <span className="text-gray-600">{models.length} models available</span>
            </>
          )}
        </div>

        {/* Messages */}
        <ChatMessages messages={messages} />

        {/* Input */}
        <ChatInput
          onSend={sendMessage}
          disabled={isLoading || modelsLoading || !!modelsError}
          placeholder={
            modelsError
              ? 'Cannot connect to server...'
              : isLoading
              ? 'Waiting for response...'
              : 'Type your message...'
          }
        />
      </div>
    </div>
  );
};

Step 8: Update Main App Component

Update frontend/src/App.tsx:

tsx

import { ErrorBoundary } from './components/ErrorBoundary';
import { ChatContainer } from './components/ChatContainer';

function App() {
  return (
    <ErrorBoundary>
      <ChatContainer />
    </ErrorBoundary>
  );
}

export default App;

Step 9: Create Environment Configuration

Create frontend/.env:

# API Configuration
VITE_API_URL=http://localhost:8000/api/v1

Create frontend/.env.example:

# API Configuration
# Update this to match your FastAPI backend URL
VITE_API_URL=http://localhost:8000/api/v1

Step 10: Update Package Scripts

Update frontend/package.json scripts section:

{
  "scripts": {
    "dev": "vite",
    "build": "tsc && vite build",
    "lint": "eslint . --ext ts,tsx --report-unused-disable-directives --max-warnings 0",
    "preview": "vite preview"
  }
}

Step 11: Configure Vite

Update frontend/vite.config.ts:

typescript

import { defineConfig } from 'vite'
import react from '@vitejs/plugin-react'

// https://vitejs.dev/config/
export default defineConfig({
  plugins: [react()],
  server: {
    port: 3000,
    proxy: {
      '/api': {
        target: 'http://localhost:8000',
        changeOrigin: true,
      },
    },
  },
})

🧪 Running Your Complete Application

Complete Setup & Testing

Step 1: Start All Services

# Terminal 1: Start Ollama
ollama serve

# Terminal 2: Start FastAPI Backend
cd ~/Documents/fastapi-ai-backend
source venv/bin/activate
python main.py

# Terminal 3: Start React Frontend
cd ~/Documents/fastapi-ai-backend/frontend
npm run dev

Step 2: Verify Everything Works

Check services:

You should see:

  • Ollama: Running in background
  • FastAPI: Swagger UI with all endpoints
  • React: Beautiful chat interface

📊 Complete Project Structure

fastapi-ai-backend/
├── app/                          # FastAPI Backend
│   ├── api/v1/endpoints/
│   │   ├── ai.py                # AI endpoints
│   │   ├── users.py             # User endpoints
│   │   └── ...
│   ├── models/
│   │   ├── ai.py                # AI Pydantic models
│   │   └── ...
│   ├── services/
│   │   ├── ai_service.py        # Ollama service
│   │   └── ...
│   ├── middleware/
│   ├── core/
│   └── main.py
│
├── frontend/                     # React Frontend ✨ NEW!
│   ├── src/
│   │   ├── components/
│   │   │   ├── ChatContainer.tsx
│   │   │   ├── ChatHeader.tsx
│   │   │   ├── ChatMessages.tsx
│   │   │   ├── ChatMessage.tsx
│   │   │   ├── ChatInput.tsx
│   │   │   └── ErrorBoundary.tsx
│   │   ├── hooks/
│   │   │   ├── useChat.ts
│   │   │   └── useModels.ts
│   │   ├── services/
│   │   │   └── api.ts
│   │   ├── types/
│   │   │   └── chat.ts
│   │   ├── App.tsx
│   │   ├── main.tsx
│   │   └── index.css
│   ├── .env
│   ├── vite.config.ts
│   ├── tailwind.config.js
│   └── package.json
│
├── venv/
├── .env
└── requirements.txt

🎯 What You’ve Accomplished!

Technical Achievements:

Full-Stack Application

  • FastAPI backend (Python)
  • React frontend (TypeScript)
  • Real-time streaming
  • Modern build tools

Production-Ready Architecture

  • Type-safe frontend
  • Component-based UI
  • Custom hooks for logic
  • Error boundaries
  • Responsive design

Advanced Features

  • Server-Sent Events streaming
  • Multi-model support
  • Conversation management
  • Settings configuration
  • Beautiful, modern UI

Skills Mastered:

Frontend Development

  • React with TypeScript
  • Custom hooks
  • State management
  • API integration
  • Component design

Modern Tooling

  • Vite for fast development
  • TailwindCSS for styling
  • ESLint for code quality
  • TypeScript for type safety

Best Practices

  • Component composition
  • Separation of concerns
  • Error handling
  • Performance optimization
  • Accessibility

Why use Vite instead of Create React App (CRA) for an AI project?

Vite offers significantly faster development startup times and hot module replacement (HMR) by utilizing native ES modules. For AI projects that require frequent UI updates and real-time streaming testing, Vite’s speed and modern build pipeline provide a much smoother developer experience compared to the now-deprecated CRA.
To display streaming text, you must use a custom hook that processes the response from your FastAPI /chat/stream endpoint. Using fetch and a ReadableStreamDefaultReader, you read incoming “chunks” of text and append them to the existing message state in real-time. This creates the “typewriter” effect common in modern AI interfaces like ChatGPT.
TypeScript ensures that your message objects (containing fields like role, content, and id) follow a strict schema, preventing runtime errors when rendering message lists. It also provides excellent IDE autocomplete for the various props passed between your ChatInput, ChatContainer, and MessageList components, making the codebase easier to maintain as it grows.
You can use a useRef hook to target an empty div at the bottom of your message container. Combine this with a useEffect hook that triggers the scrollIntoView() method whenever the messages array or the streaming content updates. This ensures the user always sees the most recent part of the conversation without manual scrolling.
By using utility-first classes directly in your JSX/TSX files, Tailwind compiles only the CSS you actually use into a single, small file. Ensure that your index.css correctly imports the @tailwind directives and that your tailwind.config.js is set to scan all your source files. This results in an extremely fast-loading UI with no layout shifts.

🚀 Next Steps in Your Learning Journey

Immediate Enhancements:

  1. Add Features:
    • File upload support
    • Code syntax highlighting
    • Markdown rendering
    • Conversation export
  2. Improve UX:
    • Loading skeletons
    • Toast notifications
    • Keyboard shortcuts
    • Mobile optimization
  3. Performance:
    • React.memo for heavy components
    • Virtual scrolling for long chats
    • Lazy loading for routes
    • Service worker for offline support

Coming in Future Posts:

  • Ep.08: Database Integration (PostgreSQL + SQLAlchemy)
  • Ep.09: Authentication & Authorization (JWT)
  • Ep.10: Deployment (Docker, CI/CD, Production)

💡 Pro Tips

Development Workflow

# Use concurrently to run all services
npm install -g concurrently

# Add to package.json:
"scripts": {
  "dev:all": "concurrently \"npm:dev:backend\" \"npm:dev:frontend\"",
  "dev:backend": "cd .. && python main.py",
  "dev:frontend": "vite"
}

# Now just run:
npm run dev:all

Environment Management

# Development
cp .env.example .env

# Production
cp .env.example .env.production

# Switch easily
npm run dev           # Uses .env
npm run build         # Uses .env.production

Debugging Tips

// 1. Add console groups
console.group('API Call');
console.log('Request:', request);
console.log('Response:', response);
console.groupEnd();

// 2. Use debugger
const sendMessage = async (content: string) => {
  debugger; // Execution pauses here
  // ...
};

// 3. React DevTools Profiler
// Measure component render performance

🎉 Summary

You’ve successfully built a complete, production-ready, full-stack AI application!

Your Stack:

  • ⚡ FastAPI (Python) – Lightning-fast backend
  • ⚛️ React (TypeScript) – Modern, type-safe frontend
  • 🤖 Ollama – Local AI models
  • 🎨 TailwindCSS – Beautiful, responsive UI
  • 🔥 Vite – Instant development feedback

You can now:

  • 💬 Build chat applications
  • 🌊 Implement streaming
  • 🎨 Create beautiful UIs
  • 🔧 Manage complex state
  • 🚀 Deploy to production

This is a massive achievement! You’ve gone from zero to a full-stack AI application with modern best practices.

Congratulations! 🎊


📚 Additional Learning Resources

React + TypeScript:

Streaming:

UI/UX:


Ready to continue with Ep.08: Database Integration & Authentication?

Leave a Reply

Your email address will not be published. Required fields are marked *

Search