Stock Earnings Analyzer

PythonFastAPISvelte 5PostgreSQLRedisAnthropic APIBrave Search APIClerk AuthSQLAlchemyTailwind CSS 4ViteDocker

Project Overview

Stock Earnings Analyzer is a comprehensive financial intelligence platform that displays weekly earnings calendars for publicly traded companies and delivers AI-powered analysis of individual earnings reports. The backend fetches earnings data from Alpha Vantage, enriches it with market cap data, and stores it in PostgreSQL with upsert logic to maintain a clean, deduplicated dataset. When a user requests analysis for a specific stock, the system searches for the actual earnings press release and market reaction using the Brave Search API, scrapes and extracts the full article text prioritizing primary sources (BusinessWire, PR Newswire, SEC filings, investor relations pages), then sends the extracted content to the Anthropic Claude API with a structured tool-use schema to produce precise financial metrics including EPS, revenue, guidance summary, sentiment scoring, and price reaction data. Results stream to the frontend via Server-Sent Events for real-time status updates. The frontend is built with Svelte 5 runes and features interactive SVG price charts with hover tooltips, batched sparkline loading for performance, a favorites/watchlist system backed by Clerk JWT authentication, and a curated news feed aggregated from NewsAPI and Brave Search.

Key Features

Weekly earnings calendar with navigation between weeks and top-company highlights sorted by market cap
AI-powered earnings analysis using Claude with structured tool-use for precise financial metric extraction
Real-time Server-Sent Events streaming analysis progress (searching, reading articles, analyzing, saving)
Interactive SVG price charts with multiple timeframes (1D to 5Y) and hover crosshair with price/date tooltips
Batched sparkline loading system with module-level caching and 50ms debounced batch requests for performance
User watchlist/favorites system with Clerk JWT authentication and persistent PostgreSQL storage
Stock news feed aggregated from NewsAPI and Brave Search with relative time formatting
Primary source prioritization for analysis — scrapes BusinessWire, PR Newswire, SEC filings, and IR pages before secondary sources
Multi-layer caching with Redis (calendar 4h TTL, market cap 24h, charts 5min–1h, analysis results) with graceful fallback when Redis is unavailable
Homepage with curated "Last Week's Top Earnings" and "This Week to Watch" sections with most-anticipated highlights
Full-text stock search with ticker lookup and earnings event discovery
Responsive glassmorphism UI with custom CSS theme system using CSS custom properties

Challenges Solved

Designing a streaming analysis pipeline that orchestrates web search, HTML scraping, and AI inference while providing real-time progress to the client via SSE
Implementing reliable web scraping that prioritizes primary earnings sources (press releases, SEC filings) and falls back to secondary coverage, with concurrent page fetching and character limits to stay within AI context windows
Building an efficient sparkline batching system at the Svelte module level that debounces individual component requests into chunked API calls with client-side caching
Creating interactive SVG charts from scratch with computed derived state for chart paths, area fills, Y-axis labels, hover tracking, and responsive scaling without a charting library
Handling Brave Search API rate limiting with exponential backoff retry logic and sequential query spacing to avoid 429 responses
Implementing Clerk JWT authentication with JWKS caching, RS256 verification, and optional auth middleware that gracefully degrades for unauthenticated users
Designing PostgreSQL upsert logic with unique constraints to handle duplicate earnings events from repeated Alpha Vantage imports without data corruption

Technologies Used

FastAPI & Async Python

Built a fully async backend with FastAPI using async generators for SSE streaming, asyncpg for non-blocking PostgreSQL queries, and httpx for concurrent HTTP requests. Implements lifespan management for database table creation on startup and Redis cleanup on shutdown.

Anthropic Claude API (Tool Use)

Integrated Claude with a forced tool-use pattern where the model must call an earnings_analysis_result tool with a strict JSON schema, ensuring structured output with financial metrics (EPS, revenue, guidance, sentiment) rather than free-text responses.

Brave Search & Web Scraping

Built a two-query search strategy (press release + market reaction) with exponential backoff retry logic for rate limiting. Implements concurrent page fetching with BeautifulSoup extraction, prioritizing primary sources over secondary coverage, with per-page and total character limits.

Svelte 5 (Runes)

Leveraged Svelte 5 runes syntax ($state, $derived, $derived.by, $effect, $props) for reactive state management. Built complex derived computations for SVG chart geometry, hover interactions, and price change calculations using the new fine-grained reactivity system.

PostgreSQL & SQLAlchemy

Designed a normalized schema with EarningsEvent, EarningsAnalysis (linked by FK), and UserFavorite tables. Uses JSONB columns for raw analysis storage, custom enums for report timing and sentiment, and unique constraints for upsert-safe data ingestion.

Redis Caching Layer

Implemented domain-aware caching with different TTLs per data type. The cache layer gracefully degrades when Redis is unavailable, allowing the app to function without caching rather than failing.

Clerk Authentication

JWT-based auth with JWKS endpoint caching for RS256 token verification. Supports both required and optional auth middleware, enabling mixed authenticated/public endpoints. Frontend uses the Clerk JS SDK for sign-in/out flows and session token management.

Docker Multi-Stage Build

Production deployment uses a two-stage Dockerfile: Node.js for the Svelte/Vite build with build-time environment variables, and Python for the FastAPI server serving the SPA from static files with a catch-all route for client-side routing.

Development Process

1

Data Pipeline & Earnings Calendar

Built the earnings data pipeline fetching from Alpha Vantage EARNINGS_CALENDAR endpoint, implementing PostgreSQL upserts with ON CONFLICT handling on a unique ticker/report_date constraint. Added batch market cap enrichment with Redis caching (24h TTL) to sort and display companies by size.

2

AI Analysis Engine with Structured Output

Developed the core analysis pipeline: Brave Search queries for earnings press releases and market reaction articles, concurrent HTML scraping with BeautifulSoup prioritizing primary sources (BusinessWire, PR Newswire, SEC.gov, investor relations pages), and Anthropic Claude API integration using forced tool-use to extract structured financial metrics (EPS, revenue, guidance, sentiment, price reaction).

3

Real-time Streaming Architecture

Implemented Server-Sent Events streaming from FastAPI async generators, delivering step-by-step progress updates (cache check → web search → article reading → AI analysis → database save) to the frontend with a custom SSE parser that handles buffered reads and event type routing.

4

Interactive Frontend with Svelte 5

Built the entire frontend using Svelte 5 runes ($state, $derived, $effect, $props) with hand-crafted SVG price charts featuring hover crosshairs, multiple timeframes, and computed chart geometry. Implemented a module-level sparkline batching system with debounced requests and client-side caching for efficient data loading across dozens of concurrent components.

5

Authentication & Personalization

Integrated Clerk authentication with JWT verification on the backend (JWKS fetching, RS256 decode, optional auth middleware) and the Clerk JS SDK on the frontend. Built a watchlist/favorites system with PostgreSQL-backed storage, optimistic UI updates, and bulk favorite checking for efficient page loads.

6

Caching & Performance Optimization

Implemented a multi-tier Redis caching layer with domain-specific TTLs (calendar 4h, market cap 24h, charts 5min–1h, analysis results) that gracefully no-ops when Redis is unavailable. Added Yahoo Finance chart data caching and response optimization throughout the API layer.

7

Containerized Deployment

Created a multi-stage Dockerfile with a Node.js stage for Vite/Svelte production builds and a Python stage for the FastAPI server, serving the SPA from static files with a catch-all route. Configured for Railway deployment with environment-based settings via Pydantic Settings.