# CLAUDE.md This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository. ## Project Overview US stock trading platform built as a Python microservices architecture. Uses Alpaca Markets API for market data and order execution. Services communicate via Redis Streams and persist to PostgreSQL. ## Common Commands ```bash make infra # Start Redis + Postgres (required before running services/tests) make up # Start all services via Docker Compose make down # Stop all services make test # Run all tests (pytest -v) make lint # Lint check (ruff check + format check) make format # Auto-fix lint + format make migrate # Run DB migrations (alembic upgrade head, from shared/) make migrate-new msg="description" # Create new migration make ci # Full CI: install deps, lint, test, Docker build make e2e # End-to-end tests ``` Run a single test file: `pytest services/strategy-engine/tests/test_rsi_strategy.py -v` ## Architecture ### Services (each in `services//`, each has its own Dockerfile) - **data-collector** (port 8080): Fetches stock bars from Alpaca, publishes `CandleEvent` to Redis stream `candles` - **news-collector** (port 8084): Continuously collects news from 7 sources (Finnhub, RSS, SEC EDGAR, Truth Social, Reddit, Fear & Greed, Fed), runs sentiment aggregation every 15 min - **strategy-engine** (port 8081): Consumes candle events, runs strategies, publishes `SignalEvent` to stream `signals`. Also runs the stock selector at 15:30 ET daily - **order-executor** (port 8082): Consumes signals, runs risk checks, places orders via Alpaca, publishes `OrderEvent` to stream `orders` - **portfolio-manager** (port 8083): Tracks positions, PnL, portfolio snapshots - **api** (port 8000): FastAPI REST endpoint layer - **backtester**: Offline backtesting engine with walk-forward analysis ### Event Flow ``` Alpaca → data-collector → [candles stream] → strategy-engine → [signals stream] → order-executor → [orders stream] → portfolio-manager News sources → news-collector → [news stream] → sentiment aggregator → symbol_scores DB ↓ stock selector (15:30 ET) → [selected_stocks stream] → MOC strategy → signals ``` All inter-service events use `shared/src/shared/events.py` (CandleEvent, SignalEvent, OrderEvent, NewsEvent) serialized as JSON over Redis Streams via `shared/src/shared/broker.py` (RedisBroker). ### Shared Library (`shared/`) Installed as editable package (`pip install -e shared/`). Contains: - `models.py` — Pydantic domain models: Candle, Signal, Order, Position, NewsItem, NewsCategory - `sentiment_models.py` — SymbolScore, MarketSentiment, SelectedStock, Candidate - `sa_models.py` — SQLAlchemy ORM models (CandleRow, SignalRow, OrderRow, PortfolioSnapshotRow, NewsItemRow, SymbolScoreRow, MarketSentimentRow, StockSelectionRow) - `broker.py` — RedisBroker (async Redis Streams pub/sub with consumer groups) - `db.py` — Database class (async SQLAlchemy 2.0), includes news/sentiment/selection CRUD methods - `alpaca.py` — AlpacaClient (async aiohttp client for Alpaca Trading + Market Data APIs) - `events.py` — Event types and serialization (CandleEvent, SignalEvent, OrderEvent, NewsEvent) - `sentiment.py` — SentimentData (legacy gating) + SentimentAggregator (freshness-weighted composite scoring) - `config.py`, `logging.py`, `metrics.py`, `notifier.py` (Telegram), `resilience.py`, `healthcheck.py` DB migrations live in `shared/alembic/`. ### Strategy System (`services/strategy-engine/strategies/`) Strategies extend `BaseStrategy` (in `strategies/base.py`) and implement `on_candle()`, `configure()`, `warmup_period`. The plugin loader (`strategy_engine/plugin_loader.py`) auto-discovers `*.py` files in the strategies directory and loads YAML config from `strategies/config/.yaml`. BaseStrategy provides optional filters (ADX regime, volume, ATR-based stops) via `_init_filters()` and `_apply_filters()`. ### News-Driven Stock Selector (`services/strategy-engine/src/strategy_engine/stock_selector.py`) Dynamic stock selection for MOC (Market on Close) trading. Runs daily at 15:30 ET via `strategy-engine`: 1. **Candidate Pool**: Top 20 by sentiment score + LLM-recommended stocks from today's news 2. **Technical Filter**: RSI 30-70, price > 20 EMA, volume > 50% average 3. **LLM Final Selection**: Claude picks 2-3 stocks with rationale Market gating: blocks all trades when Fear & Greed ≤ 20 or VIX > 30 (`risk_off` regime). ### News Collector (`services/news-collector/`) 7 collectors extending `BaseCollector` in `collectors/`: - `finnhub.py` (5min), `rss.py` (10min), `reddit.py` (15min), `truth_social.py` (15min), `sec_edgar.py` (30min), `fear_greed.py` (1hr), `fed.py` (1hr) - All use VADER (nltk) for sentiment scoring - Provider abstraction via `BaseCollector` for future paid API swap (config change only) Sentiment aggregation (every 15min) computes per-symbol composite scores with freshness decay and category weights (policy 0.3, news 0.3, social 0.2, filing 0.2). ### CLI (`cli/`) Click-based CLI installed as `trading` command. Depends on the shared library. ## Tech Stack - Python 3.12+, async throughout (asyncio, aiohttp) - Pydantic for models, SQLAlchemy 2.0 async ORM, Alembic for migrations - Redis Streams for inter-service messaging - PostgreSQL 16 for persistence - Ruff for linting/formatting (line-length=100) - pytest + pytest-asyncio (asyncio_mode="auto") - Docker Compose for deployment; monitoring stack (Grafana/Prometheus/Loki) available via `--profile monitoring` ## Environment Copy `.env.example` to `.env`. Key vars: `ALPACA_API_KEY`, `ALPACA_API_SECRET`, `ALPACA_PAPER=true`, `DRY_RUN=true`, `DATABASE_URL`, `REDIS_URL`, `FINNHUB_API_KEY`, `ANTHROPIC_API_KEY`. DRY_RUN=true simulates order fills without hitting Alpaca. Stock selector requires `ANTHROPIC_API_KEY` to be set.