Architecture: Technology Stack (Implementation Guide)
This document mandates the specific technology choices for implementing FintraOS. These choices are designed to support the High-Throughput, Event-Sourced, and Bi-Modal Intelligence requirements of the platform.
1. Core Runtime & Language
- Language: Go (Golang)
- Rationale: Superior concurrency model (goroutines) for high-throughput event processing. Strong typing ensures reliability for financial data. Single binary deployment simplifies operations.
- Usage: All microservices (Connect, Core, Guard, Pulse, Vault, Views).
- Frontend: Next.js (React)
- Rationale: Server-Side Rendering (SSR) for SEO and performance. React ecosystem for component reusability.
- Usage: Developer Dashboard, Admin Console, Demo Apps.
2. Data Persistence (Polyglot Persistence)
We use the right database for the specific workload (CQRS).
A. The "Hot" Store (Transactional)
- Technology: PostgreSQL 16
- Role: The "Source of Truth" for current state.
- Usage: User profiles, account metadata, current balances, active permissions.
- Configuration: Strict ACID compliance.
B. The "Warm" Store (Time-Series & Events)
- Technology: TimescaleDB (Postgres Extension)
- Role: High-volume storage for the Immutable Event Log and Transaction History.
- Usage: Storing billions of
TransactionCreatedevents, historical balances, and metric points. - Rationale: Automatic partitioning, compression, and rapid time-range queries.
C. The "Cold" Store (Archival & Analytics)
- Technology: AWS S3 (Parquet Format)
- Role: Long-term data lake for "Slow Brain" batch processing.
- Usage: Training data for ML models, regulatory archives (7+ years).
D. The "Read" Store (Projections)
- Technology: Redis / CDN
- Role: Serving pre-computed JSON views to the frontend (Millisecond latency).
- Usage: The
Viewsmodule pushes ready-to-consume JSON here.
3. Messaging & Event Bus
- Technology: Kafka (or Redpanda for simpler ops)
- Role: The central nervous system.
- Usage: All inter-service communication is asynchronous via topics (e.g.,
core.events.v1,brain.insights.v1). - Pattern: "Smart Endpoints, Dumb Pipes."
4. Intelligence & ML
- Inference: Python (FastAPI) + ONNX Runtime
- Rationale: Python ecosystem for ML is unbeatable. ONNX provides a standardized, high-performance runtime for production.
- Vector Database: Qdrant
- Usage: Storing embeddings for Merchant Entity Resolution (e.g., matching "Starbucks" text to the Starbucks entity).
5. Infrastructure
- Containerization: Docker
- Orchestration: Kubernetes (K8s)
- CI/CD: GitHub Actions