Building AI Home Budget System: Day 5 Docker & Database Implementation

Building an AI-Powered Home Budget System: Day 5 Implementation Journey


Today marks a significant milestone in my AI home budget optimisation system project – Day 5 brought the core infrastructure to life! After weeks of planning, I finally have a working financial data processing system running on my Proxmox homelab.

What I Accomplished Today

In just 2-3 hours, I transformed architectural plans into a fully functional system that can:

  • Import and categorize financial transactions from CSV files
  • Intelligently suggest categories using keyword matching algorithms
  • Process thousands of transactions with real-time progress tracking
  • Store everything securely in a PostgreSQL database
  • Provide a clean web interface for data management

The Technical Architecture

My AI home budget optimization system runs entirely on my homelab infrastructure:

VM: 192.168.xxx.xxx (Ubuntu on Proxmox)
├── PostgreSQL 15 (Financial data storage)
├── Redis 7 (Caching & session management)  
├── FastAPI (Web application framework)
├── InfluxDB 2.7 (Time-series data for future IoT integration)
└── Management UIs (Adminer & Redis Commander)

Real-World Implementation Challenges

Challenge 1: Database Schema Evolution

The biggest hurdle was handling schema mismatches between the application code and database structure. The categorization system expected a type column that didn’t exist, causing crashes during transaction import.

Solution: I implemented dynamic schema detection that gracefully handles missing columns, making the system more robust for future changes.

Challenge 2: User Experience Issues

Initial testing revealed several UX problems:

  • No progress indication during large imports
  • Limited category options
  • Difficult navigation on smaller screens

Solution: Enhanced the interface with sticky navigation, comprehensive progress tracking, and expanded the category system from 8 to over 20 options including Insurance, Education, Travel, and custom categories.

Challenge 3: Data Processing Performance

Processing 1000+ transactions was taking too long and blocking the UI.

Solution: Implemented asynchronous processing with real-time progress updates, reducing apparent wait times and improving user experience.

The Smart Categorisation Engine

One of the coolest features is the intelligent categorization system. Instead of manually assigning categories to hundreds of transactions, the system:

  1. Analyzes transaction descriptions using keyword matching
  2. Considers transaction amounts for context
  3. Suggests multiple categories with confidence scores
  4. Learns from user choices to improve future suggestions

For example, it automatically recognizes:

  • “WOOLWORTHS” → Food & Dining
  • “SHELL” → Transportation
  • “NETFLIX” → Entertainment/Subscriptions
  • “$2,500 RENT” → Housing & Utilities

Key Technical Wins

Docker Containerisation

Everything runs in Docker containers with host networking for simplicity. This approach provides:

  • Easy deployment across different environments
  • Consistent configuration using environment variables
  • Simple backup and restore procedures
  • Scalable architecture for future enhancements

Database Design

The PostgreSQL schema balances simplicity with flexibility:

  • Transactions table with JSONB for flexible metadata storage
  • Categories table supporting hierarchical organization
  • Proper indexing for fast queries on date ranges and amounts
  • Foreign key constraints ensuring data integrity

Web Interface

Built with FastAPI and modern HTML/CSS:

  • Responsive design working on desktop and mobile
  • Real-time updates using server-sent events
  • Intuitive workflow from upload to categorization
  • Error handling with user-friendly messages

Performance Results

The final system delivers impressive performance:

  • Import speed: ~1000 transactions per minute
  • Response time: <200ms for categorization suggestions
  • Memory footprint: ~512MB total for all containers
  • Storage efficiency: ~2GB database size with sample data

What’s Next: Day 6 Preview

Tomorrow I’m diving into the machine learning pipeline that will transform this from a simple categorization tool into a true AI budget optimization system:

  • Spending prediction models for next week/month forecasting
  • Anomaly detection to flag unusual transactions
  • Recommendation engine for budget optimization
  • Integration with Home Assistant for energy usage correlation

Lessons Learned

  1. Start simple, then enhance: The basic CSV import worked immediately, allowing me to focus on user experience improvements
  2. Error handling is crucial: Real-world data is messy – robust error handling prevented system crashes
  3. User feedback drives features: Testing with actual bank data revealed UX issues I hadn’t anticipated
  4. Documentation matters: Detailed logging helped debug schema issues quickly

Technical Deep Dive Resources

For fellow developers interested in the implementation details:

  • Complete Docker Compose configuration with all service definitions
  • PostgreSQL schema with proper indexing strategies
  • FastAPI application structure with async request handling
  • Feature engineering pipeline preparation for ML models
  • Error handling patterns for robust data processing

Homelab Infrastructure Benefits

Running this AI home budget optimisation system on my Proxmox homelab provides several advantages:

  • Complete data privacy – no financial data leaves my network
  • No subscription costs – all software is open source
  • Full customization – can modify algorithms for my specific needs
  • Learning opportunity – hands-on experience with production-like infrastructure
  • Scalability – easy to add more VMs or containers as needed

Community and Feedback

I’m documenting this entire journey to help others build similar systems. The combination of AI, homelab infrastructure, and financial optimization creates a powerful toolkit for personal finance management.

Have you built similar systems? I’d love to hear about your approaches to:

  • Financial data processing architectures
  • Machine learning for personal finance
  • Homelab deployment strategies
  • Privacy-focused financial tools

Conclusion

Day 5 transformed my AI home budget optimisation system from concept to reality. With a solid foundation of database storage, web interface, and intelligent categorization, I’m ready to add the machine learning capabilities that will make this truly intelligent.

The system already saves hours of manual categorization work, and tomorrow’s ML pipeline will add predictive capabilities for proactive budget management.

Next up: Day 6 focuses on building the machine learning models that will predict spending patterns, detect anomalies, and provide optimisation recommendations.


Tags: AI, home budget, machine learning, homelab, Proxmox, Docker, PostgreSQL, FastAPI, financial technology, personal finance automation, budget optimisation, data processing, financial analytics

Categories: Technology, Personal Finance, AI/Machine Learning, Homelab

Internal Links:

  • Previous post: “Day 4: Database Design for AI Budget System”
  • Related: “Proxmox Homelab Setup Guide”
  • Related: “Docker Compose Best Practices”

External Links:

Social Media Snippets:

  • Twitter: “Day 5 ✅ Built a complete AI budget system in 3 hours! CSV import → intelligent categorization → PostgreSQL storage. Next: ML predictions! #AIBudget #Homelab #TechProject”
  • LinkedIn: “Transformed financial data chaos into an organized AI-powered system today. The combination of FastAPI, PostgreSQL, and Docker creates a robust foundation for machine learning budget optimization.”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.