When trying to recall symptoms, feelings, or significant events, our memories fail us precisely when accuracy matters most. Did that headache start Tuesday or Wednesday? Was the pain sharp or dull? How severe was the anxiety before it subsided? These details fade within hours, yet they're often critical when speaking with healthcare providers, tracking treatment effectiveness, or identifying patterns in our own health.
The Problem: Memory is Unreliable When It Matters Most
This problem compounds over time. By the time we're sitting in a doctor's office trying to describe symptoms from the past month, we're left with vague impressions rather than concrete data. "I've been feeling tired lately" lacks the specificity of "I've logged 8 instances of fatigue over the past 3 weeks, concentrated in afternoons, severity ranging 5-7/10."
The traditional solution—keeping a written journal—introduces friction that often leads to abandonment. Opening a notes app, typing detailed entries, and maintaining the habit requires sustained effort that conflicts with the very moments when logging is most important: when you're experiencing the symptom itself.
Why This Matters Even More in the Age of AI
We're entering an era where AI can analyze vast amounts of health data to identify patterns, predict outcomes, and recommend interventions. Wearables track our heart rate variability, sleep stages, and step counts. Medical records capture lab results, diagnoses, and prescriptions. Environmental sensors log air quality, weather patterns, and pollen counts.
AI models can tell us our resting heart rate increased by 8 BPM, but they can't know we felt anxious. They can detect we slept poorly, but not that we had vivid dreams or woke with a headache. They can see we took medication, but not whether we experienced side effects or symptom relief.
This subjective data—symptoms, feelings, perceptions—is precisely what's needed to make objective health data meaningful. Without it, we're building increasingly sophisticated analytical tools while starving them of the most important signal: how we actually feel.
Qupi exists to capture this missing data layer in a way that's effortless enough to maintain and structured enough to analyze.
Building Qupi: From Concept to Deployment
Technical Foundation
I knew from the outset that for Qupi to be useful, it needed to be reliably accessible and fully operational from day one. This informed three core architectural decisions:
I built Qupi with a Node.js/Express/TypeScript backend that exposes all functionality through a RESTful API. This approach provided flexibility to iterate on the frontend while maintaining a stable backend, and crucially, positioned the app for future expansion to native mobile platforms without requiring backend changes.
Rather than starting with a minimal hosting solution and migrating later, I deployed on AWS from the beginning. This provided production-grade reliability, scalability, and security without requiring infrastructure changes as the app evolved.
I chose Supabase for its combination of powerful PostgreSQL functionality, built-in authentication, and row-level security features. The flexible schema design accommodates diverse event types while maintaining user privacy through selective encryption.
Development Approach
I used Claude Code as my primary development assistant, which allowed me to focus on product decisions and architecture while accelerating implementation. The codebase lives in GitHub, providing version control and deployment automation through CI/CD pipelines.
MVP: Progressive Web App with Voice Entry
For the initial release, I identified the core value proposition: making event logging effortless enough to maintain daily. This crystallized into a Progressive Web App focused on speed and accessibility:
Using the Web Speech API, users can speak their symptoms and events naturally. The app employs keyword-based detection to automatically categorize entries (e.g., "bad headache" triggers headache categorization with high severity).
Notes are encrypted at rest, ensuring sensitive health information remains private while still allowing for aggregate analytics on structured fields like severity and body area.
The entire logging flow takes seconds—open app, speak or type, done. No complex forms, required fields, or multi-step processes.
No app store approval process, works across platforms, and users can install it to their home screen for app-like experience without native development costs.
Refinement Phase
Following the initial launch, I focused on three areas:
Addressing edge cases in voice recognition, handling network issues gracefully, and ensuring consistent behavior across browsers and devices.
Streamlining the interface based on actual usage patterns, improving accessibility, and refining the visual design for clarity and focus.
I added an optional "Enhance with AI" feature that maintains the fast keyword-based parsing as the default while offering users LLM-powered analysis for more nuanced entry interpretation. This hybrid approach preserves speed and cost-efficiency while unlocking more sophisticated analysis for users who want it.
What's Next: Expanding Capabilities
With a solid foundation in place, I'm focused on five key enhancements:
Why: The PWA approach has limitations—particularly around voice logging and background functionality. Users currently need to manually create Siri shortcuts, which introduces friction. A native app enables voice logging without opening the app and deeper system integration.
Approach: Building with Swift/SwiftUI while maintaining the existing Express API backend. Starting with a read-only version before adding complexity allows for incremental validation.
Why: The real value of logged data emerges through pattern recognition that humans can't easily spot.
Capabilities:
- Event Correlation: Identifying relationships between events (e.g., poor sleep consistently preceding migraines)
- Pattern Detection: Surfacing temporal patterns, triggers, and trends across weeks or months
- Predictive Insights: Suggesting likely future events based on current patterns
- Anomaly Detection: Flagging unusual combinations or changes in baseline patterns
Why: The ideal logging moment is when the symptom occurs, not minutes or hours later. Watch integration removes even the friction of pulling out a phone.
Experience: Quick tap to log predefined events or use voice directly from the watch. Severity selection via Digital Crown. Seamless sync to phone and cloud.
Why: Certain events benefit from regular logging prompts (e.g., medication side effects, pain levels at consistent intervals).
Approach: User-configurable push notifications for event types they want to track regularly, with smart scheduling that learns optimal reminder times.
Why: Common tracking scenarios (migraine episodes, medication trials, symptom flare-ups) follow predictable patterns that shouldn't require manual entry each time.
Functionality: Pre-configured event bundles that users can quickly log with a single tap, customized to their specific tracking needs.