Start With the Riskiest Assumption
Every MVP should test one primary hypothesis. Not "will people use our product" โ something more specific. "Will small business owners pay โน2,000/month for automated invoice reconciliation?" or "Will parents in Tier 2 cities engage with a live-class tutoring app for their children?" The MVP's job is to validate or invalidate that hypothesis with real users as cheaply as possible. Everything else is a bonus.
The Feature Prioritisation Matrix
We put every proposed feature through a simple 2ร2 matrix: Customer Value (low/high) vs Build Effort (low/high). Features that are high-value and low-effort go in the MVP. High-value, high-effort features need to be further broken down โ can you test the value with a simpler version? Low-value features, regardless of effort, go on the post-MVP roadmap. This exercise typically removes 60โ70% of the initial feature list.
The 8-Week Sprint Plan
An 8-week MVP assumes: Week 1โ2: Discovery, architecture, Figma wireframes. Week 2โ3: High-fidelity design, user testing of prototype. Week 3โ8: Two-week development sprints (three sprints). The first sprint builds core infrastructure and the single most critical user journey. The second sprint adds the next most important user journeys. The third sprint handles edge cases, polish, and launch preparation. Week 8 is buffer โ always expect it to be needed.
The Scope Creep Contract
Before development starts, we document the exact feature list and have the client sign off on it. This isn't adversarial โ it's protective. Scope creep during development is the single biggest cause of delayed MVPs. When a new feature request comes in (and it will), we have a documented process: the feature is added to the post-MVP backlog, scoped, and scheduled for the next sprint cycle. Nothing enters the current sprint without removing something else of equal scope.
What to Leave Out
Common features clients want in an MVP that rarely belong there: Admin panel (use a spreadsheet or Airtable initially). Email templates (plain-text emails work fine). Analytics dashboard (Google Analytics covers early-stage needs). Multi-language support (validate in one language first). Complex permission systems (two roles maximum). Advanced search and filters. Social sharing. Each of these adds weeks and rarely affects whether the MVP validates the core hypothesis.
The Definition of Done
An MVP is done when: It can be used by real users without a developer holding their hand. The core user journey works end-to-end without critical bugs. Basic analytics are instrumented so you can measure what matters. It can handle 10ร your expected launch traffic. The code is documented well enough that a new developer could onboard in a day. Nothing else.
Ready to build your MVP?
Tell us about your project and we'll respond within 24 hours with a clear, honest plan.