Site logo

First Work, Then Right, Finally Fast Software

Created at: September 26, 2025

"Make it work, make it right, make it fast." - Kent Beck
"Make it work, make it right, make it fast." - Kent Beck

"Make it work, make it right, make it fast." - Kent Beck

Origins in Extreme Programming

Kent Beck crystallized this sequence in Extreme Programming Explained (1999), later reinforcing it through Test-Driven Development: By Example (2002). The cadence—make it work, make it right, make it fast—embodies XP’s feedback-first ethos. By separating concerns—behavior, design, and performance—the mantra tells teams what to do now and what to defer. Moreover, it creates a social contract in code reviews: debate design after correctness, and debate speed after design. This ordering lowers stress, sharpens focus, and turns learning into an intentional, repeatable practice.

Make It Work: Prove Behavior First

To start, ensure the feature behaves as intended. In TDD this is the red-to-green step: write a failing test, then add the simplest code that makes it pass. XP calls small research implementations spikes; the Lean Startup’s build-measure-learn loop (Eric Ries, 2011) echoes this by shipping a vertical slice to gather real feedback. Early integration and working software expose misunderstandings faster than documents, and they give stakeholders something concrete to evaluate. By focusing on behavior first, teams reduce risk and create a stable baseline for later improvements.

Make It Right: Improve the Design

Once behavior is confirmed, improve the code’s structure without changing its observable behavior. Beck’s red-green-refactor cycle and Martin Fowler’s Refactoring (1999/2018) catalog safe, mechanical transformations—extract function, rename for clarity, eliminate duplication—that lift readability and testability. Good design principles (e.g., SOLID per Robert C. Martin) keep modules cohesive and loosely coupled so change remains cheap. This step also clarifies domain concepts, aligning code with the language of users. As quality rises and defects fall, future work accelerates, setting the stage for performance efforts that won’t collapse under fragile design.

Make It Fast: Optimize With Evidence

Only then should teams optimize, and only with measurements. Profilers, tracing, and realistic benchmarks reveal hot paths; Amdahl’s Law (1967) warns that speeding a small fraction yields limited gains. Donald Knuth’s caution about premature optimization (1974) underscores why evidence matters. Often, algorithmic choices, better data access patterns, or judicious caching provide outsized wins, while micro-optimizations merely add complexity. Because correctness and design are already solid, performance tweaks remain localized and safer to revert. Moreover, the test suite built earlier quickly catches regressions introduced by optimization.

Risk Management Through Flow

In parallel, adopt practices that reinforce the sequence. Continuous Integration and small pull requests keep the “make it work” slices flowing; feature flags let incomplete work ship safely. Continuous Delivery principles (Jez Humble and David Farley, 2010) shorten feedback loops so “right” and “fast” improvements reach users quickly. Pair programming and code reviews create shared ownership, turning refactoring into a routine habit rather than a heroic event. By controlling batch size and feedback latency, the team’s cadence mirrors the mantra, transforming advice into daily operational rhythm.

The Economics of Technical Debt

Economically, the ordering manages technical debt deliberately. Ward Cunningham’s debt metaphor (OOPSLA, 1992) suggests that shipping early code is like taking a loan to learn; refactoring is the interest payment that prevents compounding. Barry Boehm’s Software Engineering Economics (1981) shows how rework costs grow when defects linger; making it right soon after it works limits that growth. Deferring optimization avoids speculative bets until usage data clarifies where speed truly matters. Consequently, the project gains optionality: invest performance effort where it moves business metrics, not where guesswork points.

A Practical Vignette in Three Steps

Consider a small team adding refunds to an online shop. First, they implement a narrow path—submit request, validate, persist—with tests mirroring real cases, and a stubbed payment gateway to make it work. Next, they extract domain objects, name states explicitly, and simplify error handling to make it right, shrinking cognitive load. Finally, load tests reveal a bottleneck in inventory lookups; a read-through cache and batched queries make it fast. Because each step built on the last, the feature ships with confidence, clarity, and speed—exactly as Beck’s sequence intends.