Solve the Right Problem is a discipline I apply before any design work starts: a set of habits for making sure the team is pointed at the actual problem, not the one written in the brief three months ago. It comes from watching teams ship polished, well-crafted features that nobody used, because nobody stopped to ask whether the brief was right in the first place.
I started doing this after noticing a pattern. The projects that failed weren't the ones with bad design. They were the ones where the team executed well against unclear requirements, undefined success metrics, or assumptions nobody had tested. The design was fine. The problem was wrong.
I've worked on teams that had everything: good designers, strong engineers, clear timelines, executive buy-in. And the feature still flopped. The problem statement was wrong from the start. Someone wrote a brief based on assumptions. The team executed against it. Nobody paused to check whether the assumptions held up.
This is a framing failure. The brief said "build X." Everyone built X. But X was what someone upstream thought the user needed, filtered through three layers of interpretation, not what the user actually needed.
This isn't a workshop or a framework you can download. It's a set of questions I bring into every kickoff, every brief review, every time someone says "we need to build this." The point is to make sure speed is pointed in the right direction.
Early in my career, I treated briefs as instructions. Someone wrote it, I executed it. If the result didn't land, I assumed I'd designed it wrong. It took a few failures to realise that sometimes the brief is wrong. The most valuable thing a designer can do is say so, with evidence.
A good brief survives scrutiny. A bad one falls apart when you ask "how do we know this is what the user needs?" You want that to happen before the sprint, not after launch.
Designers sit at the intersection of user needs, business goals, and technical constraints. That position gives you a perspective nobody else on the team has. When something in the brief doesn't add up (the user need feels assumed, the success metric is missing, the scope has quietly doubled) you're often the first person to notice. Staying quiet about it lets the team walk into a wall.
Every project has a moment when the stated problem turns out to be different from the actual problem. The discipline's job is to surface that in a kickoff, not in QA. Here are three times pushing back on the brief changed the outcome.
The brief was asking us to optimise a journey that wasn't happening. Instead of reorganising navigation, we redesigned the homepage around the question: what does a seller need to see the moment they log in? The output was a task-oriented dashboard — surfacing the actions that needed attention, not a menu of every tool that existed. A better navigation structure would have solved nothing.
One reporting feature became two distinct UI treatments — a performance tracker showing live progress against the bonus threshold, and a forward-planning view mapping upcoming sourcing decisions to deal targets. Treating them as a single problem would have produced a view that half-solved both. Separating them meant each one could actually do its job.
The design brief flipped from "add more fields" to "make the critical fields impossible to miss." We reduced visible information, introduced a clear visual hierarchy for urgent flags, and structured the layout around the handover workflow. Clinicians got what they actually needed — not more data, but the right data in the right order at the right moment.
The rework, the pivots, the features that get quietly sunsetted three months after launch: that's the cost of not asking whether the brief was right.
The companion methodology: how I align cross-functional teams on the user story before any design work begins.
Read the methodology →