The Critique Problem
We've all been there. A designer presents work. The room offers opinions. "I think the button should be blue." "What if we tried a different font?" "Can we make the logo bigger?"
Ninety minutes later, everyone leaves having said things without advancing the design. The presenter is confused about what feedback actually matters. The reviewers feel they've contributed without providing real value.
Design reviews often fail because we treat them as informal discussions rather than structured processes. Better process leads to better outcomes.
Why Most Reviews Fail
No Clear Goal
What is the review trying to achieve? Are we evaluating if the design solves the problem? Checking visual polish? Identifying usability issues? Without clarity, feedback scatters in all directions.
Wrong Participants
Reviews often include people who shouldn't be there (executives with opinions but no context) and exclude people who should (engineers who'll build it, users who'll use it).
Presenter Defensiveness
Designers often present in defensive mode, explaining every decision before anyone can question it. This signals "please don't critique this" and shuts down valuable feedback.
Feedback Without Accountability
Anyone can offer opinions. Few are held accountable for feedback quality. The loudest voices dominate regardless of their insight.
No Follow-Through
Feedback is offered, acknowledged, and then... what? Without clear next steps, good feedback disappears into the void.
Structuring Reviews for Success
Define the Review Type
Different stages need different reviews:
Concept Review: Is this the right direction? Focus on strategy and approach, not visual details.
Design Review: Does this solution work? Focus on usability, consistency, and implementation feasibility.
Polish Review: Is this execution excellent? Focus on visual refinement, edge cases, and accessibility.
Be explicit about which type you're doing. Don't let a concept review devolve into pixel debates.
Set the Context
Before showing anything, presenters should share:
- The problem: What user need or business goal does this address?
- The constraints: What limitations shaped the design?
- The type of feedback sought: What questions should reviewers focus on?
- What's not ready: What aspects are deliberately rough?
This framing focuses feedback where it's useful.
Use Structured Critique Methods
Several methods improve critique quality:
"I like / I wish / What if"
- I like [specific positive observation]
- I wish [specific improvement suggestion]
- What if [exploratory alternative idea]
This structure ensures balanced feedback and encourages specificity.
Problem-focused feedback Instead of "I don't like the layout," say "Users might struggle to find the primary action because [specific reason]."
Tie feedback to user outcomes, not personal preferences.
Question-based feedback Frame feedback as questions the design raises rather than statements:
- "How will users discover this feature?"
- "What happens if the data doesn't load?"
- "Why did you choose this approach over X?"
Questions invite dialogue; statements provoke defense.
Time-Boxing and Prioritization
Reviews expand to fill available time. Constrain them:
- Set hard time limits: "We have 45 minutes for this review."
- Prioritize feedback: "What are the top three concerns that would most improve this design?"
- Distinguish blocking vs. non-blocking: Some feedback must be addressed; some is optional.
Document Everything
Assign a note-taker (not the presenter). Document:
- All feedback given
- Who gave it
- Whether it's blocking or nice-to-have
- Agreed next steps
Share notes immediately after. This creates accountability and prevents "I thought we decided..." disputes.
The Presenter's Mindset
How you present affects the feedback you get:
Present, Don't Defend
Show the work. Explain the context. Then stop. Don't pre-emptively justify every choice. Let reviewers react to what they see.
If you catch yourself saying "I did this because..." before anyone has questioned it, you're defending. Stop.
Welcome Criticism
The goal of review is improvement, not validation. If everyone loves your design and has no feedback, the review failed. Something can always be better.
Saying "that's great feedback" when someone critiques your work isn't weakness -it's maturity.
Distinguish Ego from Evaluation
Your design being critiqued isn't the same as you being critiqued. Separating your identity from your work enables you to hear feedback clearly.
This is hard. It gets easier with practice.
Ask for Specific Feedback
"What do you think?" invites vague responses. Instead:
- "Does the hierarchy effectively direct attention to the primary action?"
- "Can you identify any scenarios where this flow would fail?"
- "What concerns would you have about implementing this?"
Specific questions get specific answers.
The Reviewer's Mindset
Giving good feedback is a skill:
Be Specific
"I don't like it" is useless. "The call-to-action gets lost among the other elements because they're all similarly weighted" is useful.
Specificity enables action.
Explain the Why
Don't just identify problems -explain why they're problems. "Users might miss this" is better than "this is bad," but "users might miss this because their eyes are drawn to the competing element above" is better still.
Offer Alternatives (Carefully)
"What if we tried X?" can be helpful when X represents a meaningfully different approach. It's less helpful when X is just your personal preference restated as a suggestion.
Alternatives should open exploration, not close discussion.
Know Your Authority
Some feedback carries more weight than others. User research findings trump personal preferences. Accessibility requirements trump aesthetic opinions.
Know when your feedback is authoritative and when it's merely perspective.
Don't Pile On
If someone has already made a point, you don't need to repeat it. "I agree with Sarah's concern about the navigation" is sufficient.
Piling on identical feedback wastes time and demoralizes presenters.
Remote Review Best Practices
Remote reviews have unique challenges:
Asynchronous First
Before synchronous meetings, share designs asynchronously. Give reviewers time to form thoughts without group influence.
Tools like Figma comments enable asynchronous feedback that's attached to specific elements.
Camera and Voice On
Remote critique requires seeing and hearing each other. Non-verbal cues matter. Tone matters. Full participation requires full presence.
Screen Control Etiquette
Usually, the presenter controls the screen. If reviewers need to point at something, use annotation tools rather than "scroll down a bit, no the other way, there."
Prevent Crosstalk
Remote conversations break down when people talk over each other. Designate a facilitator to manage turn-taking.
Record for Absent Members
Record reviews (with consent) so absent team members can catch up. This is more effective than summarized notes alone.
Common Dysfunctions and Fixes
"The Endless Review"
Reviews that drag on without resolution.
Fix: Time-box ruthlessly. Agree on next steps before time expires. Schedule follow-up if needed.
"The HiPPO" (Highest Paid Person's Opinion)
Senior stakeholders dominating feedback, regardless of expertise.
Fix: Establish clear roles. Feedback from leadership is one input, not the final word. Hold everyone to the same standards.
"The Bikeshed"
Extensive debate about trivial details while significant issues go undiscussed.
Fix: Prioritize feedback explicitly. Address foundational concerns before details.
"The Validation Theater"
Reviews where everyone agrees everything is great, despite obvious issues.
Fix: Explicitly ask "what's wrong with this?" Make critique expected and valued.
"The Feature Creep"
Reviews that become brainstorms for new features rather than evaluation of current design.
Fix: Separate critique from ideation. New ideas go to a parking lot for later discussion.
After the Review
The review ends. Now what?
Synthesize Feedback
Not all feedback deserves action. Synthesize what you heard into:
- Must address (blocking issues)
- Should consider (significant but not blocking)
- Nice to have (minor improvements)
- Disagree with (feedback you've considered and rejected)
Communicate Decisions
Tell reviewers what you're doing with their feedback. This closes the loop and shows their input mattered.
Iterate and Re-Review
Significant changes may warrant another review. Don't assume one pass is sufficient for complex designs.
Building Review Culture
Better reviews come from better culture:
Model Good Behavior
Leaders should demonstrate how to give and receive feedback well. Culture follows example.
Train the Skills
Feedback is a skill. Invest in training. Share frameworks. Practice in low-stakes settings.
Celebrate Good Feedback
Recognize people who give excellent feedback. This signals what you value.
Retrospect on Reviews
Occasionally ask: "How effective are our reviews? What should we change?" Improve the process continuously.
What makes design reviews work (or fail) on your team? I'm always looking to improve our process.