Pages and links tagged with Findings.
No pages tagged with this facet yet.
Jeff Sauro
Jeff Sauro examines five common NPS criticisms, including scale changes, predictive validity, and benchmark loss, with a balanced take on what to keep. Useful when teams debate dropping or changing NPS and you want a research-grounded view.
Dock David Treece
Business News Daily piece weighing whether NPS is still useful, agreeing it tracks customer sentiment trends but does not tell you what to change to grow. Useful when leaders ask if NPS is worth keeping and you want a balanced view.
Ryan Stuart
Ryan Stuart of Kapiche lays out NPS's limits, like missing context, slow lag, and risk of becoming a vanity metric. Useful when teams over-rely on NPS without digging into why it moves.
Jeff Sauro
Jeff Sauro lists five ways to interpret a SUS score: percentiles, grades, adjectives, acceptability, and NPS correlation, with 68 as the average. Useful when explaining or grading SUS results without confusing stakeholders.
Kate Betteridge
Bentley UXC piece explaining what clients should know about SUS scores, including how the 0-100 number is not a percentage and the average sits around 68. Useful when sharing SUS results with stakeholders who might misread the number.
Geoffrey Bourne
Lists eight signals that you might be building something no one wants and what to do about it, drawing on Lean Startup ideas. Useful when a team senses something is off and wants a quick checklist of warning signs.
Adam Dorrell
Adam Dorrell pushes back on the wave of NPS-bashing articles, defending the score as a useful metric when applied with care. Useful when leaders want a counter-argument to anti-NPS critiques before deciding what to track.
Lucy Luo
Argues validation lives on a spectrum, so leaders should ask what evidence has been collected and how strong it is, not just whether something is validated. Useful when teams overclaim that an idea is proven from light evidence.
Martha Brooke
Pushes back on NPS and other outcome-style metrics, explaining what they cannot tell you about customers. Useful when teams want to know the limits of NPS before basing big decisions on it.
Ben Le Ralph
Shares step-by-step techniques for turning raw user research into clear insights that shape an MVP. Useful when a team has lots of research notes and needs a way to pull out themes and design directions.
Lucas Maretti
Lucas Maretti analyzes Starbucks marketing data to surface customer preferences and trade-offs. Useful when a marketer wants to see real-world examples of preference data in action.
Formclick
Formclick covers why design validation (right thing) and verification (built right) both matter for UX quality. Useful when a team confuses the two and skips one step.
They Make Design
TheyMakeDesign explains how to turn UX insights into shipped actions, not just reports. Useful when a team gathers data but rarely changes the product.
Nielsen Norman
Nielsen Norman Group explains long-tail data in UX and why low-frequency events still matter for product strategy. Useful when a team only looks at top-funnel metrics and is missing valuable niche behaviors.
Martin Sandström
Argues that designers should pick the smallest impactful insight and run a quick experiment instead of producing big reports stakeholders ignore. Useful when a team is stuck in long discovery cycles that don't translate to decisions.
Henry Lantham
Argues that most product discovery is fluffy because PMs and designers do not package insights for stakeholders. Useful when discovery work keeps getting ignored by leaders despite real research effort.
Jeremy Strickland
Personal story of a Google product launch that flopped because the team followed waterfall instead of agile and missed user signals. Useful when a product team is debating waterfall vs agile and wants a real cautionary tale.
Ilma Andrade
Reframes negative design feedback as a chance to learn more about clients and grow. Useful when designers feel hurt by criticism and need a healthier mindset.
Muralidhara Anandamurthy
Case study using exploratory factor analysis on trust in online sellers. Useful when you want a worked example of EFA on a UX-style survey dataset.
Jorge Maya Castaño
Academic study using factor analysis to build an empirical UX model. Useful when you want a scholarly base to build on for survey-driven research.
Bartosz Mozyrko
Argues that mixing quantitative and qualitative data avoids both narrow conclusions and shallow numbers, with a bounce-rate example. Useful when a UX team has data but cannot explain user behavior and needs the why behind the numbers.
Alastair Simpson
Atlassian case study showing how blending data with qualitative insight beats pure data-driven design, including a -12% to +22% turnaround. Useful when a team is debating whether to ship based on a bad first experiment result.
Nagoya Studio
Practical guide for graphic designers on sketching, testing assumptions, getting feedback, and polishing visuals to tell a clearer data story. Useful when a designer is about to build a chart or report and wants a simple checklist for making it land.
Mudita Singhal
Mudita Singhal shares a lens for reading product metrics in context. Useful when a team gets numbers but cannot tell good from bad signals.
Abhishek Chakravarty
Abhishek Chakravarty walks through running rigorous product experiments without guesswork. Useful when a team is shipping tests but unsure if results are trustworthy.
Martin Sandström
Martin Sandström warns that discovery research can produce insights no one asked for if you ignore why you're on the project. Useful when researchers feel ignored after presenting findings and need to refocus their work.
Jon MacDonald
The Good shares case studies of iterative testing that drove conversion lifts for clients — covering hypothesis design, sample size, and what they learned. Useful when a CRO team wants concrete proof that iterative testing pays off, not just opinion.
Bryan Zmijewski
Seventeen tactics to run design critiques that influence others and win meetings. Useful when designers want stakeholders to leave reviews aligned and bought in.
Renato Galindo
Eight rules for handling product design feedback, like setting expectations and asking why. Useful before a critique so designers know how to run it well.
Fábio Belchior
Shows how to gather user feedback from many channels and turn it into clear design changes. Useful when teams collect lots of input but struggle to act on it.
@Jo Crowley
Personal take on how to receive design criticism with grace and not take it to heart. Useful for junior designers learning to handle critique sessions.
Kaan Uluer
PM's guide to using behavioral analytics to understand and improve product use. Useful when you have analytics data but aren't sure how to use it for behavior insight.
James R. Lewis
Excerpt on factor analysis from Sauro and Lewis's Practical Statistics for UX. Useful when you need a credible reference on stats for UX surveys.
Travis Kassab
Walks through using factor analysis to find patterns in user-needs survey data. Useful when you have lots of survey items and need to group them into themes.
Shane Gryzko
Summary of quant UX research advice from Jeff Sauro and James Lewis. Useful when you want a quick crib sheet of quant research wisdom from the best.
Diane Bowen , MS
Demystifies statistical analysis for UX researchers running surveys. Useful when stats feel intimidating but you need to do them anyway.
Alexandra Nemeth
Real-world HCD case studies from movingworlds.org showing what action looks like. Useful when you need motivating examples of HCD doing real work.
Nick Moore ⚡️
Survey results on what developers actually want from designers at handoff time. Useful when you want to align handoff to what devs ask for, not what designers assume.
Audrey Alejandro
Reflective post from a researcher questioning whether she's coding qualitative data the right way. Useful when you are coding interview transcripts and want to sanity-check your method.
Cambrian Berry
Healthcare-focused service design case studies showing how journey mapping drives social impact. Useful when you are designing for healthcare or social services and need real examples.
Arun Joseph Martin
Walkthrough of a real IBM service design project for z/OS Cloud Broker. Useful when you want a concrete enterprise example of service design end-to-end.
storytelling with data
Cole Nussbaumer Knaflic's chart guide covering when to use which chart type and how to design clear visuals. Useful when you need to pick the right chart for a specific business message.
Ridhi Singh
Case study showing how cutting screens, removing extra form fields, and explaining wait times lifted onboarding conversion by 200%. Useful when you suspect your onboarding flow has friction and want a real before-and-after to learn from.
tmdesign
Walks through how data-driven design turns UX insights into actions, with a sample workflow. Useful when a design team wants to move from research debriefs to actual design changes.
Rick Swette
Rick Swette shares anecdotes of design insights from years of research, like why visitors prefer one-page navigation and why first-song quality drives Spotify playlist traction. Useful when teaching insight craft and you want real, surprising examples instead of textbook lists.
Krutik Bhavsar
Argues AI experiences need quiet, effective design and shares stats on why latency and weak language understanding push users away. Useful when an AI feature is built but adoption is poor and you need to find the friction points.