Pages and links tagged with UX metrics.
No pages tagged with this facet yet.
Jared Spool
Jared Spool argues NPS is harmful because reducing user experience to one number does not help teams grow or know how loyal customers are. Useful when UX professionals need to push back on NPS being treated as the main UX metric.
Ricardo Saltz Gulko
Ricardo Saltz Gulko argues NPS is too narrow for B2B and pushes for a 360-degree mix of metrics that capture today's customer reality. Useful when leaders rely on a single number and you want to argue for a broader measurement set.
Raluca Budiu
NN/g video with Raluca Budiu on the System Usability Scale, how to administer the 10-item survey, and how to interpret the score. Useful when training a team to use SUS in usability tests.
Ward van Gasteren
Walks through eight steps to find a North Star Metric that captures the value users get from your product. Useful when picking the single number a company should rally around.
Jeremy Watkin
Sets the record straight on CSAT, NPS, and CES, explaining what each measures and when to use them. Useful when teams confuse the three metrics and need to know which fits which question.
Dhananjay Garg
Argues startups should swap NPS for Customer Effort Score because CES better predicts loyalty by tracking how easy a transaction or support call was. Useful when picking the right CX metric for an early-stage product.
Klaas Hermans
Explains Customer Effort Score (CES) as a metric for how easy it is for users to complete a specific interaction, captured right after that moment. Useful when teams want a focused signal on friction in support or product flows.
Elizabeth Coffman
Lists ten customer engagement metrics like CSAT, conversion rate, session duration, and churn that signal how users are connecting with a product. Useful when picking the right engagement signals to track on a dashboard.
Joshua Seiden
Argues NPS only tells you how you did, so the magic question is which user behaviors lead to a high score. Useful when leaders chase NPS targets without knowing what to actually change.
Stacey Barr
Provides a decision tree to figure out whether a result actually needs a KPI before adding one, to avoid metric overwhelm. Useful when leaders keep asking for new KPIs and you want to push back with a process.
Claudiu Murariu
Argues that retention numbers misread churn when sign-ups are counted instead of onboarded users, and shows a B2B case where onboarded retention sat above 25%. Useful when leadership panics about churn before checking onboarding completion.
Jane Portman
Walks through building a SaaS activation funnel by mapping the minimum path to "awesome" and defining success at each step. Useful when activation feels messy and you need a clean path to measure stage by stage.
Dan Martell
Five SaaS founder moves to drive activation and net-negative churn, including tent-pole moments in the first 100 days and cancellation insights. Useful when retention is shaky and you want activation tactics tested across many SaaS companies.
Natalie Marcotullio
Five techniques for moving SaaS users from signup to activation, with benchmark numbers (avg 34%, median 25%) from Lenny Rachitsky data. Useful when activation lags and you want field-tested fixes plus a number to aim for.
Kara Pernice
NN/g video walking through five signs of low UX maturity, like UX as a service department or no shared metrics. Useful when you suspect the org is held back by low UX maturity but cannot put words to it.
Emily Taylor
Defines brand positioning and walks through measurement methods like surveys, brand tracking, and competitive analysis. Useful when leadership asks how we know our positioning is working in market.
Daniel Maculey
Daniel McAuley shares principles for designing metrics that change behavior, like keeping them simple and testable. Useful when a team is creating a new metric and worried it will be misused.
Udit Maitra
Short video on UX metrics covering common measures and how to use them. Useful when you want a quick visual primer on UX metrics for your team.
Elita Freiberga
Elita Freiberga at Sharewell explains six user testing metrics and how each one tracks UX over time. Useful when a small team is building its first user-testing dashboard.
Nikki Anderson-Stanier
Nikki Anderson-Stanier walks through eleven usability testing metrics like task success, time on task, confidence, and SUS. Useful when a researcher wants quantitative numbers to pair with qualitative findings.
UXNess
UXness lists seven UX metrics including task success, time on task, error rate, and NPS. Useful when a team is starting fresh and needs a small, well-known set of signals to track.
Florian Biesinger
Florian Biesinger of Spotify explains data-driven approaches to building new features, including key metrics and decision making. Useful when a product team is unsure how to set up data thinking for a new feature.
Shana Lynn Bresnahan
Shana Lynn Bresnahan walks through a simple formula for customer retention rate so any team can compute it. Useful when a community or subscription team needs a clean baseline number to track over time.
Emerging India Analytics
Lists five reasons data analytics changes customer retention, from real-time adaptation to personalization based on behavior. Useful when a team needs quick talking points to justify investing in retention analytics.
lloyd tabb
Lloyd Tabb of Looker explains why download counts and other surface numbers fool teams, and offers clarity metrics that predict real growth. Useful when a team needs to swap pretty dashboards for signals that drive real decisions.
Kohki Yamaguchi
Walks through how to choose UX metrics using Google's Goals-Signals-Metrics process and the HEART framework. Useful when a UX research team is setting up its first measurement plan and wants a tested method.
Chethan KVS
Comprehensive video on product metrics for designers covering setup, frameworks, and how to communicate them. Useful when a product designer is learning measurement basics and wants a guided overview.
Connor Joyce
Cautions that proxy metrics like click-through rate are weakly tied to ROI and explains where they break down as KPIs. Useful when a marketing or analytics leader is rethinking which metrics to elevate to KPI status.
Maria Panagiotidi
Comprehensive video on product metrics for designers covering setup, frameworks, and how to communicate them. Useful when a product designer is learning measurement basics and wants a guided overview.
Nicole Gallardo
Connor Joyce shares a four-step process for picking UX metrics that link to business impact using a User Outcome Connection lens. Useful when a UX team is moving from usage stats to outcome-based metrics tied to retention and growth.
Softvery Solutions
Comprehensive guide to product launch metrics covering revenue, customers, marketing, and product usage. Useful when a team is building a launch dashboard and needs a wide menu of metrics to choose from.
Chloe Hoang
Lays out four metrics any feature launch should track, aimed at product designers and PMs. Useful when a team is launching a feature and wants a tight set of metrics to judge success.
Megan Wells
Megan Wells walks through ways to measure customer value across the lifecycle. Useful when a team needs concrete metrics to prove value to customers and to leadership.
Reforge
Reforge explains how to choose North Star Metrics across acquisition, retention, and monetization using unit of value, quality, and frequency. Useful when a leadership team is selecting one or more North Stars and wants a robust framework.
Tim Neusesser
NN/g video that draws a sharp line: use completion rate for linear flows, success rate when there are many ways to finish. Useful when you can never remember which metric to report and need a clean rule.
Rebecca Riserbato
Defines CX through eight metrics service teams use, with stats like 49% of customers leaving after one bad experience. Useful when a service or CX team is choosing which metrics to put on their dashboard.
Jorge García-Luengo
Critiques NPS and CSAT as too vague and proposes a four-family KPI structure across financial, sales, experience, and operations. Useful when a CX team wants better metrics that tie experience to operations and revenue.
Userpilot
Walks through key UX metrics like task completion rate, time on task, and error rate, plus tools to track them. Useful when a product team wants concrete metric definitions and a way to plug them into analytics.
Incharaprasad
Lists ten startup UI/UX metrics across quantitative, qualitative, behavioral, and attitudinal types with example tools like Google Analytics. Useful when a startup team needs a starter list of which UX metrics to set up first.
Javier Andrés Bargas-Avila
Talk by Google's UX research director on how to define and measure user-centric metrics for product development. Useful when a UX research lead is setting up measurement at scale and wants to learn from Google's playbook.
Bansi Mehta
Breaks UX metrics into usability and engagement, then introduces Google's HEART framework as a way to organize what to track. Useful when a team is setting up a UX measurement plan and needs a starter framework.
Adam Fard
Walks through behavioral and attitudinal UX metrics like task time and CSAT, and how to tie them to business outcomes. Useful when a UX team needs to set up a measurement plan that connects design changes to ROI.
Agathe Huez
Lists seven techniques for building dashboards that read like stories, including filtering, KPI focus, and beginning-middle-end structure. Useful when a team is designing a customer dashboard and needs concrete moves to make it more engaging.
Userpilot
Userpilot explains the difference between a counter metric and a north star metric. Useful when a team is picking guardrail metrics to stop unintended consequences.
Ricky Johnston
Ricky Johnston names metrics that every design team should track to show value. Useful when a design lead wants a starter set of UX metrics for the team.
Neil Patel
Neil Patel explains how to use a single metric to run a startup. Useful when a founder wants a simple way to focus the team's efforts.
Ash Maurya
Ash Maurya shares lean analytics ideas around the one metric that matters. Useful when a startup needs a focused metric to align the team around.
Mike Zawitkowski
Mike Zawitkowski warns that picking one metric can become a trap and cause bad decisions. Useful when a team is rallying around a single metric and ignoring tradeoffs.
George Guimarães
George Guimaraes argues lead time is a better signal than cycle time for many teams. Useful when leaders want to pick one number to watch over time.
Nick Hodges
Nick Hodges shares tactics for shrinking the coding portion of cycle time. Useful when an engineering manager wants concrete moves to speed up delivery.
Barry Overeem
Johannes Schartau and Barry Overeem explain how to experiment with measuring lead and cycle time. Useful when teams want to start tracking delivery flow without big tooling.
Nav Dhunay
Nav Dhunay walks through what cycle time is and how to master it. Useful when a team is hearing the term and wants a clear primer.
Aaron Gitlin
Aaron Gitlin walks designers through becoming data-aware in their work. Useful when designers want to read data without becoming analysts.
Sandeep Chadda
Sandeep Chadda offers a simple structure for thinking about metrics for any product. Useful for new PMs who want a clean mental model before picking metrics.
Ant Murphy
Ant Murphy compares four frameworks for defining product metrics. Useful when a PM team wants to standardize how they pick metrics across squads.
Scott Sehlhorst
Scott Sehlhorst shares five lenses for looking at any product. Useful when a team needs a fresh angle to find a metric or opportunity.
Userpilot
Userpilot guide for picking a product metrics framework that fits your stage. Useful when a PM is choosing between AARRR, HEART, and others.
Cody Arsenault
Cody Arsenault compares NPS, CES, and CSAT for product managers. Useful when a PM is choosing one customer satisfaction metric to track over time.
Jan Bosch
Jan Bosch argues that customer value is a fuzzy term and pushes for measurable outcomes instead. Useful when a team keeps using customer value as a vague excuse for choices.
Nikolas Vogt
Nikolas Vogt's North Star Metric checklist flags common pitfalls like using revenue as a North Star or ignoring usage and retention. Useful when a leadership team wants a final review before locking in a metric.
Jared M. Spool
Jared Spool argues that real UX success metrics are tied to user outcomes (e.g., paying tickets sooner) and not vanity numbers. Useful when teams set generic KPIs and need a way to find truly meaningful ones.
Debbie Levitt
Debbie Levitt warns that CX and UX metrics often miss the long arc of an experience, focusing on a transaction instead of repeat behavior. Useful when a team chases conversion gains that hurt loyalty later.
Alex A. Szczurek
Alex Szczurek's longer list of UX metrics groups them into behavioral, attitudinal, descriptive, diagnostic, and engagement buckets. Useful when a team wants a long inventory before narrowing down to the few that matter.
Bryan Zmijewski
Bryan Zmijewski's LinkedIn post argues UX metrics should give structure to ideas and clearly tie design to business value. Useful when leaders need a quick prompt to upgrade how their team picks metrics.
Kerry Rodden
Kerry Rodden's GV Library piece introduces the HEART framework and Goals-Signals-Metrics process for picking UX metrics that fit a product. Useful when a team wants a proven, lightweight method to choose metrics that match goals.
Reddit, Inc.
Reddit r/UXDesign thread debates what metrics a product or UX designer should actually own. Useful when a designer wants real-talk perspectives before agreeing to KPIs in a review.
Max Stepanov
Max Stepanov writes a long guide on KPIs and UX metrics, including SUS, SUPR-Q, UMUX-Lite, and CES, to ground design decisions in business goals. Useful when a team builds its first measurement system and needs reference instruments.
Emily May
Emily May at ICAgile shares five Agile metrics for continuous improvement, like cycle time, that help teams spot bottlenecks. Useful when an Agile team wants metrics tuned to flow, not just outcomes.
Build UX
Build UX lists eight UX metrics that fit Agile teams, like task success, time on task, CSAT, and NPS. Useful when an Agile product team needs a starter metric set that fits short cycles.
Jasmine C.
Jasmine Christensen extends Nielsen's heuristics into a custom set spanning product thinking, UX, and UI design so teams can rate their own quality. Useful when a design team wants a shared scorecard for self-critique.
Alex Szczurek
Alex Szczurek explains common UX KPIs like satisfaction, task completion, and bounce rate, and how to collect them via surveys, analytics, and tests. Useful when a team needs an entry-level guide to UX KPIs and how to gather them.
Janine Kim
Janine Kim explains why measuring UX over time gives leadership a clear story of what got better or worse. Useful when teams have lots of qualitative feedback but no longitudinal numbers.
Emily Stevens
Emily Stevens lists the seven UX KPIs every team should track, mixing behavioral measures like task success with attitudinal scores like NPS. Useful when a team needs a starter set of UX KPIs to align on.
Maximilian Speicher, PhD
Maximilian Speicher argues that conversion rate and AOV are not real UX metrics and that proper measurement needs instruments like the UEQ. Useful when leaders mix up business KPIs with UX metrics and miss real friction.
Reddit, Inc.
Reddit thread collects opinions on how to measure UX designers and researchers, with debate over outcomes vs activities. Useful when a leader is shaping reviews or KPIs and wants to see how peers handle it.
Tomer Sharon
Tomer Sharon shows how to define what counts as use, then track the share of users who keep using the product over time. Useful when a team has loose retention numbers and wants a clean, agreed way to measure.
Fullstory
Fullstory explains user retention rate, the formula to calculate it, and why it predicts long-term product health. Useful when teams need a basic, shared definition of retention before debating goals.
Alex L.
Alex Lew explains how to measure B2B UX with the User Experiences System and PTECH, which combine ease of use, consistency, and performance metrics. Useful when you cannot get full behavioral data from B2B customers and still need a defensible way to score the experience.
Boaz Gurdin
VMware's USER framework measures enterprise UX through Usage, Satisfaction, Ease of use, and Ramp-up so teams have continuous signals from real customers. Useful when an enterprise team wants a steady, shared metric set to guide priorities and prove design's impact.
Gabriela Lucía L.
A case study walks through picking and using UX metrics for a real B2B product, broken into descriptive, behavioral, and attitudinal categories. Useful when a small team needs a worked example before setting up its own metric program.
Mingtao Wu
Mingtao Wu offers a three-minute method for defining product metrics — pick the goal, choose the metric, and check it against bias. Useful when a PM has to defend a metric in a review tomorrow and needs a fast way to sanity-check it.
Unipaws
Unipaws shows how analytics can power personalized experiences that feel delightful — covering segmentation, behavioral data, and recommendation patterns. Useful when a product team wants to use existing data to create moments of personalization without rebuilding the stack.
HELEN PARF
Shows how to layer metrics onto a user journey map so each stage has a number that signals if it is working. Useful when a team has a pretty journey map that nobody updates and needs to make it operational.
Tanya Levdikova
UXPressia walks through best practices for measuring journey performance — picking stage KPIs, setting baselines, and using examples from real journey maps. Useful when a CX team has a journey map but no way to know if it is getting better or worse.
Jen Clinehens
Jen Clinehens explains how to measure customer experience by combining surveys, behavioral data, and journey scores so teams can connect feeling to action. Useful when a CX team is asked to prove their work moves business numbers, not just survey scores.
Natalie LeRoy
Case study on selecting UX metrics for onboarding and engagement inside a daily study program — covering what to measure on day one versus day thirty. Useful when a team is building a habit-forming product and needs to separate first-use metrics from long-term engagement signals.
Ivan Peralta
Short, opinionated take on which onboarding metrics actually matter — completion rate, time-to-value, and feature adoption — and which vanity metrics to ignore. Useful when an onboarding redesign is being measured and the team needs a tight set of numbers to defend.
Parth V
Part three of a series goes deep on acquisition and activation metrics — what counts as a real signup, how to define an activation event, and why time-to-value matters. Useful when a PM is setting up funnel tracking and needs a clear definition of what a 'good' activation looks like.
Archana Madhavan
Amplitude lays out a four-step process for measuring onboarding: define the activation event, segment new users, watch behavior between signup and activation, and run experiments on drop-off points. Useful when a growth team can see signups but cannot tell why new users do not stick.
Suzan Calcali
Catalog of 16 customer success metrics — NRR, churn, NPS, time-to-value, expansion, and more — with formulas and when to use each. Useful when a CS leader is rebuilding their reporting and wants a starting menu of metrics with definitions.
Sue Nabeth Moore
Sue Nabeth Moore explains how to pick customer success metrics that actually predict renewals — separating health, adoption, and outcome metrics so teams know which lever they are pulling. Useful when a CS team's dashboard has too many numbers and leaders cannot tell which ones matter for retention.
Janhavi Nagarhalli ð»
Walks through how to map a B2B journey, attach metrics to each stage, and run small experiments to improve weak spots. Useful when a B2B team has a rough journey doc but no way to measure or improve it stage by stage.
Navdeep Yadav
Long catalog of 100+ product metrics grouped by funnel stage, business model, and team — from acquisition and activation through retention, revenue, and referral. Useful when a PM is setting up a metrics tree and wants a menu to pick from instead of reinventing each KPI.
Tanner Christensen
Pairs Top Tasks with PURE (Pragmatic Usability Ratings by Experts) to score how well a design helps users finish key jobs and where it slips. Useful when a design team needs a repeatable way to show leaders that their work is moving real user outcomes.
Accredian
Accredian's overview of key metrics and strategies for product success. Useful when PMs need a refresher on big metric categories.
Sandeep Chadda
Sandeep Chadda's simple structure for thinking about any product metric. Useful when teams want a flexible way to map any metric to a goal.
Matej Latin
Matej Latin on measuring and quantifying user experience. Useful when teams want to put real numbers behind UX work.
Vlad Orlov
Vlad Orlov on avoiding the vanity KPI trap in sales. Useful when sales teams report numbers that look great but mean little.
Tashina Alavi
Tashina Alavi on telling vanity metrics from actionable metrics. Useful when teams want a simple test for whether a metric should stay.
Avi Siegel
Avi Siegel pokes at vanity metrics that boost ego but not the business. Useful when teams flex big numbers that hide real problems.
John Utz
John Utz's PM guide to moving from vanity metrics to actionable insights. Useful when PMs want to shift their dashboards to drive decisions.
Jefferies Jiang
Jefferies Jiang on moving past vanity metrics in 2025. Useful when teams want a refresh on which numbers really matter.
Ben Yoskovitz
Ben Yoskovitz argues even vanity metrics can have a real role if used well. Useful when teams blanket-ban vanity metrics and miss useful context.
Luke Korthals
Explains why UX KPIs matter and lists task success rate, time on task, and other key indicators. Useful when a team wants to move from gut feel to evidence-based UX decisions.
Maximilian Speicher, PhD
Argues that common analytics like conversion rate do not really measure UX, and points to proper tools like the UEQ. Useful when a team is unsure if they are measuring the right thing.
Neil Patel
Lists seven simple ways to measure a website's user experience, from page speed to task success. Useful when a small team wants quick checks before doing deeper UX work.
Udit Maitra
Walkthrough of the main types of UX metrics and when to use each. Useful when you need a tour of metric types before picking your set.
Reddit, Inc.
Reddit thread where designers share ways to measure the work of UX designers and researchers. Useful when a manager needs honest, peer ideas about how to judge UX team output.
Jeff Sauro
Lays out a step-by-step plan for measuring UX by tying user data to company KPIs, top tasks, and benchmarks. Useful when a team needs a starting playbook to set up a real UX measurement program.
Nick Babich
Quick guide to UX metrics including HEART and PULSE frameworks. Useful when you want a one-page primer to share with your team.
Jeff Humble
Plain-English intro to UX metrics and how to pick the right ones. Useful when you are new to UX measurement and want a clear starting point.
Shivani Dubey
Lists 5 UX metrics and 8 KPIs you can use to measure user experience. Useful when you need a starter list of UX metrics to pick from.
Raluca Budiu
Classic NN/g piece arguing success rate is the cheapest, clearest UX metric and represents the bottom line of usability. Useful when you only have time for one metric and need to defend the choice.
Kayode Osinusi
Toptal guide that covers qualitative and quantitative ways to measure UX with concrete metrics. Useful when you need a starter map for picking which UX metrics to track in your product.
Jeff Sauro
Ten quick facts about completion rates: how to score, why they matter, and how sample size affects confidence. Useful when you are running a usability study and want to use completion rate the right way.
Sophie
Walks through how to measure binary task success and levels of success in usability tests. Useful when you are setting up a usability test and need to define what success looks like.
Rurik Mahlberg
Explains how task success rate measures usability and where it pinpoints user friction in real services. Useful when you want one go-to UX KPI to compare designs or user groups against.
Aparna Subhash
Wowmakers' KPIs for measuring the success of UX design, with examples and rationales. Useful when a UX lead is preparing a KPI deck for stakeholders and wants quick examples to borrow.
Kateryna Mayka
Eleken's UX design KPI examples and how to measure user experience with them. Useful when a designer or team is making their first KPI doc and wants a starter set of examples to choose from.
Jack O'Donoghue
Three steps to better UX metrics with the Google HEART framework, focused on practical use. Useful when a team has heard of HEART and wants a quick way to apply it without a heavy rollout.
Ryan Ford
Pokes at the alphabet soup of design KPIs and helps teams figure out which ones actually mean something. Useful when a design lead is trying to clean up a noisy metrics deck before a stakeholder review.
Fullstory
Walks through six core CX metrics like NPS, CSAT, CES, churn, and retention, with examples of when each one is the right pick. Useful when a team is setting up a CX dashboard and wants a quick reference to choose the right metric per question.
Sarah Gomillion, PhD
Breaks down what user trust is and how to measure it through behavior, surveys, and qualitative cues. Useful when a team builds something sensitive like AI, finance, or health and needs a way to track if users actually feel safe.
Brian Fleming
Lays out a practical framework for measuring customer experience by tying signals from surveys, behavior, and operations to clear outcomes. Useful when a team wants a simple shared view of how to track CX without drowning in dashboards.
Greg Kihlstrom
Walks through traditional CX measures like CSAT and NPS and explains where each one falls short on its own. Useful when a CX or product team is picking which metrics to keep, drop, or pair with new signals.
Jorge García-Luengo
Jorge García-Luengo argues CSAT and NPS often fall short and proposes a measurement model tied to strategy, operations, and delivery. Useful when CX scores look fine but real experience problems persist and you need a deeper measurement approach.
Gabriel Queiroz
LinkedIn article applying Google's HEART framework to product discovery metrics for a clearer view of user signals. Useful when mentoring an associate PM or when picking metrics for a new discovery effort.
Ethan Chan
Walks through using a clear desired outcome and frameworks like AARRR or JTBD to drive data-backed product discovery. Useful when starting discovery on a new feature and you want to lock in a measurable outcome before research.
Tim Herbig
Tim Herbig argues that quantity of experiments alone can't show whether discovery is succeeding and pushes signals of quality and process. Useful when experiment volume looks high but team decisions are not improving and you need better metrics.
Jeff Gothelf
Jeff Gothelf's three-tier approach to measuring product discovery: is it happening, is it effective, and is it consistent over time. Useful when starting to measure discovery in a team and you want a simple maturity ladder.
Jana DiSanti
Twenty Ideas blog on metrics for product discovery, including validated ideas, user engagement, and conversion rates as signals of innovation. Useful when leadership asks how to measure discovery work and you need a starter metric set.
Rohit V.
Groups product success metrics into five buckets — journey, utility, satisfaction, connection, and customer lifetime value — so PMs can see health from multiple angles. Useful when building a metrics dashboard and you want a balanced view rather than one number.
Olesia Melnichenko
Walks through seven satisfaction metrics including NPS, CSAT, and customer effort score, with notes on when each one fires the right signal. Useful when picking which satisfaction metric to attach to a feature or service touchpoint.
John Rampton
Outlines five ways to read customer happiness: easy feedback channels, product usage, qualitative signals, NPS, and support tickets. Useful when you want a starter set of signals to track satisfaction without spinning up a heavy program.
Nir Eyal
Shares NPS data showing that products users open more often score higher on customer satisfaction, linking habit formation to loyalty. Useful when you need a data-backed argument for investing in habit loops or measuring product stickiness.
Arkadiusz Radek
Arkadiusz Radek argues UX and product teams must be strategic about which metrics they pick and offers a guide to numbers worth measuring. Useful when teams default to vanity metrics and miss what actually moves the product.
Max Stepanov
Max Stepanov's Dual Metrics Model pairs metrics so each guards against the other's failure mode, avoiding tunnel vision from a single number. Useful when teams over-optimize one metric (like bounce rate) at the cost of UX or brand.
Sean Taylor
Sean Taylor argues metric design is never done - metrics are evolving artifacts to test, tweak, and replace, with five key tradeoff properties. Useful when teams treat metrics as fixed and stop reviewing whether they still work.
Maria M.
Maria Myre's Designlab guide groups product metrics into engagement, retention/satisfaction, and feature usage, paired with user research for richer insight. Useful when teams want a quick reference of which metrics fit which question.
@Clint Fontanella
Clint Fontanella's HubSpot guide names 15 customer-success metrics across health, revenue, and operations - including health score, NPS, and churn. Useful as a starting menu of CS metrics for SaaS teams.
Sarah Gomillion, PhD
Sarah Gomillion shares how Expedia Group built a measure of user trust at scale and proved it predicts conversion and rebooking. Useful when teams want to make trust a real product KPI, not a vague concept.
Ryan Law
Ryan Law's guide to SaaS customer-success metrics covers ~50 metrics with formulas, including viral coefficient, referral ROI, and referral revenue. Useful when SaaS teams want to shift growth focus from acquisition to retention and referrals.
Austin Caldwell
Austin Caldwell lists e-commerce metrics with formulas - repeat-customer rate, churn rate, and many more - as a reference table. Useful when ops or merch teams need consistent definitions across reports.
Shailesh Sharma
Shailesh Sharma lists 15 e-commerce metrics for PMs - unique visitors, traffic sources, impressions, reach, CPA, and more. Useful as a quick reference when a PM needs to map e-commerce work to specific numbers.
Maximilian Speicher, PhD
Maximilian Speicher argues conversion rate and AOV aren't UX metrics - real measurement needs an instrument like UEQ. Useful when teams default to revenue numbers as proxies and miss what UX is actually doing.
Udit Maitra
Udit Maitra outlines three types of UX metrics - performance, self-reported, and behavioral/physiological - and recommends combining them for richer data. Useful when teams want to triangulate signals across methods.
Nick Babich
Nick Babich gives a quick guide to UX metrics, splitting them into behavioral (what users do) and attitudinal (what users say). Useful when teams need a one-pager intro before picking metrics for a project.
Jeff Humble
Jeff Humble breaks UX metrics down by qualitative vs quantitative and attitudinal vs behavioral, with a goal of centering metrics on the user's point of view. Useful when teams want a foundational framing of what UX metrics actually are.
Hailey Friedman
Hailey Friedman lays out information-hierarchy principles for marketing dashboards and argues custom dashboards beat generic ones. Useful when marketing teams want a quick reference on layout and content choices.
Adam Fard
Adam Fard pulls together dashboard lessons across three pillars: research, decluttering, and data viz. Useful when teams need a single overview of what makes a dashboard go from good to great.
Userpilot
Userpilot defines product value, shows how to measure it (PMF survey, NPS, behavior analytics), and lists ways to grow it. Useful when teams need to quantify whether users actually find the product valuable.