All posts
Opinion7 min read

When customers ask for features they already have: UX discoverability as the real problem

TL;DR

When customers request a feature you already built, the problem is almost never missing functionality — it's that your navigation, onboarding, or empty states failed to surface it, and building a duplicate will compound the original confusion.

Key takeaways

  • Before scoping any feature request, run a 10-minute audit: search your own product for the requested capability as a first-time user would. If you can't find it in three clicks, neither can they.
  • Track 'feature awareness rate' alongside feature usage — if 40% of active users have never triggered a core feature, you have a discoverability problem masquerading as a usage problem.
  • Empty states are the highest-leverage discoverability surface. A blank dashboard that says 'No data yet' has failed. One that says 'Import your first CSV to see trends — here's how' has done the job.
  • Duplicate features created in response to 'missing' requests become permanent maintenance debt. Audit first, build second.
  • Contextual tooltips on first use outperform onboarding tours by roughly 3x for feature retention — users ignore tours but engage with help that appears exactly when they need it.

The request that should not exist

A common scenario: an export-to-CSV feature ships in week three. By week eight, three separate customers have asked for an export-to-CSV feature to be added.

Same product. Same feature. Three people who had been using the tool for weeks with no idea the button existed.

The instinct is to blame the customers. The correct one is to open the product and navigate to the export as a first-time user would. Four clicks through a menu labelled "Account settings". The data export ended up inside account settings because that's where it happened to get built, not because that's where anyone would look.

The feature wasn't missing. It was invisible.

This pattern is so common in product development that it has a name: the feature cemetery. You build things, they work, customers don't find them, and then you build them again in response to requests — burying the original version even deeper under a second layer of indirection.

Why builders are systematically blind to it

The core problem is that you have the curse of knowledge. You know where everything lives because you put it there. When you navigate your product, you are not discovering — you are remembering.

New users have no memory. They approach your product with a mental model built from other products they've used, from your landing page copy, and from whatever state their context is in at that moment. If those three things don't align with where you hid the feature, they won't find it.

There's also a support bias at work. Customers who can't find a feature will either email you, post in your community, or churn silently. The ones who email tend to frame their request as "I'd love if you added X" rather than "I can't find X". This isn't dishonesty — they genuinely believe it's missing. Your support inbox therefore reads as feature demand when it's actually navigation failure.

Intercom published data a few years ago showing that a significant portion of their "feature requests" in customer conversations corresponded to functionality already in the product. That figure varies by product complexity, but in my own experience across several tools, roughly one in five requests falls into this category.

The Feature Cemetery Loop

Feature not found
Emails as feature request
Added to backlog
Duplicate feature built
Original harder to find

Measure discoverability before you open the code editor

Before treating any feature request as a build task, run this audit:

Step 1: Search your own product cold. Open an incognito window. Log in as a new user (or use a demo account). Try to find the requested feature in under 90 seconds with no prior knowledge. If you fail, the discoverability problem is confirmed.

Step 2: Check session recordings. Tools like PostHog, FullStory, or Hotjar let you filter to sessions where users navigated to a feature's parent page but never clicked the feature itself. A funnel that shows 60% of users reaching a settings page but only 8% clicking the export button is telling you something specific.

Step 3: Run a five-second test. Show your navigation screenshot to three people who haven't used your product. Ask them: "Where would you go to export your data?" If fewer than two out of three point to the right place, your label or information architecture is broken.

Step 4: Audit your empty states. The moment a user first lands on a feature page — before they've created any data — what do they see? A blank screen with a generic "Get started" message is a discoverability failure waiting to happen. That screen should explicitly name what's available and how to trigger it.

None of this takes more than half a day. It either confirms a genuine gap (in which case you build), or it surfaces a navigation fix that costs two hours of work and resolves five open feature requests at once.

Discoverability Audit Steps

1

Cold product walkthrough

Find as a new user would

2

Empty state audit

First-use surface

3

Five-second test

Navigation label check

4

Session recording funnel

Find dropoff before feature

What actually moves the needle

There are three interventions that consistently surface buried features.

Contextual activation prompts. When a user does something that implies they need a feature — say, manually copying a table cell ten times — surface a tooltip: "Did you know you can export this whole table as CSV? Here's how." This is not an onboarding tour. It's triggered by behaviour, not by first login. Behaviour-triggered prompts have dramatically higher engagement than the generic "take a tour" modal that every user dismisses immediately.

Rename the thing. Often the feature exists but the label is wrong. "Data portability" does not mean the same thing to a user as "Export to CSV". If your search analytics (Algolia, Intercom's search, even Cmd+K logs) show users typing "export" and landing on zero results — or worse, landing on an unrelated page — you have a naming problem. The fix is a label change or a search alias, not a new feature.

Progressive disclosure in navigation. If your product has 40 features and you show all 40 in the sidebar, you're hiding everything by showing everything. Group features into contexts that match the user's current task. A user in a "Reports" flow should see export options surfaced in that context, not buried under "Account". Figma does this well — the properties panel shows only what's relevant to the currently selected element.

What doesn't work: adding a new entry point to the same broken feature without fixing the mental model mismatch. You'll have two invisible paths instead of one.

How to tell when it really is a missing feature

The audit above will sometimes confirm that the feature genuinely doesn't exist, or exists in a severely limited form. The signals that distinguish real gaps from discoverability failures:

  • Users describe a workflow, not a button. "I want to export to CSV so I can run pivot tables in Excel" is different from "I want an export button". The former describes a job to be done that your feature may not actually serve, even if the button is there.
  • Power users are the ones asking. If someone who has used your product for 200 hours and knows every corner of it is requesting something, it's probably not a discoverability problem.
  • The request appears in NPS verbatims from churned users, not just active users. Churned users left because something was blocking them; they're unlikely to be confused about navigation.
  • Session recordings show users reaching the feature and then abandoning it — which means they found it, but it didn't do what they expected. That's a capability gap, not a discoverability gap.
Build the feature when you see these signals. But treat the default assumption as discoverability failure until the evidence says otherwise. The cost of shipping a navigation fix when you needed a feature is low. The cost of shipping a duplicate feature when you needed better labels is compounding technical and UX debt that will outlive your current roadmap.

The metric you're probably not tracking

Most products track feature adoption as a binary: used / not used. Almost none track feature awareness — whether users know the feature exists before they need it.

The gap between awareness and adoption tells you whether you have a discoverability problem or a value problem. If awareness is high and adoption is low, the feature isn't useful enough (or it's broken). If awareness is low and adoption is low, you have no idea what you actually have — you may have a great feature that nobody has ever had the chance to evaluate.

You can approximate awareness through periodic in-product surveys ("Which of these features have you tried?"), by tracking first-use rates relative to account age, or by watching how users respond to contextual prompts. None of this is hard to instrument.

The reason this matters beyond individual features: a product full of invisible capabilities is not a complete product in any meaningful sense. Your documentation can list every feature correctly, your changelog can be exhaustive, and your onboarding can mention everything once — and users will still only use what they stumble into or are pointed to at exactly the right moment.

Build for discoverability first. Then build for functionality. The customers asking for features they already have aren't confused — they're telling you that the features you shipped haven't landed yet.

Product, measurement, and decision quality