Every company I've ever worked with says they listen to their customers.

They have the surveys. They have the NPS scores. They have dashboards full of sentiment data and customer verbatims and trend lines going back years. They can tell you exactly what their satisfaction score was last quarter, broken out by segment.

And most of them are still making decisions based on an incomplete picture of what their customers actually experience.

Because listening and hearing aren't the same thing. Listening is collecting the signal. Hearing is doing the harder, slower, more uncomfortable work of understanding what the signal actually means — and then being willing to act on it, even when the answer isn't what you were hoping for.

I've learned this distinction through several projects over the past few years, and each one taught me something different about what "hearing" the customer really requires.


The Customers You Should Be Talking To Are the Ones Who Left

Early in one project, our team was focused on understanding why certain guests weren't coming back after a first visit. The standard approach would have been to look at the data we already had — satisfaction surveys from current customers, feedback from the people who were showing up regularly.

But here's the problem with that: if you only listen to the people who stayed, you're hearing a self-selected sample. The customers who left — the ones who tried you once and decided not to return — are carrying the most important information. And they're the hardest to reach.

So we made a deliberate choice to go find them. Working with our customer insights team, we designed outreach specifically for guests who had visited once and not come back. We wanted to understand what brought them in, and more importantly, what made them decide not to return.

What we heard changed the direction of the project.

The gap these lapsed guests described wasn't what we expected. It wasn't about the core service quality — the technical skill was generally fine. The gap was about the overall experience. The consultation. The communication. The feeling of being cared for, not just served. They were telling us that the experience around the service mattered as much as the service itself.

That's a distinction that doesn't show up in a satisfaction survey. Surveys ask, "Were you satisfied with your service?" The answer can be yes — and the guest can still never come back. Because satisfaction with the service and satisfaction with the experience are two different things. And most organizations measure the first and assume it covers the second.


Feedback Without Context Is Just Noise

Around the same time, I was involved in building an enterprise-level view of guest feedback — pulling NPS and themes across all guest-facing programs so that the full picture was visible in one place.

The data collection itself was straightforward. What became clear very quickly was that guest feedback alone was only half the story.

Guests can describe their symptoms. They can tell you what felt good and what felt wrong. But they can't diagnose the cause. Only the teams running each program know what operational changes, staffing decisions, or process shifts might be driving what guests are experiencing. A guest tells you "the experience felt rushed." She can't tell you that's because a scheduling change compressed appointment windows, or that a staffing gap meant fewer associates on the floor that week.

So I built a bridge. I identified a point of contact in each program area and set up a regular rhythm of conversations — not formal meetings, just a reliable cadence of "what's happening on your end that might explain what we're seeing in the feedback?" Then I layered that business context onto the guest feedback, so the report going to decision-makers told a complete story: here's what guests are saying, and here's what the business is doing that connects to it.

The difference was immediate. Before, a drop in satisfaction was a status update — "NPS is down." After, it became a strategic conversation — "NPS dropped because this specific change affected this part of the experience, and here's what the team is doing about it."

That's the difference between listening and hearing. Listening collects the guest's voice. Hearing pairs it with the operational reality that explains it. One tells you what. The other tells you why — and why is what you actually need to make good decisions.


Go Be the Customer

One of the most valuable things I did during the trip-frequency project wasn't analytical at all. It was experiential.

I booked appointments at competitor locations. Not as a strategist doing competitive research — as a customer. I walked in, experienced the consultation, sat through the service, paid, and left. I noticed things I never would have caught from data: how the greeting felt, whether the consultation made me feel heard, how the physical space shaped the experience, what the checkout felt like.

Then I did the same at our own locations. And the gap between what I experienced as a guest and what I understood as a strategist was humbling.

Numbers had told me what was happening. Being the customer told me why. The in-person experience changed what I noticed when I went back to the data. Patterns that had been flat on a spreadsheet suddenly had texture and meaning.

I think every operations and strategy professional should do this regularly — not as a mystery-shop exercise with a clipboard, but as a genuine attempt to feel what your customer feels. It's uncomfortable, because you notice things you wish you hadn't. But that discomfort is the point. That's hearing.


When Hearing Means Accepting What You Don't Want to Hear

There's one more dimension to this that I think is the hardest. Sometimes you do the listening, you do the work of hearing, and what you hear isn't what you were hoping for.

In one project, we ran a portfolio of tests — structured experiments to understand which service offerings worked and which didn't. We built a scorecard that measured each test across three lenses: the financial impact, the guest impact, and the associate impact. Not just "did it make money?" but "did guests respond to it?" and "was it operationally sustainable for the people delivering it?"

When we reviewed the results, the honest answer was that most of the tests needed to stop. Not pivot. Stop.

That's a hard outcome to accept. There's always pressure to keep going — to run another quarter, to expand the sample, to give it more time. But the scorecard was clear. The signals were honest. And the team had the discipline to act on what the data was actually saying rather than what they hoped it would eventually say.

That discipline — the willingness to hear the answer and act on it — is what separates organizations that learn from organizations that just collect information. It's easy to listen when the data confirms your strategy. Hearing means staying with it when the data says you were wrong.


What Hearing Actually Requires

If I had to distill what I've learned into a few principles, it would be this:

Talk to the people who left, not just the people who stayed. Your happiest customers will tell you what you're doing right. Your lapsed customers will tell you what actually needs to change. Both are valuable. Most organizations only do the first because it's easier.

Pair the customer's voice with the business context. Feedback in isolation is a symptom report. Feedback combined with operational reality is a diagnosis. Build the bridge between what customers are saying and what the business is doing — that's where actionable insight lives.

Experience your own product as a customer. Not as a reviewer. Not as an auditor. As a person trying to get their needs met. The gap between what you design and what you experience is where your biggest opportunities hide.

Be willing to act on what you hear, even when it's uncomfortable. Hearing sometimes means stopping something you invested in. It sometimes means admitting a hypothesis was wrong. The organizations that grow are the ones that let honest signals drive decisions, not momentum or hope.


Listening Is the Start. Hearing Is the Work.

Every company has listening infrastructure — surveys, feedback tools, dashboards. That infrastructure matters. But it's just the start.

Hearing is what happens next. It's the lapsed-customer interview. It's the bridge between guest data and business context. It's the competitor visit where you let yourself feel what your customer feels. It's the scorecard review where you act on what the data says instead of what you wish it said.

Hearing is slower. It's harder. It requires more cross-functional collaboration, more humility, and more willingness to sit with uncomfortable findings.

But it's the only way to actually understand your customer. And in my experience, it's the only way to build something that lasts.