Brand Experience Equals Customer Retention

Customers tend to abandon a product following just one bad experience. Of course, you’re working to avoid this. You want customers to delight in your service, embrace your brand, tell every friend that THIS is the business they champion. But how do you achieve that level of loyalty? How do you know what constitutes a "good" and "bad" experience?

If the answers were completely obvious and straightforward, every company would be flying high. The challenge is to see your brand through your customers’ perspective, understand their experiences inside and out, and gather accurate information that points to your business’s weaknesses and strengths. From there, it’s much clearer what to alter, hone, enhance, and improve.

Why Your Brand Experience Surveys Continue to Fail You

The challenge for marketers is extracting genuinely useful insights from typical customer data. When you take a good look at the design of an average customer experience survey, this makes total sense. How useful is it to know that 60% of respondents are "somewhat likely" to recommend your product or services while only 30% are “highly likely”? You have no idea how 'highly likely' relates to actual experiences. This becomes even more pressing when trying to understand what generates a "highly unlikely" response.

With typical customer experience survey questions, these details are nearly impossible to pinpoint. When taking a survey where they know the type of answer you’re looking for, humans are wired to "game" the questions. They answer out of an impulse to reward or punish or express their indifference. A survey does not invite them to share an experience, but rather, an abstracted opinion about the sum of their experiences.

Active Sensemaking Instruments Provide Actionable Insight

The faulty setup surrounding brand experience surveys is the reason why so many professionals have reported that they don’t have the needed information about their customers. So just what is active sensemaking and what makes it different?

Unlike surveys, an active sensemaking instrument prompts respondents to supply personal stories or anecdotes related to their experience of a brand or their journey as a customer. The prompting questions that invite these stories are carefully neutral and ambiguous. After sharing their story, a respondent answers a small set of interpretive questions by which they indicate the meaning of their experience. Taken together, the collected stories, along with their interpretations, reveal patterns of experience that provide invaluable insights into what customers are actually experiencing. Research, brand design, and customer experience professionals can then meaningfully explore those patterns of customer experience, discerning concrete opportunities for improvement because the data is anchored in a specific context rather than numerically abstracted generalities.

  • Share a specific situation or moment at work that gives you hope or concern for the future of...
  • Describe a specific event or activity with (the present adoption) that inspired or bothered you.
  • Tell a story about work that you would share with a close friend.

Revamp Your Methods for Taking in Brand Experience Data

Active sensemaking is a game-changer that can radically deepen your understanding of how your customers are interacting and resonating with your brand. To learn more about this fresh approach and the opportunities it presents, subscribe to receive a free customizable template that will help you apply active sensemaking to the brands you currently oversee. Get ready for some exciting and unexpected lightbulb moments!

OR

A public service agency was responsible for providing a multi-faceted set of essential services to a broad constituency, including a variety of vulnerable populations who are likewise served by multiple stakeholder groups, both within and external to the agency. After a government audit had revealed a range of short-comings and service delivery deficiencies, the agency initiated a deep and thorough review of every aspect of their work. Given the diversity of stakeholders and stakeholder interests, and given the essential nature of the services delivered, the agency was concerned to ensure that every stakeholder would have a meaningful opportunity to add their voice to the review, contributing their perspectives and experiences. This led them to use active sensemaking as a core element in their research methodologies, with Spryng as the software platform for the initiative. Through a series of workshops, a broadly representative stakeholder group of about 25 individuals helped develop a Spryng sensor instrument that would be relevant to the issues at hand. The sensor was then ‘released’ into the broader population, inviting people to share personal stories related to their experiences with the agency and its services, interpreting each experience shared via answers to a carefully crafted set of questions about their story. At the end of the collection period, the stories (each with its interpretation from the contributor) were analyzed for patterns of interpretation and shared themes to be further explored. Then, through a series of workshops, another group of representative stakeholders dove into that data—the stories and the patterns of interpretation—making sense of the patterns and teasing out what they concluded were important insights and priorities for action, based on the qualitative and quantitative data from the respondents. These were captured in a formal review document that provided the agency with insights grounded in the contexts and experiences of their clients and stakeholders. Moreover, the Agency’s stakeholder communities felt valued and respected as their voices informed the recommendations for moving forward.