From a simple survey to €40m new revenue in 8 weeks

4 min read

The Power of Data-driven Strategic Design.

We’ve been working with Canyon for 2 years focused on customer experience design, digital products & services, and strategic support. Earlier this year Canyon set us a broad strategic design challenge to help them identify latent value in the business. In this post I’ll explain, and explore, the amazing journey that challenge took the team on, and how it gave us a much deeper, and more nuanced understanding, of data’s impact on the strategic design processes.

“This approach is going to fundamentally change how we do business forever” — Canyon CMO

1. Move Fast & Find Things

In order to gain insight quickly, we launched a hotjar survey asking customers in the UK and Germany one question:

‘How can we improve your web experience?’

For reasons that I hope are obvious we can’t show all of the customer feedback , however you can clearly see the second highest area of improvement was ‘LiveChat Issues’ with 14% of the vote. It’s interesting to note that customers added ‘issues’ in of their own accord which heightened our interest.

This was enough to indicate a potential opportunity and need to dig deeper.

2. 4x + 2x = ?

In order to quickly determine the potential value Keir; our data analyst, delved into the analytics to determine the impact LiveChat has on the customer experience and global sales.

“To say the least the findings are stunning.”

It turned out that users who engaged with LiveChat converted 4x better and spent on avg. 2x as much (vs. non-chat users). Linked with a sizable revenue contribution it was evident the strategic significance this tool had for users and thus Canyon’s Customer Experience and Service propositions.

3. Bridging the void

If LiveChat was so effective, why did customers reporting issues? What’s going on here? What and where are the gaps and more importantly what’s the opportunity?

To try and answer some of these questions we looked at a huge number of datasets, however one highlight centred around chat requests vs. conversations had. We discovered that only 16% of people who requested a chat were successful. From this point we were able to create a very early business case, however we needed to create a series of hypothesis and tests to validate our assumptions (or guesses, which is all they really are at this stage) in the real world.

4. Deep Dive and Hypothesis Statements

To understand how we might increase conversations first, we needed to deep dive every aspect of LiveChat data to develop a picture of what tests we might run. We looked at everything from page engagement, customer feedback, conversation keywords, open rates, etc and cross-matched with data from Google Analytics, Sales Data, Trends & Conversation Data.

An additional highlight included sessions vs. conversion rate when mapped against hours of the day across all markets. We noticed (and you can see in the drawing below) three key things:

  1. A slight drop in conversion vs. sessions between 12 -1
  2. While sessions increased significantly from 4 pm — 10 pm conversion rate actually decreased
  3. Finally, we saw in the U.K. that 25% of visits happened on the weekend when chat wasn’t available

These three insights formed the basis of the hypothesis for a series of tests executed in collaboration with the U.K. service team only and over the period of 1 week.

5. Tests & Results

At this stage, we’d considered a range of tests. We could have improved aspects of the Customer Experience (it opens slowly, isn’t positioned well on the page, vanishes after scrolling and has too many steps), or even implemented new Technology like IntercomZenDesk, or another tool but that would have taken time in design, development, testing, legals, procurement, etc. We were looking for something we could do quickly to validate our hypothesis before making any investment.

The one aspect we could change quickly were people, so we created 3 simple tests:

  • More support 10:00–14:00
  • Evening hours 18:00–21:00
  • Weekend 10:00–14:00

We can’t share the specific numbers behind the results but in summary, we had a:

  • 160% increase in opens
  • 57% increase in conversations
  • 56% decrease in abandoned carts
  • 15% decrease in emails sent to the service team
  • 47:1 ROI
  • Improved customer feedback (3% reduction in issues reported)

Based on data from a control week (prior to test) and the week following the test we were able to determine the ‘actual loss’ of not supporting chat during the suggested periods, the ‘total loss’ (actual loss + opportunity cost)which when added together and modelled across all markets (with similar parameters & patterns) we’d found €20–40m in ‘validated’ additional revenue opportunity and it all started with a hotjar survey!

And this was based on a single channel, tested in one country, and from just one of three possible revenue-generating tests.

6. Strategy

It’s easy at this point to think that the long-term value here is in the revenue but it isn’t. We’re only sharing a small part of what we learned. The real strategic value was in helping Canyon understand deeper customer needs, how to organise teams globally, what type of technology they needed, how to train staff, allocate them to customers, optimise workflows with ChatBots, automate service vs. sales requests and the value of service in a direct-to-consumer business with limited physical locations…I’ll write more about this in another article soon.

If you’d like to know more about our strategic design process and what it could do for your business drop me an email: john@common-good.co

Thanks for reading.

John, Co-founder & Strategy Director