conversion rate optimization

How to fail well with CRO (conversion rate optimization)

  • Belle
  • ON
  • August 7, 2018

Picture this. You’ve built a website. It’s a pretty good site. It’s easy to navigate, explains your product offering well and is nice to look at. It’s performing alright. You’re getting good traffic and a fair percentage of your users convert. You’ve implemented things that are considered best practices in the conversion rate optimization world: you have prominent calls to action, your button colors contrast against the background, and the copy is clear.

But you sense there’s still untapped potential. There’s room for improvement. You want to see your conversions lift.

This is where learning to fail well becomes important. To see the conversion lifts you’re looking for, you have to embrace taking risks. If you only test small changes you’re pretty sure will improve your site, you miss out on the opportunity to test bigger changes that could be really impactful. Bigger risks often lead to bigger lifts.


The first step in failing well: take risks

You’ve probably heard the quote “In order to succeed spectacularly, you have to be willing to fail spectacularly,” by Biz Stone, the founder of Twitter. At Brain Bytes Creative, we live by this mantra: fail spectacularly. We’ve built a culture that celebrates failing well — because we know it’s the only way to big success.

Conversion rate optimization is experimental by nature, and experimentation always involves an element of risk and uncertainty. Successful CRO programs are built around the concept of testing. As CRO specialists, we gather insights based on UX and marketing insights, then test website changes, monitoring the results until they reach a statistical significance. We obsess over the data and get excited when we see lots of green arrows. It’s exciting to help our clients succeed.

As much as we love seeing green numbers, we also celebrate losing tests. Why? Because they teach us stuff about our target audience. They help us redirect our efforts. Failed tests matter because of the marketing insights we learn from them.

This is the next key aspect of failing well: recognizing the value in failure. Because failing is learning.

Understand that failing is learning

Whether we’re A/B testing a blue call-to-action button vs. a green one, or testing shorter vs. longer copy, the most important part of a test is what we learn from it. This is why it’s vital to run tests that are designed to teach us something. How? By setting up tests properly so that a lift or dip in conversion can be attributed to a specific change, by running tests until they reach statistical significance and — this part is important — by accepting that having some failed hypotheses is part of the process. In fact, having a few fails means you’re probably testing the right hypotheses: those that are backed by data and UX / marketing best practices, but still teach you something you don’t know.

So how do you find valuable insights from a failed hypothesis? Here’s an example of a failed test we ran here at Brain Bytes and how we learned from it.

For context, our goal was to increase conversions (appointments scheduled) for an orthopaedic practice in the Southeast. We noticed that the CTA was below the fold and users had to scroll for a while before reaching it. Our hypothesis was that moving the “Schedule an Appointment” button higher on the page would increase appointments scheduled. We A/B tested it. And it failed.

For the variation that had a CTA above the fold, conversions actually decreased. Fewer users were scheduling appointments. When our test reached statistical significance, appointment schedule conversions were down 18% from the month before.

But we learned something valuable. We learned that for this product offering, customers needed to gather more information before they were ready to convert. It wasn’t only a matter of helping them quickly navigate to the “schedule appointment” button, but identifying and giving them all of the information they needed to feel comfortable before deciding to convert.

We moved the CTA back down on the page and watched conversions go back up. What was really valuable about this test was that it gave us deeper insight into our customer and their conversion journey. It shifted our focus to identifying what pieces of information they were looking for and presenting that info in a clear way, making the page easier to use and resulting in increased conversions.

So our test failed. We learned from it. What next?

Openly share CRO failures – internally and with clients

At Brain Bytes, we talk a lot about how our clients are partners. In any healthy partnership, both parties are working together toward the same goals. Nothing makes us happier than seeing our clients succeed. Because we want to support our clients and their success, we openly share failed tests with them. We share the red numbers. We share what we learned. And we share how we plan to take the insights we learned and turn them into new tests and actionable changes that will give our conversions a lift — and ultimately help our clients win.

What would you say if I told you I could help you increase clicks on a button on your website 71.2% simply by editing the copy? Or increase clicks on a link 24.93% by changing the formatting of the text? Better yet, what if I told you I could increase conversions by 46.95%, making the page 19.99% more valuable?

Looking to give your conversions a lift? Brain Bytes Creative offers specialized conversion rate optimization services using funnel analysis, user flow optimization, heatmapping, session replays, form analytics and optimization, and of course, A/B testing. Click here to read more about our services or contact us!

Leave a Reply