GROWTH HACKER’S GUIDE TO CONVERSION RATE OPTIMIZATION: TESTING STRATEGY, BEST PRACTICES, COMMON PITFALLS

There are some important things to keep in mind when planning and running tests for Conversion Rate Optimization. As you run your tests and gather results, it can be easy get lost in the complexity of running multiple tests at once, translating data into actionable insights, and staying on top of the various parties within your company who are involved in the planning and running of the tests in order to ensure that things stay on track and nothing gets missed.

This post will address some of the best practices for running CRO tests, as well as some significant mistakes to avoid.

 

1.) Start Small — Once you’ve created a list of pain points, and ideas for how to improve them, it’s tempting to want to dive in headfirst and test everything at once, possibly making large and immediate changes to your site. This is a mistake for two reasons:

 

  • You should keep tests separate – This isn’t to say that you can’t run more than one test at once, but you certainly should not run two tests on the same site element at once! For example, running two simultaneous tests on the same page of your checkout funnel will leave you scratching your head as to which test caused the change in performance.

 

  • You don’t want to lose what’s working — Let’s say you want to test five elements on a page. You can run five different tests — one for each element, in isolation; or you can run one big test with a whole new page. Even if the test performs better than the original, you run into the same problem as with running multiple tests at once — you won’t know what made the difference. What’s more, you won’t know if some of the elements you left behind were actually working well, and you’re now throwing the baby out with the bathwater. By testing elements separately, you’ll end up with a better and more detailed understanding of what’s working and what isn’t.

 

 

2.) Stay The Course

Don’t make changes to tests midstream! It can be tempting to jump into the middle of an experiment and make changes. This will void the test data and waste time. If something needs to be changed, start a whole new test. There is a difference between augmenting test content and starting a new test; the latter means the entirety of the test data will be consistent to the same elements.

 

3.) Keep To You Testing Schedule

Don’t stop tests too soon. It’s a good idea to run all tests for the same amount of time, and for an amount of time that ensures you’ll accrue enough data to draw reliable insights. This mitigates the potential for statistical insignificance and, consequentially, drawing the wrong conclusions.

 

4.) Keep Track of External Variables

This refers to anything outside of your website that might change over time.

  • Advertising budgets
  • Competitive landscape
  • Seasonal performance changes

 

4.) Keep Your Expectations Reasonable

While optimizing your site can have a huge effect on your online success, it’s important to keep your expectations within the realm of what reasonable for what you’re testing. For example, testing the color of to call-to-action on your homepage isn’t likely to increase conversion rates by 100%. However, removing a sign-up or sign-in requirement in your check-out funnel may have a significant impact. Calibrate expectations to what you’re actually testing.

 

5.) Keep Testing

Once you’ve gone through the process of running a series of tests and implementing changes based on the results, it can be tempting to quit while you’re ahead. Resist the urge! There’s always an optimization waiting to be discovered; always a way to be better. You should adopt Conversion Rate Optimization as a philosophy for successful online marketing, not merely a one-time tactic to improve performance.

 

Conclusion

When performing Conversion Rate Optimization tests:

 

  • Start small by not testing too many things at once or making tests too big
  • Stay the course. Don’t make changes mid-stream
  • Stick to your testing schedule
  • Keep track of external variables
  • Have reasonable expectations
  • Always keep testing!

References and Resources: