- Log In
- Request a Demo
Your website is an incredibly powerful conversion channel – unless no one is converting.
Many marketers have no idea if their website is an effective lead generation tool (which probably means it’s not as effective as it could be). Running marketing experiments isn’t just for data scientists. Marketers now have tools at our disposal to find out how our digital properties are performing, and what we could do to improve them.
I’ve been doing some tinkering on the SnapApp website in an effort to establish baselines for our web conversion and improve performance over time.
The SnapApp website was designed with a single, blanket call-to-action banner on almost every page:
Sure, that call to action could be effective, but we’ll never know until we test.
Lucky for me, there are great tools out there to quickly and easily run marketing experiments on our website. I use Optimizely, which offers a great entry-level A/B testing tool that is free to use and integrates easily with our website. We just installed a short snippet of code and we could start testing elements all over our site.
Today, I want to walk you through how I ran some of my first conversion optimization tests – and got results well beyond my expectations.
I’ll walk you through:
The first step in setting up a conversion optimization test is kind of obvious: what am I going to test?
Think Traffic recommends designing your hypothesis around your key performance indicators.
“When a visitor arrives at your site, what do you want them to do?” - Think Traffic
For us, we want to urge website visitors to find out everything the SnapApp platform is capable of, and how interactive content will help them meet their marketing goals. That means signing up for a live demo. To that end, we have demo request CTAs all over our website – but we still weren’t receiving very many inbound demo requests.
Since this was our first foray into conversion rate optimization on our site, there was a lot of low-hanging fruit.
I decided to take that low-hanging fruit – the blanket CTA all over our site – and see if I could make them it compelling for our visitors. The first page I tackled was our interactive infographics information page.
To refresh your memory, here’s the blanket CTA on that page:
I hypothesized that including language more directly relevant to the content on the page, and (ideally) therefore to the interests and goals of my reader, would encourage more conversion. But we’ll never know unless we try it out!
In order to reach statistical significance, you’ll need to pass a certain number of people through your test page. Depending on your web traffic, more variations could mean your test will take forever to pick a winner.
Let’s say you have 1,000 visitors to your page per month. If you set up 5 variations, you’ll only send 200 people per month to each variation. Unless one variation dramatically out-performs the rest, your experimental timeline may need to be rather long to show a significant difference between the outcomes.
For this experiment, I knew this page received only a little bit of traffic per month, so I started small: two variations on the original.
Here’s variation one:
And here’s variation two:
First things first, a confession: In this experiment, I broke a cardinal rule of multivariate testing (Shame! Shame!). I changed more than one variable at a time.
While I’m tempted to say that error voids this test’s validity, I found the results compelling – especially given that I kept the CTA language consistent for both variations. So bear with me.
For the first variation, I changed the CTA language, making it more action-oriented, but kept the button copy consistent. In the second variation, I used the active CTA language paired with even more active button copy.
CRO. Exciting stuff.
As I mentioned earlier, I hypothesized that a more effective call to action would be tailored to the content on that specific page and encourage action.
I did some research into effective conversion copy and found this great piece from Neil Patel on what makes CTA buttons really clickable. One of the tactics Neil recommends is “making the button the next obvious action,” an approach I adopted here.
The CTA Language
To get visitors excited enough to convert on this page, I wanted to make the CTA language an obvious next step. “Ready to see what you could do with interactive content?” is fine, but not quite specific enough in terms of the value the visitor will see or the action they should take.
Given that this page is all about what interactive infographics are and how they work, I hypothesized that an engaged visitor might be interested in actually making such a piece of content.
“Ready to build an interactive infographic?” suggests the next step will enable the visitor to build this piece of content themselves, supporting the SnapApp value prop that interactive content is quick and easy to create in our platform.
The Button Copy
I’d also read that the most compelling CTAs offer value, rather than making demands. Our blanket CTA of “Request a Demo” wasn’t necessarily demanding something, but it wasn’t expressing a clear value or step forward for our prospect, either.
Here’s what Wishpond had to say:
I thought “Get Started” or “Let’s Go” sounded more active and value-oriented than “Request a Demo.” So I tested it to see if it would work.
If you’re using the right tool, you can turn your test on and let it run until the software finds a winner. There’s some complicated math you could do if you don’t use software, but I won’t get into that here. Optimizely conveniently calculates statistical significance for you in the Results dashboard and declares a winner.
It took 435 visitors for this test to yield results, with a greater than 99% confidence interval. This is relatively few visitors, but Variation #2 outperformed the original and Variation #1 so resoundingly that statistical significance was reached. According to this test, this CTA has converted 22.7% of visitors, compared to the original’s conversion rate of just 2.07%. Safe to say we’ve seen an improvement. *Pops champagne*
Want to find out if your own test was statistically significant? Use this free significance calculator!
One pitfall in multivariate testing is the risk that results achieved in one time period won’t hold over time. I kept this test running, displaying the winning variation to all visitors, and am happy to report the conversion rate has improved even more since February.
The above screenshot was taken on February 5; the below is from June 13.
Hooray! Still working its magic.
The SnapApp website has a page dedicated to each of our content types; the “interactive infographics” page is just one of ten of this type of page. Based on the outcome here, I hypothesized that using action-oriented language on the other content type pages would improve performance there, as well.
But I didn’t just dive in and change all of them at once: I ran tests on each page individually to determine what would work in each case.
For calculators, the winning variation was “Ready to build a calculator? [[Let’s Go]]” and beat the original by 1,441%. For interactive video, “Ready to see what interactive video could do for you? [[Let’s Go]] won out, beating the original by 229%.
What did I learn? Active calls to action are incredibly powerful. These CTAs, urging visitors to “get started” or inviting them with “let’s go” seem to really resonate with our audience and motivate them to click.
Tackling the low-hanging fruit is a great way to get your feet wet with CRO. You’ll likely see some big wins quickly, which can motivate you to take on more granular changes throughout the rest of your site.
Make your CTA specific to your user’s goal
Test only one element at a time
Use active language
Be patient and wait for statistical significance
Analyze your results and take action
Find out if your tests reach statistical significance with this handy CRO calculator, by SnapApp and Leadin by HubSpot!