Monday, 18 June 2018

See How Dynamic Text on a Landing Page Helped Increase Conversions by 31.4% [A/B Test Reveal]

a/b testing with ConversionLab

Pictured above: Rolf Inge Holden (Finge), founder of ConversionLab.

Whether your best ideas come to you in the shower, at the gym, or have you bolting awake in the middle of the night, sometimes you want to quickly A/B test to see if a given idea will help you hit your marketing targets.

This want to split test is real for many Unbounce customers, including Norway-based digital agency ConversionLab, who works with client Campaign Monitor.

Typically this agency’s founder, Rolf Inge Holden (Finge), delivers awesome results with high-performing landing pages and popups for major brands. But recently his agency tried an experiment we wanted to share because of the potential it could have for your paid search campaigns, too.

The Test Hypothesis

If you haven’t already heard of San-Francisco based Campaign Monitor, they make it easy to create, send, and optimize email marketing campaigns. Tasked with running especially effective PPC landing pages for the brand, Finge had a hypothesis:

If we match copy on a landing page dynamically with the exact verb used as a keyword in someone’s original search query, we imagine we’ll achieve higher perceived relevance for a visitor and (thereby) a greater chance of conversion.

In other words, the agency wondered whether the precise verb someone uses in their Google search has an effect on how they perceive doing something with a product, and—if they were to see this exact same verb on the landing page— whether this would increase conversions.

In the case of email marketing, for example, if a prospect typed: “design on-brand emails” into Google, ‘design’ is the exact verb they’d see in the headline and CTAs on the resulting landing page (vs. ‘build’ or ‘create’, or another alternative). The agency wanted to carry through the exact verb no matter what the prospect typed into the search bar for relevance, but outside the verb the rest of the headline would stay the same.

The question is, would a dynamic copy swap actually increase conversions?

Setting up a valid test

To run this test properly, ConversionLab had to consider a few table-stakes factors. Namely, the required sample size and duration (to understand if the results they’d achieve were significant).

In terms of sample size, the agency confirmed the brand could get the traffic needed to the landing page variations to ensure a meaningful test. Combined traffic to variant A and B was 1,274 visitors total and—in terms of duration—they would run the variants for a full 77 days for the data to properly cook.

To determine the amount of traffic and duration you need for your own tests to be statistically significant, check out this A/B test duration calculator.

Next, it was time to determine how the experiment would play out on the landing page. To accomplish the dynamic aspect of the idea, the agency used Unbounce’s Dynamic Text Replacement feature on Campaign Monitor’s landing page. DTR helps you swap out the text on your landing page with whatever keyword a prospect actually used in their search.

Below you can see a few samples of what the variants could have looked like once the keywords from search were pulled in (“create” was the default verb if a parameter wasn’t able to be pulled in):

A/B test variation 1
A/B test sample variation

What were the results?

When the test concluded at 77 days (Oct 31, 2017 —Jan 16, 2018), Campaign Monitor saw a 31.4% lift in conversions using the variant in which the verb changed dynamically. In this case, a conversion was a signup for a trial of their software, and the test achieved 100% statistical significance with more than 100 conversions per variant.

The variant that made use of DTR to send prospects through to signup helped lift conversions to trial by 31.4%

What these A/B test results mean

In the case of this campaign, the landing page variations (samples shown above) prompt visitors to click through to a second page where someone starts their trial of Campaign Monitor. The tracked conversion goal in this case (measured outside of Unbounce reporting) was increases to signups on this page after clicking through from the landing page prior.

This experiment ultimately helped Campaign Monitor understand the verb someone uses in search can indeed help increase signups.

The result of this test tell us that when a brand mirrors an initial search query as precisely as possible from ad to landing page, we can infer the visitor understands the page is relevant to their needs and are thereby more primed to click through onto the next phase of the journey and ultimately, convert.

Message match for the win!

Here’s Finge on the impact the test had on the future of their agency’s approach:

“Our hypothesis was that a verb defines HOW you solve a challenge; i.e. do you design an email campaign or do you create it? And if we could meet the visitor’s definition of solving their problem we would have a greater chance of converting a visit to a signup. The uplift was higher than we had anticipated! When you consider that this relevance also improves Quality Score in AdWords due to closer message match, it’s fair to say that we will be using DTR in every possible way forwards.”

Interested in A/B testing your own campaigns?

Whether you work in a SaaS company like Campaign Monitor, or have a product for which there are multiple verbs someone could use to make queries about your business, swapping out copy in your headlines could be an A/B test you want to try for yourself.

Using the same type of hypothesis format we shared above, and the help of the A/B testing calculator (for determining your duration and sample size), you can set up some variants of your landing pages to pair with your ads to see whether you can convert more.

ConversionLab’s test isn’t a catch all or best practice to be applied blindly to your campaigns across the board, but it could inspire you to try out Dynamic Text Replacement on your landing pages to see if carrying through search terms and intent could make a difference for you.

Unbounce

Sunday, 17 June 2018

Google Updates Ad Settings: You Turn Off Personalized Ads

Google announced they have updated your ad settings page to enable you to turn off personalized ads and/or see and remove the information Google has on you around personalizing those ads...

Search Engine Roundtable

When Bounce Rate, Browse Rate (PPV), and Time-on-Site Are Useful Metrics... and When They Aren"t - Whiteboard Friday

Posted by randfish

When is it right to use metrics like bounce rate, pages per visit, and time on site? When are you better off ignoring them? There are endless opinions on whether these kinds of metrics are valuable or not, and as you might suspect, the answer is found in the shades of grey. Learn what Rand has to say about the great metrics debate in today"s episode of Whiteboard Friday.

When bounce rate browse rate and ppc are useful metrics and when they suck

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we"re chatting about times at which bounce rate, browse rate, which is pages per visit, and time on site are terrible metrics and when they"re actually quite useful metrics.

This happens quite a bit. I see in the digital marketing world people talking about these metrics as though they are either dirty-scum, bottom-of-the-barrel metrics that no one should pay any attention to, or that they are these lofty, perfect metrics that are what we should be optimizing for. Neither of those is really accurate. As is often the case, the truth usually lies somewhere in between.

So, first off, some credit to Wil Reynolds, who brought this up during a discussion that I had with him at Siege Media"s offices, an interview that Ross Hudgens put together with us, and Sayf Sharif from Seer Interactive, their Director of Analytics, who left an awesome comment about this discussion on the LinkedIn post of that video. We"ll link to those in this Whiteboard Friday.

So Sayf and Wil were both basically arguing that these are kind of crap metrics. We don"t trust them. We don"t use them a lot. I think, a lot of the time, that makes sense.

Instances when these metrics aren"t useful

Here"s when these metrics, that bounce rate, pages per visit, and time on site kind of suck.

1. When they"re used instead of conversion actions to represent "success"

So they suck when you use them instead of conversion actions. So a conversion is someone took an action that I wanted on my website. They filled in a form. They purchased a product. They put in their credit card. Whatever it is, they got to a page that I wanted them to get to.

Bounce rate is basically the average percent of people who landed on a page and then left your website, not to continue on any other page on that site after visiting that page.

Pages per visit is essentially exactly what it sounds like, the average number of pages per visit for people who landed on that particular page. So people who came in through one of these pages, how many pages did they visit on my site.

Then time on site is essentially a very raw and rough metric. If I leave my computer to use the restroom or I basically switch to another tab or close my browser, it"s not necessarily the case that time on site ends right then. So this metric has a lot of imperfections. Now, averaged over time, it can still be directionally interesting.

But when you use these instead of conversion actions, which is what we all should be optimizing for ultimately, you can definitely get into some suckage with these metrics.

2. When they"re compared against non-relevant "competitors" and other sites

When you compare them against non-relevant competitors, so when you compare, for example, a product-focused, purchase-focused site against a media-focused site, you"re going to get big differences. First off, if your pages per visit look like a media site"s pages per visit and you"re product-focused, that is crazy. Either the media site is terrible or you"re doing something absolutely amazing in terms of keeping people"s attention and energy.

Time on site is a little bit misleading in this case too, because if you look at the time on site, again, of a media property or a news-focused, content-focused site versus one that"s very e-commerce focused, you"re going to get vastly different things. Amazon probably wants your time on site to be pretty small. Dell wants your time on site to be pretty small. Get through the purchase process, find the computer you want, buy it, get out of here. If you"re taking 10 minutes to do that or 20 minutes to do that instead of 5, we"ve failed. We haven"t provided a good enough experience to get you quickly through the purchase funnel. That can certainly be the case. So there can be warring priorities inside even one of these metrics.

3. When they"re not considered over time or with traffic sources factored in

Third, you get some suckage when they are not considered over time or against the traffic sources that brought them in. For example, if someone visits a web page via a Twitter link, chances are really good, really, really good, especially on mobile, that they"re going to have a high bounce rate, a low number of pages per visit, and a low time on site. That"s just how Twitter behavior is. Facebook is quite similar.

Now, if they"ve come via a Google search, an informational Google search and they"ve clicked on an organic listing, you should see just the reverse. You should see a relatively good bounce rate. You should see a relatively good pages per visit, well, a relatively higher pages per visit, a relatively higher time on site.

Instances when these metrics are useful

1. When they"re used as diagnostics for the conversion funnel

So there"s complexity inside these metrics for sure. What we should be using them for, when these metrics are truly useful is when they are used as a diagnostic. So when you look at a conversion funnel and you see, okay, our conversion funnel looks like this, people come in through the homepage or through our blog or news sections, they eventually, we hope, make it to our product page, our pricing page, and our conversion page.

We have these metrics for all of these. When we make changes to some of these, significant changes, minor changes, we don"t just look at how conversion performs. We also look at whether things like time on site shrank or whether people had fewer pages per visit or whether they had a higher bounce rate from some of these sections.

So perhaps, for example, we changed our pricing and we actually saw that people spent less time on the pricing page and had about the same number of pages per visit and about the same bounce rate from the pricing page. At the same time, we saw conversions dip a little bit.

Should we intuit that pricing negatively affected our conversion rate? Well, perhaps not. Perhaps we should look and see if there were other changes made or if our traffic sources were in there, because it looks like, given that bounce rate didn"t increase, given that pages per visit didn"t really change, given that time on site actually went down a little bit, it seems like people are making it just fine through the pricing page. They"re making it just fine from this pricing page to the conversion page, so let"s look at something else.

This is the type of diagnostics that you can do when you have metrics at these levels. If you"ve seen a dip in conversions or a rise, this is exactly the kind of dig into the data that smart, savvy digital marketers should and can be doing, and I think it"s a powerful, useful tool to be able to form hypotheses based on what happens.

So again, another example, did we change this product page? We saw pages per visit shrink and time on site shrink. Did it affect conversion rate? If it didn"t, but then we see that we"re getting fewer engaged visitors, and so now we can"t do as much retargeting and we"re losing email signups, maybe this did have a negative effect and we should go back to the other one, even if conversion rate itself didn"t seem to take a particular hit in this case.

2. When they"re compared over time to see if internal changes or external forces shifted behavior

Second useful way to apply these metrics is compared over time to see if your internal changes or some external forces shifted behavior. For example, we can look at the engagement rate on the blog. The blog is tough to generate as a conversion event. We could maybe look at subscriptions, but in general, pages per visit is a nice one for the blog. It tells us whether people make it past the page they landed on and into deeper sections, stick around our site, check out what we do.

So if we see that it had a dramatic fall down here in April and that was when we installed a new author and now they"re sort of recovering, we can say, "Oh, yeah, you know what? That takes a little while for a new blog author to kind of come up to speed. We"re going to give them time," or, "Hey, we should interject here. We need to jump in and try and fix whatever is going on."

3. When they"re benchmarked versus relevant industry competitors

Third and final useful case is when you benchmark versus truly relevant industry competitors. So if you have a direct competitor, very similar focus to you, product-focused in this case with a homepage and then some content sections and then a very focused product checkout, you could look at you versus them and their homepage and your homepage.

If you could get the data from a source like SimilarWeb or Jumpshot, if there"s enough clickstream level data, or some savvy industry surveys that collect this information, and you see that you"re significantly higher, you might then take a look at what are they doing that we"re not doing. Maybe we should use them when we do our user research and say, "Hey, what"s compelling to you about this that maybe is missing here?"

Otherwise, a lot of the time people will take direct competitors and say, "Hey, let"s look at what our competition is doing and we"ll consider that best practice." But if you haven"t looked at how they"re performing, how people are getting through, whether they"re engaging, whether they"re spending time on that site, whether they"re making it through their different pages, you don"t know if they actually are best practices or whether you"re about to follow a laggard"s example and potentially hurt yourself.

So definitely a complex topic, definitely many, many different things that go into the uses of these metrics, and there are some bad and good ways to use them. I agree with Sayf and with Wil, but I think there are also some great ways to apply them. I would love to hear from you if you"ve got examples of those down in the comments. We"ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don"t have time to hunt down but want to read!

The Moz Blog

Google Clarifies Seven Points On Mobile-First Indexing After Much Confusion

Image credit to Shutterstock Let me start by saying all of these points, in my opinion, is something we"ve covered here before but the interesting thing is that Google felt these seven items are things SEOs who give presentations have confused over the past several months...

Search Engine Roundtable

See How Dynamic Text on a Landing Page Helped Increase Conversions by 31.4% [A/B Test Reveal]

a/b testing with ConversionLab

Pictured above: Rolf Inge Holden (Finge), founder of ConversionLab.

Whether your best ideas come to you in the shower, at the gym, or have you bolting awake in the middle of the night, sometimes you want to quickly A/B test to see if a given idea will help you hit your marketing targets.

This want to split test is real for many Unbounce customers, including Norway-based digital agency ConversionLab, who works with client Campaign Monitor.

Typically this agency’s founder, Rolf Inge Holden (Finge), delivers awesome results with high-performing landing pages and popups for major brands. But recently his agency tried an experiment we wanted to share because of the potential it could have for your paid search campaigns, too.

The Test Hypothesis

If you haven’t already heard of San-Francisco based Campaign Monitor, they make it easy to create, send, and optimize email marketing campaigns. Tasked with running especially effective PPC landing pages for the brand, Finge had a hypothesis:

If we match copy on a landing page dynamically with the exact verb used as a keyword in someone’s original search query, we imagine we’ll achieve higher perceived relevance for a visitor and (thereby) a greater chance of conversion.

In other words, the agency wondered whether the precise verb someone uses in their Google search has an effect on how they perceive doing something with a product, and—if they were to see this exact same verb on the landing page— whether this would increase conversions.

In the case of email marketing, for example, if a prospect typed: “design on-brand emails” into Google, ‘design’ is the exact verb they’d see in the headline and CTAs on the resulting landing page (vs. ‘build’ or ‘create’, or another alternative). The agency wanted to carry through the exact verb no matter what the prospect typed into the search bar for relevance, but outside the verb the rest of the headline would stay the same.

The question is, would a dynamic copy swap actually increase conversions?

Setting up a valid test

To run this test properly, ConversionLab had to consider a few table-stakes factors. Namely, the required sample size and duration (to understand if the results they’d achieve were significant).

In terms of sample size, the agency confirmed the brand could get the traffic needed to the landing page variations to ensure a meaningful test. Combined traffic to variant A and B was 1,274 visitors total and—in terms of duration—they would run the variants for a full 77 days for the data to properly cook.

To determine the amount of traffic and duration you need for your own tests to be statistically significant, check out this A/B test duration calculator.

Next, it was time to determine how the experiment would play out on the landing page. To accomplish the dynamic aspect of the idea, the agency used Unbounce’s Dynamic Text Replacement feature on Campaign Monitor’s landing page. DTR helps you swap out the text on your landing page with whatever keyword a prospect actually used in their search.

Below you can see a few samples of what the variants could have looked like once the keywords from search were pulled in (“create” was the default verb if a parameter wasn’t able to be pulled in):

A/B test variation 1
A/B test sample variation

What were the results?

When the test concluded at 77 days (Oct 31, 2017 —Jan 16, 2018), Campaign Monitor saw a 31.4% lift in conversions using the variant in which the verb changed dynamically. In this case, a conversion was a signup for a trial of their software, and the test achieved 100% statistical significance with more than 100 conversions per variant.

The variant that made use of DTR to send prospects through to signup helped lift conversions to trial by 31.4%

What these A/B test results mean

In the case of this campaign, the landing page variations (samples shown above) prompt visitors to click through to a second page where someone starts their trial of Campaign Monitor. The tracked conversion goal in this case (measured outside of Unbounce reporting) was increases to signups on this page after clicking through from the landing page prior.

This experiment ultimately helped Campaign Monitor understand the verb someone uses in search can indeed help increase signups.

The result of this test tell us that when a brand mirrors an initial search query as precisely as possible from ad to landing page, we can infer the visitor understands the page is relevant to their needs and are thereby more primed to click through onto the next phase of the journey and ultimately, convert.

Message match for the win!

Here’s Finge on the impact the test had on the future of their agency’s approach:

“Our hypothesis was that a verb defines HOW you solve a challenge; i.e. do you design an email campaign or do you create it? And if we could meet the visitor’s definition of solving their problem we would have a greater chance of converting a visit to a signup. The uplift was higher than we had anticipated! When you consider that this relevance also improves Quality Score in AdWords due to closer message match, it’s fair to say that we will be using DTR in every possible way forwards.”

Interested in A/B testing your own campaigns?

Whether you work in a SaaS company like Campaign Monitor, or have a product for which there are multiple verbs someone could use to make queries about your business, swapping out copy in your headlines could be an A/B test you want to try for yourself.

Using the same type of hypothesis format we shared above, and the help of the A/B testing calculator (for determining your duration and sample size), you can set up some variants of your landing pages to pair with your ads to see whether you can convert more.

ConversionLab’s test isn’t a catch all or best practice to be applied blindly to your campaigns across the board, but it could inspire you to try out Dynamic Text Replacement on your landing pages to see if carrying through search terms and intent could make a difference for you.

Unbounce

Search Buzz Video Recap: Google Algorithm Changes, Bing AMP & JSON-LD, Google & YouTube Spam & Matt Cutts

This week we have a lot to cover, first an algorithmic change in the Google search results over last weekend throughout this whole week. Bing announced a new AMP viewer coming this summer, they also announced JSON-LD support in Bing Webmaster Tools...

Search Engine Roundtable

Daily Search Forum Recap: June 15, 2018

Here is a recap of what happened in the search forums today, through the eyes of the Search Engine Roundtable and other search forums on the web...

Search Engine Roundtable