Columnist Brian Patterson shares the results of a click-through rate test performed on one of the test websites he maintains.
Reading articles, experiments and case studies is great, but we’re always eager to test things out for ourselves. We maintain a bunch of test websites for just that purpose. And so we decided to run a CTR test on one of the websites we hadn’t touched in a long while. The hope was to gather some evidence of our own.
If you’re looking for a randomized, double-blind, placebo-controlled test, this ain’t it. But if you’re looking for more anecdotal evidence suggesting how the algorithm might react to clicks based on an experiment of n=1, this is it.
We started by selecting a site that hadn’t had a lot of work done to it. In fact, it was one of our more neglected test sites, which made it ideal for a click-through rate test, as we could attribute most movement (or lack thereof) to the test itself.
We chose nine keywords based on the following parameters:
- Our site currently ranked between positions 5 and 25 for each keyword.
- Those rankings hadn’t recently experienced much ranking fluctuation.
- The keywords received more than 50 searches per month, according to Google Keyword Planner.
- The organic competition was low or medium.
These ended up all being e-commerce product category keyword phrases between two and four words.
We used a micro-task-type website to find people to do the clicking and searching. Each day, we had a random, reasonable number of them (based on actual daily keyword search volume for each word) search the keyword and click on our targeted link.
Each searcher was instructed to spend two minutes browsing around the website and reading the content. Our goal here was to have them resemble real users rather than a visitor who hits the site and immediately pogo-sticks out. We didn’t give specific instructions on where or what to look at, but just to act like a regular user.
I’ll let the chart below showing the ranking changes over the two-month project speak for itself. Each blue line represents a keyword we were tracking.
All of the keywords improved in ranking position at some point over the course of the experiment, and seven of the nine ended the test with a better ranking than they started with.
When taken in conjunction with other articles and research on the topic, our findings were enough to convince our team that CTR does impact rankings. But what does that matter if, like us, you aren’t going to start manipulating click-through rate for your clients’ websites with micro-taskers — something that would clearly violate Google’s guidelines?
What it means is that we now have evidence that helps us prioritize the optimization of title tags and meta descriptions to attract clicks. It gives us an example to point to when enterprise clients express concern over how large or difficult an initiative it will be to update the title and meta tags on each page. It helps us keep in mind that keywords in title tags are just one part of optimizing a title tag.
Additionally, our experiment had users act “real.” They stayed on the page for at least two minutes. This dwell time is important, both from a user experience and very possibly from a ranking perspective. The test results further reinforce that our headlines need to be engaging, our on-page copy needs to be captivating and our images and videos need to be enthralling.
The bottom line
Do I think click-through rate matters? Yes.
Will I start manipulating click-through rate for clients? No.
How will this impact our business process? It tells us that optimizing title tags, meta descriptions and site content won’t just capture more visitors and improve the user experience, but will very possibly increase our rankings as well.
Some opinions expressed in this article may be those of a guest author and not necessarily Marketing Land. Staff authors are listed here.