Fixing Digital Publishing’s Dirty Little Secret
by Pierre-Marc Diennet , Op-Ed Contributor, May 12, 2017
Digital publishing has a dirty little secret – one that all publishers know, accept and refuse to discuss openly. But the days of being able to sweep this ugly truth under the rug, out of view from clients, are fast coming to an end.
Yes. We’re talking about bot traffic. Security firm Imperva estimates that bots, both helpful and harmful, accounted for 52% of Web traffic in 2016. Of concern, the company attributed 29% of overall Web traffic specifically to harmful bots—the impersonators, scrapers, scammers and hacker tools of the Internet.
Bot traffic varies greatly by publisher, but none are unaffected.
It is a reality of online publishing that most industry executives recognize, but few want to openly discuss. When asked what portion of their traffic is driven by bots, most feign ignorance and redirect attention to more favorable audience measures. But in doing so, they’re not doing themselves—or their companies—any favors.
The truth is that countless options exist for detecting and eliminating bot traffic to a site. A quick Google search will yield a plethora of bot detection tools and software, ranging from simple free tools to advanced software solutions. Despite the prevalence of bot traffic online, it is entirely feasible for publishers to strip bots from their audiences and target human-only traffic with advertising.
And yet, very few are doing so—and it’s no great secret why.
These days, staying afloat in online publishing is hard enough. Implementing bot detection and removal technologies can lead to a drastic reduction in inventory. For some publishers, admitting that 20% of site traffic is generated by bots means accepting a 20% hit on revenue, which is not a blow that many can withstand in one fell swoop.
And so, the dirty little secret persists.
Clients don’t ask. So publishers don’t tell. Ignorance is bliss, right?
Wrong. The days of clients not asking are coming to an end, and publishers that aren’t getting out in front of the question to address the sometimes-massive implications of fraudulent traffic on their inventory will soon find their very survival in jeopardy. It’s time to alter course—and fast.
Taking a proactive approach to identify and eliminate bot traffic from inventory is a necessary long-term play for publishers. Executives who are struggling with this reality should keep these three vital points in mind:
If they haven’t already, clients will begin to ask about your levels of bot traffic. Awareness and industrywide education on this topic are spreading. Soon there will come a time where every brand asks this question as a routine part of a media buy.
They will question the quality of your traffic, and if the answer is unsatisfactory, they will further question why you have not already taken steps to remedy the problem.
Legacy clients will flee, and new ones will not come on board until the problem has been addressed. Few publishers are prepared to weather that sort of revenue gap.
You’re not going to make more money by implementing bot detection technology. If you think you are, it’s best to dismiss that idea right now. This isn’t a line item that can be passed along to the client. There’s no way to monetize it, and yet it’s still worth doing.
Although you won’t make more money because of it, you will protect the money that you are already making.
You’re not going to lose money immediately by implementing bot detection and removal on your site.
When you initially implement bot detection and removal solutions on your site, you don’t need to immediately turn off every single bot-driven impression. You can choose what you want to do with the information and when.
But by proactively putting these systems in place, you’ll be in a position to deliver the needed traffic cleanup as client expectations become more attuned to the need and ability to block bot-driven traffic from campaigns. That means being prepared to shift right alongside the market, rather than scrambling to catch up.