What’s The Lost Revenue From Bots Scraping Original Content?




  • by , Staff Writer @lauriesullivan, August 31, 2016



    Bots — which some estimate make up about 46% of Web traffic — scrape the Web for whatever bits of content they are programmed to find, often mimicking human behavior. But what does it cost the company developing the content, analyzing the numbers, and creating the information for consumers and customers in hopes they will visit their Web site?


    Original content and data really mean much more, especially when it comes to search engines like Google, Bing and Yahoo. 


    The real estate and travel industries see a majority of bot activity, mostly based on price comparisons, but in this past year real estate had the highest percentage of bad bots at 32%. From 2014 to 2015, the real estate industry saw a 300% increase in bad bot activity, copying the price comparison model of the travel industry. Simultaneously, travel industry leaders saw that 48% of their traffic in 2015 was bad bots, according to Distil Networks, which published a study this week titled “The 2016 Economics of Web Scraping.”


    Aside from real estate and finance, industries such as digital publishing, e-commerce, directories and classifieds, and airlines and travel took a big hit. I’m betting that as the need for content rises, so will Web scraping. If a Web site contains content that drives revenue for a business, that business is at risk.


    The loss, in part, results in a decline of Web site traffic, which eventually turns into loss of revenue from advertising.


    The risk and economic impact of bad bots on a business is between 1.6% and 7.9% of annual Web site revenue, with a median business impact of about 4.2%, according to Distil Networks, citing Derek Brink, VP of research and fellow at Aberdeen Research.


    Breaking out the impact of Web scraping from Brink’s analysis, its impact is about 2% of the Web site’s contribution to revenue, including content theft, price scraping and competitive data mining, Web site performance degradation for both downtime and slowdowns, negative search engine optimization, and skewed Web site analytics.


    No surprise, really, although the numbers seem a bit low. The study illustrates Web scraping’s prevalence, sophistication, and industry use cases. Through analysis of top Web-scraping platforms and services, the report also outlines how the democratization of web scraping allows users to effortlessly steal sensitive information on the Web.


    Web scraping is also used for weather data monitoring and Web site change detection, which emails notifications to users about changes made to specific Web sites, according to the study.


     


    MediaPost.com: Search Marketing Daily

    (7)