Incapsula, the CDN and DDoS mitigation provider, have released the most recent of their Bot Traffic Reports, which shows that 61.5% of all website traffic is generated by software bots, an increase of 21% on last year.
While a company that offers bot protection services is motivated to emphasize the effect bots have on the web, the report is nevertheless an interesting insight into the web most users are completely unaware of.
Of course, anyone who runs a website will have seen the bot traffic in their logs, particularly that of search engine crawlers, so it’ll come as no surprise that crawler traffic makes up a significant proportion of all bot traffic. They might be surprised to learn that more of the web’s bot traffic is made up of good bots than bad.
The data was gathered from 1.45 billion bot visits to 20,000 sites over 90 days.
Search engine crawlers and other “good bots” make up 31% of web traffic, with scrapers constituting 5%, hacking tools constituting 4.5%, and spammers making up only 0.5%. The remaining 20% is made up of “other impersonators”, which includes some DDoS attacks, market intelligence gathering, and other hostile bots without an identified purpose.
Interestingly, as a proportion of all traffic, good bots have increased by 31%. That may be a result of the increased resources and shorter crawling cycles of search engine bots and other SEO related services. But spammers have seen a massive decrease of 75%. Incapsula suggests a large part of the decline in spammer activity can be attributed to Google’s spam fighting efforts.
Malicious bots are a significant danger for web hosting companies, site owners, and online service providers. The days of the lone hacker searching for targets of opportunity are largely passed. In the modern online landscape, bots, often running on compromised machines, scan large chunks of the Internet in search of vulnerable machines that can be exploited, either as further nodes in the botnet or as distributors of malware.
Security by obscurity was never a smart option, but it’s no longer an option at all. If a server has a vulnerability, the chances are high it will be attacked, even on a relatively short timescale.
In addition to the security problems caused by malicious bots, the other major concern is their use of network infrastructure, which imposes significant costs on the web hosting industry, both for mitigation of attacks on their clients and for use of limited bandwidth resources.
As is shown by the Incapsula study and the multiple stories hitting the headlines over the last of sites being hacked, although we’re making progress against the spammers and identifiable hacking tools, the industry has a way to go in dealing with the online underground.
Photo Credits: langfordw