Produced as part of the annual Imperva Incapsula Bot Traffic Report, the data on the amount of traffic the bots of the world are sending one another paints an interesting picture. In 2015, with the growth of online video sharing, humans took back the top spot in internet traffic, but that started to swing the other way in 2016.
48.2 percent of all traffic was sent by humans that year. The other 51.8 percent were bots, but only 23 percent of traffic was handled by what the report terms “good bots.” Those are traffic monitors, commercial crawlers, search engine robots and feed fetchers. The rest was made up of nasty ones.
More: Think tech tilted the 2016 election? Just wait for 2020
Hacker tools made up a few percent, as did scrapers that are used for extracting data from a network, while spammers – who you might expect to take up a big chunk – only accounted for 0.3 percent of all internet traffic. The big group of nasties today are the impersonators.
These bots try to use fake identities to gain control of accounts, or breach security to gain access for their controllers to protected systems. As the report explains, their most common use though is in DDOS attacks. With millions upon millions of internet of things devices being leveraged at the end of last year, perhaps it’s no surprise, but it does show a worrying trend.
As more and more of the world’s bandwidth is gobbled up by bots, there’s less for humans to use and that doesn’t even address the fact that malicious bot traffic is being used to disrupt the very services we’re trying to use.
The only heartening news is that the bots can be beaten. The more popular a site is, the higher chance that traffic visiting it is positive and human. The biggest websites in the world might receive the most attention from the bots, but their relative percentage is far smaller than on smaller sites.
To collate all this information, Imperva Incapsula used data from 16.7 billion visits to 100,000 randomly selected domain names.