As search engines such as Google formulate their algorithm to force websites to come up with content designed more for humans and not machines, a new report revealed that bots now rule the Internet. Incapsula has released the results of its latest study, disclosing that 61.5 percent of website traffic today are generated by bots compared to 38.5 percent of human traffic.
According to the security content delivery network, the non-human traffic can be attributed to impersonators (20.5 percent) that can be spy bots used for DDoS or marketing intelligence gathering; spammers (0.5 percent) or bots that post malware links or irrelevant content; hacking tools (4.5 percent) that are primarily aimed at getting data or hijacking websites; scrapers ( 5 percent) that steal or duplicate content of websites; and good bots (31 percent) such as search engines.
"Compared to the previous report from 2012, we see a 21% growth in total bot traffic, which now represents 61.5% of website visitors. The bulk of that growth is attributed to increased visits by good bots," the latest Incapsula report read.
Incapsula pointed to the evolving web-based services and activities of existing bots as responsible for the growth of bot traffic. In 2012, human traffic was responsible for 49 percent of web activities while 51 percent came from bots.
The company put spotlight on the decreased activity of spammers and lauded Google for the drop.
"SEO link building was always a major motivation for automated link spamming. With its latest Penguin updates Google managed to increase the perceivable risk for comment spamming SEO techniques, while also driving down their actual effectiveness.Based on our figures, it looks like Google was able to discourage link spamming practices, causing a 75% decrease in automated link spamming activity," the study noted.
Incapsula highlighted the 8 percent increase in the activity of impersonator bots, which can be tools of hackers or automated spy robots that can pretend to be human traffic to penetrate the firewall or defense systems of websites.
The company monitored 1.45 billion bot visits for around three months to come up with its conclusions. It used its network of client websites to collect the data needed for the report.