Good Bots And Bad Bots Online: They Outnumber Us

Bot Traffic Report 2013

Good Bots And Bad Bots

Bots account for 61.5% of all Internet traffic, according to a new infographic published by online data security company Incapsula. This represents a 21% increase over the past year, and it signifies not only an increase in automated web traffic, but a significant increase in activity by the bad bots – those out to skim information and infiltrate databases and computers everywhere.

Of particular interest is the revelation that half of all bot traffic is “good,” meaning that it is comprised of search engines and other automated programs that supposedly collect data about us for our benefit. The other half, however, consists of “bad” bots, which Incapsula subdivides by type.

These consist of the following:

  • Scrapers: some of these steal email addresses for spam purposes, while others reverse-engineer pricing and business models – essentially scraping data from existing websites for re-use elsewhere;
  • Hackers: tools that break in to other sites to steal credit card data or inject malicious code;
  • Spammers: these steal email addresses and send out billions of useless and annoying email messages, as well as inviting “search engine blacklisting;”
  • Impersonators: these specialize in intelligence gathering, DdoS attacks and bandwidth consumption.

In a recent interview with CloudTweaks, Incapsula CEO Gur Shatz stated that “the inadequacies of today’s defenses, juxtaposed with the ever-rising value of the information that can be stolen, represent a huge opportunity for cybercriminals. Personal or corporate devices are a tremendous intelligence source, carrying richer and more accurate data than ever before, but protections on these devices still mostly rely on outdated technologies such as passwords.” Compounding this issue is the degree by which so many devices are interconnected, and that the public remains largely unaware of the unrelenting presence and of bots and other automated programs that visit their computers and read their data either unnoticed, or worse with the user’s consent.

Shatz points out that even a company without any major secrets or critical online functionality is still subject to being used as a “mule” to conduct cybercrime. “Even small online businesses such as ecommerce sites are vulnerable,” he says, “because downtime or slowness costs them both money and reputation damage”.

In the short space of one year the proportion of bots to human users has shifted from roughly 50-50 to 60-40, a trend that with most things technology-related, promises to continue, in accordance with Moore’s law, to the point that the vast majority of all web traffic will be automated, and much of that will be up to no good.

By Steve Prentice

Jonathan Custance
IoT and cloud computing are on the increase High-profile cybersecurity breaches are increasingly in the news, a prime example being the NHS incident of May 2017 when services were brought to a standstill for several ...
Alex Tkatch
Best Practices for Designing and Executing a Product Launch Nothing in entrepreneurial life is more exciting, frustrating, time-consuming and uncertain than launching a new product. Creating something new and different can be exhilarating, assuming everything ...
MIT
Smart Manufacturing Startups AI and machine learning's potential to drive greater visibility, control, and insight across shop floors while monitoring machines and processes in real-time continue to attract venture capital. $62 billion is now invested ...
James Corbishly
Teams Sprawl in the Remote Workspace As working from home has become the new everyday norm, with more employers embracing the remote-work model as a new and likely permanent fixture of the employment world, there ...
Dana Gardner
Just as cloud computing initially seeped into organizations under the cloak of shadow IT, application programming interface (API) adoption has often followed an organic, inexact, and unaudited path. IT leaders know they’re benefiting from APIs -- ...