Good Bots And Bad Bots Online: They Outnumber Us

Bot Traffic Report 2013

Good Bots And Bad Bots

Bots account for 61.5% of all Internet traffic, according to a new infographic published by online data security company Incapsula. This represents a 21% increase over the past year, and it signifies not only an increase in automated web traffic, but a significant increase in activity by the bad bots – those out to skim information and infiltrate databases and computers everywhere.

Of particular interest is the revelation that half of all bot traffic is “good,” meaning that it is comprised of search engines and other automated programs that supposedly collect data about us for our benefit. The other half, however, consists of “bad” bots, which Incapsula subdivides by type.

These consist of the following:

  • Scrapers: some of these steal email addresses for spam purposes, while others reverse-engineer pricing and business models – essentially scraping data from existing websites for re-use elsewhere;
  • Hackers: tools that break in to other sites to steal credit card data or inject malicious code;
  • Spammers: these steal email addresses and send out billions of useless and annoying email messages, as well as inviting “search engine blacklisting;”
  • Impersonators: these specialize in intelligence gathering, DdoS attacks and bandwidth consumption.

In a recent interview with CloudTweaks, Incapsula CEO Gur Shatz stated that “the inadequacies of today’s defenses, juxtaposed with the ever-rising value of the information that can be stolen, represent a huge opportunity for cybercriminals. Personal or corporate devices are a tremendous intelligence source, carrying richer and more accurate data than ever before, but protections on these devices still mostly rely on outdated technologies such as passwords.” Compounding this issue is the degree by which so many devices are interconnected, and that the public remains largely unaware of the unrelenting presence and of bots and other automated programs that visit their computers and read their data either unnoticed, or worse with the user’s consent.

Shatz points out that even a company without any major secrets or critical online functionality is still subject to being used as a “mule” to conduct cybercrime. “Even small online businesses such as ecommerce sites are vulnerable,” he says, “because downtime or slowness costs them both money and reputation damage”.

In the short space of one year the proportion of bots to human users has shifted from roughly 50-50 to 60-40, a trend that with most things technology-related, promises to continue, in accordance with Moore’s law, to the point that the vast majority of all web traffic will be automated, and much of that will be up to no good.

By Steve Prentice

Cloud Image Migration

The Best Web Migration Should Be Invisible to Your Customers

How you approach a migration of your assets to the AWS Cloud is important to getting it right When the British-bank TSB decided to migrate to the Amazon Web Services (AWS) cloud in 2017, they ...
Episode 4: The Power of Regulatory Compliant Cloud: A European Case Study

Episode 4: The Power of Regulatory Compliant Cloud: A European Case Study

An interview with Johan Christenson, CEO of CityNetwork With the world focusing on the big three hyperscalers, there is still room – and much necessity for – more local cloud providers who are better suited ...
Daniela Streng

Preventing IT Outages and Downtime

Preventing IT Outages As businesses continue to embrace digital transformation, availability has become a company’s most valuable commodity. Availability refers to the state of when an organization’s IT infrastructure, which is critical to operating a ...
Ajay

Explainable Intelligence Part 3 – The Strategy for XAI

The Strategy for XAI It is not enough to say that something is true just because 'I know it’s true!' – we have to have some evidence or argument that gives a justification for our ...
Aruna Headshot

Top Four Predictions in 2020 for Unified Collaboration

Predictions in 2020 The year 2020 promises to usher in significant new developments in collaboration and communication. It’s part of an unending climb, moving higher on a logarithmic curve of progress. New technologies continue to ...
Gilad David Maayan

Accessing (HPC) High Performance Computing

HPC in the Cloud Big data and Machine Learning (ML) can provide businesses with incredible insights and an innovative edge. However, to properly analyze the data collected or to train your ML models, you need ...