Category Archives: Security

Net Neutrality Part 2 – Arguments Against Net Neutrality

Net Neutrality Part 2 – Arguments Against Net Neutrality

Net Neutrality Arguments

Net neutrality seems like a simple issue of corporations and big business fighting for government to open up more avenues for profit. However, despite the calls from the majority of the tech community to uphold net neutrality, there are those who oppose the idea, and not just those who are paid by telecoms giants or lobby groups.

In an article for Forbes, Josh Steimle questioned why he seemed to be the only techie against the idea of net neutrality. He disputed that there are more than just 2 sides to the argument; that you are either for or against corporations having “control” of what you can see on the internet. He isn’t against net neutrality as a concept, rather his concerns lie with net neutrality in terms of legislation or public policy. He outlined 3 reasons for his concerns that I feel perfectly frame the best arguments opposing net neutrality:

More Competition

Monopolies are bad. Everyone can agree on that. They are bad for pricing, competition and innovation, and government imposed monopolies are even worse. Do you think we would have had the same growth and innovation in smartphones if the government still owned the phone system? Or the same growth in space tech if NASA still had the monopoly? When deregulation occurs, generally innovation will follow.

The reason that telecoms have so much power is not in spite of government regulation; it is because of government regulation. If net neutrality is passed in legislation it may become more difficult for new companies to offer internet services, and we could end up more beholden to ISPs than before. However, if telecoms have to compete in a truly free market. For example, in the US, Comcast and Time Warner would be replaced by better cheaper options unless they remain competitive. If you want to break up monopolies you have to eliminate regulations, not enhance them. The problem here is that there needs to be some change in legislation to allow the market to become more open, otherwise ISPs won’t be challenged in their monopolies.

More Privacy

Free Speech cannot exist without privacy; so should we trust the government to be the gate-keepers of our privacy? Under net neutrality legislation the government would either have to trust the ISPs to regulate themselves, or the government would have to become the regulatory body. There is fear in the tech community that government monitoring could go further than simply monitoring, given the revelations in Glen Greenwald’s book No Place to Hide  that the U.S. government has been tampering with Internet routers in collaboration with the NSA.

What this argument fails to consider is that without some form of regulation and oversight, there is always likely to be abuses of power – that is unfortunately the nature of the world we live in. The difference is when government provides that oversight, that we can hold them to account, question them, and even vote them out of power. Corporations and large ISPs don’t answer to the public in the way government does.

More Freedom

Net Neutrality Arguments

(Image courtesy of Unsplash and the tremendous artists involved in the initiative) 

Any form of legislation or regulation is a limit to freedom and the free market. In an ideal world, governments always act with the best interests of their people at the heart of what they do. However, this is rarely the case, politics is a dirty business and corporate interests and lobby groups often have more influence than many of us realise. The fear is that any form of legislation will be influenced by these large groups to benefit them, in ways that may seem to be for the good of the people. Regulation in any form should be viewed as a restriction on freedoms in the long run.

However, this seems like a cynical view of government (though it is perhaps not too far from the truth) and, much like the previous argument, it fails to consider that government can be held to account.

Although there are a number of arguments against the adoption of net neutrality, including one ISP who claims it would be a restriction of free speech, the general consensus seems to be that, if done right, it is the most effective way to secure free speech and ensure that the internet remains a level playing field. We have to trust someone to regulate ISPs in some way, so is it better to trust the ISPs themselves? Or to trust the institutions that the people can hold to account?

By Josh Hamilton

Cloud Predictions 2017: Forrester Research Highlights 10 Trends

Cloud Predictions 2017: Forrester Research Highlights 10 Trends

Cloud Predictions 2017

By 2020 it’s projected by Gartner that corporate no-cloud policies will be as scarce as no-internet policies are today. Not surprising considering how valuable the cloud and cloud tools already are to many businesses across a range of industries. But although we’re seeing mainstream uptake of popular cloud products and services, cloud developers aren’t resting on their laurels; instead, we’re noting the development of existing and new cloud devices that are likely to keep the cloud top of mind and increasingly appreciated in the years to come.

Trends for 2017 and Beyond

Cloud Predictions 2017

Forrester Research has released a new report outlining cloud predictions for 2017 and highlighting ten trends for the coming year they believe necessary to act on. It’s expected that the cloud will be saving users money in many ways, not just through their traditional pay-per-use models, but also through the advancement of best practices optimizing costs. Expense transparency may also be realized through integrated cost management tools. And though Forrester says ‘size still matters,’ it’s clear that mega cloud providers will be balanced with niche providers. The public cloud option with its scalability and good economics continues to be a popular choice for enterprises averse to setting up their own private cloud networks, but the customization available through niche cloud provider services promises the smaller dedicated providers will also have a slice of the pie.

Furthermore, Forrester suggests that ‘hyper-converged infrastructure will help private clouds get real.’ These systems create integration of contrasting services on private cloud networks, and Forrester believes hyper-converged infrastructures should be the foundation for the development of private cloud networks, ensuring effortless and effective implementations. It’s also likely that containers will ‘shake up’ cloud platform and management strategies as Forrester predicts container-driven software code management will advance with Linux containers likely available in the majority of private and public cloud platforms early on in 2017. This, however, increases security challenges and is just one motive for the belief that cloud service providers will begin developing better security protocols into their offerings.

And further encouraging the cloud shift, Forrester believes that migration is going to become easier thanks to ‘lift-and-shift’ tools. Cloud migration applications are expected to be highly relevant in 2017, enabling smooth implementation and making the switch from public to private cloud, or vice versa, straightforward. Forrester does also expect enterprises to avoid large, complex and expensive cloud software suites, but also concludes that hybrid cloud networking will continue to create challenges for the hybrid cloud. We might also see SaaS moving towards regional and industry solutions instead of the prevalent one-stop-shops of today. Finally, Forrester suggests we keep an eye on what’s coming out of China as it’s expected that ‘Chinese firms will be key drivers of global cloud evolution.’

What the Cloud has in Store for Enterprises

Taking a look at enterprise advances, it’s suggested that the cloud market will accelerate more rapidly in 2017 as businesses attempt to improve efficiencies while scaling computing resources for better customer service. Says Forrester analyst Dave Bartoletti, “The number one trend is here come the enterprises. Enterprises with big budgets, data centers, and complex applications are now looking at cloud as a viable place to run core business applications.” Forrester recognizes Amazon Web Services as the originators of the first wave of cloud computing, launching basic storage and computing services back in 2006; ten years on and the results are mind-boggling. With 38% of surveyed North American and European enterprise infrastructure technology directors building private clouds and a further 32% securing public cloud services it’s evident that businesses are well into their cloud journey and nudging providers toward greater developments and innovations for the future.

By Jennifer Klostermann

5 Simple Tips to Make Strong and Robust Business Continuity Plans

5 Simple Tips to Make Strong and Robust Business Continuity Plans

Business Continuity Plans

Today’s organizations need comprehensive and robust business continuity planning for swift and effective action in case of a disaster or crisis. As the trade and supply chain have gone global, businesses today expect crisis response to be in seconds, not in hours, to ensure that the ripple impact is minimized. As organizations go digital, an IT failure can cripple the whole supply chain and business operations, causing extreme losses within hours and requiring countless hours to recover from the them. Plans to mitigate IT failures are also affected by the complexity of today’s IT infrastructure. As applications and systems are added based on business and market requirements, newer technologies and infrastructure pose new challenges.

Most businesses leverage cloud based platforms for their enterprise needs at least partially. The cloud helps businesses minimize costs and maximize efficiency; made for speed and convenience, it can scale up and down as needs demand and bring flexibility to business operations. However, the added overhead of managing cloud data centers, planning and performing test exercises across multiple locations and vendors as well as managing a crisis recovery, requires that organizations pay critical attention to their cloud solutions in combination with legacy infrastructure.

Today, an effective business continuity plan requires dynamic collection of information across the extended organization in a continuous manner. Organizations need to overcome the traditional fragmented approach to business continuity and formulate the business continuity strategy that adheres to the following five-point agenda:

1. Champion Business Continuity at the Highest Level

With senior management sponsorship, the business continuity plan will occupy its rightful position, high up in business priorities. This is important for sufficient budget, resourcing and training to be assigned to it. Senior leaders must set the tone at the top by insisting on robust crisis planning and regular reviews as a standard practice rather than a mere formality.

In August, Delta suffered a major IT outage that resulted in a $100 million loss in revenues for the airline. The impact was far-reaching, affecting check-in systems, flight information screens, the airline’s website and smartphone apps. The disruption to customers was extensive as well.

This is just one example of many; unfortunately, downtime of one type or another is a common situation in business. According to the Continuity Insights and KPMG, Global BCM 2016 report, 39 percent of global organizations have estimated the cost of business disruption to be $100,000 or less and 27 percent have estimated business disruptions ranging from $100,000 to $5 million or more in the last 12 months . This highlights the need for robust business continuity planning, championed at the highest level.


Types of Instances and Interruptions in Past Year


2. Review, Update and Test Regularly

The business continuity plan is a living document; it isn’t one to be created, filed and never looked at again. Risks evolve. Exercising the plans on a regularly scheduled basis will ensure businesses keep pace with the changing environment and understand what’s needed to protect critical infrastructure and preserve operations during a physical or virtual attack. Companies must learn from their own experience. Worryingly, according to Forrester and the Disaster Recovery Journal, 33 percent of businesses who had to invoke a business continuity plan, said one lesson learned from the experience was that the plan was out of date. Yet, 60 percent never carry out a full simulation of their business continuity plan for the entire organization; most walk through the plan as a document review.

It is of utmost importance that business continuity plans be reviewed by senior management and the planning team. Also, test results should be periodically evaluated and reported to the board, to assess the nature and scope of any changes to the organization’s business.

3. Include Partners, Suppliers and Third Parties

Companies don’t pay enough attention to the significant role of partners, suppliers and third parties in their business continuity. Deloitte found that over 94 percent of survey respondents had low to moderate confidence in the tools and technology used to manage third party risk and 88 percent felt the same about risk management processes. This, despite 87 percent having experienced disruption in the past three years that involved a third party.

Business continuity planning and disaster recovery has to be part of early third party discussions with responsibilities documented in service level agreements. Plans need to be aligned so that it is clear and easy to identify who does what, and where the handover points are when a plan is executed. The tools and systems used for collaboration must support transparency of information so that both parties are able to work from up to date information and take swift action in the event of a crisis.

4. Prioritize Ongoing Business Operations

The continuity plan should demonstrate that the business understands the priority level of its systems and that mitigating plans are in place to restore core operations as quickly as possible.

Cyber Crime DDoS

In the case of the Delta crisis, the outage was so extensive that it paralyzed business critical operations. The range of problems that can disrupt business – natural disasters, industrial action, cybercrime, IT failures, political or economic upheaval, suppliers ceasing to trade and so on – is so vast, and the systems and operations that can be impacted can be so wide that prioritization is a must.

A cloud-based option provides many benefits as an off-site back-up solution to ensure the efficacy of your continuity plan. However, as you develop your plan, ask yourself if a cloud-based option would increase the efficiency and cost-effectiveness of your plan and cover off essential considerations such as due diligence and service reliability with their provider. Another option is establishing a back-up plan that is independent of the cloud by leveraging personalized file backups, cross-device continuity solutions and communication software. The main aim is to get back faster and limit the amount of time that you’re spending without access to critical systems and information, by having a clearly defined continuity plan in place.

5. Define the Communications Plan Clearly

The business continuity plan has to be absolutely clear on how all stakeholders are going to be kept informed and how to enable upstream and downstream communication channels in times of crisis. Stakeholders include employees at all levels of the organization, such as suppliers, partners and customers.

The goal of the communications plan is to outline the channels and mechanisms for the sharing of information that will support efforts to resolve an issue at hand and limit the extent of its damage. How a company handles a crisis has an enormous impact on how they come out of the incident – people remember how the organization dealt and reacted to the issue and how convincing they were over the company’s efforts to make things right. For this reason, crisis management communications must be engaged at the earliest opportunity.

Service disruption is damaging to all businesses not only in terms of immediate revenue loss but also in the longer term brand and reputational impact. The business continuity plan is an essential, living document that aims to protect the ongoing sustainability of the business. Those that plan and execute well will see better performance in the long-run and be best-placed to weather the storms, whatever form they take.

By Vibhav Agarwal

Is Machine Learning Making Your Data Scientists Obsolete?

Is Machine Learning Making Your Data Scientists Obsolete?

Machine Learning and Data Scientists

In a recent study, almost all the businesses surveyed stated that big data analytics were fundamental to their business strategies. Although the field of computer and information research scientists is growing faster than any other occupation, the increasing applicability of data science across business sectors is leading to an exponential deficit between supply and demand.

When a 2012 article in the Harvard Business Review, co-written by U.S. chief data scientist DJ Patil, declared the role of data scientist “the sexiest job of the 21st century,” it sparked a frenzy of hiring people with an understanding of data analysis. Even today, enterprises are scrambling to identify and build analytics teams that can not only analyze the data received from a multitude of human and machine sources, but also can put it to work creatively.

One of the key areas of concern has been the ability of machines to gain cognitive power as their intelligence capacities increase. Beyond the ability to leverage data to disrupt multiple white-collar professions, signs that machine learning has matured enough to execute roles traditionally done by data scientists are increasing. After all, advances in deep learning are automating the time-consuming and challenging tasks of feature engineering.

While reflecting on the increasing power of machine learning, one disconcerting question comes to mind: Would advances in machine learning make data scientists obsolete?

The Day the Machines Take Over


Advances in the development of machine learning platforms from leaders like Microsoft, Google, and a range of startups mean that a lot of work done by data scientists would be very amenable to automation — including multiple steps in data cleansing, determination of optimal features, and development of domain-specific variations for predictive models.

With these platforms’ increasing maturity and ability to create market-standard models and data-exchange interfaces, the focus shifts toward tapping machine-learning algorithms with a “black box” approach and away from worrying about the internal complexities.

However, as with any breakthrough technology, we need to recognize that the impact of the technology is limited unless it is well-integrated into the overall business flow. Some of the most successful innovations have been driven not by a single breakthrough technology but by reimagining an end-to-end business process through creative integration of multiple existing components. Uber and Netflix offer prime examples of intelligence gleaned from data being integrated seamlessly into a company’s process flow. Data scientists play a key role in this by leveraging data to orchestrate processes for better customer experience and by optimizing through continuous experimentation.

While organizations across industries increasingly see a more strategic role for data, they often lack clarity around how to make it work. Their tendency to miss the big picture by looking for “easy wins” and working with traditional data sources means that data scientists have an opportunity to help frame problems and to clearly articulate the “realm of the possible.

From Data to Strategy

It is easy to get carried away by the initial hype that machine learning will be a panacea that can solve all the problems and concerns around its impact on the roles of data science practitioners. However, let us recall the AI winters in the mid-’70s, and later in the ’90s, when the journey to the “promised land” did not pan out.


Today, we don’t see the same concerns as in the past — lack of data, data storage costs, limitations of compute power — but we still find true challenges in identifying the right use cases and applying AI in a creative fashion. At the highest of levels, it helps to understand that machine learning capability needs to translate into one of two outcomes:

  • Interaction: Understanding user needs and building better and more seamless engagement
  • Execution: Meeting customer needs in the most optimal manner with ability to self-correct and fine-tune

Stakeholder management becomes extremely important throughout the process. Framing key business problems as amenable to data-led decision-making (in lieu of traditional gut feel) to secure stakeholder buy-in is critical. Consequently, multiple groups need to be involved in identifying the right set of data sources (or best alternatives) while staying conscious of data governance and privacy considerations. Finally, stakeholders need to be fully engaged to ensure that the insights feed into business processes.

Data Scientists Become Core Change Agents

Given the hype surrounding big data analytics, data scientists need to manage responses that fall on opposite ends of the spectrum by tempering extreme optimism and handling skepticism. A combination of the following skills that go beyond platforms and technology are thus needed:

  • Framing solutions to business problems as hypotheses that will require experimentation, incorporating user input as critical feedback
  • Identifying parameters by which outcomes can be judged and being sensitive to the need for learning and iteration
  • Safeguarding against correlations being read as causal factors
  • Ensuring the right framework for data use and governance, given the potential for misuse

This requires pivoting a data scientist’s remit in a company from a pure data-analysis function into a more consultative role, engaging across business functions. Data scientists are not becoming obsolete. They are becoming bigger, more powerful, and more central to organizations, morphing from technician into change agents through the use of data.

By Guha Ramasubramanian

guha-rGuha heads Corporate Business Development at Wipro Technologies and is focused on two strategic themes at the intersection of technology and business: cybersecurity framed from a business risk perspective and how to leverage machine learning for business transformation.

Guha is currently leading the development and deployment of Apollo, an anomaly detection platform that seeks to mitigate risk and improve process velocity through smarter detection.

Big Data Comes to Bear on Healthcare

Big Data Comes to Bear on Healthcare

Big Data Healthcare

The National Institutes of Health (NIH) is examining the use of big data for infectious disease surveillance, exploring the use of information taken from social media, electronic health records, and a range of other digital sources to provide detailed and well-timed intelligence around infectious disease threats and outbreaks that traditional surveillance methods aren’t able to. On the other side of the disease spectrum, big data analytics is also helping with the management of diabetes, a disease affecting over 422 million people globally and resulting in 1.5 million deaths per year according to the World Health Organization. Today, big data and big data analytics are delivering a range of innovative health care options as well as disease and illness monitoring and prevention tools that better the wellbeing of the world’s population.

Where is All This Data Coming From?

Thanks to the digitization of records, the spreading use of sensors, and the prolific use of mobile and standard computing devices, the data that is collected and recorded today is immense. But just because all of this data exists doesn’t mean it’s necessarily useful. Consider the ‘information’ gleaned from Twitter and Facebook posts, Snapchat and text messages, and Google and Siri question and answer sessions: certainly, some of that will be relevant to someone, but the sheer volume of non-qualitative data available can sometimes be a deterrent. Fortunately, the technology that’s evolving to collect all of this data is working hand in hand with big data management and analytics tech to ensure value.

Today, big data applications can predict future actions and with the widespread use of Internet of Things tech personalized data can be collected and monitored on an individual level. Applications such as Google Trends provide practical methods for using big data, while big data analytics helps navigate and utilize unstructured data that might otherwise seem irrelevant. And thanks to tools such as Hadoop, developers are able to construct predictive models that help organizations understand user responses and better tailor applications to these results.

Solutions for Healthcare

Already big data plays a role in biomedicine, advancing methodologies and skills and creating new cultures and modes of discovery. Some experts, in fact, believe the advances in medicine suggest we’ll be facing disruption in the industry as new systems and approaches prove their worth. Precision medicine initiatives already involve above a million volunteers in the US alone, along with several NIH-funded cohorts, and it’s likely that we’ll see the sharing of lifestyle information, genomic data, and biological samples linked to electronic health records as these schemes search for superior health care solutions. The benefits of these initiatives are collaborative and cooperative science, more efficient and better-funded research enterprises, and training advances, but all of this needs to be carefully balanced with the necessary privacy and security demands of big data.

Other advantages provided by big data analysis include a better understanding of rare diseases through the precision provided by aggregated integrated data, as well as predictive modeling able to advance diagnosis of illnesses and diseases both common and rare. Though many opportunities available through big data and big data analytics require a particular cultural shift, our high-tech environment already encourages this change.

Concerns and Further Investigations

Although experts see potential in the use of big data in the healthcare field, we’re also cautioned that unconventional data streams may lack necessary demographic identifiers or provide information that underrepresents particular groups. Further, social media can’t always be relied on as a stable data source. Nevertheless, big data research continues in many unique health care areas: multiple studies are investigating social media and online health forums for drug use and the existence of adverse reactions; one European surveillance system is collecting crowdsourced data on influenza; ResistanceOpen monitors antibiotic resistance at regional levels; and many others provide unique insight into our healthcare systems. The combination of traditional and digital disease surveillance methods is promising, and says Professor Shweta Bansal of Georgetown University, “There’s a magnitude of difference between what we need and what we have, so our hope is that big data will help us fill this gap.”

By Jennifer Klostermann

Net Neutrality Part 1 – What Is Net Neutrality And Why Is It So Important?

Net Neutrality Part 1 – What Is Net Neutrality And Why Is It So Important?

Net Neutrality

Net neutrality is a concept that has been the centre of a lot of debates recently, it is based on the idea that all internet traffic should be treated equally by your internet service provider. The fact that all traffic has (in most countries across the world) been treated equally, is one of the greatest strengths of the internet – it is a level playing field.

However, there is a risk at the moment that it could all change. Large ISP providers like AT&T, Verizon, and Comcast want to classify different types of traffic and treat them differently so they can charge you more depending on what you use, and potentially even restrict content that they disagree with.

There are several fundamental fears that drive people to advocate for net neutrality. Firstly, they fear that without it, ISPs (Internet Service Providers) could provide a “fast lane” to “favoured” (and what many suspect will be sponsored) content. They also fear that it could restrict market accessibility, by preventing small businesses from getting a step on the ladder, and thus lead to ever more entrenched monopolies. Finally, they worry that some ISPs would strategically slow the speed of websites who can’t afford to pay (or who don’t want to), holding young businesses to ransom as a result of their poor service.

There are a number of independent sites and organisations, such as, that promote and advocate for net neutrality and seek to spread the word about its importance to society. They argue that free and open internet is important because,

  1. Free and open internet is the single greatest technology of our time, and control should not be at the mercy of corporations.
  2. Free and open internet stimulates ISP competition.
  3. Free and open internet helps prevent unfair pricing practices.
  4. Free and open internet promotes innovation.
  5. Free and open internet promotes the spread of ideas.
  6. Free and open internet drives entrepreneurship.
  7. Free and open internet protects freedom of speech. was created by entrepreneur Michael Ciarlo and his story is the perfect example of the why net neutrality is so important; without net neutrality many people may not have been able to find his site, and his message would be lost. A level playing field of access and bandwidth is key to ensuring the internet continues to promote innovation and collaboration across the world. suggests that there is need to preserve the openness of the internet, in order to ensure its continued success. They argue that

The open internet has fostered unprecedented creativity, innovation and access to knowledge and to other kinds of social, economic, cultural, and political opportunities across the globe”

There is no argument against the fact that service providers shouldn’t be able to manage their networks. However, a line needs to be drawn to prevent the service providers from becoming the gate-keepers of internet content, attempting to limit access to certain parts of the internet, or even trying to police or censor our online experience.

As you can see from the map here, there hasn’t been many countries so far that have put actual legislation in place. However, access around the world at the minute in most countries is essentially unrestricted and completely neutral, as you can see from this map which was assembled using internet speed data as opposed to looking at legal precedent or legislation. However, there have been efforts to legally secure this trend and ensure the survival of net neutrality.


Brazil has been arguably the most successful at attempting to codify net neutrality and protect it, through the passing of what was dubbed as Brazil’s “Internet Constitution”. The law was adopted the legislation in April 2014, and prevents ISPs from charging higher rates for access to content that requires more bandwidth, such as Netflix; as well as limiting the gathering of metadata, and holds corporations, like Facebook and Google, accountable for the security of Brazilians’ data, even when it is stored abroad.

I like to think of the internet as a globalised and modernised version of the American dream. It can provide a fresh start or blank canvass for anyone who wants to sell their product, spread their message, or just go viral with a home video. The quality of the content is what matters, not where it came from.

Keep an eye out on for the next part of the series where will be exploring the arguments against net neutrality….

By Josh Hamilton

Tesco Bank Breach – Why Fintech Security Is Imperative

Tesco Bank Breach – Why Fintech Security Is Imperative

Fintech Security 

Thousands of Tesco Bank accounts were attacked by fraudsters just days ago, and as a result, the online payments of customers’ current accounts were frozen; though regular services are being restored, online and contactless transactions have been suspended. Dubious transactions were apparently seen on around 40,000 accounts, and initial reports suggested theft from 20,000 Tesco Bank clients. However, the latest information available suggests that only 9,000 accounts were involved; although quickly reimbursed by the bank, a total of £2.5 million was pilfered from these luckless clients in the attack. A disturbing episode occurring just as many consumers are beginning to trust some of fintech’s more reputable products.

What Went Wrong?

Details are still scarce, but it’s speculated that this security breach is due to human error (deliberate or accidental) kevin-obrienand/or poor data sharing controls. Currently, the National Cyber Security Centre is working with the National Crime Agency as investigations into Britain’s largest banking cyber-attack proceed. Sadly, we should by no means consider this attack a fluke. CloudTweaks received exclusive comment from, CEO and founder of cyber security platform GreatHorn, who says, “Breaches like this are possible in the U.S. in part because bank security routines for debit transactions are woefully inadequate. Even chip-and-pin technology won’t stop this type of threat; perimeter security that protects against access to card data is a good start, but absent behavioral analytics around account usage, fraudulent transactions will generally not be detected or prevented.

The Threat to Average Consumers

In a case such as this, consumers are left with very little recourse; though stolen funds are being returned to Tesco Bank clients, it’s understood that there was absolutely no client error involved and nothing any of them could have done to better secure their accounts. Says O’Brien, “One of the primary threats to consumers is around illicit use of their debit accounts; seeing this kind of attack compromise a major retailer suggests that we will see an increase in the amount of fraud that is directed at regular users, and likely both immediately and over the long term. One common approach is for thieves to place very small debits against stolen cards, confirming that the cards themselves work, and then follow it with larger drawdown charges months or even years later.”

Tesco Bank chief executive Benny Higgins has assured customers that no personal data has been compromised, a relief for the victims of this latest fraud, but reminding us that the threat of data theft is very real in attacks of this nature. We’re reminded that, unfortunately, the technology we trust needs a fair amount of supervision by ourselves and just because our fintech products are backed by a respected and reputable player doesn’t mean they’re failsafe.

What’s Next?

A warning for the fintech sector, the Tesco Bank cyber-attack will hopefully encourage new and established organizations in the sector to implement more stringent controls. O’Brien remarks, “Overall, this type of threat is a significant one, and should be a warning to the industry that better (and more automated) analysis of security-related activity is a requisite for a modern security posture.” Regrettably, such a fiasco is likely to result in a decline of the general consumer’s opinion of fintech products, the developing trust hard-won to begin with, and with cybercrime increasing financial costs associated with fintech firms it’s possible that this attack and others like it will push customers back to traditional financial systems. But then again, some of our brightest tech talent animates the fintech industry so perhaps with the right regulations and judicious development we can expect products both innovative and unfailingly secure.

By Jennifer Klostermann

The Dark Side of AI Part 2 – What Are We Doing About It?

The Dark Side of AI Part 2 – What Are We Doing About It?

What Are We Doing About It? 

In a way the creation of Artificial Intelligence (The Dark Side of AI Part 1) has become this century’s nuclear power. Yes, nuclear power could help fuel the world in a much more sustainable way that fossil fuels, but the threat of nuclear war is always possible once nuclear power has been created – this risk vs reward is mirrored in the search to create artificial intelligence.

kaplan-jerryIn the book, Humans Need Not Apply, author Jerry Kaplan suggests that for humanity the future is inside a zoo run by “synthetic intelligences“. He suggests that rather than enslave us, A.I.s are much more likely to keep us on some sort of reserve and give us very little reason to want to leave. Because of the horrifying nature of these types of scenarios, often associated with the creation of AI, scientist have long grappled with how to combat the possibilities of an out of control A.I.

Isaac Asimov proposed three laws of robotics, in his 1942 science fiction novel Runaround, that many suggested could be the answer to preventing an AI uprising/takeover (the same 3 laws that prevented an uprising in I, Robot…).

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  1. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
  1. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

However, these laws were designed to be flawed and deliberately vague, or Asimov’s books wouldn’t have made for great reading. The books explore the imperfections, loopholes, and ambiguities that are inherent to these laws and ironically they have only taught us how not to deal with A.I.. But are there ways to protect ourselves? Or are we all doomed to live in a human zoo?

The UN have recently been discussing a banning the use of autonomous weapons, in an attempt to combat the idea of A.I. vs A.I. warfare, and have declared that humans must always have meaningful control over machines. Yet that doesn’t ultimately protect us, the UN have notoriously little power, so it is up to science to provide a solution!

A.I Fears


Many leaders in the field of A.I. and Deep Learning have come together to sign an open letter to humanity and the scientific community, to combat the fears associated with Artificial Intelligence. The letter itself discusses both the huge benefits that A.I. could provide and weighs them against the great dangers that come hand in hand, with the overwhelming message that emerges being one of caution,

“The potential benefits are huge, since everything that civilization has to offer is a product of human intelligence….We recommend expanded research aimed at ensuring that increasingly capable AI systems are robust and beneficial: our AI systems must do what we want them to do

The letter itself has been signed by a wide spectrum of scientists and politicians from across the world including Elon Musk, Stephen Hawking, Google’s director of research Peter Norvig, co-founders of DeepMind (the British AI company purchased by Google in January 2014), MIT professors, and experts from technology’s biggest corporations, including IBM’s Watson supercomputer team and Microsoft Research. This type of global collaboration and initiative is key to maintaining control of our creations.

This group are not the only people concerned about the implications of creating Artificial Intelligence, there are many who are actively working on practical solutions. Perhaps the most famous of these is Google, and their work on an A.I. “kill switch”. Developers at DeepMind, Google’s artificial intelligence division, have collaborated with Oxford University researchers to develop a way for humans to keep the upper hand with super-intelligent computers. According to Google’s Laurent Orseau and Stuart Armstrong, members of the Future of Humanity Institute, humanity might need to have some sort of “big red button”, to stop AI from carrying out a “harmful sequence of actions” – in other words humans hold the ultimate trump card to combat a rogue A.I..

The most open and positively reviewed way to combat this problem has been the transparent and collaborative approach fostered by the Future of Life Institute (FLI). David Parkes, an FLI researcher and Harvard professor, has been attempting to teach A.I.s in a system to compromise to help a system of A.I.s to reason and work together. This research will only become more important as computing power develops.

The best way to ensure that A.I. doesn’t lead to an end to humanity is to create an open and collaborative environment for research and development. Google, Facebook and Microsoft all have researchers exploring machine learning and artificial intelligence techniques, and competitors at Open AI worry that the currently open nature of this research will close off as the findings grow exponentially in value. Humanity must remain united on this problem, or else, much like the development of nuclear weapons, this could spell the end of the world.

By Josh Hamilton

CloudTweaks Comics
4 Different Types of Attacks – Understanding the “Insider Threat”

4 Different Types of Attacks – Understanding the “Insider Threat”

Understanding the “Insider Threat”  The revelations that last month’s Sony hack was likely caused by a disgruntled former employee have put a renewed spotlight on the insider threat. The insider threat first received attention after Edward Snowden began to release all sorts of confidential information regarding national security. While many called him a hero, what…

Cloud Computing Then & Now

Cloud Computing Then & Now

The Evolving Cloud  From as early as the onset of modern computing, the possibility of resource distribution has been explored. Today’s cloud computing environment goes well beyond what most could even have imagined at the birth of modern computing and innovation in the field isn’t slowing. A Brief History Matillion’s interactive timeline of cloud begins…

Cloud Computing Services Perfect For Your Startup

Cloud Computing Services Perfect For Your Startup

Cloud Computing Services Chances are if you’re working for a startup or smaller company, you don’t have a robust IT department. You’d be lucky to even have a couple IT specialists. It’s not that smaller companies are ignoring the value and importance of IT, but with limited resources, they can’t afford to focus on anything…

The CloudTweaks Archive - Posted by
The Storytelling Machine: Big Content and Big Data

The Storytelling Machine: Big Content and Big Data

Bridging The Gap Between Big Content and Big Data Advances in cloud computing, along with the big data movement, have transformed the business IT landscape. Leveraging the cloud, companies are now afforded on demand capacity and mobile accessibility to their business-critical systems and information. At the same time, the amount of structured and unstructured data…

Internet Of Things – Industrial Robots And Virtual Monitoring

Internet Of Things – Industrial Robots And Virtual Monitoring

Internet Of Things – Industrial Robots And Virtual Monitoring One of the hottest topics in Information and Communication Technology (ICT) is the Internet of Things (IOT). According to the report of International Telecommunication Union (2012), “the Internet of things can be perceived as a vision with technological and societal implications. It is considered as a…

Digital Marketing Hubs And The Cloud

Digital Marketing Hubs And The Cloud

Digital Market Hubs Gartner’s recently released research, Magic Quadrant for Digital Marketing Hubs, recognizes the big four marketing cloud vendors as leaders, but also points to many challengers. Adobe, Marketo, Oracle, and Salesforce inhabit the leader’s block of the Magic Quadrant, reflecting both their growing capabilities as well as marketing technology platform scopes. Gartner believes…

The Monstrous IoT Connected Cloud Market

The Monstrous IoT Connected Cloud Market

What’s Missing in the IoT? While the Internet of Things has become a popular concept among tech crowds, the consumer IoT remains fragmented. Top companies continue to battle to decide who will be the epicenter of the smart home of the future, creating separate ecosystems (like the iOS and Android smartphone market) in their wake.…

Do Small Businesses Need Cloud Storage Service?

Do Small Businesses Need Cloud Storage Service?

Cloud Storage Services Not using cloud storage for your business yet? Cloud storage provides small businesses like yours with several advantages. Start using one now and look forward to the following benefits: Easy back-up of files According to Practicalecommerce, it provides small businesses with a way to back up their documents and files. No need…

Using Big Data To Make Cities Smarter

Using Big Data To Make Cities Smarter

Using Big Data To Make Cities Smarter The city of the future is impeccably documented. Sensors are used to measure air quality, traffic patterns, and crowd movement. Emerging neighborhoods are quickly recognized, public safety threats are found via social networks, and emergencies are dealt with quicklier. Crowdsourcing reduces commuting times, provides people with better transportation…

New Report Finds 1 Out Of 3 Sites Are Vulnerable To Malware

New Report Finds 1 Out Of 3 Sites Are Vulnerable To Malware

1 Out Of 3 Sites Are Vulnerable To Malware A new report published this morning by Menlo Security has alarmingly suggested that at least a third of the top 1,000,000 websites in the world are at risk of being infected by malware. While it’s worth prefacing the findings with the fact Menlo used Alexa to…

Cloud Security Risks: The Top 8 According To ENISA

Cloud Security Risks: The Top 8 According To ENISA

Cloud Security Risks Does cloud security risks ever bother you? It would be weird if it didn’t. Cloud computing has a lot of benefits, but also a lot of risks if done in the wrong way. So what are the most important risks? The European Network Information Security Agency did extensive research on that, and…

5 Things To Consider About Your Next Enterprise Sharing Solution

5 Things To Consider About Your Next Enterprise Sharing Solution

Enterprise File Sharing Solution Businesses have varying file sharing needs. Large, multi-regional businesses need to synchronize folders across a large number of sites, whereas small businesses may only need to support a handful of users in a single site. Construction or advertising firms require sharing and collaboration with very large (several Gigabytes) files. Financial services…

The Cancer Moonshot: Collaboration Is Key

The Cancer Moonshot: Collaboration Is Key

Cancer Moonshot In his final State of the Union address in January 2016, President Obama announced a new American “moonshot” effort: finding a cure for cancer. The term “moonshot” comes from one of America’s greatest achievements, the moon landing. If the scientific community can achieve that kind of feat, then surely it can rally around…

Using Private Cloud Architecture For Multi-Tier Applications

Using Private Cloud Architecture For Multi-Tier Applications

Cloud Architecture These days, Multi-Tier Applications are the norm. From SharePoint’s front-end/back-end configuration, to LAMP-based websites using multiple servers to handle different functions, a multitude of apps require public and private-facing components to work in tandem. Placing these apps in entirely public-facing platforms and networks simplifies the process, but at the cost of security vulnerabilities. Locating everything…

The Importance of Cloud Backups: Guarding Your Data Against Hackers

The Importance of Cloud Backups: Guarding Your Data Against Hackers

The Importance of Cloud Backups Cloud platforms have become a necessary part of modern business with the benefits far outweighing the risks. However, the risks are real and account for billions of dollars in losses across the globe per year. If you’ve been hacked, you’re not alone. Here are some other companies in the past…

3 Keys To Keeping Your Online Data Accessible

3 Keys To Keeping Your Online Data Accessible

Online Data Data storage is often a real headache for businesses. Additionally, the shift to the cloud in response to storage challenges has caused security teams to struggle to reorient, leaving 49 percent of organizations doubting their experts’ ability to adapt. Even so, decision makers should not put off moving from old legacy systems to…

Cloud-Based or On-Premise ERP Deployment? Find Out

Cloud-Based or On-Premise ERP Deployment? Find Out

ERP Deployment You know how ERP deployment can improve processes within your supply chain, and the things to keep in mind when implementing an ERP system. But do you know if cloud-based or on-premise ERP deployment is better for your company or industry? While cloud computing is becoming more and more popular, it is worth…

Three Tips To Simplify Governance, Risk and Compliance

Three Tips To Simplify Governance, Risk and Compliance

Governance, Risk and Compliance Businesses are under pressure to deliver against a backdrop of evolving regulations and security threats. In the face of such challenges they strive to perform better, be leaner, cut costs and be more efficient. Effective governance, risk and compliance (GRC) can help preserve the business’ corporate integrity and protect the brand,…

Is Machine Learning Making Your Data Scientists Obsolete?

Is Machine Learning Making Your Data Scientists Obsolete?

Machine Learning and Data Scientists In a recent study, almost all the businesses surveyed stated that big data analytics were fundamental to their business strategies. Although the field of computer and information research scientists is growing faster than any other occupation, the increasing applicability of data science across business sectors is leading to an exponential…