Category Archives: Big Data

Microsoft Releases Beta of Microsoft Cognitive Toolkit For Deep Learning Advances

Microsoft Releases Beta of Microsoft Cognitive Toolkit For Deep Learning Advances

Microsoft Cognitive Toolkit

Microsoft has released an updated version of Microsoft Cognitive Toolkit, a system for deep learning that is used to speed advances in areas such as speech and image recognition and search relevance on CPUs and NVIDIA® GPUs.

The toolkit, previously known as CNTK, was initially developed by computer scientists at Microsoft who wanted a tool to do their own research more quickly and effectively. It quickly moved beyond speech and morphed into an offering that customers including a leading international appliance maker and Microsoft’s flagship product groups depend on for a wide variety of deep learning tasks.

We’ve taken it from a research tool to something that works in a production setting,” said Frank Seide, a principal researcher at Microsoft Artificial Intelligence and Research and a key architect of Microsoft Cognitive Toolkit.

The latest version of the toolkit, which is available on GitHub via an open source license, includes new functionality that lets developers use Python or C++ programming languages in working with the toolkit.  With the new version, researchers also can do a type of artificial intelligence work called reinforcement learning.

Finally, the toolkit is able to deliver better performance than previous versions. It’s also faster than other toolkits, especially when working on big datasets across multiple machines. That kind of large-scale deployment is necessary to do the type of deep learning across multiple GPUs that is needed to develop consumer products and professional offerings…

Read Full Article: Microsoft

A President’s Trove of Data

A President’s Trove of Data

Then vs Now

According to some popular opinions, today’s information age affords more information to teens today than a few of the world leaders had access to 20 years ago. C+R Research has put this hypothesis through its paces, comparing access to information across areas such as private data, classified information, genetics, public opinion, and more and finds that in many ways the average smartphone user does, in fact, have access to a lot more information than those with the highest clearance would have two decades ago. However, the accuracy and quality of data available don’t necessarily compare.

Critical Information vs. the Non-Essentials

C+R Research finds that just about any 13-year- old with a smartphone in 2016 would beat President Bill Clinton’s 1996 intelligence and access in areas such as traffic data, music, trivia, opinion, and even genetics. But then, the president of the United States might not have time to listen to the 30 million songs immediately accessible via Spotify, nor would Air Force One likely be constrained by the same traffic limitations as the rest of us. Of course, political campaign teams of 20 years ago would drool for the polling possibilities that Twitter offers today.


On the other hand, President Clinton would have had better access to classified information, data from satellites, and research journals, as well as access to private data – though there are rules governing this, some very important people tend to be ‘incorporated’ into such regulations. Happily, or unhappily depending on how much privacy you desire, tracking of family members via the secret service in 1996 was about as proficient as the smartphone apps we use today to monitor friends and family.

In the end, the 13-year-old wins 7 to 5 in the most general terms, but it’s important to recognize that the broad scope of information available today doesn’t necessarily point to accurate or significant information, two traits President Clinton could be sure of.

By Jennifer Klostermann

Blockchain and the IoT

Blockchain and the IoT

IoT Blockchain

Blockchain, also known as Distributed Ledger Technology (DLT), is the innovative technology behind Bitcoin. The impact of Bitcoin has been tremendous and, as with any revolutionary technology, was treated initially with awe and apprehension. Since its open source release back in 2009, Bitcoin became a transformative force in the global payments system, establishing itself without the aid or support of the traditional financial infrastructure. While initial usage saw huge success in black markets, Bitcoin defied odds, and the blockchain technology spawned other cryptocurrencies, exchanges, commercial ventures, alliances, consortiums, investments, and uptake by governments, merchants, and financial services worldwide.


On August 12, the World Economic Forum (WEF) published a report on the future of the financial infrastructure, and in particular on the transformative role that blockchain technology is set to play. Notably, it analyzes the technology’s impact on the financial services industry and how it can provide more transparency and security. Potential use cases are examined, including for insurance, deposits and lending, insurance, capital raising, investment management, and market provisioning. The report also looks at the current challenges in instituting a widespread implementation of blockchain, many of which will require international legal frameworks, harmonized regulatory environments, and global standardization efforts.

DLT is already having a serious impact on the financial services industry. The WEF report states that 80% of banks will initiate a DLT project by next year, and more than $1.4 billion has already been invested in the technology in the past three years. More than that, governments and law firms are seriously staking their claim in advancing the technology. Law firm Steptoe & Johnson LLP recently announced the expansion of its Blockchain Team into a multidisciplinary practice involving FinTech, financial services, regulatory, and law enforcement knowledge. The firm is also one of the founders of the Blockchain Alliance, a coalition of blockchain companies and law enforcement and regulatory agencies, alongside the U.S. Chamber of Digital Commerce and Coin Center. This expansion is an endorsement of the potential of DLT, within and eventually beyond financial services.


(Image source:

The possible applications of blockchain are already being explored in numerous new sectors: energy, transportation, intellectual property, regulation and compliance, international trade, law enforcement, and government affairs, among many others. Ethereum is one blockchain endeavor that features smart contract functionality. The distributed computing platform provides a decentralized virtual machine to execute peer-to-peer contracts using the Ether cryptocurrency. The Ether Hack Camp is launching a four-week hackathon in November 2016 for DLT using Ether. Currently, the Camp is requesting developers to propose ideas to the public, which will be voted on by registered fans and those selected will be able to take part in the hackathon. The ideas can be seen online already and are vast and varied, ranging from academic publishing without journals, music licensing reform, decentralized ISP, voting on the blockchain, alternative dispute resolution, and rural land register. The idea winning first place in November will collect $50,000 USD.

IBM is one of the most dynamic forerunners currently pushing DLT for the IoT. The firm just announced it is investing $200 million in blockchain technology to drive forward its Watson IoT efforts. The firm is opening up a new office in Germany, which will serve as headquarter to new blockchain initiatives. The investment is part of the $3 billion that IBM pledged to develop Watson’s cognitive computing for the IoT. The goal of the new investment is to enable companies to share IoT data in a private blockchain. A commercial implementation is already underway with Finnish company Kouvola Innovation, which wants to integrate its capabilities into the IBM Blockchain and link devices for tracking, monitoring, and reporting on shipping container status and location, optimizing packing, and transfer of shipments.

IBM is working hard to align its IoT, AI and Blockchain technologies through Watson. The new headquarters in Germany will be home to a Cognitive IoT Collaboratories for researchers, developers and engineers.

Many of IBM’s current projects are developed leveraging the open source Hyperledger Project fabric, a consortium founded by the Linux Foundation in which IBM is a significant contributor, alongside Cisco and Intel. IBM pushed its involvement even further with the June launch of its New York-based Bluemix Garage. The idea is to allow developers and researchers the opportunity to use IBM Cloud APIs and blockchain technologies to drive cognitive, IoT, unstructured data, and social media technology innovation. Just one month after the launch, IBM announced the launch of a cloud service for companies running blockchain technology. The cloud service is underpinned by IBM’s LinuxONE technology, which is specifically designed to meet the security requirements of critical sectors, such as financial, healthcare, and government.

The potential for DLT is certainly broad and rather long-term, but the engagement by the financial services industry is a testament to its potential. While FinTech remains the big focus for blockchain technologies, its success will drive the use of DLT for other areas. The promise of blockchain is to deliver accountability and transparency; although this could be disrupted significantly if announcements, such as the one made by Accenture on ‘editable’ Blockchain, become a reality. While banks may welcome the feature, it would be a serious blow to not only the integrity, but also the security of blockchain technology.

By Michela Menting

Great Cloud Platforms Need to Win the Hearts and Minds of Developers First

Great Cloud Platforms Need to Win the Hearts and Minds of Developers First

Great Cloud Platforms 

Adoption of cloud computing services is growing exponentially all around the world. Companies are realizing that so much of the hard, expensive work that they used to have to do internally can now be outsourced to cloud providers, allowing the companies to focus on what it is that they do best. That’s the reason why tech research firm Gartner projects that over the next five years, the shift to the cloud is looking to be a US$1-trillion market.

Everything from running payrolls, to marketing, logistics, data analysis and much, much more is moving to the cloud, and one of the most successful uses of the cloud is the concept of Platform-as-a-Service (PaaS, as it is known). What this does is enable customers to develop, run and manage their own applications without having to invest heavily in the infrastructure required in order to develop and launch a web application.

The key to creating a good product on the right platform is to win the hearts and minds of web developers so that they choose the right platform to go with. SAP, the world’s largest enterprise cloud company with over 320,000 customers and over 110 million cloud users in 190 countries is using its extensive experience and knowledge in the business space to offer the SAP HANA Cloud Platform, a remarkable service for all company sizes. This platform is already being used extensively by developers who are creating apps for their customers or their various organizations and employees.


The SAP HANA Cloud Platform enables developers to build business applications in the cloud quickly and easily.

Three features of this platform stand out:

  1. its ability to extend your cloud and on-premise applications to develop customized hybrid solutions,
  2. the awesome feature allowing you to integrate applications seamlessly and securely to synchronize data and processes across cloud, on-premise and third-party applications, as well as
  3. the core feature which allows you to build new enterprise-ready applications rapidly with an open standards platform that brings out the best in developers.

The Director of Group Software at the Danone Group, Ralf Steinbach, says that “with SAP HANA Cloud Platforms, we can quickly develop beautiful, user-friendly applications that are opening new opportunities to connect our customers directly to our back-end systems.”

Cloud services are a rapidly expanding market, and research indicates there are over 150 PaaS offerings to choose from. Too often companies simply choose the PaaS of a cloud-service provider that they’re already working with, without exploring the offerings in-depth and with a long-term focus.

According to John Rymer of Forrester Research, there are three types of developers who make use of PaaS offerings to build apps:

  1. Coders, who want the ability to do it all themselves,
  2. DevOps developers who want the ability to do some coding if they need to but can also plug into some level of abstraction, and
  3. RapidDevs who don’t want to code at all but just to configure a task to the capabilities of the platform.

For each of these types of developers, the SAP HANA Cloud Platform can deliver, due to its flexibility, requiring fewer skills and still at a lower cost. That flexibility extends to the choices that customers are offered between selecting to use a private, managed cloud, a public pay-as-you-go model or even public cloud infrastructure-as-a-service or platform-as-a-service.

In order for a platform to survive and thrive, it requires developers to regard it as the best choice for what they have to do on a daily basis: easily and quickly deploy applications that leverage a proven in-memory platform for next generation applications and analytics supported by a world-class technical team at every step of the way.

A great way to get started with SAP HANA Cloud Platform is with the user-based packages. Priced per users, they offer the flexibility to choose the package that best fits your needs. You can get started for as little as $25 / user / month, and scale as you go, adding more users or upgrading to add more resources when you need them.

For a limited time, you can get 30% off SAP HANA Cloud Platform user-based packages on the SAP Store by using the promo code HCP30.

Sponsored spotlight series by SAP

By Jeremy Daniel

The Intelligent Industrial Revolution

The Intelligent Industrial Revolution

AI Revolution

Prefatory Note: Over the past six weeks, we took NVIDIA’s developer conference on a world tour. The GPU Technology Conference (GTC) was started in 2009 to foster a new approach to high performance computing using massively parallel processing GPUs. GTC has become the epicenter of GPU deep learning — the new computing model that sparked the big bang of modern AI. It’s no secret that AI is spreading like wildfire. The number of GPU deep learning developers has leapt 25 times in just two years. Some 1,500 AI startups have cropped up. This explosive growth has fueled demand for GTCs all over the world. So far, we’ve held events in Beijing, Taipei, Amsterdam, Tokyo, Seoul and Melbourne. Washington is set for this week and Mumbai next month. I kicked off four of the GTCs. Here’s a summary of what I talked about, what I learned and what I see in the near future as AI, the next wave in computing, revolutionizes one industry after another.

A New Era of Computing


Intelligent machines powered by AI computers that can learn, reason and interact with people are no longer science fiction. Today, a self-driving car powered by AI can meander through a country road at night and find its way. An AI-powered robot can learn motor skills through trial and error. This is truly an extraordinary time. In my three decades in the computer industry, none has held more potential, or been more fun. The era of AI has begun.

Our industry drives large-scale industrial and societal change. As computing evolves, new companies form, new products are built, our lives change. Looking back at the past couple of waves of computing, each was underpinned by a revolutionary computing model, a new architecture that expanded both the capabilities and reach of computing.

In 1995, the PC-Internet era was sparked by the convergence of low-cost microprocessors (CPUs), a standard operating system (Windows 95), and a new portal to a world of information (Yahoo!). The PC-Internet era brought the power of computing to about a billion people and realized Microsoft’s vision to put “a computer on every desk and in every home.” A decade later, the iPhone put “an Internet communications” device in our pockets. Coupled with the launch of Amazon’s AWS, the Mobile-Cloud era was born. A world of apps entered our daily lives and some 3 billion people enjoyed the freedom that mobile computing afforded.

Today, we stand at the beginning of the next era, the AI computing era, ignited by a new computing model, GPU deep learning. This new model — where deep neural networks are trained to recognize patterns from massive amounts of data — has proven to be “unreasonably” effective at solving some of the most complex problems in computer science. In this era, software writes itself and machines learn. Soon, hundreds of billions of devices will be infused with intelligence. AI will revolutionize every industry.

GPU Deep Learning “Big Bang”

Why now? As I wrote in an earlier post (“Accelerating AI with GPUs: A New Computing Model”), 2012 was a landmark year for AI. Alex Krizhevsky of the University of Toronto created a deep neural network that automatically learned to recognize images from 1 million examples. With just several days of training on two NVIDIA GTX 580 GPUs, “AlexNet” won that year’s ImageNet competition, beating all the human expert algorithms that had been honed for decades. That same year, recognizing that the larger the network, or the bigger the brain, the more it can learn, Stanford’s Andrew Ng and NVIDIA Research teamed up to develop a method for training networks using large-scale GPU-computing systems.

The world took notice. AI researchers everywhere turned to GPU deep learning. Baidu, Google, Facebook and Microsoft were the first companies to adopt it for pattern recognition. By 2015, they started to achieve “superhuman” results — a computer can now recognize images better than we can. In the area of speech recognition, Microsoft Research used GPU deep learning to achieve a historic milestone by reaching “human parity” in conversational speech.

Image recognition and speech recognition — GPU deep learning has provided the foundation for machines to learn, perceive, reason and solve problems. The GPU started out as the engine for simulating human imagination, conjuring up the amazing virtual worlds of video games and Hollywood films. Now, NVIDIA’s GPU runs deep learning algorithms, simulating human intelligence, and acts as the brain of computers, robots and self-driving cars that can perceive and understand the world. Just as human imagination and intelligence are linked, computer graphics and artificial intelligence come together in our architecture. Two modes of the human brain, two modes of the GPU. This may explain why NVIDIA GPUs are used broadly for deep learning, and NVIDIA is increasingly known as “the AI computing company.”

An End-to-End Platform for a New Computing Model

As a new computing model, GPU deep learning is changing how software is developed and how it runs. In the past, software engineers crafted programs and meticulously coded algorithms. Now, algorithms learn from tons of real-world examples — software writes itself. Programming is about coding instruction. Deep learning is about creating and training neural networks. The network can then be deployed in a data center to infer, predict and classify from new data presented to it. Networks can also be deployed into intelligent devices like cameras, cars and robots to understand the world. With new experiences, new data is collected to further train and refine the neural network. Learnings from billions of devices make all the devices on the network more intelligent. Neural networks will reap the benefits of both the exponential advance of GPU processing and large network effects — that is, they will get smarter at a pace way faster than Moore’s Law.

Whereas the old computing model is “instruction processing” intensive, this new computing model requires massive “data processing.” To advance every aspect of AI, we’re building an end-to-end AI computing platform — one architecture that spans training, inference and the billions of intelligent devices that are coming our way.

Let’s start with training. Our new Pascal GPU is a $2 billion investment and the work of several thousand engineers over three years. It is the first GPU optimized for deep learning. Pascal can train networks that are 65 times larger or faster than the Kepler GPU that Alex Krizhevsky used in his paper. A single computer of eight Pascal GPUs connected by NVIDIA NVLink, the highest throughput interconnect ever created, can train a network faster than 250 traditional servers.

Soon, the tens of billions of internet queries made each day will require AI, which means that each query will require billions more math operations. The total load on cloud services will be enormous to ensure real-time responsiveness. For faster data center inference performance, we announced the Tesla P40 and P4 GPUs. P40 accelerates data center inference throughput by 40 times. P4 requires only 50 watts and is designed to accelerate 1U OCP servers, typical of hyperscale data centers. Software is a vital part of NVIDIA’s deep learning platform. For training, we have CUDA and cuDNN. For inferencing, we announced TensorRT, an optimizing inferencing engine. TensorRT improves performance without compromising accuracy by fusing operations within a layer and across layers, pruning low-contribution weights, reducing precision to FP16 or INT8, and many other techniques.

Someday, billions of intelligent devices will take advantage of deep learning to perform seemingly intelligent tasks. Drones will autonomously navigate through a warehouse, find an item and pick it up. Portable medical instruments will use AI to diagnose blood samples onsite. Intelligent cameras will learn to alert us only to the circumstances that we care about. We created an energy-efficient AI supercomputer, Jetson TX1, for such intelligent IoT devices. A credit card-sized module, Jetson TX1 can reach 1 TeraFLOP FP16 performance using just 10 watts. It’s the same architecture as our most powerful GPUs and can run all the same software.

In short, we offer an end-to-end AI computing platform — from GPU to deep learning software and algorithms, from training systems to in-car AI computers, from cloud to data center to PC to robots. NVIDIA’s AI computing platform is everywhere.

AI Computing for Every Industry

Our end-to-end platform is the first step to ensuring that every industry can tap into AI. The global ecosystem for NVIDIA GPU deep learning has scaled out rapidly. Breakthrough results triggered a race to adopt AI for consumer internet services — search, recognition, recommendations, translation and more. Cloud service providers, from Alibaba and Amazon to IBM and Microsoft, make the NVIDIA GPU deep learning platform available to companies large and small. The world’s largest enterprise technology companies have configured servers based on NVIDIA GPUs. We were pleased to highlight strategic announcements along our GTC tour to address major industries:

AI Transportation: At $10 trillion, transportation is a massive industry that AI can transform. Autonomous vehicles can reduce accidents, improve the productivity of trucking and taxi services, and enable new mobility services. We announced that both Baidu and TomTom selected NVIDIA DRIVE PX 2 for self-driving cars. With each, we’re building an open “cloud-to-car” platform that includes an HD map, AI algorithms and an AI supercomputer.

Driving is a learned behavior that we do as second nature. Yet one that is impossible to program a computer to perform. Autonomous driving requires every aspect of AI — perception of the surroundings, reasoning to determine the conditions of the environment, planning the best course of action, and continuously learning to improve our understanding of the vast and diverse world. The wide spectrum of autonomous driving requires an open, scalable architecture — from highway hands-free cruising, to autonomous drive-to-destination, to fully autonomous shuttles with no drivers.

NVIDIA DRIVE PX 2 is a scalable architecture that can span the entire range of AI for autonomous driving. At GTC, we announced DRIVE PX 2 AutoCruise designed for highway autonomous driving with continuous localization and mapping. We also released DriveWorks Alpha 1, our OS for self-driving cars that covers every aspect of autonomous driving — detection, localization, planning and action.

We bring all of our capabilities together into our own self-driving car, NVIDIA BB8. Here’s a little video:

NVIDIA is focused on innovation at the intersection of visual processing, AI and high performance computing — a unique combination at the heart of intelligent and autonomous machines. For the first time, we have AI algorithms that will make self-driving cars and autonomous robots possible. But they require a real-time, cost-effective computing platform.

At GTC, we introduced Xavier, the most ambitious single-chip computer we have ever undertaken — the world’s first AI supercomputer chip. Xavier is 7 billion transistors — more complex than the most advanced server-class CPU. Miraculously, Xavier has the equivalent horsepower of DRIVE PX 2 launched at CES earlier this year — 20 trillion operations per second of deep learning performance — at just 20 watts. As Forbes noted, we doubled down on self-driving cars with Xavier.

  • AI Enterprise: IBM, which sees a $2 trillion opportunity in cognitive computing, announced a new POWER8 and NVIDIA Tesla P100 server designed to bring AI to the enterprise. On the software side, SAP announced that it has received two of the first NVIDIA DGX-1 supercomputers and is actively building machine learning enterprise solutions for its 320,000 customers in 190 countries.
  • AI City: There will be 1 billion cameras in the world in 2020. Hikvision, the world leader in surveillance systems, is using AI to help make our cities safer. It uses DGX-1 for network training and has built a breakthrough server, called “Blade,” based on 16 Jetson TX1 processors. Blade requires 1/20 the space and 1/10 the power of the 21 CPU-based servers of equivalent performance.
  • AI Factory: There are 2 billion industrial robots worldwide. Japan is the epicenter of robotics innovation. At GTC, we announced that FANUC, the Japan-based industrial robotics giant, will build the factory of the future on the NVIDIA AI platform, from end to end. Its deep neural network will be trained with NVIDIA GPUs, GPU-powered FANUC Fog units will drive a group of robots and allow them to learn together, and each robot will have an embedded GPU to perform real-time AI. MIT Tech Review wrote about it in its story “Japanese Robotics Giant Gives Its Arms Some Brains.”
  • The Next Phase of Every Industry: GPU deep learning is inspiring a new wave of startups — 1,500+ around the world — in healthcare, fintech, automotive, consumer web applications and more., which was recently licensed to test its vehicles on California roads, is tackling the challenge of self-driving cars by applying deep learning to the full driving stack. Preferred Networks, the Japan-based developer of the Chainer framework, is developing deep learning solutions for IoT., based in London and one of the first recipients of DGX-1, is using deep learning for drug discovery to tackle diseases like Parkinson’s, Alzheimer’s and rare cancers. According to CB Insights, funding for AI startups hit over $1 billion in the second quarter, an all-time high.

The explosion of startups is yet another indicator of AI’s sweep across industries. As Fortune recently wrote, deep learning will “transform corporate America.”  

AI Everyone

AI for Everyone

AI can solve problems that seemed well beyond our reach just a few years back. From real-world data, computers can learn to recognize patterns too complex, too massive or too subtle for hand-crafted software or even humans. With GPU deep learning, this computing model is now practical and can be applied to solve challenges in the world’s largest industries. Self-driving cars will transform the $10 trillion transportation industry. In healthcare, doctors will use AI to detect disease at the earliest possible moment, to understand the human genome to tackle cancer, or to learn from the massive volume of medical data and research to recommend the best treatments. And AI will usher in the 4thindustrial revolution — after steam, mass production and automation — intelligent robotics will drive a new wave of productivity improvements and enable mass consumer customization. AI will touch everyone. The era of AI is here.

Syndicated article courtesy of Nvidia

By Jen-Hsun Huang

jen-hsun-huangJen-Hsun Huang founded NVIDIA in 1993 and has served since its inception as president, chief executive officer and a member of the board of directors.

NVIDIA invented the GPU in 1999 and, from its roots as a PC graphics company, has gone on to become the world leader in AI computing.

Update: Timeline of the Massive DDoS DYN Attacks

Update: Timeline of the Massive DDoS DYN Attacks

DYN DDOS Timeline

This morning at 7am ET a DDoS attack was launched at Dyn (the site is still down at the minute), an Internet infrastructure company whose headquarters are in New Hampshire. So far the attack has come in 2 waves, the first at 11.10 UTC and the second at around 16.00 UTC. So far details have been vague, though there are a number of theories starting to surface in the aftermath of the attack. The attack took down numerous websites including Twitter, Amazon, Spotify and Reddit for a period – you can find the full list of affected sites here. PSN and Xbox live apps have also been affected!


The timeline of events according to the DYN updates is as follows:

11:10 UTC- We began monitoring and mitigating a DDoS attack against our Dyn Managed DNS infrastructure. Some customers may experience increased DNS query latency and delayed zone propagation during this time.

12:45 UTC – This attack is mainly impacting US East and is impacting Managed DNS customers in this region. Our Engineers are continuing to work on mitigating this issue.

13:36 UTC – Services have been restored to normal as of 13:20 UTC.

16:06 UTC – As of 15:52 UTC, we have begun monitoring and mitigating a DDoS attack against our Dyn Managed DNS infrastructure. Our Engineers are continuing to work on mitigating this issue.

16:48 UTC – This DDoS attack may also be impacting Dyn Managed DNS advanced services with possible delays in monitoring. Our Engineers are continuing to work on mitigating this issue.

17:53 UTC – Our engineers continue to investigate and mitigate several attacks aimed against the Dyn Managed DNS infrastructure.

18:23 UTC – Dyn Managed DNS advanced service monitoring is currently experiencing issues. Customers may notice incorrect probe alerts on their advanced DNS services. Our engineers continue to monitor and investigate the issue.

18:52 UTC – At this time, the advanced service monitoring issue has been resolved. Our engineers are still investigating and mitigating the attacks on our infrastructure.

20:37 UTC – Our engineers continue to investigate and mitigate several attacks aimed against the Dyn Managed DNS infrastructure.

Cloud Disaster Recovery

The attack has come only a few hours after Doug Madory, DYN researcher, presented a talk (you can watch it here) on DDoS attacks in Dallas at a meeting of the North American Network Operators Group (NANOG). Krebs on Security has also drawn links between reports of extortion threats posted on this thread, with the threats clearly referencing DDoS attacks – “If you will not pay in time, DDoS attack will start, your web-services will go down permanently. After that, price to stop will be increased to 5 BTC with further increment of 5 BTC for every day of attack.”

They do however, distance themselves from making any actual claims of extortion, “Let me be clear: I have no data to indicate that the attack on Dyn is related to extortion, to Mirai or to any of the companies or individuals Madory referenced in his talk this week in Dallas

However, this isn’t the only theory circulating at the moment. Dillon Townsel from IBM security has tweeted: has reported that hacking group PoodleCorp are being blamed for the attack by because of the cryptic tweet that they posted 2 days ago, “October 21st #PoodleCorp will be putting @Battlefield in the oven

PoodleCorp famously took down the Pokemon Go servers in July. Homeland Security and the FBI are investigating the attack and are yet to deem who was responsible.

Today’s attack is very different to the DDoS style that Anonymous rose to fame with. Instead of attacking and taking out an individual website for short periods of time, hackers took down a massive piece of the internet backbone for an entire morning, not once but twice with new reports of a potential 3rd wave. At the moment there have been no claims of ownership for the attack nor has there been any concrete evidence of who perpetrated the attack.

Dyn are well known for publishing detailed reports on attacks of this nature so we can only hope they will do the same for their own servers.

Until then you can follow any updates that Dyn are releasing here.

DDoS Attack – Update 10/24/2016

As of 22.17 UTC on October 21st Dyn declared the massive IoT attack, which had crippled large parts of the internet, to be over. However, details surrounding the attack are still emerging.

In the midst of the chaos, WikiLeaks tweeted this,  “Mr. Assange is still alive and WikiLeaks is still publishing. We ask supporters to stop taking down the US internet. You proved your point.


– suggesting that they knew who the perpetrators were. Perhaps even that they requested that attack, although this is pure speculation at this point.

A senior U.S. intelligence official spoke to NBC News, he commented that the current assessment is that this is a case of “internet vandalism”. At this point, they do not believe that it was any kind of state-sponsored or directed attack.

Hangzhou Xiongmai Technology, who specialise in DVRs and internet-connected cameras, said on Sunday that its products security vulnerabilities inadvertently played a role in the cyberattack, citing weak default passwords in its products as the cause.

Security researchers have discovered that malware known as Mirai was used to take advantage of these weaknesses by infecting the devices and using them to launch huge distributed denial-of service attacks. Mirai works by infecting and taking over IoT devices to create a massive connected network, which then overloads sites with requests and takes the website offline.

At this point we do not know when the identity of the hackers will become clear. Watch this page for more updates as they become available.

By Josh Hamilton

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

DDoS Knocks Out Several Websites

Cyber attacks targeting the internet infrastructure provider Dyn disrupted service on major sites such as Twitter and Spotify on Friday, mainly affecting users on the U.S. East Coast.

It was not immediately clear who was responsible. Officials told Reuters that the U.S. Department of Homeland Security and the Federal Bureau of Investigation were both investigating.

The disruptions come at a time of unprecedented fears about the cyber threat in the United States, where hackers have breached political organizations and election agencies.

Homeland Security last week issued a warning about a powerful new approach for blocking access to websites – hackers infecting routers, printers, smart TVs and other connected devices with malware that turns them into “bot” armies that overwhelm website servers in distributed denial of service attacks.

Dyn said it had resolved one attack, which disrupted operations for about two hours, but disclosed a second attack a few hours later that was causing further disruptions.

In addition to the social network Twitter and music-streamer Spotify, the discussion site Reddit, hospitality booking service Airbnb and The Verge news site were among companies whose services were disrupted on Friday. Inc’s web services division, one of the world’s biggest cloud computing companies, also reported a related outage, which it said was resolved early Friday afternoon.

Dyn is a Manchester, New Hampshire-based provider of services for managing domain name servers (DNS), which act as switchboards connecting internet traffic. Requests to access sites are transmitted through DNS servers that direct them to computers that host websites.

Its customers include some of the world’s biggest corporations and Internet firms, such as Pfizer, Visa, Netflix and Twitter, SoundCloud and BT.

Dyn said it was still trying to determine how the attack led to the outage but that its first priority was restoring service.

Attacking a large DNS provider can create massive disruptions because such firms are responsible for forwarding large volumes of internet traffic.

Full Article Source: Reuters

Cashless Society Part 2: Pros and Cons

Cashless Society Part 2: Pros and Cons

The Cashless Society

Having looking at our movement towards a cashless society in Part 1, I thought we should turn our attention to the consequences of a truly cashless society. Could it be a force for good? Or could it lead to banks and governments abusing the power that comes along with it?

The phasing out of cash in the economy would make implementation of certain fiscal policies, such as negative interest rates, far easier and more effective. Kenneth Rogoff, author of “The Curse of Cash”, cites negative interest rates as an important tool for central banks to restore macroeconomic stability; the incentive to borrow and spend help stimulate the economy. By holding all currency in regulated accounts the government can tax savings in the name of monetary policy.

Kenneth RogoffOne of the more widely used arguments in favour of a cashless economy is that of security. France’s finance minister has recently stated that he plans to “fight against the use of cash and anonymity in the French economy” in order to help fight terrorism and other threats. With the ability to track every transaction that takes place, intelligence services could cut down on crime by monitoring purchases and money transfers. However, Rogoff acknowledges the limitations of this policy, in that the removal of paper money will only be effective “provided the government is vigilant about playing whac-a-mole as alternative transaction media come into being“. Although, it is naïve to think that crime could be quashed so easily. If interest rates fall too far below zero, it is quite possible that citizens would find an alternative to cash (drug traffickers certainly would). Money has been reinvented time and again throughout history, as shells, cigarettes and cryptographic code. Going cashless has also been touted as being more secure from theft, with Apple and Google claiming their payment system is more secure than regular banking, as well as being more convenient than cash.

Yet there are a number of concerns that have been raised about the transition to digital money. Advances in tech have allowed credit and debit card purchases to be tracked and evaluated to gauge the validity of a purchase. This has so far been used to prevent fraud and theft, to protect consumers. However, there is a risk of abuse here, for example in 2010 Visa and Mastercard gave in to government pressure, not even physical legislation, and barred all online-betting payments from their systems. They made it virtually impossible for these gambling sites to operate regardless of their jurisdiction or legality. Scott A. Shay, chairman of Signature Bank, suggested in an article on CNBC that “the day might come when the health records of an overweight individual would lead to a situation in which they find that any sugary drink purchase they make through a credit or debit card is declined”. Although this may seem like a stretch, a government with access to this sort of power could quite easily control individual spending.

A cashless society would also increase the difficulties for homeless people to re-integrate into society. Having no fixed address already makes holding a bank account incredibly difficult, a cash free society simply increases the societal barriers that those on the fringes of society have to navigate. There is also the psychological issue, that electronic payment encourages frivolous spending. A student interviewed at the University of Gothenberg commented that she was much more likely to think twice about spending a 500 krona note compared to with a debit or credit card.

The other side of the coin (pardon the pun), is that this power could be used for good, for example placing restrictions on recovering alcoholics from purchasing alcohol. The route which this technology will take is, as is often the case, determined by the government and societal attitude to the situation. There is room for abuse in the technology, more than most, but the benefits are well documented and used sensibly could help prevent terrorism and crime, reduce tax evasion, and help to curb unhealthy spending habits. Ultimately, a cashless society will be what we make of it.

By Josh Hamilton

CloudTweaks Comics
The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks! So we are all cheering as the FCC last week made the right choice in upholding the principle of net neutrality! For the general public it is a given that an ISP should be allowed to charge for bandwidth and Internet access but never to block or somehow…

Update: Timeline of the Massive DDoS DYN Attacks

Update: Timeline of the Massive DDoS DYN Attacks

DYN DDOS Timeline This morning at 7am ET a DDoS attack was launched at Dyn (the site is still down at the minute), an Internet infrastructure company whose headquarters are in New Hampshire. So far the attack has come in 2 waves, the first at 11.10 UTC and the second at around 16.00 UTC. So…

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

DDoS attacks, unauthorized access and false alarms Above DDoS attacks, unauthorized access and false alarms, malware is the most common incident that security teams reported responding to in 2014, according to a recent survey from SANS Institute and late-stage security startup AlienVault. The average cost of a data breach? $3.5 million, or $145 per sensitive…

The DDoS That Came Through IoT: A New Era For Cyber Crime

The DDoS That Came Through IoT: A New Era For Cyber Crime

A New Era for Cyber Crime Last September, the website of a well-known security journalist was hit by a massive DDoS attack. The site’s host stated it was the largest attack of that type they had ever seen. Rather than originating at an identifiable location, the attack seemed to come from everywhere, and it seemed…

Lavabit, Edward Snowden and the Legal Battle For Privacy

Lavabit, Edward Snowden and the Legal Battle For Privacy

The Legal Battle For Privacy In early June 2013, Edward Snowden made headlines around the world when he leaked information about the National Security Agency (NSA) collecting the phone records of tens of millions of Americans. It was a dramatic story. Snowden flew to Hong Kong and then Russia to avoid deportation to the US,…

Three Ways To Secure The Enterprise Cloud

Three Ways To Secure The Enterprise Cloud

Secure The Enterprise Cloud Data is moving to the cloud. It is moving quickly and in enormous volumes. As this trend continues, more enterprise data will reside in the cloud and organizations will be faced with the challenge of entrusting even their most sensitive and critical data to a different security environment that comes with using…

The Fully Aware, Hybrid-Cloud Approach

The Fully Aware, Hybrid-Cloud Approach

Hybrid-Cloud Approach For over 20 years, organizations have been attempting to secure their networks and protect their data. However, have any of their efforts really improved security? Today we hear journalists and industry experts talk about the erosion of the perimeter. Some say it’s squishy, others say it’s spongy, and yet another claims it crunchy.…

Achieving Network Security In The IoT

Achieving Network Security In The IoT

Security In The IoT The network security market is experiencing a pressing and transformative change, especially around access control and orchestration. Although it has been mature for decades, the network security market had to transform rapidly with the advent of the BYOD trend and emergence of the cloud, which swept enterprises a few years ago.…

The Security Gap: What Is Your Core Strength?

The Security Gap: What Is Your Core Strength?

The Security Gap You’re out of your mind if you think blocking access to file sharing services is filling a security gap. You’re out of your mind if you think making people jump through hoops like Citrix and VPNs to get at content is secure. You’re out of your mind if you think putting your…

Staying on Top of Your Infrastructure-as-a-Service Security Responsibilities

Staying on Top of Your Infrastructure-as-a-Service Security Responsibilities

Infrastructure-as-a-Service Security It’s no secret many organizations rely on popular cloud providers like Amazon and Microsoft for access to computing infrastructure. The many perks of cloud services, such as the ability to quickly scale resources without the upfront cost of buying physical servers, have helped build a multibillion-dollar cloud industry that continues to grow each…

The Rise Of BI Data And How To Use It Effectively

The Rise Of BI Data And How To Use It Effectively

The Rise of BI Data Every few years, a new concept or technological development is introduced that drastically improves the business world as a whole. In 1983, the first commercially handheld mobile phone debuted and provided workers with an unprecedented amount of availability, leading to more productivity and profits. More recently, the Cloud has taken…

What You Need To Know About Choosing A Cloud Service Provider

What You Need To Know About Choosing A Cloud Service Provider

Selecting The Right Cloud Services Provider How to find the right partner for cloud adoption on an enterprise scale The cloud is capable of delivering many benefits, enabling greater collaboration, business agility, and speed to market. Cloud adoption in the enterprise has been growing fast. Worldwide spending on public cloud services will grow at a…


Sponsored Partners

The Benefits Of Having A Cloud-Bursting Partner
Collaborative Economy – Customer Appreciation Day
SAP HANA® And Global Healthcare
Internet Performance Management In Today’s Volatile Online Environment
The Value of Hybrid Cloud
The Benefits of Cloud-Based Phone Systems
Cyber Security: An Ounce of Prevention
Competing Cloud Security Demands Call For Credentialed Professionals
Security: The Goodwill Virus That Keeps On Giving
Help Your Business Improve Security By Choosing The Right Cloud Provider
Collaborative Economy – The Death Of “Death By Meeting”
Confused By The Cloud? A New eBook Reveals All…