Category Archives: Technology

The Managed DNS Industry

The Managed DNS Industry

DNS Industry 

The SaaS industry has been going through a major shift in just the last few years, which is redefining how platforms are designed. System and network administrators are demanding all-in-one platforms for a variety of management tasks. The managed DNS industry, for one, has been radically altered by this shift. Both new and existing DNS providers are rolling out integrated platforms, which combine the analytical power of monitoring with advanced query management.

The Internet has been abuzz as the skeptical sys admins question how these integrated platforms can fix issues their predecessors couldn’t. And can you replace your current toolset with an all-in-one platform?

The principal idea behind these platforms is synergy, a mutually dependent relationship between monitoring and management. This technology is made possible by the cloud, which allows information to be shared between the two services in real time. The cloud foundations for all-in-one platforms have also proven to make these subscription services noticeably cheaper.

So what is this synergistic secret sauce that makes these all-in-one services so revolutionary? In the case of DNS management, network monitoring is integral to efficient query routing. What’s the point of making changes to your network configurations if you can’t monitor and analyze the results? This can also be applied the other way around: what’s the point in monitoring your network if you can’t fix the problems that you identify?

security-tips

Traffic management should never feel like a shot in the dark, rather it should be informed and calculated to provide the best result for each individual end-user. The new integrated platform push is forcing admins to rethink how they manage their organizations’ traffic.

The problem is, too many admins think these tools are only used for anticipating DDoS or resolving attacks and outages. To be frank, outages are rare, but they can be devastating. DNS management has shifted from outage resolution to performance optimization. Next-generation managed DNS solutions will take a look at your entire network and implement changes to improve the experience for all of your end-users—individually optimized for each user’s location, browser, IP connectivity, and more.

Admins aren’t wrong for wanting to use query management for security reasons. That’s because DNS traffic operates at a critical ingress point for managing incoming traffic; as in, you can filter and root out malicious traffic before it even reaches your site. But what most admins seem to forget is these same management tools can be used to eliminate latency and improve network performance.

End-users are demanding faster load times, especially from mobile sites. DNS resolution times are only one portion of load time, but 50% of page load time is taken up by network latency overhead. Admins have to leverage every layer of the stack for optimal performance, or get left behind.

All-in-one management solutions are proving to be invaluable during high traffic periods. You can analyze traffic loads and redirect segments of traffic so that it’s balanced across many different resources or locations. You can also use this technology to minimize resolution times, by ensuring queries are being answered at the nearest possible server, or most optimally performing server (in case the closest one is under strain or underperforming).

These platforms are also incorporating Artificial Intelligence (AI) to analyze areas causing performance degradation and then make changes to alleviate them before they can cause appreciable affects to end-users. Some AI’s are paired with automated services that are able to recognize performance trends and patterns. They then use the analytics to anticipate and even predict potential attacks or fluctuations.

These all-in-one suites have created a new breed of traffic management, called Internet Traffic Optimization Services (ITOS). This new industry seeks to redefine the way admins manage their networks, by harnessing the power of analytics to make informed proactive changes. DNS is a user’s first and most impactful step when accessing a website, which is why ITOS places a strong emphasis on informed DNS management.

In the end, it all comes down to the cold hard stats. In order to get the most ROI out of a service, you need to look for reliability, cost efficiency, and proven performance improvements. All-in-one and ITOS solutions may still be in their formative years, but these solutions provide admins with all the tools they need in one platform. Now admins can see the performance improvement of their configurations in real time, while still costing less than non-integrated services.

By Steven Job

Microsoft Releases Beta of Microsoft Cognitive Toolkit For Deep Learning Advances

Microsoft Releases Beta of Microsoft Cognitive Toolkit For Deep Learning Advances

Microsoft Cognitive Toolkit

Microsoft has released an updated version of Microsoft Cognitive Toolkit, a system for deep learning that is used to speed advances in areas such as speech and image recognition and search relevance on CPUs and NVIDIA® GPUs.

The toolkit, previously known as CNTK, was initially developed by computer scientists at Microsoft who wanted a tool to do their own research more quickly and effectively. It quickly moved beyond speech and morphed into an offering that customers including a leading international appliance maker and Microsoft’s flagship product groups depend on for a wide variety of deep learning tasks.

We’ve taken it from a research tool to something that works in a production setting,” said Frank Seide, a principal researcher at Microsoft Artificial Intelligence and Research and a key architect of Microsoft Cognitive Toolkit.

The latest version of the toolkit, which is available on GitHub via an open source license, includes new functionality that lets developers use Python or C++ programming languages in working with the toolkit.  With the new version, researchers also can do a type of artificial intelligence work called reinforcement learning.

Finally, the toolkit is able to deliver better performance than previous versions. It’s also faster than other toolkits, especially when working on big datasets across multiple machines. That kind of large-scale deployment is necessary to do the type of deep learning across multiple GPUs that is needed to develop consumer products and professional offerings…

Read Full Article: Microsoft

A President’s Trove of Data

A President’s Trove of Data

Then vs Now

According to some popular opinions, today’s information age affords more information to teens today than a few of the world leaders had access to 20 years ago. C+R Research has put this hypothesis through its paces, comparing access to information across areas such as private data, classified information, genetics, public opinion, and more and finds that in many ways the average smartphone user does, in fact, have access to a lot more information than those with the highest clearance would have two decades ago. However, the accuracy and quality of data available don’t necessarily compare.

Critical Information vs. the Non-Essentials

C+R Research finds that just about any 13-year- old with a smartphone in 2016 would beat President Bill Clinton’s 1996 intelligence and access in areas such as traffic data, music, trivia, opinion, and even genetics. But then, the president of the United States might not have time to listen to the 30 million songs immediately accessible via Spotify, nor would Air Force One likely be constrained by the same traffic limitations as the rest of us. Of course, political campaign teams of 20 years ago would drool for the polling possibilities that Twitter offers today.

Data

On the other hand, President Clinton would have had better access to classified information, data from satellites, and research journals, as well as access to private data – though there are rules governing this, some very important people tend to be ‘incorporated’ into such regulations. Happily, or unhappily depending on how much privacy you desire, tracking of family members via the secret service in 1996 was about as proficient as the smartphone apps we use today to monitor friends and family.

In the end, the 13-year-old wins 7 to 5 in the most general terms, but it’s important to recognize that the broad scope of information available today doesn’t necessarily point to accurate or significant information, two traits President Clinton could be sure of.

By Jennifer Klostermann

Blockchain and the IoT

Blockchain and the IoT

IoT Blockchain

Blockchain, also known as Distributed Ledger Technology (DLT), is the innovative technology behind Bitcoin. The impact of Bitcoin has been tremendous and, as with any revolutionary technology, was treated initially with awe and apprehension. Since its open source release back in 2009, Bitcoin became a transformative force in the global payments system, establishing itself without the aid or support of the traditional financial infrastructure. While initial usage saw huge success in black markets, Bitcoin defied odds, and the blockchain technology spawned other cryptocurrencies, exchanges, commercial ventures, alliances, consortiums, investments, and uptake by governments, merchants, and financial services worldwide.

block-chain

On August 12, the World Economic Forum (WEF) published a report on the future of the financial infrastructure, and in particular on the transformative role that blockchain technology is set to play. Notably, it analyzes the technology’s impact on the financial services industry and how it can provide more transparency and security. Potential use cases are examined, including for insurance, deposits and lending, insurance, capital raising, investment management, and market provisioning. The report also looks at the current challenges in instituting a widespread implementation of blockchain, many of which will require international legal frameworks, harmonized regulatory environments, and global standardization efforts.

DLT is already having a serious impact on the financial services industry. The WEF report states that 80% of banks will initiate a DLT project by next year, and more than $1.4 billion has already been invested in the technology in the past three years. More than that, governments and law firms are seriously staking their claim in advancing the technology. Law firm Steptoe & Johnson LLP recently announced the expansion of its Blockchain Team into a multidisciplinary practice involving FinTech, financial services, regulatory, and law enforcement knowledge. The firm is also one of the founders of the Blockchain Alliance, a coalition of blockchain companies and law enforcement and regulatory agencies, alongside the U.S. Chamber of Digital Commerce and Coin Center. This expansion is an endorsement of the potential of DLT, within and eventually beyond financial services.

blockchain-landscape-2016

(Image source: Startupmanagement.org)

The possible applications of blockchain are already being explored in numerous new sectors: energy, transportation, intellectual property, regulation and compliance, international trade, law enforcement, and government affairs, among many others. Ethereum is one blockchain endeavor that features smart contract functionality. The distributed computing platform provides a decentralized virtual machine to execute peer-to-peer contracts using the Ether cryptocurrency. The Ether Hack Camp is launching a four-week hackathon in November 2016 for DLT using Ether. Currently, the Camp is requesting developers to propose ideas to the public, which will be voted on by registered fans and those selected will be able to take part in the hackathon. The ideas can be seen online already and are vast and varied, ranging from academic publishing without journals, music licensing reform, decentralized ISP, voting on the blockchain, alternative dispute resolution, and rural land register. The idea winning first place in November will collect $50,000 USD.

IBM is one of the most dynamic forerunners currently pushing DLT for the IoT. The firm just announced it is investing $200 million in blockchain technology to drive forward its Watson IoT efforts. The firm is opening up a new office in Germany, which will serve as headquarter to new blockchain initiatives. The investment is part of the $3 billion that IBM pledged to develop Watson’s cognitive computing for the IoT. The goal of the new investment is to enable companies to share IoT data in a private blockchain. A commercial implementation is already underway with Finnish company Kouvola Innovation, which wants to integrate its capabilities into the IBM Blockchain and link devices for tracking, monitoring, and reporting on shipping container status and location, optimizing packing, and transfer of shipments.

IBM is working hard to align its IoT, AI and Blockchain technologies through Watson. The new headquarters in Germany will be home to a Cognitive IoT Collaboratories for researchers, developers and engineers.

Many of IBM’s current projects are developed leveraging the open source Hyperledger Project fabric, a consortium founded by the Linux Foundation in which IBM is a significant contributor, alongside Cisco and Intel. IBM pushed its involvement even further with the June launch of its New York-based Bluemix Garage. The idea is to allow developers and researchers the opportunity to use IBM Cloud APIs and blockchain technologies to drive cognitive, IoT, unstructured data, and social media technology innovation. Just one month after the launch, IBM announced the launch of a cloud service for companies running blockchain technology. The cloud service is underpinned by IBM’s LinuxONE technology, which is specifically designed to meet the security requirements of critical sectors, such as financial, healthcare, and government.

The potential for DLT is certainly broad and rather long-term, but the engagement by the financial services industry is a testament to its potential. While FinTech remains the big focus for blockchain technologies, its success will drive the use of DLT for other areas. The promise of blockchain is to deliver accountability and transparency; although this could be disrupted significantly if announcements, such as the one made by Accenture on ‘editable’ Blockchain, become a reality. While banks may welcome the feature, it would be a serious blow to not only the integrity, but also the security of blockchain technology.

By Michela Menting

Great Cloud Platforms Need to Win the Hearts and Minds of Developers First

Great Cloud Platforms Need to Win the Hearts and Minds of Developers First

Great Cloud Platforms 

Adoption of cloud computing services is growing exponentially all around the world. Companies are realizing that so much of the hard, expensive work that they used to have to do internally can now be outsourced to cloud providers, allowing the companies to focus on what it is that they do best. That’s the reason why tech research firm Gartner projects that over the next five years, the shift to the cloud is looking to be a US$1-trillion market.

Everything from running payrolls, to marketing, logistics, data analysis and much, much more is moving to the cloud, and one of the most successful uses of the cloud is the concept of Platform-as-a-Service (PaaS, as it is known). What this does is enable customers to develop, run and manage their own applications without having to invest heavily in the infrastructure required in order to develop and launch a web application.

The key to creating a good product on the right platform is to win the hearts and minds of web developers so that they choose the right platform to go with. SAP, the world’s largest enterprise cloud company with over 320,000 customers and over 110 million cloud users in 190 countries is using its extensive experience and knowledge in the business space to offer the SAP HANA Cloud Platform, a remarkable service for all company sizes. This platform is already being used extensively by developers who are creating apps for their customers or their various organizations and employees.

hcp_customer_journey_october12_2016_v5_001

The SAP HANA Cloud Platform enables developers to build business applications in the cloud quickly and easily.

Three features of this platform stand out:

  1. its ability to extend your cloud and on-premise applications to develop customized hybrid solutions,
  2. the awesome feature allowing you to integrate applications seamlessly and securely to synchronize data and processes across cloud, on-premise and third-party applications, as well as
  3. the core feature which allows you to build new enterprise-ready applications rapidly with an open standards platform that brings out the best in developers.

The Director of Group Software at the Danone Group, Ralf Steinbach, says that “with SAP HANA Cloud Platforms, we can quickly develop beautiful, user-friendly applications that are opening new opportunities to connect our customers directly to our back-end systems.”

Cloud services are a rapidly expanding market, and research indicates there are over 150 PaaS offerings to choose from. Too often companies simply choose the PaaS of a cloud-service provider that they’re already working with, without exploring the offerings in-depth and with a long-term focus.

According to John Rymer of Forrester Research, there are three types of developers who make use of PaaS offerings to build apps:

  1. Coders, who want the ability to do it all themselves,
  2. DevOps developers who want the ability to do some coding if they need to but can also plug into some level of abstraction, and
  3. RapidDevs who don’t want to code at all but just to configure a task to the capabilities of the platform.

For each of these types of developers, the SAP HANA Cloud Platform can deliver, due to its flexibility, requiring fewer skills and still at a lower cost. That flexibility extends to the choices that customers are offered between selecting to use a private, managed cloud, a public pay-as-you-go model or even public cloud infrastructure-as-a-service or platform-as-a-service.

In order for a platform to survive and thrive, it requires developers to regard it as the best choice for what they have to do on a daily basis: easily and quickly deploy applications that leverage a proven in-memory platform for next generation applications and analytics supported by a world-class technical team at every step of the way.

A great way to get started with SAP HANA Cloud Platform is with the user-based packages. Priced per users, they offer the flexibility to choose the package that best fits your needs. You can get started for as little as $25 / user / month, and scale as you go, adding more users or upgrading to add more resources when you need them.

For a limited time, you can get 30% off SAP HANA Cloud Platform user-based packages on the SAP Store by using the promo code HCP30.

Sponsored spotlight series by SAP

By Jeremy Daniel

The Intelligent Industrial Revolution

The Intelligent Industrial Revolution

AI Revolution

Prefatory Note: Over the past six weeks, we took NVIDIA’s developer conference on a world tour. The GPU Technology Conference (GTC) was started in 2009 to foster a new approach to high performance computing using massively parallel processing GPUs. GTC has become the epicenter of GPU deep learning — the new computing model that sparked the big bang of modern AI. It’s no secret that AI is spreading like wildfire. The number of GPU deep learning developers has leapt 25 times in just two years. Some 1,500 AI startups have cropped up. This explosive growth has fueled demand for GTCs all over the world. So far, we’ve held events in Beijing, Taipei, Amsterdam, Tokyo, Seoul and Melbourne. Washington is set for this week and Mumbai next month. I kicked off four of the GTCs. Here’s a summary of what I talked about, what I learned and what I see in the near future as AI, the next wave in computing, revolutionizes one industry after another.

A New Era of Computing

a_new_era-iot

Intelligent machines powered by AI computers that can learn, reason and interact with people are no longer science fiction. Today, a self-driving car powered by AI can meander through a country road at night and find its way. An AI-powered robot can learn motor skills through trial and error. This is truly an extraordinary time. In my three decades in the computer industry, none has held more potential, or been more fun. The era of AI has begun.

Our industry drives large-scale industrial and societal change. As computing evolves, new companies form, new products are built, our lives change. Looking back at the past couple of waves of computing, each was underpinned by a revolutionary computing model, a new architecture that expanded both the capabilities and reach of computing.

In 1995, the PC-Internet era was sparked by the convergence of low-cost microprocessors (CPUs), a standard operating system (Windows 95), and a new portal to a world of information (Yahoo!). The PC-Internet era brought the power of computing to about a billion people and realized Microsoft’s vision to put “a computer on every desk and in every home.” A decade later, the iPhone put “an Internet communications” device in our pockets. Coupled with the launch of Amazon’s AWS, the Mobile-Cloud era was born. A world of apps entered our daily lives and some 3 billion people enjoyed the freedom that mobile computing afforded.

Today, we stand at the beginning of the next era, the AI computing era, ignited by a new computing model, GPU deep learning. This new model — where deep neural networks are trained to recognize patterns from massive amounts of data — has proven to be “unreasonably” effective at solving some of the most complex problems in computer science. In this era, software writes itself and machines learn. Soon, hundreds of billions of devices will be infused with intelligence. AI will revolutionize every industry.

GPU Deep Learning “Big Bang”

Why now? As I wrote in an earlier post (“Accelerating AI with GPUs: A New Computing Model”), 2012 was a landmark year for AI. Alex Krizhevsky of the University of Toronto created a deep neural network that automatically learned to recognize images from 1 million examples. With just several days of training on two NVIDIA GTX 580 GPUs, “AlexNet” won that year’s ImageNet competition, beating all the human expert algorithms that had been honed for decades. That same year, recognizing that the larger the network, or the bigger the brain, the more it can learn, Stanford’s Andrew Ng and NVIDIA Research teamed up to develop a method for training networks using large-scale GPU-computing systems.

The world took notice. AI researchers everywhere turned to GPU deep learning. Baidu, Google, Facebook and Microsoft were the first companies to adopt it for pattern recognition. By 2015, they started to achieve “superhuman” results — a computer can now recognize images better than we can. In the area of speech recognition, Microsoft Research used GPU deep learning to achieve a historic milestone by reaching “human parity” in conversational speech.

Image recognition and speech recognition — GPU deep learning has provided the foundation for machines to learn, perceive, reason and solve problems. The GPU started out as the engine for simulating human imagination, conjuring up the amazing virtual worlds of video games and Hollywood films. Now, NVIDIA’s GPU runs deep learning algorithms, simulating human intelligence, and acts as the brain of computers, robots and self-driving cars that can perceive and understand the world. Just as human imagination and intelligence are linked, computer graphics and artificial intelligence come together in our architecture. Two modes of the human brain, two modes of the GPU. This may explain why NVIDIA GPUs are used broadly for deep learning, and NVIDIA is increasingly known as “the AI computing company.”

An End-to-End Platform for a New Computing Model

As a new computing model, GPU deep learning is changing how software is developed and how it runs. In the past, software engineers crafted programs and meticulously coded algorithms. Now, algorithms learn from tons of real-world examples — software writes itself. Programming is about coding instruction. Deep learning is about creating and training neural networks. The network can then be deployed in a data center to infer, predict and classify from new data presented to it. Networks can also be deployed into intelligent devices like cameras, cars and robots to understand the world. With new experiences, new data is collected to further train and refine the neural network. Learnings from billions of devices make all the devices on the network more intelligent. Neural networks will reap the benefits of both the exponential advance of GPU processing and large network effects — that is, they will get smarter at a pace way faster than Moore’s Law.

Whereas the old computing model is “instruction processing” intensive, this new computing model requires massive “data processing.” To advance every aspect of AI, we’re building an end-to-end AI computing platform — one architecture that spans training, inference and the billions of intelligent devices that are coming our way.

Let’s start with training. Our new Pascal GPU is a $2 billion investment and the work of several thousand engineers over three years. It is the first GPU optimized for deep learning. Pascal can train networks that are 65 times larger or faster than the Kepler GPU that Alex Krizhevsky used in his paper. A single computer of eight Pascal GPUs connected by NVIDIA NVLink, the highest throughput interconnect ever created, can train a network faster than 250 traditional servers.

Soon, the tens of billions of internet queries made each day will require AI, which means that each query will require billions more math operations. The total load on cloud services will be enormous to ensure real-time responsiveness. For faster data center inference performance, we announced the Tesla P40 and P4 GPUs. P40 accelerates data center inference throughput by 40 times. P4 requires only 50 watts and is designed to accelerate 1U OCP servers, typical of hyperscale data centers. Software is a vital part of NVIDIA’s deep learning platform. For training, we have CUDA and cuDNN. For inferencing, we announced TensorRT, an optimizing inferencing engine. TensorRT improves performance without compromising accuracy by fusing operations within a layer and across layers, pruning low-contribution weights, reducing precision to FP16 or INT8, and many other techniques.

Someday, billions of intelligent devices will take advantage of deep learning to perform seemingly intelligent tasks. Drones will autonomously navigate through a warehouse, find an item and pick it up. Portable medical instruments will use AI to diagnose blood samples onsite. Intelligent cameras will learn to alert us only to the circumstances that we care about. We created an energy-efficient AI supercomputer, Jetson TX1, for such intelligent IoT devices. A credit card-sized module, Jetson TX1 can reach 1 TeraFLOP FP16 performance using just 10 watts. It’s the same architecture as our most powerful GPUs and can run all the same software.

In short, we offer an end-to-end AI computing platform — from GPU to deep learning software and algorithms, from training systems to in-car AI computers, from cloud to data center to PC to robots. NVIDIA’s AI computing platform is everywhere.

AI Computing for Every Industry

Our end-to-end platform is the first step to ensuring that every industry can tap into AI. The global ecosystem for NVIDIA GPU deep learning has scaled out rapidly. Breakthrough results triggered a race to adopt AI for consumer internet services — search, recognition, recommendations, translation and more. Cloud service providers, from Alibaba and Amazon to IBM and Microsoft, make the NVIDIA GPU deep learning platform available to companies large and small. The world’s largest enterprise technology companies have configured servers based on NVIDIA GPUs. We were pleased to highlight strategic announcements along our GTC tour to address major industries:

AI Transportation: At $10 trillion, transportation is a massive industry that AI can transform. Autonomous vehicles can reduce accidents, improve the productivity of trucking and taxi services, and enable new mobility services. We announced that both Baidu and TomTom selected NVIDIA DRIVE PX 2 for self-driving cars. With each, we’re building an open “cloud-to-car” platform that includes an HD map, AI algorithms and an AI supercomputer.

Driving is a learned behavior that we do as second nature. Yet one that is impossible to program a computer to perform. Autonomous driving requires every aspect of AI — perception of the surroundings, reasoning to determine the conditions of the environment, planning the best course of action, and continuously learning to improve our understanding of the vast and diverse world. The wide spectrum of autonomous driving requires an open, scalable architecture — from highway hands-free cruising, to autonomous drive-to-destination, to fully autonomous shuttles with no drivers.

NVIDIA DRIVE PX 2 is a scalable architecture that can span the entire range of AI for autonomous driving. At GTC, we announced DRIVE PX 2 AutoCruise designed for highway autonomous driving with continuous localization and mapping. We also released DriveWorks Alpha 1, our OS for self-driving cars that covers every aspect of autonomous driving — detection, localization, planning and action.

We bring all of our capabilities together into our own self-driving car, NVIDIA BB8. Here’s a little video:

NVIDIA is focused on innovation at the intersection of visual processing, AI and high performance computing — a unique combination at the heart of intelligent and autonomous machines. For the first time, we have AI algorithms that will make self-driving cars and autonomous robots possible. But they require a real-time, cost-effective computing platform.

At GTC, we introduced Xavier, the most ambitious single-chip computer we have ever undertaken — the world’s first AI supercomputer chip. Xavier is 7 billion transistors — more complex than the most advanced server-class CPU. Miraculously, Xavier has the equivalent horsepower of DRIVE PX 2 launched at CES earlier this year — 20 trillion operations per second of deep learning performance — at just 20 watts. As Forbes noted, we doubled down on self-driving cars with Xavier.

  • AI Enterprise: IBM, which sees a $2 trillion opportunity in cognitive computing, announced a new POWER8 and NVIDIA Tesla P100 server designed to bring AI to the enterprise. On the software side, SAP announced that it has received two of the first NVIDIA DGX-1 supercomputers and is actively building machine learning enterprise solutions for its 320,000 customers in 190 countries.
  • AI City: There will be 1 billion cameras in the world in 2020. Hikvision, the world leader in surveillance systems, is using AI to help make our cities safer. It uses DGX-1 for network training and has built a breakthrough server, called “Blade,” based on 16 Jetson TX1 processors. Blade requires 1/20 the space and 1/10 the power of the 21 CPU-based servers of equivalent performance.
  • AI Factory: There are 2 billion industrial robots worldwide. Japan is the epicenter of robotics innovation. At GTC, we announced that FANUC, the Japan-based industrial robotics giant, will build the factory of the future on the NVIDIA AI platform, from end to end. Its deep neural network will be trained with NVIDIA GPUs, GPU-powered FANUC Fog units will drive a group of robots and allow them to learn together, and each robot will have an embedded GPU to perform real-time AI. MIT Tech Review wrote about it in its story “Japanese Robotics Giant Gives Its Arms Some Brains.”
  • The Next Phase of Every Industry: GPU deep learning is inspiring a new wave of startups — 1,500+ around the world — in healthcare, fintech, automotive, consumer web applications and more. Drive.ai, which was recently licensed to test its vehicles on California roads, is tackling the challenge of self-driving cars by applying deep learning to the full driving stack. Preferred Networks, the Japan-based developer of the Chainer framework, is developing deep learning solutions for IoT. Benevolent.ai, based in London and one of the first recipients of DGX-1, is using deep learning for drug discovery to tackle diseases like Parkinson’s, Alzheimer’s and rare cancers. According to CB Insights, funding for AI startups hit over $1 billion in the second quarter, an all-time high.

The explosion of startups is yet another indicator of AI’s sweep across industries. As Fortune recently wrote, deep learning will “transform corporate America.”  

AI Everyone

AI for Everyone

AI can solve problems that seemed well beyond our reach just a few years back. From real-world data, computers can learn to recognize patterns too complex, too massive or too subtle for hand-crafted software or even humans. With GPU deep learning, this computing model is now practical and can be applied to solve challenges in the world’s largest industries. Self-driving cars will transform the $10 trillion transportation industry. In healthcare, doctors will use AI to detect disease at the earliest possible moment, to understand the human genome to tackle cancer, or to learn from the massive volume of medical data and research to recommend the best treatments. And AI will usher in the 4thindustrial revolution — after steam, mass production and automation — intelligent robotics will drive a new wave of productivity improvements and enable mass consumer customization. AI will touch everyone. The era of AI is here.

Syndicated article courtesy of Nvidia

By Jen-Hsun Huang

jen-hsun-huangJen-Hsun Huang founded NVIDIA in 1993 and has served since its inception as president, chief executive officer and a member of the board of directors.

NVIDIA invented the GPU in 1999 and, from its roots as a PC graphics company, has gone on to become the world leader in AI computing.

Making Enterprise IT Affordable for Small Businesses with the Cloud

Making Enterprise IT Affordable for Small Businesses with the Cloud

Making Enterprise IT Affordable

Recent advancements in cloud technology have made enterprise IT services, like DNS management, a reality for even small businesses.

Customers have started to expect the same levels of online performance from small businesses as they do from enterprises. Everything from application to network performance, even DNS resolution times are all being held to the same standard as tech giants, like Google. If you can’t meet these standards, then the Twittersphere will explode, your brand could be damaged, and you could be losing revenue… all because you can’t be Google.

cloud-apps

Everyone wants to point the finger at the millennials. The demand generation who expects every business, no matter the size or scale, to have a responsive website, mobile app, social media presence, and everything must load within two seconds or less, or else you’ll have to deal with a scathing Yelp review.

But you’d be wrong to assume it’s their fault. Nearly every generation has become accustomed these demands, to the point where they have become standards for all online businesses. While some demands may seem outlandish, we are only going to focus on the critical ones that apply to all industries and businesses.

If you are a modern business, then you need to make sure your content is readily accessible and loads quickly regardless of a customer’s location or device.

How are small businesses supposed to maintain stride with these performance metrics? Most companies don’t have the resources, connections, or know-how to engineer the same performance as enterprise organizations. Let alone the time to stay on top of Internet trends, vulnerabilities, and regulations.

The Answer is ITOS

The ITOS (Internet Traffic Optimization Services) industry strives to bridge the gap by using cloud technology to help companies of all sizes achieve the same performance goals as enterprises. ITOS uses cloud-hosted management platforms to give small businesses the same global infrastructure as a tech giant, without the tech giant price tag.

Recent studies have shown that migrating to the cloud can and will save your organization money, no matter how large or small your network needs are.

These networks use Anycast technology, which is hosted in the cloud, self-healing, and highly redundant. Anycast networks are able to authoritatively represent a domain’s name servers at multiple points of presence. That means your domain’s DNS information is hosted at dozens of locations around the world, on multiple name servers at any given time. This dramatically reduces the time it takes for clients to resolve your domain because your DNS information is hosted locally. It’s simple physics, the closer you are to your end-users, the faster your site will load.

Now mom and pop’s can take advantage of multi-million dollar networks with infrastructure at dozens of different critical peering hubs around the world.

But speed is only one of many benefits that small businesses gain when implementing an ITOS solution. DNS management has dramatically evolved through the migration to cloud-hosted networks, but more importantly through the availability of big data. The cloud has made big data faster, affordable, and is able to be updated in real-time. Now, you can use big data analytics to influence routing decisions in real time. You can gather critical insights about your end-users’ routing patterns and behaviors and make intelligent routing decisions customized on a per user basis.

If you want to learn more about how to implement an ITOS solution to improve your businesses’ online performance, you can download this eBook for free here.

By Steven Job

Why a White Label Cloud for Emerging Economies

Why a White Label Cloud for Emerging Economies

White Label Cloud 

Given our starting point, one of the inquiries we field every now and then is: ‘why did we opt to go SaaS?’ By not going the B2C route (like Dropbox and OneDrive), we laid out what we believe to be our roadmap to success. With an eye for emerging economies, I’ll take you through the process of why we chose the SaaS route.

What is White Label?

In the mobile carrier cloud space, one word you will hear tossed around quite a bit is ‘white label’. What white label means for us is that mobile carriers can choose to brand our cloud service as their own. The term has roots in the music industry (for those wondering), and basically it’s an umbrella term for products you can brand as your own.

For reference, here’s our cloud offered as Vestel’s Vestel Cloud. By design, you will not find one hint of Cloudike on the website because the cloud has been branded and marketed as if it were Vestel’s own.

When we thought about how we wanted to proceed, we saw telecoms as our best shot at building a sustainable business. For us, these were the points that helped us favour a B2B2C model rather a B2C:

  • For a B2C product in emerging markets, new market entry is a very difficult feat: Everything from local partners to marketing has to be done from the ground up – even finding someone on the ground to manage all these things is a hurdle.
  • Mobile carriers have an existing customer base: This means we can bypass any need to spend tremendous capital on marketing, ads, and other methods to acquire users.
  • ARPU in emerging markets is not particularly high: E-commerce still a relatively new concept for many in the emerging market and given lower incomes, consumers are not racing to spend money online. With mobile carriers however, spending on cloud services could instead be bundled with their mobile phone billing; a process most consumers are already familiar with.

Why the Emerging Market?

From a market standpoint, we found that mobile carriers in the US and Western Europe had adopted cloud services already and that companies were fighting tooth and nail for opportunities.

On the other hand, we examined emerging markets and saw the increasing rates of mobile and internet connectivity. Both of which were very promising.

If you take a look at this report by the Asia Cloud Computing Association in 2016, nearly every emerging economy has some variation of a government assistance program that aims at increasing web infrastructure and connectivity. Given research that points towards connected users naturally inclining towards cloud services, we knew it was only a matter of time before these markets reach potential.

Take into consideration that one of the few entities in emerging markets that can afford data centres to install cloud are mobile carriers. If you can put two and two together, you can see our thought process three years ago.

Our Results

So when you factor in the new market entry requirements plus the infrastructure hurdle, the logical direction for us was a cloud platform for mobile carriers.

However, unlike B2C or even SMB B2B, signing a mobile carrier to a service is a far more difficult endeavour. As you can imagine, entities with 100,000+ customers are not going to be easy to sway.

business-ny

Even as we have refined our pitch to mobile carriers on how to best roll out our service, signing a client is still a ~8 month process. Even with references from other telecoms and major OEMs, there are still many hoops we have to jump through before we have everything ready to go for mobile carrier customers. This includes everything from software security tests, implementation timeline, and of course the contract negotiations themselves.

That said, we take the wait time as a cost of doing business. Given our product and where we like to operate, we have no doubt that our way is the most secure and success-bound path.

Status Now

I think if there’s one thing we’ve been sure of, it’s the fact that our built three years ago, was the right one. We’ve found the mobile carriers who’ve found a need for cloud in emerging markets and we’ve discovered that trends such as the adoption for cloud, has proven true as evident by our business pipeline.

While we still believe our product has many innovative upgrades to come, we feel that SaaS in emerging markets model thus far has been the right one.

By Max Azarov

CloudTweaks Comics
Cloud Infographic: The Explosive Growth Of The Cloud

Cloud Infographic: The Explosive Growth Of The Cloud

The Explosive Growth Of The Cloud We’ve been covering cloud computing extensively over the past number of years on CloudTweaks and have truly enjoyed watching the adoption and growth of it. Many novices are still trying to wrap their mind around what the cloud it is and what it does, while others such as thought…

Utilizing Digital Marketing Techniques Via The Cloud

Utilizing Digital Marketing Techniques Via The Cloud

Digital Marketing Trends In the past, trends in the exceptionally fast-paced digital marketing arena have been quickly adopted or abandoned, keeping marketers and consumers on their toes. 2016 promises a similarly expeditious temperament, with a few new digital marketing offerings taking center stage. According to Gartner’s recent research into Digital Marketing Hubs, brands plan to…

Disaster Recovery And The Cloud

Disaster Recovery And The Cloud

Disaster Recovery And The Cloud One of the least considered benefits of cloud computing in the average small or mid-sized business manager’s mind is the aspect of disaster recovery. Part of the reason for this is that so few small and mid-size businesses have ever contemplated the impact of a major disaster on their IT…

How Your Startup Can Benefit From Cloud Computing And Growth Hacking

How Your Startup Can Benefit From Cloud Computing And Growth Hacking

Ambitious Startups An oft-quoted statistic, 50% of new businesses fail within five years. And the culling of startups is even more dramatic, with an estimated nine out of ten folding. But to quote Steve Jobs, “I’m convinced that about half of what separates the successful entrepreneurs from the non-successful ones is pure perseverance.” So while…

The Big Data Movement Gets Bigger

The Big Data Movement Gets Bigger

The Big Data Movement In recent years, Big Data and Cloud relations have been growing steadily. And while there have been many questions raised around how best to use the information being gathered, there is no question that there is a real future between the two. The growing importance of Big Data Scientists and the…

Cloud Computing Price War Rages On

Cloud Computing Price War Rages On

Cloud Computing Price War There’s little question that the business world is a competitive place, but probably no area in business truly defines cutthroat quite like cloud computing. At the moment, we are witnessing a heated price war pitting some of the top cloud providers against each other, all in a big way to attract…

The CloudTweaks Archive - Posted by
Consequences Of Combining Off Premise Cloud Storage and Corporate Data

Consequences Of Combining Off Premise Cloud Storage and Corporate Data

Off Premise Corporate Data Storage Cloud storage is a broad term. It can encompass anything from on premise solutions, to file storage, disaster recovery and off premise options. To narrow the scope, I’ve dedicated the focus of today’s discussion to the more popular cloud storage services—such as Dropbox, Box, OneDrive—which are also known as hosted,…

The Business of Security: Avoiding Risks

The Business of Security: Avoiding Risks

The Business of Security Security is one of those IT concerns that aren’t problematic until disaster strikes. It might be tomorrow, it could be next week or next year. The fact is that poor security leaves businesses wide open for data loss and theft. News outlets just skim the surface, but hackers cost business up…

What You Need To Know About Choosing A Cloud Service Provider

What You Need To Know About Choosing A Cloud Service Provider

Selecting The Right Cloud Services Provider How to find the right partner for cloud adoption on an enterprise scale The cloud is capable of delivering many benefits, enabling greater collaboration, business agility, and speed to market. Cloud adoption in the enterprise has been growing fast. Worldwide spending on public cloud services will grow at a…

Choosing IaaS or a Cloud-Enabled Managed Hosting Provider?

Choosing IaaS or a Cloud-Enabled Managed Hosting Provider?

There is a Difference – So Stop Comparing We are all familiar with the old saying “That’s like comparing apples to oranges” and though we learned this lesson during our early years we somehow seem to discount this idiom when discussing the Cloud. Specifically, IT buyers often feel justified when comparing the cost of a…

Protecting Devices From Data Breach: Identity of Things (IDoT)

Protecting Devices From Data Breach: Identity of Things (IDoT)

How to Identify and Authenticate in the Expanding IoT Ecosystem It is a necessity to protect IoT devices and their associated data. As the IoT ecosystem continues to expand, the need to create an identity to newly-connected things is becoming increasingly crucial. These ‘things’ can include anything from basic sensors and gateways to industrial controls…

Lavabit, Edward Snowden and the Legal Battle For Privacy

Lavabit, Edward Snowden and the Legal Battle For Privacy

The Legal Battle For Privacy In early June 2013, Edward Snowden made headlines around the world when he leaked information about the National Security Agency (NSA) collecting the phone records of tens of millions of Americans. It was a dramatic story. Snowden flew to Hong Kong and then Russia to avoid deportation to the US,…

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Cloud Email Migration In today’s litigious society, preserving your company’s data is a must if you (and your legal team) want to avoid hefty fines for data spoliation. But what about when you move to the cloud? Of course, you’ve probably thought of this already. You’ll have a migration strategy in place and you’ll carefully…

Three Reasons Cloud Adoption Can Close The Federal Government’s Tech Gap

Three Reasons Cloud Adoption Can Close The Federal Government’s Tech Gap

Federal Government Cloud Adoption No one has ever accused the U.S. government of being technologically savvy. Aging software, systems and processes, internal politics, restricted budgets and a cultural resistance to change have set the federal sector years behind its private sector counterparts. Data and information security concerns have also been a major contributing factor inhibiting the…

Beacons Flopped, But They’re About to Flourish in the Future

Beacons Flopped, But They’re About to Flourish in the Future

Cloud Beacons Flying High When Apple debuted cloud beacons in 2013, analysts predicted 250 million devices capable of serving as iBeacons would be found in the wild within weeks. A few months later, estimates put the figure at just 64,000, with 15 percent confined to Apple stores. Beacons didn’t proliferate as expected, but a few…

The Rise Of BI Data And How To Use It Effectively

The Rise Of BI Data And How To Use It Effectively

The Rise of BI Data Every few years, a new concept or technological development is introduced that drastically improves the business world as a whole. In 1983, the first commercially handheld mobile phone debuted and provided workers with an unprecedented amount of availability, leading to more productivity and profits. More recently, the Cloud has taken…

The Importance of Cloud Backups: Guarding Your Data Against Hackers

The Importance of Cloud Backups: Guarding Your Data Against Hackers

The Importance of Cloud Backups Cloud platforms have become a necessary part of modern business with the benefits far outweighing the risks. However, the risks are real and account for billions of dollars in losses across the globe per year. If you’ve been hacked, you’re not alone. Here are some other companies in the past…