Author Archives: CloudTweaks

The Intelligent Industrial Revolution

The Intelligent Industrial Revolution

AI Revolution

Prefatory Note: Over the past six weeks, we took NVIDIA’s developer conference on a world tour. The GPU Technology Conference (GTC) was started in 2009 to foster a new approach to high performance computing using massively parallel processing GPUs. GTC has become the epicenter of GPU deep learning — the new computing model that sparked the big bang of modern AI. It’s no secret that AI is spreading like wildfire. The number of GPU deep learning developers has leapt 25 times in just two years. Some 1,500 AI startups have cropped up. This explosive growth has fueled demand for GTCs all over the world. So far, we’ve held events in Beijing, Taipei, Amsterdam, Tokyo, Seoul and Melbourne. Washington is set for this week and Mumbai next month. I kicked off four of the GTCs. Here’s a summary of what I talked about, what I learned and what I see in the near future as AI, the next wave in computing, revolutionizes one industry after another.

A New Era of Computing

a_new_era-iot

Intelligent machines powered by AI computers that can learn, reason and interact with people are no longer science fiction. Today, a self-driving car powered by AI can meander through a country road at night and find its way. An AI-powered robot can learn motor skills through trial and error. This is truly an extraordinary time. In my three decades in the computer industry, none has held more potential, or been more fun. The era of AI has begun.

Our industry drives large-scale industrial and societal change. As computing evolves, new companies form, new products are built, our lives change. Looking back at the past couple of waves of computing, each was underpinned by a revolutionary computing model, a new architecture that expanded both the capabilities and reach of computing.

In 1995, the PC-Internet era was sparked by the convergence of low-cost microprocessors (CPUs), a standard operating system (Windows 95), and a new portal to a world of information (Yahoo!). The PC-Internet era brought the power of computing to about a billion people and realized Microsoft’s vision to put “a computer on every desk and in every home.” A decade later, the iPhone put “an Internet communications” device in our pockets. Coupled with the launch of Amazon’s AWS, the Mobile-Cloud era was born. A world of apps entered our daily lives and some 3 billion people enjoyed the freedom that mobile computing afforded.

Today, we stand at the beginning of the next era, the AI computing era, ignited by a new computing model, GPU deep learning. This new model — where deep neural networks are trained to recognize patterns from massive amounts of data — has proven to be “unreasonably” effective at solving some of the most complex problems in computer science. In this era, software writes itself and machines learn. Soon, hundreds of billions of devices will be infused with intelligence. AI will revolutionize every industry.

GPU Deep Learning “Big Bang”

Why now? As I wrote in an earlier post (“Accelerating AI with GPUs: A New Computing Model”), 2012 was a landmark year for AI. Alex Krizhevsky of the University of Toronto created a deep neural network that automatically learned to recognize images from 1 million examples. With just several days of training on two NVIDIA GTX 580 GPUs, “AlexNet” won that year’s ImageNet competition, beating all the human expert algorithms that had been honed for decades. That same year, recognizing that the larger the network, or the bigger the brain, the more it can learn, Stanford’s Andrew Ng and NVIDIA Research teamed up to develop a method for training networks using large-scale GPU-computing systems.

The world took notice. AI researchers everywhere turned to GPU deep learning. Baidu, Google, Facebook and Microsoft were the first companies to adopt it for pattern recognition. By 2015, they started to achieve “superhuman” results — a computer can now recognize images better than we can. In the area of speech recognition, Microsoft Research used GPU deep learning to achieve a historic milestone by reaching “human parity” in conversational speech.

Image recognition and speech recognition — GPU deep learning has provided the foundation for machines to learn, perceive, reason and solve problems. The GPU started out as the engine for simulating human imagination, conjuring up the amazing virtual worlds of video games and Hollywood films. Now, NVIDIA’s GPU runs deep learning algorithms, simulating human intelligence, and acts as the brain of computers, robots and self-driving cars that can perceive and understand the world. Just as human imagination and intelligence are linked, computer graphics and artificial intelligence come together in our architecture. Two modes of the human brain, two modes of the GPU. This may explain why NVIDIA GPUs are used broadly for deep learning, and NVIDIA is increasingly known as “the AI computing company.”

An End-to-End Platform for a New Computing Model

As a new computing model, GPU deep learning is changing how software is developed and how it runs. In the past, software engineers crafted programs and meticulously coded algorithms. Now, algorithms learn from tons of real-world examples — software writes itself. Programming is about coding instruction. Deep learning is about creating and training neural networks. The network can then be deployed in a data center to infer, predict and classify from new data presented to it. Networks can also be deployed into intelligent devices like cameras, cars and robots to understand the world. With new experiences, new data is collected to further train and refine the neural network. Learnings from billions of devices make all the devices on the network more intelligent. Neural networks will reap the benefits of both the exponential advance of GPU processing and large network effects — that is, they will get smarter at a pace way faster than Moore’s Law.

Whereas the old computing model is “instruction processing” intensive, this new computing model requires massive “data processing.” To advance every aspect of AI, we’re building an end-to-end AI computing platform — one architecture that spans training, inference and the billions of intelligent devices that are coming our way.

Let’s start with training. Our new Pascal GPU is a $2 billion investment and the work of several thousand engineers over three years. It is the first GPU optimized for deep learning. Pascal can train networks that are 65 times larger or faster than the Kepler GPU that Alex Krizhevsky used in his paper. A single computer of eight Pascal GPUs connected by NVIDIA NVLink, the highest throughput interconnect ever created, can train a network faster than 250 traditional servers.

Soon, the tens of billions of internet queries made each day will require AI, which means that each query will require billions more math operations. The total load on cloud services will be enormous to ensure real-time responsiveness. For faster data center inference performance, we announced the Tesla P40 and P4 GPUs. P40 accelerates data center inference throughput by 40 times. P4 requires only 50 watts and is designed to accelerate 1U OCP servers, typical of hyperscale data centers. Software is a vital part of NVIDIA’s deep learning platform. For training, we have CUDA and cuDNN. For inferencing, we announced TensorRT, an optimizing inferencing engine. TensorRT improves performance without compromising accuracy by fusing operations within a layer and across layers, pruning low-contribution weights, reducing precision to FP16 or INT8, and many other techniques.

Someday, billions of intelligent devices will take advantage of deep learning to perform seemingly intelligent tasks. Drones will autonomously navigate through a warehouse, find an item and pick it up. Portable medical instruments will use AI to diagnose blood samples onsite. Intelligent cameras will learn to alert us only to the circumstances that we care about. We created an energy-efficient AI supercomputer, Jetson TX1, for such intelligent IoT devices. A credit card-sized module, Jetson TX1 can reach 1 TeraFLOP FP16 performance using just 10 watts. It’s the same architecture as our most powerful GPUs and can run all the same software.

In short, we offer an end-to-end AI computing platform — from GPU to deep learning software and algorithms, from training systems to in-car AI computers, from cloud to data center to PC to robots. NVIDIA’s AI computing platform is everywhere.

AI Computing for Every Industry

Our end-to-end platform is the first step to ensuring that every industry can tap into AI. The global ecosystem for NVIDIA GPU deep learning has scaled out rapidly. Breakthrough results triggered a race to adopt AI for consumer internet services — search, recognition, recommendations, translation and more. Cloud service providers, from Alibaba and Amazon to IBM and Microsoft, make the NVIDIA GPU deep learning platform available to companies large and small. The world’s largest enterprise technology companies have configured servers based on NVIDIA GPUs. We were pleased to highlight strategic announcements along our GTC tour to address major industries:

AI Transportation: At $10 trillion, transportation is a massive industry that AI can transform. Autonomous vehicles can reduce accidents, improve the productivity of trucking and taxi services, and enable new mobility services. We announced that both Baidu and TomTom selected NVIDIA DRIVE PX 2 for self-driving cars. With each, we’re building an open “cloud-to-car” platform that includes an HD map, AI algorithms and an AI supercomputer.

Driving is a learned behavior that we do as second nature. Yet one that is impossible to program a computer to perform. Autonomous driving requires every aspect of AI — perception of the surroundings, reasoning to determine the conditions of the environment, planning the best course of action, and continuously learning to improve our understanding of the vast and diverse world. The wide spectrum of autonomous driving requires an open, scalable architecture — from highway hands-free cruising, to autonomous drive-to-destination, to fully autonomous shuttles with no drivers.

NVIDIA DRIVE PX 2 is a scalable architecture that can span the entire range of AI for autonomous driving. At GTC, we announced DRIVE PX 2 AutoCruise designed for highway autonomous driving with continuous localization and mapping. We also released DriveWorks Alpha 1, our OS for self-driving cars that covers every aspect of autonomous driving — detection, localization, planning and action.

We bring all of our capabilities together into our own self-driving car, NVIDIA BB8. Here’s a little video:

NVIDIA is focused on innovation at the intersection of visual processing, AI and high performance computing — a unique combination at the heart of intelligent and autonomous machines. For the first time, we have AI algorithms that will make self-driving cars and autonomous robots possible. But they require a real-time, cost-effective computing platform.

At GTC, we introduced Xavier, the most ambitious single-chip computer we have ever undertaken — the world’s first AI supercomputer chip. Xavier is 7 billion transistors — more complex than the most advanced server-class CPU. Miraculously, Xavier has the equivalent horsepower of DRIVE PX 2 launched at CES earlier this year — 20 trillion operations per second of deep learning performance — at just 20 watts. As Forbes noted, we doubled down on self-driving cars with Xavier.

  • AI Enterprise: IBM, which sees a $2 trillion opportunity in cognitive computing, announced a new POWER8 and NVIDIA Tesla P100 server designed to bring AI to the enterprise. On the software side, SAP announced that it has received two of the first NVIDIA DGX-1 supercomputers and is actively building machine learning enterprise solutions for its 320,000 customers in 190 countries.
  • AI City: There will be 1 billion cameras in the world in 2020. Hikvision, the world leader in surveillance systems, is using AI to help make our cities safer. It uses DGX-1 for network training and has built a breakthrough server, called “Blade,” based on 16 Jetson TX1 processors. Blade requires 1/20 the space and 1/10 the power of the 21 CPU-based servers of equivalent performance.
  • AI Factory: There are 2 billion industrial robots worldwide. Japan is the epicenter of robotics innovation. At GTC, we announced that FANUC, the Japan-based industrial robotics giant, will build the factory of the future on the NVIDIA AI platform, from end to end. Its deep neural network will be trained with NVIDIA GPUs, GPU-powered FANUC Fog units will drive a group of robots and allow them to learn together, and each robot will have an embedded GPU to perform real-time AI. MIT Tech Review wrote about it in its story “Japanese Robotics Giant Gives Its Arms Some Brains.”
  • The Next Phase of Every Industry: GPU deep learning is inspiring a new wave of startups — 1,500+ around the world — in healthcare, fintech, automotive, consumer web applications and more. Drive.ai, which was recently licensed to test its vehicles on California roads, is tackling the challenge of self-driving cars by applying deep learning to the full driving stack. Preferred Networks, the Japan-based developer of the Chainer framework, is developing deep learning solutions for IoT. Benevolent.ai, based in London and one of the first recipients of DGX-1, is using deep learning for drug discovery to tackle diseases like Parkinson’s, Alzheimer’s and rare cancers. According to CB Insights, funding for AI startups hit over $1 billion in the second quarter, an all-time high.

The explosion of startups is yet another indicator of AI’s sweep across industries. As Fortune recently wrote, deep learning will “transform corporate America.”  

AI Everyone

AI for Everyone

AI can solve problems that seemed well beyond our reach just a few years back. From real-world data, computers can learn to recognize patterns too complex, too massive or too subtle for hand-crafted software or even humans. With GPU deep learning, this computing model is now practical and can be applied to solve challenges in the world’s largest industries. Self-driving cars will transform the $10 trillion transportation industry. In healthcare, doctors will use AI to detect disease at the earliest possible moment, to understand the human genome to tackle cancer, or to learn from the massive volume of medical data and research to recommend the best treatments. And AI will usher in the 4thindustrial revolution — after steam, mass production and automation — intelligent robotics will drive a new wave of productivity improvements and enable mass consumer customization. AI will touch everyone. The era of AI is here.

Syndicated article courtesy of Nvidia

By Jen-Hsun Huang

jen-hsun-huangJen-Hsun Huang founded NVIDIA in 1993 and has served since its inception as president, chief executive officer and a member of the board of directors.

NVIDIA invented the GPU in 1999 and, from its roots as a PC graphics company, has gone on to become the world leader in AI computing.

Data Sharing: A Matter of Transparency and Control

Data Sharing: A Matter of Transparency and Control

Janrain’s Consumer Identity Survey Shows 93% are Concerned How Brands Use/Share Their Online Activity

It comes as no surprise that people suffer from anxiety when sharing their personal information, even with big brands and names in the social media and eCommerce field. What does come as a surprise is the sheer number of netizens who share these feelings.

A recent research report put out by Marketwired found out that more than 93 percent of online users are concerned about how their info is used online. (Below is a colorful infographic created by the group at Janrain.)

So what are some of the reasons behind this hesitation?

janrain_identitysurvey_comic_full-01-1-compressor

The Five Rules of Security and Compliance in the Public Cloud Era

The Five Rules of Security and Compliance in the Public Cloud Era

Security and Compliance 

With technology at the heart of businesses today, IT systems and data are being targeted by criminals, competitors and even foreign governments. Every day, we hear about how another retailer, bank or Internet company has been hacked and private information of customers or employees stolen. Governments and oversight organizations are responding to these attacks with calls for tighter control and regulations, from the Society for Worldwide Interbank Financial Telecommunication (SWIFT) beefing up its requirements for members to new proposed regulations targeting financial institutions in the State of New York. It is no wonder that as enterprises embrace the public cloud to run their critical applications, (See image) compliance remains one of the top concerns.

Biggest Barriers Holding You Back

cloud-barriers-security

Enterprises used to regard IT compliance audits and certifications, e.g., HIPAA for hospital IT systems or PCI DSS for banks and e-commerce companies, primarily from the perspective of staying on the right side of the law. But this is changing – companies across all industries are now willing to spend on IT security and compliance, not only to deal with legal requirements but also to win customer trust and ensure that they don’t make headlines for the wrong reasons.

Security and compliance in public-cloud environments are fundamentally different from private datacenter security. Old techniques and controls (e.g., connecting to physical switch TAP/SPAN ports and sniffing traffic, installing gateway firewalls at perimeters) do not work in the cloud any more. With compliance playing a key role in IT security and governance, it is important to keep a few guidelines in mind when it comes to managing public-cloud environments.

1. Start with a dose of security common sense: Common data and information security best practices lie at the heart of compliance standards such as HIPAA and PCI DSS as well as of security frameworks such as the CIS Benchmarks for Amazon Web Services (AWS). For example, compliance rulesets for cloud environments typically stipulate password policies, encryption of sensitive data and configuration of security groups. Enterprise IT and security teams would do well to incorporate these rules into their security management, irrespective of compliance requirements.

2. Remember the shared-responsibility model: Public cloud providers such as AWS follow a shared-responsibility model; they manage the security of the cloud and leave security in the cloud (environment) to the customer. These clouds have invested heavily to build security into their products and develop customer confidence. AWS has robust controls in place to maintain security and compliance with industry standards such as PCI and ISO 27001. In going from datacenters to public cloud environments, security administrators need to understand what aspects of security compliance they are responsible for in the cloud. This requires cross-functional collaboration between the operations and security teams to map the security controls in the datacenter to those in public-cloud environments.

3. Stay compliant all the time: In the software-defined world of public clouds, where a simple configuration change can expose a private database or application server to the world, there are no second chances. Enterprises are going from periodic security checks to continuous enforcement and compliance. Businesses that develop and deploy applications in clouds need to bake security and compliance checks into the development and release process. A software build that causes a security regression or does not meet the bar for compliance should not be released to a product environment. Enterprise IT needs to ensure that the tools they use for compliance monitoring and enforcement allow them to check applications for compliance before they are deployed.

4. Automate or die: Manual security and compliance processes don’t work in the dynamic, scalable world of the public cloud. When a business’ cloud environment spans hundreds or thousands of instances across accounts, regions and virtual private clouds, just the process of gathering the data required to run a compliance audit can take days or weeks, driving up the time to compliance and increasing the risk of errors. Even a team of qualified security personnel may not be able to detect vulnerabilities and respond in a timely manner. Automation is key to survival in the public cloud. It is no wonder that Michael Coates, the trust and infosec officer of Twitter, said “Automate or die. This is the biggest thing I stick by in this day and age.” In selecting the tools to manage compliance in cloud environments, enterprise IT must regard automated data aggregation, compliance checking and enforcement of security gold standards as table stakes.

5. Don’t just find it, fix it: There is an abundance of security-monitoring products in the market today that allow administrators to find security misconfigurations and vulnerabilities but do not offer the control to fix these issues. These tools are limited in scope and utility and force enterprise IT to use a patchwork of tools to manage the security and compliance lifecycle. Businesses should pick comprehensive “find it, fix it, stay fixed” platforms that do not stop at identifying issues with the environment but offer the tools required to fix them and put safeguards and controls in place to ensure that security best practices are enforced.

Public clouds are transforming the world of enterprise IT by offering unprecedented agility and a pay-as-you-grow operational model. Clouds are also changing the rules of the game for IT security and compliance management by offering new controls and capabilities. The tools and processes that served IT well in datacenter environments will not work in the public cloud. It is time for security and compliance to be transformed as well.

By Suda Srinivasan, Vice President of Growth at Dome9

suda_dome9Suda is the Vice President of Growth at Dome9, where he oversees marketing and customer growth. Prior to Dome9, Suda held a senior marketing role at Nutanix where he was responsible for defining, communicating and driving the execution of the go-to-market strategy for the company’s enterprise cloud platform. Suda is a seasoned leader with extensive experience in technology, having worked in engineering, strategy consulting and marketing roles at Nutanix, Microsoft, Coraid and Deloitte

Cloud Native Trends Picking Up – Legacy Security Losing Ground

Cloud Native Trends Picking Up – Legacy Security Losing Ground

Cloud Native Trends

Once upon a time, only a select few companies like Google and Salesforce possessed the knowledge and expertise to operate efficient cloud infrastructure and applications. Organizations patronizing those companies benefitted with apps that offered new benefits in flexibility, scalability and cost effectiveness.

These days, the sharp division between cloud and on-premises infrastructure is quickly becoming a thing of the past. In fact, the cloud has become so ingrained in the fabric of the enterprise computing experience that we often don’t even use the term “cloud” as a descriptive qualifier, but rather take it for granted as an inherent and vital component of all IT environments.

enterprise

In the enterprise, where once traditional on-premises software like Oracle and SAP dominated the IT environment, organizations are now increasingly turning to cloud and cloud native capabilities – that is, applications built from microservices running in containers, or installed in cloud-based virtual machines (VMs)–to achieve greater efficiency and better economic value of IT services.

Why the surge of interest in cloud native technologies? Organizations that are making new and ambitious forays into the world of cloud native are allowed to press the proverbial “reset button.” For them, it’s an opportunity to do things differently, from customer-facing applications all the way down to the infrastructure layer.

And the advantages are tremendous. The ability to develop and manage applications in a true modular fashion – to troubleshoot and update components up and down the stack without impacting other parts of the application – delivers better efficiency and strong economic benefits which are some of the reasons why more and more organizations are rolling up their sleeves and diving headfirst into this new arena.

One of the driving forces behind this technological and economic transformation is the proliferation of container technologies like Docker*, which helps to enable automated deployment of cloud native applications. All you have to do is look at the numbers to wrap your head around Docker’s exponential growth rates. In February 2016, 2 billion Docker images had been pulled from Docker hub.

2bpulls_dockerhub

That number has recently surpassed 5 billion in August 2016, according to Docker published statistics. If this kind of growth trajectory remains consistent, it’s very likely that by 2020 nearly 100 percent of net new enterprise applications will be cloud native and a significant portion of legacy applications will be migrated to cloud native infrastructure.

The ripple effects around this massive shift are extensive. One of the ramifications is that traditional IT tooling suites are going to be losing quite a bit of “real estate.” For example, traditional storage mechanisms will likely give away to software-defined storage. Traditional networking with physical routers connected to physical endpoints will be replaced with virtual overlay networks whose topologies can change on the dot. And security mechanisms that work on traditional host or VM boundaries will need to adopt new semantic lens to address containers or container-equivalent.

Is this shift occurring already? The short answer is yes. Many user organizations are either already in the middle of the transformation or are actively preparing for this impending reality. Adobe, the Silicon Valley based digital media company, is moving its hugely-popular Creative Cloud services to cloud native infrastructure. Online payroll service provider ADP made an early and critical bet on Docker technology and is transforming many of its applications and services to a cloud native implementation. GE digital’s Predix system will be largely built on container infrastructure. Even GSA, the largest service provider to the U.S. government, invested heavily in Docker and microservice-related technologies to modernize service delivery to government agencies.

But what may be an even bigger harbinger of changes to come is that many startups aren’t investing in legacy products, but instead are leap-frogging over traditional solutions right into container technologies and cloud native apps. And the startup companies of today will be the new industry visionaries and leaders of tomorrow. While the apex of this technological shift might still be some time in the future, organizations that are laying the foundation for this transformation today will not only have a competitive edge tomorrow, but will also help pioneer an entirely new era of digital transformation.

By Chenxi Wang,

chenxi-wangDr. Chenxi Wang, current chief strategy officer at Twistlock, is a security industry veteran and a respected thought leader. She held a variety of strategy leadership positions from Intel and Ciphercloud, following a stint as a highly respected industry analyst at Forrester Research. Chenxi held a faculty position at Carnegie Mellon University earlier on in her career. She has a Ph.D. in Computer Science from University of Virginia.

Introducing and Implementing Voice Biometrics in Call Centers

Introducing and Implementing Voice Biometrics in Call Centers

Voice Biometrics in Call Centers

It wouldn’t be wrong to say that voice biometrics is the way of the future, when it comes to verifying the identity of customers contacting call centers. Market research firm Forrester, for one, predicts it will be the go-to authentication solution for financial institutions by 2020.

But it is just as accurate to say that voice biometrics is rapidly being recognized as today’s best practice as well. Already, major businesses in such sectors as banking and finance, healthcare, telecom, and other security-sensitive fields are recognizing that voice authentication offers a wide array of compelling benefits.

For one thing, it vastly improves the customer experience, by doing away with the unwelcome interrogations that call centers traditionally needed to go through to identify each caller. Since voice authentication relies on the caller’s normal conversation itself, and verifies a caller’s identity in real time without requiring any effort on the caller’s part, the process is frustration-free, unlike a barrage of questions. In fact, most consumers say they prefer voice authentication to jumping through the current hoops. Secondly, because voice authentication takes into account more than 100 variables of speech in a sophisticated mathematical expression, it offers a high degree of accuracy and security that rivals or exceeds the certainty of the fingerprint.

Print

(Infographic Source: NJIT)

And, in no small matter for businesses, it offers benefits that go directly to the bottom line. By eliminating time spent on verifying identity every time the phone rings, it frees up employees for the revenue-generating activities at the heart of their jobs.

That said, it is still the case that making the transition from the old way to the new and improved way doesn’t come without challenges. Fortunately, with the right guidance for efficient implementation, these adoption challenges become negligible.

Facing the hurdles, and clearing them

  • As much as deploying a voice authentication solution is a technical challenge, it is also a legal one in many jurisdictions. It can’t happen without the consent of the customers, so investigating the requirements and potential issues is an essential starting point.
  • Once legal questions are resolved, the next step is optimizing the process of actually asking for consent. The key to mounting a successful recruitment campaign includes not only making an effective pitch by way of carefully selected channels (mass media, email etc.), but also providing consumers with the information they need about voice biometrics to make an informed decision about whether they want to opt in.
  • Enrolling those who give consent demands yet another optimized process to collect and maintain all the necessary records, but it also calls for attention to a crucial factor. If done less than optimally, enrollment can be a lengthy, complex, and expensive proposition to gather the voiceprints of customers. The alternative, as pointed out by the experts at NICE, a leading provider of voice biometrics solutions, is to enroll customers “passively“. As opposed to using an “active” approach, in which customers might be asked to repeat a phrase a number of times to create a voiceprint, a process that undermines the customer experience gains voice biometrics offers. The passive approach employs a solution that integrates with existing call recording capabilities to leverage historical calls. Once they gave their consent, customers can then be enrolled without having to do anything at all.
  • The need for integration doesn’t end with enrollment. It is not uncommon for call centers to need to integrate voice biometrics technology with a number of other systems such as security and CRM software. That can be a lengthy and costly process when an ad-hoc integration is attempted, but selecting a biometrics product that offers ready-made end-to-end support or that features embedded APIs can alleviate the problems.

The advantages outweigh the challenges

Everybody wins with voice biometrics. It puts the customer first, because it eliminates extra steps and frustration. Businesses and customers alike benefit from the added security it provides, and from the shorter call times, which pay off in convenience for the customer and increased ROI for the company – especially when the company has selected a biometrics solution that adapts to all necessary integrations.

By Naomi Webb

3 Keys To Keeping Your Online Data Accessible

3 Keys To Keeping Your Online Data Accessible

Online Data

Data storage is often a real headache for businesses. Additionally, the shift to the cloud in response to storage challenges has caused security teams to struggle to reorient, leaving 49 percent of organizations doubting their experts’ ability to adapt.

Even so, decision makers should not put off moving from old legacy systems to a more flexible and accessible solution — whether public or private. By putting the right system in place, businesses can free up IT staff for more strategic projects, ensure content is available and retrievable whenever and wherever it’s needed, and analyze data effectively for actionable insights.

Keeping Data Accessible

There are several obstacles that must be overcome when keeping data accessible:

1. Storage silos have been a problem since the first digital storage devices hit the market. Directory and folder hierarchical structures were fairly useful when dealing with a limited number of files, all accessed by users or a handful of applications that knew where to find the files.

But in today’s connected world, where remote collaboration and access by many devices and applications is the norm, these hierarchical structures are hindering new workflows and locking files in place (hence the term “silo”).

2. Search issues present a number of operational and financial challenges to businesses. Searching for data from multiple systems spread across several geographic locations is a laborious task, and the need to use both past and present data makes it even trickier.

The data that is searched is often indexed in a database located in a specific application. This valuable “data about the data,” also known as metadata, needs to be stored in a way that enables portability. The Internet of Things has opened businesses to a world of new data possibilities, but going back to a specific application to search for your file or continuously migrating entire data sets to different analysis applications wastes valuable time and introduces the possibility of errors.

3. Scalability dilemmas in storage capacity, both in file count and the amount of data, as well as expansion to different geographies, prevents businesses from keeping pace with the needs of modern data accessibility requirements.

Most organizations keep data forever because they don’t know what will have value. There are also many use cases in which government regulations require longer retention times and tighter security, creating a compounding effect on storage needs. This growth, combined with the need to keep the data accessible, poses a serious problem for traditional network attached storage solutions, file systems, and their complex hierarchical structures.

Making Your Storage Efficient

music-storage

While certainly challenging, these problems are far from insurmountable.  Here are 3 easy-to-implement solutions to help keep storage simple and efficient:

1. Consolidate your data in one storage platform.

The dawn of the cloud was a major breakthrough for data storage, and the first step toward a simplified storage process is to embrace that technology. Sharing resources in a virtual environment is at the heart of the transformation we’re seeing to a more service-based approach in IT.

You can now stand up a storage service within your own data center (a private solution) or use any one of the services on the market (a public solution). If you need to keep your data secure or plan to keep the data for more than three years, private is most likely your best option. However, if you have limited data center space or only need to store data for a few months or years, public is probably the way to go.

2. Leverage metadata.

Data is growing at an astonishing rate, and experts predict the digital universe will reach 44 trillion gigabytes by 2020. But what use is that data if it can’t be found or identified? Metadata is an essential tool for simplifying data storage because it allows managers to quickly and automatically identify characteristics about data in a way that can continuously evolve, providing new views of ever-changing data sets.

The key is making metadata portable and accessible by any application or device in a way that’s easy to protect. For this reason, metadata searching and management must be native features of your storage systems — not just afterthoughts.

3. Adopt object storage.

Object storage is a core feature of many major cloud storage services on the market and is the most efficient and cost-effective way to reliably store and provide access to petabytes of data and trillions of files. Object storage is highly automated, resilient, and easily available, resulting in a vastly improved capacity-to-management resource ratio. It’s common to find one system administrator managing more than 10 PBs of storage (compared to 1PB for a NAS solution).

Object storage uses a method of storing and retrieving data that uses a key or name, supplementing that with metadata. Think of it like a valet service for your data: When you store something, you get a key or associate a tag (metadata) with it. All you need to do is present the key or search for the specific tag or combination of tags, and the storage system will retrieve the data that matches your request.

storage

This approach not only makes data easier to find, but it also enables continuous, self-healing protection and virtually unlimited scalability. Certain vendors are also making significant advancements in integrating search and providing interfaces that plug right into existing workflows in a way that’s transparent to current users and applications.

The most effective and simple storage solutions incorporate data consolidation and the use of metadata with object storage. This provides greater data access, better protection from data corruption, and the streamlined performance necessary to keep any amount of data online, accessible, and providing value for growing businesses and organizations.

Whether you want to attribute the quote “With great power comes great responsibility” to Voltaire or Spider-Man, in the world of business, we need to preface that by saying “With great knowledge comes great power.” Once you simplify your storage, it gives you the knowledge to not only help run your business, but to also gain actionable insight and the power to make discoveries that can help you solve problems and propel your business forward.

By Jonathan Ring

jonathan-ringJonathan Ring is co-founder and CEO of Caringo, a leading scale-out storage provider. Prior to Caringo, Jonathan was an active angel investor advising a broad range of companies, and he was a vice president of engineering at Siebel Systems, where he was a member of the executive team that grew Siebel from $4 million to $2 billion in sales. Jonathan’s passion and experience are shaping the future of Caringo.

Benefits of Licensing Software as a Service In The Cloud

Benefits of Licensing Software as a Service In The Cloud

Software as a Service In The Cloud

When Microsoft moved to a monthly cloud-based subscription package for its Windows 10 operating system (Secure Productive Enterprise E3, and Secure Productive Enterprise E5), it represented the most significant recent example of software evolving into an as-a-service model (SaaS). Other vendors have also continued to migrate their software and application offerings to SaaS environments.

A handful of key reasons have driven companies such as Microsoft in this direction, all of which greatly benefit businesses of all sizes. First, IT departments are shrinking, and moving software to a subscription model based in the cloud enables for easier licensing management from service providers who serve as external IT departments for businesses.

business SaaS

Second, a cloud-based subscription model enables for businesses to license software on a per-consumption basis. Projects come and go, and the scale of these projects can vary. SaaS models enable organizations to scale their software needs based on timely consumption requirements.

A Cloud-Based Business Philosophy

The decision to move Windows 10 to SaaS was born out of the success Microsoft has had with Office 365, which has been a cloud-based offering for a few years now and enjoyed by businesses both large and small.

The timing also coincides with the change in business philosophy driven largely by the cloud itself. Businesses of every size are shifting many of their operations to the cloud, and everything from content management, social media management, and customer relationship management activities are also now residing in the cloud in a SaaS environment.

This shift also impacts a larger technology picture that goes beyond business use. As more software-based resources move to the cloud, this will further impact the broader spectrum how people, technology and “things” become inter-connected, known as the Internet of Things (IoT). SaaS models are at the center of this evolution.

The Need for External IT Departments To Manage Software

Clearly put, the days of the shrink-wrapped box of software are gone, and now everything lives and is licensed in the cloud, managed by an external IT department service provider.

According to research firm, Gartner, the shift to the cloud will soon be mandatory. According to the firm’s recent press release:

By 2020, a corporate ‘no-cloud’ policy will be as rare as a ‘no-internet’ policy is today, according to Gartner, Inc. Cloud-first, and even cloud-only, is replacing the defensive no-cloud stance that dominated many large providers in recent years. Today, most provider technology innovation is cloud-centric, with the stated intent of retrofitting the technology to on-premises.

The firm goes on to predict how organizations will embrace cloud offerings:

By 2019, more than 30 percent of the 100 largest vendors’ new software investments will have shifted from cloud-first to cloud-only.”

SaaS models tied in with licensing also enable for a more seamless user experience across multiple devices now used in business. From the laptop to the tablet and the mobile device, a cloud-centric subscription-based access to software enables a seamless experience for the user, no matter which device they’re on, with virtual access wherever they are. This is also beneficial for workflow that involves remote employees from different regions all desiring access to the same files and data.

byod

Adding Services to the Software Experience

Lastly, the word “services” is key in the SaaS relationship. Service providers acting as external IT departments can help manage the software and application experience, which includes security offerings and managing license deployments for scale. And as software vendors such as Microsoft continue to enhance their software offerings, service providers will be the experts that help manage these upgrades and new features for their organizational clients.

By Kim Kuhlmann

kim_kuhlmannKim Kuhlmann is a Senior Customer Advisor for HPE SLMS Hosting. Through its range of full-service hosted software licensing capabilities and its detailed knowledge of the latest licensing programs from Microsoft and elsewhere, HPE SLMS Hosting offers the expertise service providers need to capitalize on new opportunities and grow their businesses at the pace of the cloud services market overall.

Follow HPE SLMS Hosting on Twitter, Google+ and LinkedIn for additional insight and conversation, and visit the HPE SPaRC resource community at www.hpesparc.com.

Using Private Cloud Architecture For Multi-Tier Applications

Using Private Cloud Architecture For Multi-Tier Applications

Cloud Architecture

These days, Multi-Tier Applications are the norm. From SharePoint’s front-end/back-end configuration, to LAMP-based websites using multiple servers to handle different functions, a multitude of apps require public and private-facing components to work in tandem. Placing these apps in entirely public-facing platforms and networks simplifies the process, but at the cost of security vulnerabilities. Locating everything across back-end networks causes headaches for the end-users who try to access the systems over VPN and other private links.

Many strategies have been implemented to address this issue across traditional datacenter infrastructures. Independent physical networks with a “DMZ” for public-facing components, complex routers and firewall configurations have all done the job, although they do add multiple layers of complexity and require highly specialized knowledge and skill sets to accomplish.

Virtualization has made management much easier, but virtual administrators are still required to create and manage each aspect of the configuration – from start to finish. Using a private cloud configuration can make the process much simpler, and it helps segment control while still enabling application administrators to get their jobs done.

Multi-tenancy in the Private Cloud

Private cloud architecture allows for multi-tenancy, which in turn allows for separation of the networking, back-end and front-end tiers. Cloud administrators can define logical relationships between components and enable the app admins to manage their applications without worrying about how they will connect to each other.

One example is a web-based application using a MySQL back-end data platform. In a traditional datacenter platform, the app administrators would request connectivity to either isolate the back-end database or to isolate everything and allow only minimal web traffic to cross the threshold. This requires network administrators to spend hours working with the app team to create and test firewalls and other networking rules to ensure the access they need without opening any security holes that could be exploited.

Applying private cloud methodology changes the game dramatically.

Two individual virtual networks can be created by the cloud administrator. Within each network, traffic flows freely, removing the need to manually create networking links between components in the same virtual network entirely. In addition, a set of security groups can be established that will only allow specified traffic to route between the back-end data network and the front-end web server network – specifically ports and protocols used for the transfer of MySQL data and requests. Security groups utilize per-tenant access control list (ACL) rules, which allow each virtual network to independently define what traffic it will and will not accept and route.

Private cloud networking

Due to the nature of private cloud networking, it becomes much easier to not only ensure that approved data is flowing between the front and back end networks, but to ensure that traffic only flows if it originates from the application networks themselves. This allows for free-flow of required information but blocks anyone outside the network from trying to enter through those same ports.

In the front-end virtual network, all web traffic ports are opened so that users can access those web servers. With the back-end network, the front-end network can be configured to easily reject any other protocol or port and only allow routing from the outside world to the front-end servers, but nowhere else. This has the dual effect of enabling the web servers to do their jobs but won’t allow other administrators or anyone else in the datacenter to gain access, minimalizing faults due to human error or malicious intent.

Once application and database servers are installed and configured by the application administrators, the solution is complete. MySQL data flows from the back-end network to the front-end network and back, but no traffic from other sources reaches that data network. Web traffic from the outside world flows into and out of the front-end network, but it cannot “leapfrog” into the back-end network because external routes would not be permitted to any other server in the configuration. As each tenant is handled separately and governed by individual security groups, app administrators from other groups cannot interfere with the web application. The admins also cannot cause security vulnerabilities by accidentally opening unnecessary ports across the board because they need them for their own apps.

Streamlined Administration

Finally, the entire process becomes easier when each tenant has access to self-service, only relying on the cloud administrator for configuration of the tenancy as a whole and for the provisioning of the virtual networks. The servers, applications, security groups and other configurations can now be performed by the app administrator, and will not impact other projects, even when they reside on the same equipment. Troubleshooting can be accomplished via the cloud platform, which makes tracking down problems much easier. Of course, the cloud administrator could manage the entire platform, but they no longer have to.

Using a private cloud model allows for greater flexibility, better security, and easier management. While it is possible to accomplish this with a traditional physical and virtual configuration, adding the self-service and highly configurable tools of a private cloud is a great way to take control, and make your systems work the way you want, instead of the other way around.

By Ariel Maislos, CEO, Stratoscale

ariel-maislosAriel brings more than twenty years of technology innovation and entrepreneurship to Stratoscale. After a ten-year career with the IDF, where he was responsible for managing a section of the Technology R&D Department, Ariel founded Passave, now the world leader in FTTH technology. Passave was established in 2001, and acquired in 2006 by PMC-Sierra (PMCS), where Ariel served as VP of Strategy. In 2006 Ariel founded Pudding Media, an early pioneer in speech recognition technology, and Anobit, the leading provider of SSD technology acquired by Apple (AAPL) in 2012. At Apple, he served as a Senior Director in charge of Flash Storage, until he left the company to found Stratoscale. Ariel is a graduate of the prestigious IDF training program Talpiot, and holds a BSc from the Hebrew University of Jerusalem in Physics, Mathematics and Computer Science (Cum Laude) and an MBA from Tel Aviv University. He holds numerous patents in networking, signal processing, storage and flash memory technologies.

CloudTweaks Comics
Cloud Infographic – Disaster Recovery

Cloud Infographic – Disaster Recovery

Disaster Recovery Business downtime can be detrimental without a proper disaster recovery plan in place. Only 6% of businesses that experience downtime without a plan will survive long term. Less than half of all businesses that experience a disaster are likely to reopen their doors. There are many causes of data loss and downtime —…

Cloud Infographic – Cloud Computing And SMEs

Cloud Infographic – Cloud Computing And SMEs

Cloud Computing And SMEs SMEs (Small/Medium Sized Enterprises) make up the bulk of businesses today. Most cloud based applications created today are geared toward the SME market. Accounting, Storage, Backup services are just a few of them. According to the European Commission, cloud based technology could help 80% of organisations reduce costs by 10-20%. This infographic provided…

Cloud Infographic – The Internet Of Things In 2020

Cloud Infographic – The Internet Of Things In 2020

The Internet Of Things In 2020 The growing interest in the Internet of Things is amongst us and there is much discussion. Attached is an archived but still relevant infographic by Intel which has produced a memorizing snapshot at how the number of connected devices have exploded since the birth of the Internet and PC.…

The Rise Of BI Data And How To Use It Effectively

The Rise Of BI Data And How To Use It Effectively

The Rise of BI Data Every few years, a new concept or technological development is introduced that drastically improves the business world as a whole. In 1983, the first commercially handheld mobile phone debuted and provided workers with an unprecedented amount of availability, leading to more productivity and profits. More recently, the Cloud has taken…

Most Active Internet Of Things Investors In The Last 5 Years

Most Active Internet Of Things Investors In The Last 5 Years

Most Active Internet Of Things Investors A recent BI Intelligence report claimed that the Internet of Things (IoT) is on its way to becoming the largest device market in the world. Quite naturally, such exponential growth of the IoT market has prompted a number of high-profile corporate investors and smart money VCs to bet highly…

Surprising Facts and Stats About The Big Data Industry

Surprising Facts and Stats About The Big Data Industry

Facts and Stats About The Big Data Industry If you start talking about big data to someone who is not in the industry, they immediately conjure up images of giant warehouses full of servers, staff poring over page after page of numbers and statistics, and some big brother-esque official sat in a huge government building…

Using Big Data To Analyze Venture Capitalists’ Ability To Recognize Potential

Using Big Data To Analyze Venture Capitalists’ Ability To Recognize Potential

Big Data To Analyze Using Big Data to Analyze Venture Capitalists’ Ability To Recognize Potential For those who are regularly involved with SMEs, venture capital, and company valuations, it is common knowledge that start-ups that exit for more than $1 billion dollars are extremely rare – often termed ‘unicorn’ companies. Despite their rarity, it should…

Unusual Clandestine Cloud Data Centre Service Locations

Unusual Clandestine Cloud Data Centre Service Locations

Unusual Clandestine Cloud Data Centre Service Locations Everyone knows what the cloud is, but does everybody know where the cloud is? We try to answer that as we look at some of the most unusual data centre locations in the world. Under the Eyes of a Deity Deep beneath the famous Uspenski Cathedral in the…

Cloud Computing Checklist For Startups

Cloud Computing Checklist For Startups

Checklist For Startups  There are many people who aspire to do great things in this world and see new technologies such as Cloud computing and Internet of Things as a tremendous offering to help bridge and showcase their ideas. The Time Is Now This is a perfect time for highly ambitious startups to make some…

Big Data and Financial Services – Security Threat or Massive Opportunity?

Big Data and Financial Services – Security Threat or Massive Opportunity?

Big Data and Financial Services Cloud Banking Insights Series focuses on big data in the financial services industry and whether it is a security threat or actually a massive opportunity. How does big data fit into an overall cloud strategy? Most FI’s have a positive mind-set towards cloud IT consumption as it not only enables…

How The CFAA Ruling Affects Individuals And Password-Sharing

How The CFAA Ruling Affects Individuals And Password-Sharing

Individuals and Password-Sharing With the 1980s came the explosion of computing. In 1980, the Commodore ushered in the advent of home computing. Time magazine declared 1982 was “The Year of the Computer.” By 1983, there were an estimated 10 million personal computers in the United States alone. As soon as computers became popular, the federal government…

Having Your Cybersecurity And Eating It Too

Having Your Cybersecurity And Eating It Too

The Catch 22 The very same year Marc Andreessen famously said that software was eating the world, the Chief Information Officer of the United States was announcing a major Cloud First goal. That was 2011. Five years later, as both the private and public sectors continue to adopt cloud-based software services, we’re interested in this…

Protecting Devices From Data Breach: Identity of Things (IDoT)

Protecting Devices From Data Breach: Identity of Things (IDoT)

How to Identify and Authenticate in the Expanding IoT Ecosystem It is a necessity to protect IoT devices and their associated data. As the IoT ecosystem continues to expand, the need to create an identity to newly-connected things is becoming increasingly crucial. These ‘things’ can include anything from basic sensors and gateways to industrial controls…

Technology Influencer in Chief: 5 Steps to Success for Today’s CMOs

Technology Influencer in Chief: 5 Steps to Success for Today’s CMOs

Success for Today’s CMOs Being a CMO is an exhilarating experience – it’s a lot like running a triathlon and then following it with a base jump. Not only do you play an active role in building a company and brand, but the decisions you make have direct impact on the company’s business outcomes for…

Cloud Services Providers – Learning To Keep The Lights On

Cloud Services Providers – Learning To Keep The Lights On

The True Meaning of Availability What is real availability? In our line of work, cloud service providers approach availability from the inside out. And in many cases, some never make it past their own front door given how challenging it is to keep the lights on at home let alone factors that are out of…

Despite Record Breaches, Secure Third Party Access Still Not An IT Priority

Despite Record Breaches, Secure Third Party Access Still Not An IT Priority

Secure Third Party Access Still Not An IT Priority Research has revealed that third parties cause 63 percent of all data breaches. From HVAC contractors, to IT consultants, to supply chain analysts and beyond, the threats posed by third parties are real and growing. Deloitte, in its Global Survey 2016 of third party risk, reported…

Three Tips To Simplify Governance, Risk and Compliance

Three Tips To Simplify Governance, Risk and Compliance

Governance, Risk and Compliance Businesses are under pressure to deliver against a backdrop of evolving regulations and security threats. In the face of such challenges they strive to perform better, be leaner, cut costs and be more efficient. Effective governance, risk and compliance (GRC) can help preserve the business’ corporate integrity and protect the brand,…

Using Private Cloud Architecture For Multi-Tier Applications

Using Private Cloud Architecture For Multi-Tier Applications

Cloud Architecture These days, Multi-Tier Applications are the norm. From SharePoint’s front-end/back-end configuration, to LAMP-based websites using multiple servers to handle different functions, a multitude of apps require public and private-facing components to work in tandem. Placing these apps in entirely public-facing platforms and networks simplifies the process, but at the cost of security vulnerabilities. Locating everything…

Why Security Practitioners Need To Apply The 80-20 Rules To Data Security

Why Security Practitioners Need To Apply The 80-20 Rules To Data Security

The 80-20 Rule For Security Practitioners  Everyday we learn about yet another egregious data security breach, exposure of customer data or misuse of data. It begs the question why in this 21st century, as a security industry we cannot seem to secure our most valuable data assets when technology has surpassed our expectations in other regards.…