Author Archives: CloudTweaks

Beacons Flopped, But They’re About to Flourish in the Future

Beacons Flopped, But They’re About to Flourish in the Future

Cloud Beacons Flying High

When Apple debuted cloud beacons in 2013, analysts predicted 250 million devices capable of serving as iBeacons would be found in the wild within weeks. A few months later, estimates put the figure at just 64,000, with 15 percent confined to Apple stores.

Beacons didn’t proliferate as expected, but a few forward-thinking brands did dip their toes. Macy’s, Lord & Taylor, and Sephora deployed beacons to send in-store notifications. Tech-happy airline Virgin Atlantic used them to offer currency deals and deploy blankets to chilly passengers. For the most part, though, retailers scoffed at ahead-of-its time geotargeting tech.

Despite their slow start, beacons are tottering on the edge of a renaissance. Apple and Google keep releasing updates to their beacon ecosystems and wallet platforms that allow many different types of integrations. Major retailers are starting to pay attention, and the barriers blocking beacons are now falling fast.

Great (Unfulfilled) Expectations

Beacons were expected to boom for good reason. Contextual relevance is the holy grail of digital marketing, and beacons deliver personalized information to passersby at opportune moments.

Beacons

(Image courtesy of Unsplash and the tremendous artists involved in the initiative) 

When a connected shopper nears a store, the retailer’s beacon jumps to action. It supplies the customer with information, promo deals, and personalized offers. For the business owner, it paints a data-grounded picture of the shopper’s behavior and desires. Not bad for a small, inexpensive piece of connected hardware.

But before beacons could strut their stuff, they were indelibly stained by privacy fears. Tamped down, too, by the costs and timelines of building the necessary infrastructure, beacons lost their steam.

It’s unfortunate, but perhaps unsurprising. Concerns about security loom large in popular imagination. Consumers have long balked at being tracked by phone, and they’re not crazy about pop-up ads, either.

Retailers, though, have since learned to throw in coupons or rewards, raising the share of shoppers amenable to beacons to 70 percent. Absent incentives, Oracle found just 23 percent of consumers were comfortable with the technology tracking their movements in store and online.

Hardware lag also stymied beacons. Hardware adoption always moves more slowly than software, and — although it seems strange now — not everyone had a beacon-sensitive smartphone in 2013, nor did many retailers have the capacity to broadcast beacons.

The fault, in part, lies with Apple. Estimote co-founder Steve Cheney notes that “nearly everything Apple does…works perfectly right out of the box,” but not iBeacons. Retailers couldn’t roll out beacons until developers learned to deploy and secure them at scale, gutting “within weeks” popularity predictions. Other limiting factors like network coverage were beyond Apple’s control. Networks were traditionally fragmented and closed, but they’re opening up fast.

Beacons bombed in 2013, but 2016 is another year. Beacons require a huge, deployed network to be successful. Slowly but surely, that network has been getting implemented and is now about to hit a usable saturation point. Developers have done their digging. Smartphones abound, networks are broader, and Google has addressed the security problem.

Beacons may be gone from the news cycle, but they’re popping up fast in stores. By the first quarter of 2016, 6.2 million sensors had been deployed globally. When the second quarter rolled around, 2 million more had popped up, bringing the global total to 8.27 million. At the current pace of installation, 400 million will be broadcasting globally by 2020.

Beacon Leadership?

In the coming years, we’ll see beacons broaden from brick-and-mortar retail locations to venues of all sorts. Sports, transportation, and banking establishments are also likely turf for beacons.

Beacons offer deals and details now, but in the future, they’ll facilitate transactions. Customers will leave wallets at home in favor of smartphones. Purchases will be simpler, and context-conscious discounts will become part of the shopping experience. CVS Caremark, for example, uses Google Nearby to notify shoppers they can print in-store photos from their phones’ galleries. United Airlines’ app lets travelers access free entertainment before boarding, and The Broad, an art museum, offers patrons an audio tour. The possibilities for contextual enhancement are as diverse as businesses are themselves.

Contextual Sports and Entertainment

In sports and entertainment, venues will use beacons to engage attendees and manage tickets. With a ticket sales app, for example, an attendee will buy a ticket and download it to his digital wallet; the ticket will pop up when he arrives at the venue. Already, North American sports team have used proximity tech to regain $1 billion in lost ticket sales. Beacons will allow brands, teams, and musicians to push contextual information, deals on refreshments, and emergency notifications directly to fans’ smartphones.

cloud-data-baseball

Tourism and transportation are obvious candidates for beacon technology. Apps that suggest events tailored to users’ interests and locations will help tourists explore sights free of interpreters or itineraries. So far, a beacon-infused app offers guided tours of the Berlin zoo; the University of Notre Dame uses Nearby to guide visitors to historical landmarks; and the MyStop service sends transport alerts in London, with plans to expand across the U.K.

The banking sector, too, will find beacons attractive. Millennials — now the largest U.S. age demographic — seldom visit branch locations but retain attachments to them. Beacons may be the thing to finally bring Millennials in for that auto loan, mortgage, or retirement account. Citibank is testing beacons at select New York City locations, where customers use iPads to access enhanced ATMs for services typically provided by tellers.

But with great power comes great responsibility. Businesses must ensure their offers present enough value to merit the interruption. To allay security fears, brands should give customers in-app options to disable beacon transmissions.

Beacons may not change things like the smartphone or internet did, but they’re about to make shopping smoother, marketing more precise, and live events more engaging — and they may even help us ditch those annoying chip cards.

By Tony Scherba
tony-cloudTony Scherba is the president and a founding partner of Yeti LLC, a product-focused development and design studio in San Francisco.

Tony has been building software since his teen years, and he has led product development efforts for global brands such as Google, Westfield Labs, JBL, MIT, and Sony PlayStation.

Is Machine Learning Making Your Data Scientists Obsolete?

Is Machine Learning Making Your Data Scientists Obsolete?

Machine Learning and Data Scientists

In a recent study, almost all the businesses surveyed stated that big data analytics were fundamental to their business strategies. Although the field of computer and information research scientists is growing faster than any other occupation, the increasing applicability of data science across business sectors is leading to an exponential deficit between supply and demand.

When a 2012 article in the Harvard Business Review, co-written by U.S. chief data scientist DJ Patil, declared the role of data scientist “the sexiest job of the 21st century,” it sparked a frenzy of hiring people with an understanding of data analysis. Even today, enterprises are scrambling to identify and build analytics teams that can not only analyze the data received from a multitude of human and machine sources, but also can put it to work creatively.

One of the key areas of concern has been the ability of machines to gain cognitive power as their intelligence capacities increase. Beyond the ability to leverage data to disrupt multiple white-collar professions, signs that machine learning has matured enough to execute roles traditionally done by data scientists are increasing. After all, advances in deep learning are automating the time-consuming and challenging tasks of feature engineering.

While reflecting on the increasing power of machine learning, one disconcerting question comes to mind: Would advances in machine learning make data scientists obsolete?

The Day the Machines Take Over

machine

Advances in the development of machine learning platforms from leaders like Microsoft, Google, and a range of startups mean that a lot of work done by data scientists would be very amenable to automation — including multiple steps in data cleansing, determination of optimal features, and development of domain-specific variations for predictive models.

With these platforms’ increasing maturity and ability to create market-standard models and data-exchange interfaces, the focus shifts toward tapping machine-learning algorithms with a “black box” approach and away from worrying about the internal complexities.

However, as with any breakthrough technology, we need to recognize that the impact of the technology is limited unless it is well-integrated into the overall business flow. Some of the most successful innovations have been driven not by a single breakthrough technology but by reimagining an end-to-end business process through creative integration of multiple existing components. Uber and Netflix offer prime examples of intelligence gleaned from data being integrated seamlessly into a company’s process flow. Data scientists play a key role in this by leveraging data to orchestrate processes for better customer experience and by optimizing through continuous experimentation.

While organizations across industries increasingly see a more strategic role for data, they often lack clarity around how to make it work. Their tendency to miss the big picture by looking for “easy wins” and working with traditional data sources means that data scientists have an opportunity to help frame problems and to clearly articulate the “realm of the possible.

From Data to Strategy

It is easy to get carried away by the initial hype that machine learning will be a panacea that can solve all the problems and concerns around its impact on the roles of data science practitioners. However, let us recall the AI winters in the mid-’70s, and later in the ’90s, when the journey to the “promised land” did not pan out.

data-cloud

Today, we don’t see the same concerns as in the past — lack of data, data storage costs, limitations of compute power — but we still find true challenges in identifying the right use cases and applying AI in a creative fashion. At the highest of levels, it helps to understand that machine learning capability needs to translate into one of two outcomes:

  • Interaction: Understanding user needs and building better and more seamless engagement
  • Execution: Meeting customer needs in the most optimal manner with ability to self-correct and fine-tune

Stakeholder management becomes extremely important throughout the process. Framing key business problems as amenable to data-led decision-making (in lieu of traditional gut feel) to secure stakeholder buy-in is critical. Consequently, multiple groups need to be involved in identifying the right set of data sources (or best alternatives) while staying conscious of data governance and privacy considerations. Finally, stakeholders need to be fully engaged to ensure that the insights feed into business processes.

Data Scientists Become Core Change Agents

Given the hype surrounding big data analytics, data scientists need to manage responses that fall on opposite ends of the spectrum by tempering extreme optimism and handling skepticism. A combination of the following skills that go beyond platforms and technology are thus needed:

  • Framing solutions to business problems as hypotheses that will require experimentation, incorporating user input as critical feedback
  • Identifying parameters by which outcomes can be judged and being sensitive to the need for learning and iteration
  • Safeguarding against correlations being read as causal factors
  • Ensuring the right framework for data use and governance, given the potential for misuse

This requires pivoting a data scientist’s remit in a company from a pure data-analysis function into a more consultative role, engaging across business functions. Data scientists are not becoming obsolete. They are becoming bigger, more powerful, and more central to organizations, morphing from technician into change agents through the use of data.

By Guha Ramasubramanian

guha-rGuha heads Corporate Business Development at Wipro Technologies and is focused on two strategic themes at the intersection of technology and business: cybersecurity framed from a business risk perspective and how to leverage machine learning for business transformation.

Guha is currently leading the development and deployment of Apollo, an anomaly detection platform that seeks to mitigate risk and improve process velocity through smarter detection.

Technology Influencer in Chief: 5 Steps to Success for Today’s CMOs

Technology Influencer in Chief: 5 Steps to Success for Today’s CMOs

Success for Today’s CMOs

Being a CMO is an exhilarating experience – it’s a lot like running a triathlon and then following it with a base jump. Not only do you play an active role in building a company and brand, but the decisions you make have direct impact on the company’s business outcomes for years to follow.

The role of Chief Marketing Officer (CMO) has evolved significantly in the past several years. Previously, the job was predominantly about establishing an identity — advertising, brand management and thinking creatively — but today it’s a lot more complex. CMOs are now charged with a variety of responsibilities that span technology, analytics and growth strategy, and because they’re often held accountable for contribution to company revenue, today’s CMOs are also responsible for optimizing operational processes and demonstrating measurable impact.

My personal roles in marketing have evolved over the years, as well. Before leading worldwide marketing strategy and execution as ThreatMetrix CMO, I directed the go-to-market strategy for IBM’s portfolio of Software-as-a-Service solutions. Prior to IBM, I served as a Vice President of Corporate Marketing at DemandTec and focused my role around building a modern demand generation engine and repositioning my company’s business to drastically increase our revenue.

In all of my years as a marketing leader, I’ve learned a few key lessons about what it takes to stay effective and deliver a positive return on investment for my department. Below are five tips for CMOs working to navigate today’s ever-evolving digital and mobile-focused business landscape:

1. Think like a CFO

To succeed as CMO, it’s crucial to embrace financial metrics and look beyond top-line spending numbers. Get comfortable speaking with your finance department about topics such as return on capital, budget variance, accrual accounting and revenue recognition as it applies to your marketing operations. Not only will this help build your credibility amongst other company executives, but it will also help streamline your decision-making process. Furthermore, being well-versed in the responsibilities of your CFO will broaden your business acumen and can also help refine your marketing strategy.

2. Align with sales

In addition to embracing financial metrics, it’s important to establish a close relationship with your sales department, because by deeply understanding the customer journey you’ll be able to strengthen marketing alignment to revenue and ultimately yield a stronger marketing-generated sales pipeline. To build a stronger empathy for your sales team, I’ve found it helpful to start by moving outbound sales development representatives out of the sales team and into marketing. Also, make sure you’re comfortable giving detailed product demonstrations (as sales reps development reps do) and work to ensure you’re prepared to step in and lead the sales department should there ever be a leadership transition.

3. Hire creatively

comic-facebook-friends

It may seem obvious, but it’s important to fully understand that the people you hire will drastically impact your productivity and efficacy as CMO. Build a marketing department of leaders who complement one another. Hire people with extensive business and marketing experience and also hire recent college graduates who can offer digital fluency and an aptitude for lifetime learning. Different perspectives from different generations of workers can offer enormous value, often leading to more creative solutions and better business outcomes. Additionally, by bringing together employees with different backgrounds, you as the CMO can unify your department’s strengths to benefit your individual employees and the business as a whole.

4. Embrace ABM

The school of Account Based Marketing (ABM) may just be gaining traction today, but I predict will be a strategic pillar for all CMOs in the near future. Implementing ABM can result in a variety of benefits, including a more focused sales and marketing strategy, a more closely aligned sales and marketing team, an improved buyer journey and additional revenue.

5. Prioritize content

It’s easy to get caught up in the slew of marketing-specific technologies, analytics solutions and metrics, however to achieve long-term success, it’s important to not lose sight of one of the marketing basics: content. Prioritize writing excellence amongst your team, and orient your department around content publishing. Maintain an editorial calendar to ensure your content marketing themes are upheld, and make sure the content you’re creating is forward-looking and centers around what the market cares about (rather than just what you know or sell).

The role of CMO can be challenging at times, and while it’s unclear how the job may evolve in the future, it’s certain to grow even more complex and all-encompassing as technology and data continue to permeate our world. However, by practicing greater interdepartmental empathy and combining new industry techniques with effective, traditional methods, success can be realized today. Perhaps most importantly, in moving beyond the Chief ‘Marketer’ title and embracing the comprehensive business influence they’re capable of, today’s CMOs can seize the opportunity to serve as dynamic leaders and key, corporate decision-makers along with their CEO and other C-suite colleagues.

By Armen Najarian, CMO, ThreatMetrix

armen-najarianArmen leads worldwide marketing strategy and execution for ThreatMetrix. Previously, he directed the go-to-market strategy for IBM’s $1B portfolio of 100+ SaaS solutions. Armen joined IBM through the $440M acquisition of DemandTec, where as VP of Corporate Marketing he built a modern demand generation engine and repositioned the business supporting a 3x increase in revenue over a 5 year span.

Three Challenges of Network Deployment in Hyperconverged Infrastructure for Private Cloud

Three Challenges of Network Deployment in Hyperconverged Infrastructure for Private Cloud

Hyperconverged Infrastructure

In this article, we’ll explore three challenges that are associated with network deployment in a hyperconverged private cloud environment, and then we’ll consider several methods to overcome those challenges.

The Main Challenge: Bring Your Own (Physical) Network

Some of the main challenges of deploying a hyperconverged infrastructure software solution in a data center are the diverse physical configurations. The smart network layer may be the leading component that is tasked with the need to automatically learn the physical network layer’s topology and capabilities. Modern data center operations are expected to be automated and fast. There is no place for traditional, customized and cumbersome installation and integration processes. When deploying hyperconverged smart software on top of a data center infrastructure, running a fast and automated deployment is necessary.

data

In every organization, IT operations leaders have their own philosophy about how to deploy, integrate and manage network traffic. From our discussions with enterprise network experts, I’ve found that every leader has their own specific “network philosophy” that generally includes the following phrases:

“We believe in running internal and guest networks over the same physical network.”

“We believe in running the external communications over the 1G on-board configuration interface, while the rest of the traffic runs on 10G.”

“We like to keep things super simple and run everything on a single Interface.”   

  1. Deploying Logical Over Physical

Physical networks consist of groups of appliances that are connected using protocols. Logical networks are constructed out of different types of traffic and are completely agnostic to physical networks, but they still need to run on them.

For example, let’s assume that data center traffic can be segmented into three types: red, green and blue. Let’s also assume that according to the network admin’s philosophy, red is 1G, routed externally, and green and blue are both 10G, isolated and non-routable. It is important to ensure that each node is linked to each of the three different logical networks on certain physical interfaces. We can only connect the logical layer when the physical one is connected. This can be done by separating the types of traffic from the physical source (the node), then allocating each logical type of traffic to a physical network. In the end, each of the networks (red, green and blue) is connected to the related physical interface.

  1. Automatic and Scalable Deployment

In comparison to custom deployments that tend to involve cumbersome processes mainly completed by integrators, building a hyperconverged smart solution needs to deploy an environment with hundreds of nodes in a matter of minutes.  To achieve this, the deployment must be automatic, easy and bulletproof. Additionally, deployment techniques should not require user intervention per node (users should not have to manually configure the network, or analyze how each server is physically connected to the network). Smart hyperconverged solutions need to automatically discover and analyze an underlying network’s infrastructure.

Automatic network deployment also requires an ‘infection’ mode, where several high-availability network seeders infect all of the servers that connect with them, and in turn, immediately infect their networks. Once all of the nodes are infected, the hyperconverged solution has access to them and can retrieve and analyze information accordingly. After the seeder absorbs all of the network philosophy from the infecting servers, the current state of the physical network is analyzed. Once the scale goes beyond the capacity of normal broadcast domains, the cluster should cross over broadcast domains and start deploying over L3 and IP networks.

  1. Resilient Deployment

When deploying hundreds of nodes in a short period of time, the deployment process needs to adjust to faults and changes. Automatic deployment must assume that the nodes may fail during installation, but cluster deployment should still continue. In addition to making the system prone to errors, it is important to make relevant services highly available when dealing with deployment issues  to auto-detect and notify admins.

Returning to our example, let’s say that one of the servers is not connected to the red network, or that one of the servers has the red and green networks crossed. If not corrected in deployment, these errors must be passed to the admin for intervention without affecting the deployment of the rest of the cluster. It is important to note that this is an ongoing process. The system must be able to auto-tune itself according to physical changes and faults to maintain its reliability.

Final Note

To align with the data center leaders’ philosophy, a smart hyperconverged solution should enable the input of specific configuration preferences at the start of the process. Once the system goes into its “infection” mode, this specific philosophy can be embedded into the network.

By Ariel Maislos, CEO of Stratoscale

ariel-maislosAriel brings over twenty years of technology innovation and entrepreneurship to Stratoscale. 

In 2006 Ariel founded Pudding Media, an early pioneer in speech recognition technology, and Anobit, the leading provider of SSD technology acquired by Apple (AAPL) in 2012. At Apple, he served as a Senior Director in charge of Flash Storage, until he left the company to found Stratoscale. Ariel is a graduate of the prestigious IDF training program Talpiot, and holds a BSc from the Hebrew University of Jerusalem in Physics, Mathematics and Computer Science (Cum Laude) and an MBA from Tel Aviv University. 

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

DDoS Attacks

October 21st, 2016 went into the annals of Internet history for the large scale Distributed Denial of Service (DDoS) attacks that made popular Internet properties like Twitter, SoundCloud, Spotify and Box inaccessible to many users in the US.

The DDoS attack happened in three waves targeting DNS service provider Dyn, resulting in a total of about three hours of service outage. The attack was orchestrated using a botnet of connected devices including a large number of webcams sold by a single manufacturer, which simultaneously made tens of millions of DNS requests on Dyn’s servers. Given the impact and severity, Dyn was quick to release a statement that more fully explained the incident from their side.

DDoS attacks can be carried out in many ways and can either target individual properties, or services that support a multiple Internet properties. DNS services are common targets because they are essential to the operation of cloud-based services.

Cyber Attacks are Getting Increasingly Sophisticated

comic-dating-gameThere’s a growing trend of increasingly sophisticated DDoS attacks targeting governments, political organizations, financial institutions and businesses in general. Victims of high-profile breaches in recent years include Target, eBay, Home Depot, JPMorgan Chase, LinkedIn, FDIC and Ashley Madison, but these are only a few notable names.

Even as government and private organizations embrace cloud-based services, attacks such as the one on 10/21 should compel them to reevaluate “all in on the cloud” approaches to platforms, applications and data. While I am not advocating completely pulling back from the cloud and into on-premises systems, this is a situation that pleads for a diversified risk mitigation strategy.

Organizations need to have solutions in place that will not interrupt operations and kill productivity during situations like this. As we have always advocated, a hybrid solution can certainly mitigate risk and give organizations alternative ways to work in the event of attacks or outages.

The Polarity Problem

A major problem for many organizations is their polar philosophies around infrastructure, the thinking that everything has to be in one place or another – either in the cloud or on-premises. Here’s where hybrid approaches come into their own. What if your application ran on the public cloud, but failed over to an on-premises or private cloud instance in the event of a public-cloud outage? What if your content (data) could reside in the cloud, on-premises or in both places simultaneously, depending how business critical, voluminous or regulated it is?

Consider the Enterprise File Synchronization and Sharing (EFSS) solution space. Cloud-only providers like Box and Dropbox – that emerged as consumer services and subsequently moved into the business segment – arguably don’t account for the mission-critical use cases of governments and businesses, and their need for business continuity in the event of such outages.

Consider how your organization will be impacted if all its corporate information resided in the cloud, and a DDoS attack or other form of cyber attack (or even a natural calamity) brought the cloud infrastructure down for several hours. How will it affect employee productivity? What would the revenue impact be? How would your brand image be affected?

For most organizations, the impact of a cloud outage will be very significant. As such, exploring hybrid approaches becomes mission critical.

Hybrid is the Answer

MJM, a marketing and communications agency owned by WPP, initially used a cloud-only EFSS service for file sharing and collaboration but moved over to Egnyte a few years ago after realizing that what it really needed was a hybrid file sharing solution. Thankfully they did, as disaster struck in 2012 during Hurricane Sandy, devastated the Northeast Coastline in the United States. With no internet and power going in and out, the employees at MJM were still able to work through the disaster and not lose any time or money.

DDoS Attacks

When it comes to the enterprise, we have a steadfast philosophy that:

1) Enterprises need purpose-built solutions. From our inception, we’ve had a razor-sharp focus on serving the file sharing needs of organizations rather than consumers.

2) While we enthusiastically embraced the cloud, we’ve always been aware that our customers need safeguards. Our hybrid approach to file sharing allows customers to leverage the advantages of both cloud and on-premises infrastructures for agility, reliability and business continuity.

If your cloud provider suffers an outage, a hybrid solution can seamlessly failover to your on-premises infrastructure and ensure that users, business processes and workflows remain unaffected. What’s more, these solutions can seamlessly failover to your on-premises infrastructure and ensure that users, business processes and workflows remain unaffected.

It is best to assume that Internet outages are inevitable, and plan for continued access to essential files when your cloud infrastructure or Internet connectivity become unavailable. When the next outage occurs, will you be prepared?

kris lahiriBy Kris Lahiri, VP Operations and Chief Security Officer

Kris is a co-founder of Egnyte. He is responsible for Egnyte’s security and compliance, as well as the core infrastructure, including storage and data center operations. Prior to Egnyte, Kris spent many years in the design and deployment of large-scale infrastructures for Fortune 100 customers of Valdero and KPMG Consulting.

Kris has a B.Tech in Engineering from the Indian Institute of Technology, Banaras, and an MS from the University of Cincinnati.

Mission Digital Transformation: Is Your Infrastructure Ready?

Mission Digital Transformation: Is Your Infrastructure Ready?

Mission Digital Transformation

By and large, most enterprises are developing or executing a digital strategy to transform their businesses. But what is digital transformation? In general, it’s the adoption of technology to deliver new products and experiences through digital channels, either to complement or, in some cases, replace physical interactions. Changing user expectations, new modes of engagement, and the need to improve speed and responsiveness are the main factors driving companies to update outdated processes and develop new applications. For the first time, even large enterprises are moving the focus on technology from the back office to core elements of their brands in order to compete and keep pace with the market.

Who’s Ready to Fully Embrace Digital?

Keeping pace with the evolving digital marketplace requires not only increased innovation, but also updated systems, tools, and teams. In order to deliver on the promise of digital transformation, organizations must also modernize their infrastructure to support the increased speed, scale, and change that comes with it.

A recent survey on digital transformation readiness by SignalFx uncovered that companies of all sizes are investing in at least the fundamental stages of readiness: 79% are implementing or have already implemented a plan to optimize infrastructure to enable digital transformation.

And, while 95% of IT Ops and DevOps indicated that their individual role impacts the success of their company’s digital transformation initiatives, the strategy for modernizing infrastructure prior to digital transformation isn’t exclusively owned by IT or dev management, but overseen by C-suite executives (CEO, CIO, CTO) half the time, indicating that operational preparedness may be understood as essential to success at the highest level of the business.

The infographic below outlines additional key findings.

digital-transformation-readiness-infographic_001

The Intelligent Industrial Revolution

The Intelligent Industrial Revolution

AI Revolution

Prefatory Note: Over the past six weeks, we took NVIDIA’s developer conference on a world tour. The GPU Technology Conference (GTC) was started in 2009 to foster a new approach to high performance computing using massively parallel processing GPUs. GTC has become the epicenter of GPU deep learning — the new computing model that sparked the big bang of modern AI. It’s no secret that AI is spreading like wildfire. The number of GPU deep learning developers has leapt 25 times in just two years. Some 1,500 AI startups have cropped up. This explosive growth has fueled demand for GTCs all over the world. So far, we’ve held events in Beijing, Taipei, Amsterdam, Tokyo, Seoul and Melbourne. Washington is set for this week and Mumbai next month. I kicked off four of the GTCs. Here’s a summary of what I talked about, what I learned and what I see in the near future as AI, the next wave in computing, revolutionizes one industry after another.

A New Era of Computing

a_new_era-iot

Intelligent machines powered by AI computers that can learn, reason and interact with people are no longer science fiction. Today, a self-driving car powered by AI can meander through a country road at night and find its way. An AI-powered robot can learn motor skills through trial and error. This is truly an extraordinary time. In my three decades in the computer industry, none has held more potential, or been more fun. The era of AI has begun.

Our industry drives large-scale industrial and societal change. As computing evolves, new companies form, new products are built, our lives change. Looking back at the past couple of waves of computing, each was underpinned by a revolutionary computing model, a new architecture that expanded both the capabilities and reach of computing.

In 1995, the PC-Internet era was sparked by the convergence of low-cost microprocessors (CPUs), a standard operating system (Windows 95), and a new portal to a world of information (Yahoo!). The PC-Internet era brought the power of computing to about a billion people and realized Microsoft’s vision to put “a computer on every desk and in every home.” A decade later, the iPhone put “an Internet communications” device in our pockets. Coupled with the launch of Amazon’s AWS, the Mobile-Cloud era was born. A world of apps entered our daily lives and some 3 billion people enjoyed the freedom that mobile computing afforded.

Today, we stand at the beginning of the next era, the AI computing era, ignited by a new computing model, GPU deep learning. This new model — where deep neural networks are trained to recognize patterns from massive amounts of data — has proven to be “unreasonably” effective at solving some of the most complex problems in computer science. In this era, software writes itself and machines learn. Soon, hundreds of billions of devices will be infused with intelligence. AI will revolutionize every industry.

GPU Deep Learning “Big Bang”

Why now? As I wrote in an earlier post (“Accelerating AI with GPUs: A New Computing Model”), 2012 was a landmark year for AI. Alex Krizhevsky of the University of Toronto created a deep neural network that automatically learned to recognize images from 1 million examples. With just several days of training on two NVIDIA GTX 580 GPUs, “AlexNet” won that year’s ImageNet competition, beating all the human expert algorithms that had been honed for decades. That same year, recognizing that the larger the network, or the bigger the brain, the more it can learn, Stanford’s Andrew Ng and NVIDIA Research teamed up to develop a method for training networks using large-scale GPU-computing systems.

The world took notice. AI researchers everywhere turned to GPU deep learning. Baidu, Google, Facebook and Microsoft were the first companies to adopt it for pattern recognition. By 2015, they started to achieve “superhuman” results — a computer can now recognize images better than we can. In the area of speech recognition, Microsoft Research used GPU deep learning to achieve a historic milestone by reaching “human parity” in conversational speech.

Image recognition and speech recognition — GPU deep learning has provided the foundation for machines to learn, perceive, reason and solve problems. The GPU started out as the engine for simulating human imagination, conjuring up the amazing virtual worlds of video games and Hollywood films. Now, NVIDIA’s GPU runs deep learning algorithms, simulating human intelligence, and acts as the brain of computers, robots and self-driving cars that can perceive and understand the world. Just as human imagination and intelligence are linked, computer graphics and artificial intelligence come together in our architecture. Two modes of the human brain, two modes of the GPU. This may explain why NVIDIA GPUs are used broadly for deep learning, and NVIDIA is increasingly known as “the AI computing company.”

An End-to-End Platform for a New Computing Model

As a new computing model, GPU deep learning is changing how software is developed and how it runs. In the past, software engineers crafted programs and meticulously coded algorithms. Now, algorithms learn from tons of real-world examples — software writes itself. Programming is about coding instruction. Deep learning is about creating and training neural networks. The network can then be deployed in a data center to infer, predict and classify from new data presented to it. Networks can also be deployed into intelligent devices like cameras, cars and robots to understand the world. With new experiences, new data is collected to further train and refine the neural network. Learnings from billions of devices make all the devices on the network more intelligent. Neural networks will reap the benefits of both the exponential advance of GPU processing and large network effects — that is, they will get smarter at a pace way faster than Moore’s Law.

Whereas the old computing model is “instruction processing” intensive, this new computing model requires massive “data processing.” To advance every aspect of AI, we’re building an end-to-end AI computing platform — one architecture that spans training, inference and the billions of intelligent devices that are coming our way.

Let’s start with training. Our new Pascal GPU is a $2 billion investment and the work of several thousand engineers over three years. It is the first GPU optimized for deep learning. Pascal can train networks that are 65 times larger or faster than the Kepler GPU that Alex Krizhevsky used in his paper. A single computer of eight Pascal GPUs connected by NVIDIA NVLink, the highest throughput interconnect ever created, can train a network faster than 250 traditional servers.

Soon, the tens of billions of internet queries made each day will require AI, which means that each query will require billions more math operations. The total load on cloud services will be enormous to ensure real-time responsiveness. For faster data center inference performance, we announced the Tesla P40 and P4 GPUs. P40 accelerates data center inference throughput by 40 times. P4 requires only 50 watts and is designed to accelerate 1U OCP servers, typical of hyperscale data centers. Software is a vital part of NVIDIA’s deep learning platform. For training, we have CUDA and cuDNN. For inferencing, we announced TensorRT, an optimizing inferencing engine. TensorRT improves performance without compromising accuracy by fusing operations within a layer and across layers, pruning low-contribution weights, reducing precision to FP16 or INT8, and many other techniques.

Someday, billions of intelligent devices will take advantage of deep learning to perform seemingly intelligent tasks. Drones will autonomously navigate through a warehouse, find an item and pick it up. Portable medical instruments will use AI to diagnose blood samples onsite. Intelligent cameras will learn to alert us only to the circumstances that we care about. We created an energy-efficient AI supercomputer, Jetson TX1, for such intelligent IoT devices. A credit card-sized module, Jetson TX1 can reach 1 TeraFLOP FP16 performance using just 10 watts. It’s the same architecture as our most powerful GPUs and can run all the same software.

In short, we offer an end-to-end AI computing platform — from GPU to deep learning software and algorithms, from training systems to in-car AI computers, from cloud to data center to PC to robots. NVIDIA’s AI computing platform is everywhere.

AI Computing for Every Industry

Our end-to-end platform is the first step to ensuring that every industry can tap into AI. The global ecosystem for NVIDIA GPU deep learning has scaled out rapidly. Breakthrough results triggered a race to adopt AI for consumer internet services — search, recognition, recommendations, translation and more. Cloud service providers, from Alibaba and Amazon to IBM and Microsoft, make the NVIDIA GPU deep learning platform available to companies large and small. The world’s largest enterprise technology companies have configured servers based on NVIDIA GPUs. We were pleased to highlight strategic announcements along our GTC tour to address major industries:

AI Transportation: At $10 trillion, transportation is a massive industry that AI can transform. Autonomous vehicles can reduce accidents, improve the productivity of trucking and taxi services, and enable new mobility services. We announced that both Baidu and TomTom selected NVIDIA DRIVE PX 2 for self-driving cars. With each, we’re building an open “cloud-to-car” platform that includes an HD map, AI algorithms and an AI supercomputer.

Driving is a learned behavior that we do as second nature. Yet one that is impossible to program a computer to perform. Autonomous driving requires every aspect of AI — perception of the surroundings, reasoning to determine the conditions of the environment, planning the best course of action, and continuously learning to improve our understanding of the vast and diverse world. The wide spectrum of autonomous driving requires an open, scalable architecture — from highway hands-free cruising, to autonomous drive-to-destination, to fully autonomous shuttles with no drivers.

NVIDIA DRIVE PX 2 is a scalable architecture that can span the entire range of AI for autonomous driving. At GTC, we announced DRIVE PX 2 AutoCruise designed for highway autonomous driving with continuous localization and mapping. We also released DriveWorks Alpha 1, our OS for self-driving cars that covers every aspect of autonomous driving — detection, localization, planning and action.

We bring all of our capabilities together into our own self-driving car, NVIDIA BB8. Here’s a little video:

NVIDIA is focused on innovation at the intersection of visual processing, AI and high performance computing — a unique combination at the heart of intelligent and autonomous machines. For the first time, we have AI algorithms that will make self-driving cars and autonomous robots possible. But they require a real-time, cost-effective computing platform.

At GTC, we introduced Xavier, the most ambitious single-chip computer we have ever undertaken — the world’s first AI supercomputer chip. Xavier is 7 billion transistors — more complex than the most advanced server-class CPU. Miraculously, Xavier has the equivalent horsepower of DRIVE PX 2 launched at CES earlier this year — 20 trillion operations per second of deep learning performance — at just 20 watts. As Forbes noted, we doubled down on self-driving cars with Xavier.

  • AI Enterprise: IBM, which sees a $2 trillion opportunity in cognitive computing, announced a new POWER8 and NVIDIA Tesla P100 server designed to bring AI to the enterprise. On the software side, SAP announced that it has received two of the first NVIDIA DGX-1 supercomputers and is actively building machine learning enterprise solutions for its 320,000 customers in 190 countries.
  • AI City: There will be 1 billion cameras in the world in 2020. Hikvision, the world leader in surveillance systems, is using AI to help make our cities safer. It uses DGX-1 for network training and has built a breakthrough server, called “Blade,” based on 16 Jetson TX1 processors. Blade requires 1/20 the space and 1/10 the power of the 21 CPU-based servers of equivalent performance.
  • AI Factory: There are 2 billion industrial robots worldwide. Japan is the epicenter of robotics innovation. At GTC, we announced that FANUC, the Japan-based industrial robotics giant, will build the factory of the future on the NVIDIA AI platform, from end to end. Its deep neural network will be trained with NVIDIA GPUs, GPU-powered FANUC Fog units will drive a group of robots and allow them to learn together, and each robot will have an embedded GPU to perform real-time AI. MIT Tech Review wrote about it in its story “Japanese Robotics Giant Gives Its Arms Some Brains.”
  • The Next Phase of Every Industry: GPU deep learning is inspiring a new wave of startups — 1,500+ around the world — in healthcare, fintech, automotive, consumer web applications and more. Drive.ai, which was recently licensed to test its vehicles on California roads, is tackling the challenge of self-driving cars by applying deep learning to the full driving stack. Preferred Networks, the Japan-based developer of the Chainer framework, is developing deep learning solutions for IoT. Benevolent.ai, based in London and one of the first recipients of DGX-1, is using deep learning for drug discovery to tackle diseases like Parkinson’s, Alzheimer’s and rare cancers. According to CB Insights, funding for AI startups hit over $1 billion in the second quarter, an all-time high.

The explosion of startups is yet another indicator of AI’s sweep across industries. As Fortune recently wrote, deep learning will “transform corporate America.”  

AI Everyone

AI for Everyone

AI can solve problems that seemed well beyond our reach just a few years back. From real-world data, computers can learn to recognize patterns too complex, too massive or too subtle for hand-crafted software or even humans. With GPU deep learning, this computing model is now practical and can be applied to solve challenges in the world’s largest industries. Self-driving cars will transform the $10 trillion transportation industry. In healthcare, doctors will use AI to detect disease at the earliest possible moment, to understand the human genome to tackle cancer, or to learn from the massive volume of medical data and research to recommend the best treatments. And AI will usher in the 4thindustrial revolution — after steam, mass production and automation — intelligent robotics will drive a new wave of productivity improvements and enable mass consumer customization. AI will touch everyone. The era of AI is here.

Syndicated article courtesy of Nvidia

By Jen-Hsun Huang

jen-hsun-huangJen-Hsun Huang founded NVIDIA in 1993 and has served since its inception as president, chief executive officer and a member of the board of directors.

NVIDIA invented the GPU in 1999 and, from its roots as a PC graphics company, has gone on to become the world leader in AI computing.

Data Sharing: A Matter of Transparency and Control

Data Sharing: A Matter of Transparency and Control

Janrain’s Consumer Identity Survey Shows 93% are Concerned How Brands Use/Share Their Online Activity

It comes as no surprise that people suffer from anxiety when sharing their personal information, even with big brands and names in the social media and eCommerce field. What does come as a surprise is the sheer number of netizens who share these feelings.

A recent research report put out by Marketwired found out that more than 93 percent of online users are concerned about how their info is used online. (Below is a colorful infographic created by the group at Janrain.)

So what are some of the reasons behind this hesitation?

janrain_identitysurvey_comic_full-01-1-compressor

CloudTweaks Comics
Digital Transformation: Not Just For Large Enterprises Anymore

Digital Transformation: Not Just For Large Enterprises Anymore

Digital Transformation Digital transformation is the acceleration of business activities, processes, and operational models to fully embrace the changes and opportunities of digital technologies. The concept is not new; we’ve been talking about it in one way or another for decades: paperless office, BYOD, user experience, consumerization of IT – all of these were stepping…

Cloud Computing – The Good and the Bad

Cloud Computing – The Good and the Bad

The Cloud Movement Like it or not, cloud computing permeates many aspects of our lives, and it’s going to be a big part of our future in both business and personal spheres. The current and future possibilities of global access to files and data, remote working opportunities, improved storage structures, and greater solution distribution have…

Utilizing Digital Marketing Techniques Via The Cloud

Utilizing Digital Marketing Techniques Via The Cloud

Digital Marketing Trends In the past, trends in the exceptionally fast-paced digital marketing arena have been quickly adopted or abandoned, keeping marketers and consumers on their toes. 2016 promises a similarly expeditious temperament, with a few new digital marketing offerings taking center stage. According to Gartner’s recent research into Digital Marketing Hubs, brands plan to…

Is The Fintech Industry The Next Tech Bubble?

Is The Fintech Industry The Next Tech Bubble?

The Fintech Industry Banks offered a wide variety of services such as payments, money transfers, wealth management, selling insurance, etc. over the years. While banks have expanded the number of services they offer, their core still remains credit and interest. Many experts believe that since banks offered such a wide multitude of services, they have…

Cloud Computing and Finland Green Technology

Cloud Computing and Finland Green Technology

Green Technology Finland Last week we touched upon how a project in Finland had blended two of the world’s most important industries, cloud computing and green technology, to produce a data centre that used nearby sea water to both cool their servers and heat local homes.  Despite such positive environmental projects, there is little doubt that…

15 Cloud Data Performance Monitoring Companies

15 Cloud Data Performance Monitoring Companies

Cloud Data Performance Monitoring Companies (Updated: Originally Published Feb 9th, 2015) We have decided to put together a small list of some of our favorite cloud performance monitoring services. In this day and age it is extremely important to stay on top of critical issues as they arise. These services will accompany you in monitoring…

The Global Rise of Cloud Computing

The Global Rise of Cloud Computing

The Global Rise of Cloud Computing Despite the rapid growth of cloud computing, the cloud still commands a small portion of overall enterprise IT spending. Estimates I’ve seen put the percentage between 5% and 10% of the slightly more than $2 trillion (not including telco) spent worldwide in 2014 on enterprise IT. Yet growth projections…

Cloud Infographic: The Future of File Storage

Cloud Infographic: The Future of File Storage

 The Future of File Storage A multi-billion dollar market Data storage has been readily increasing for decades. In 1989, an 8MB Macintosh Portable was top of the range; in 2006, the Dell Inspiron 6400 became available, boasting 160GB; and now, we have the ‘Next Generation’ MacBook Pro with 256GB of storage built in. But, of course,…

How Big Data Is Influencing Web Design

How Big Data Is Influencing Web Design

How Big Data Is Influencing Web Design For all you non-techies… You’re probably wondering what big data is (I know I was….a few years back) so let’s get the definitions out of the way so we’re on the same page, okay? Big data is A LOT of data – really, it is. It is a…

Cloud Computing Myths That SMBs Should Know

Cloud Computing Myths That SMBs Should Know

Cloud Computing and SMBs Cloud Computing is the hottest issue among IT intellects of Small and Medium Businesses (SMBs). Like any other computer-orientated technology, Cloud Computing has some misconceptions and myths that often kick-start arguments among the two opposing groups: Cloud Supporters and Cloud Opponents. Both of these groups have their own ideology and reasons…

Cloud-Based Services vs. On-Premises: It’s About More Than Just Dollars

Cloud-Based Services vs. On-Premises: It’s About More Than Just Dollars

Cloud-Based Services vs. On-Premises The surface costs might give you pause, but the cost of diminishing your differentiators is far greater. Will a shift to the cloud save you money? Potential savings are historically the main business driver cited when companies move to the cloud, but it shouldn’t be viewed as a cost-saving exercise. There…

Four Recurring Revenue Imperatives

Four Recurring Revenue Imperatives

Revenue Imperatives “Follow the money” is always a good piece of advice, but in today’s recurring revenue-driven market, “follow the customer” may be more powerful. Two recurring revenue imperatives highlight the importance of responding to, and cherishing customer interactions. Technology and competitive advantage influence the final two. If you’re part of the movement towards recurring…

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

DDoS Attacks October 21st, 2016 went into the annals of Internet history for the large scale Distributed Denial of Service (DDoS) attacks that made popular Internet properties like Twitter, SoundCloud, Spotify and Box inaccessible to many users in the US. The DDoS attack happened in three waves targeting DNS service provider Dyn, resulting in a total of about…

Despite Record Breaches, Secure Third Party Access Still Not An IT Priority

Despite Record Breaches, Secure Third Party Access Still Not An IT Priority

Secure Third Party Access Still Not An IT Priority Research has revealed that third parties cause 63 percent of all data breaches. From HVAC contractors, to IT consultants, to supply chain analysts and beyond, the threats posed by third parties are real and growing. Deloitte, in its Global Survey 2016 of third party risk, reported…

Three Challenges of Network Deployment in Hyperconverged Infrastructure for Private Cloud

Three Challenges of Network Deployment in Hyperconverged Infrastructure for Private Cloud

Hyperconverged Infrastructure In this article, we’ll explore three challenges that are associated with network deployment in a hyperconverged private cloud environment, and then we’ll consider several methods to overcome those challenges. The Main Challenge: Bring Your Own (Physical) Network Some of the main challenges of deploying a hyperconverged infrastructure software solution in a data center are the diverse physical…

5 Things To Consider About Your Next Enterprise Sharing Solution

5 Things To Consider About Your Next Enterprise Sharing Solution

Enterprise File Sharing Solution Businesses have varying file sharing needs. Large, multi-regional businesses need to synchronize folders across a large number of sites, whereas small businesses may only need to support a handful of users in a single site. Construction or advertising firms require sharing and collaboration with very large (several Gigabytes) files. Financial services…

How To Overcome Data Insecurity In The Cloud

How To Overcome Data Insecurity In The Cloud

Data Insecurity In The Cloud Today’s escalating attacks, vulnerabilities, breaches, and losses have cut deeply across organizations and captured the attention of, regulators, investors and most importantly customers. In many cases such incidents have completely eroded customer trust in a company, its services and its employees. The challenge of ensuring data security is far more…

5% Of Companies Have Embraced The Digital Innovation Fostered By Cloud Computing

5% Of Companies Have Embraced The Digital Innovation Fostered By Cloud Computing

Embracing The Cloud We love the stories of big complacent industry leaders having their positions sledge hammered by nimble cloud-based competitors. Saleforce.com chews up Oracle’s CRM business. Airbnb has a bigger market cap than Marriott. Amazon crushes Walmart (and pretty much every other retailer). We say: “How could they have not seen this coming?” But, more…

Is Machine Learning Making Your Data Scientists Obsolete?

Is Machine Learning Making Your Data Scientists Obsolete?

Machine Learning and Data Scientists In a recent study, almost all the businesses surveyed stated that big data analytics were fundamental to their business strategies. Although the field of computer and information research scientists is growing faster than any other occupation, the increasing applicability of data science across business sectors is leading to an exponential…