Category Archives: Technology

Growth Hacking Your Startup Into The Cloud

Growth Hacking Your Startup Into The Cloud

SaaS Growth Hacking

Growth hacking could be just the kick your business’s marketing strategy needs but knowing that the cheap and prolific tools and tactics work and knowing how to make them work for you are two different things. We take a look at who to get advice from, which brand strategies to replicate, and where to get the how-tos.

A Few of the Experts

growth-hacking-followers

Though there’s no definitive list of growth hacking experts, there are a few who’ve made a name for themselves in the field and continue to develop their art. Sean Ellis is notable not only for coining the term ‘growth hacking’ but also for the marketing triumphs of Dropbox, LogMeIn, and Eventbrite (to name a few). Check out Sean’s Twitter feed for some valuable advice. Neil Patel, co-founder of CrazyEgg and KISSmetrics, is another digital marketer with an excellent reputation and a list of top tier clients he’s helped to grow. Considered a top influencer on the web by the Wall Street Journal, Neil provides tons of quality growth-hacking content on advertising, SEO, content marketing and much more. Though the range of experts worth tapping includes far more than mentioned here, Brian Dean rounds us off with specialist SEO expertise that helps organizations build traffic growth through SEO. Brian keeps thing simple with a few key strategies and detailed step-by-step tutorials for decisive success.

Influential Brands

brands-tech

But don’t just take the experts’ word for it; it’s always a good idea to have a look at the growth hacking success stories to see what might best fit your own business. Hotmail made excellent use of simple email tag lines that invite new users through mail sent out by existing users, and both Gmail and Pinterest exploited the public’s love of exclusivity to create a buzz through invite only access. Twitter’s automatic suggestions of new users to follow have encouraged broader connections and sustained use, a version of upselling that when correctly employed is extremely fruitful. A few brands, including PayPal and Dropbox, have made good use of the fact that nobody can say no to free stuff with incentivizing schemes that encourage existing users to refer friends to their services, a growth hack that not only expands customer bases but improves the loyalty of current users.

The How-Tos

For some tips, tricks and detailed growth hacking tutorials, consider the following:

Growth Tribe

Coming to you from Amsterdam, ‘Europe’s 1st Growth Hacking Academy’ offers a wealth of information helping individuals and companies build and use top growth hacking skills.

GrowthRocks

26 online courses to help you make the most out of the available growth hacking tools.

Udemy

With a sizeable range of growth hacking courses available, this online learning platform lets you pick and choose the skills you’d like to master.

Traction

How Any Startup Can Achieve Explosive Customer Growth’ – by Gabriel Weinberg and Justin Mares, a practical and tactical must-read.

100 Days of Growth

Sujan Patel and Rob Wormley’s ‘Proven Ways to Grow Your Business Fast.’ Practical advice in an actionable set of guidelines and strategies with examples of successful devices.

Just as growth hacking provides budget-friendly and efficient marketing tools to new and growing businesses, the growth hacking community is eager to share and build their skills inexpensively and abundantly. For those with the will, the resources are waiting.

By Jennifer Klostermann

Cyber Security Tips For Digital Collaboration

Cyber Security Tips For Digital Collaboration

Cyber Security Tips

October is National Cyber Security Awareness Month – a joint effort by the Department of Homeland Security and private industry to ensure that citizens and businesses alike have the resources they need to use the Internet safely and securely. Today’s cyber criminals are ingenious and constantly probing for vulnerabilities, and when breaches occur they can put the whole company at risk. Don’t give them the opportunity!

cloudtweaks.com-comicOne of the biggest security challenges companies face is that the way we work together has changed dramatically – a transformation that is still ongoing. The term “workplace” is becoming an anachronism as people find new ways to collaborate digitally, anywhere, at any time. Sensitive information needs to be shared among dispersed teams that may include co-workers, partners, customers and other stakeholders. Some of these individuals are vetted and trusted, others…not so much.

Since most security breaches start with human error, now is a fitting time to share some reminders for employees and business users. Think of these as your first line of defense when collaborating in an unsafe world.

Don’t Intermingle Work and Personal Files

Always keep business and personal files separate, otherwise you’re asking for trouble. (A certain presidential candidate learned this the hard way!) For cloud apps, use separate accounts. If work and personal files must be on the same device, store them as far apart as possible, using different directory paths.

Use Strong Passwords and Keep Them Safe

According to Verizon’s 2016 Data Breach Investigations Report, 63% of confirmed data breaches involved leveraging weak, default or stolen passwords. Employees, contractors and everyone else in your business ecosystem should be required to use unique credentials with strong, unique passwords, rather than the name of their pet goldfish over and over. Even if a password is exposed just once, the potential consequences are enough to make a security manager cringe. Remind people that the infamous Target breach began when some hacker stole a heating contractor’s credentials, while at Home Depot, someone used a vendor’s username and password to steal credit card info for more than 50 million people.

Verify Email Addresses Are Correct

According to a Ponemon Institute survey of over 1000 IT professionals, 63% of respondents have accidentally sent files to the wrong recipients – people who clearly were not authorized to see them. Here’s a simple suggestion: if an employee needs to send an email to someone for the first time, have the intended recipient send an initial email so the employee can respond to it and use it thereafter. This eliminates the chance they’ll get the address wrong – misspell a company name, forget a dash (or add one), use “.com” instead of “.org“, etc., and send a file goodness knows where.

Don’t Send Sensitive Files using a Consumer-Grade Service

data-science

When employees need to share a file that’s too large for email, it’s tempting to send it through Dropbox, Box or some other consumer-grade file sharing service – or simply park it there for convenience. While many of these consumer-grade services have improved their security measures in recent years, they lack the file-level security and controls necessary for protecting sensitive data. For example, a file may be intended for information only, but people are saving it, renaming it, forwarding it others, pasting sections into a competitor’s sales campaign or misusing it in other ways that the sender never intended.

Have Remote Erase Capabilities, or an Effective Alternative

People are always losing their devices – at the airport, in the back of a taxi, at a restaurant, etc. If a device is used to store sensitive data, it also needs a remote wipe feature to be able to erase that data in the event the device is lost or stolen. (NASA learned this lesson the hard way.) Another approach that’s much more flexible is to use information rights management (IRM) software that can delete sensitive files instantly, on any device.

Don’t Share Your Devices with Family and Friends

With the holidays approaching, many people will be receiving new devices (laptops, phones, etc.) as gifts, and family and friends will be pleading for a chance to use them. According to a survey by Kaspersky Lab, one third of respondents reported sharing their personal devices, and of those, 32% took no precautions to protect their information. Why tempt people? In addition, some family members probably have minimal awareness or understanding of today’s cyber threats, and how cunning the perpetrators can be.

Stay Safe Online – and Collaborate with Confidence

Since most security breaches start with human error, educating your staff is an obvious way to reduce the risk. But we also have to remember that training only goes so far – whenever human beings are involved, there’s always the chance of risky behaviors and silly mistakes. And if someone takes advantage of a security lapse to sneak onto your network and steal sensitive data, the damage may not be apparent for weeks or months.

Thus a company has to back up its first line of defense with other measures to keep its information safe. Consider a solution that embeds encryption and user privileges directly into a file, including who is authorized to access it and what operations they can perform with it. These permissions then follow the file wherever it goes on, on any device it lands on. If sensitive data falls into the wrong hands, access can be immediately revoked. Companies get control over their files that’s not available with email or traditional file sharing. As business becomes increasingly powered by digital collaboration, it’s the way to keep sensitive information secure while using it to full advantage.

By Daren Glenister

The Managed DNS Industry

The Managed DNS Industry

DNS Industry 

The SaaS industry has been going through a major shift in just the last few years, which is redefining how platforms are designed. System and network administrators are demanding all-in-one platforms for a variety of management tasks. The managed DNS industry, for one, has been radically altered by this shift. Both new and existing DNS providers are rolling out integrated platforms, which combine the analytical power of monitoring with advanced query management.

The Internet has been abuzz as the skeptical sys admins question how these integrated platforms can fix issues their predecessors couldn’t. And can you replace your current toolset with an all-in-one platform?

The principal idea behind these platforms is synergy, a mutually dependent relationship between monitoring and management. This technology is made possible by the cloud, which allows information to be shared between the two services in real time. The cloud foundations for all-in-one platforms have also proven to make these subscription services noticeably cheaper.

So what is this synergistic secret sauce that makes these all-in-one services so revolutionary? In the case of DNS management, network monitoring is integral to efficient query routing. What’s the point of making changes to your network configurations if you can’t monitor and analyze the results? This can also be applied the other way around: what’s the point in monitoring your network if you can’t fix the problems that you identify?

security-tips

Traffic management should never feel like a shot in the dark, rather it should be informed and calculated to provide the best result for each individual end-user. The new integrated platform push is forcing admins to rethink how they manage their organizations’ traffic.

The problem is, too many admins think these tools are only used for anticipating DDoS or resolving attacks and outages. To be frank, outages are rare, but they can be devastating. DNS management has shifted from outage resolution to performance optimization. Next-generation managed DNS solutions will take a look at your entire network and implement changes to improve the experience for all of your end-users—individually optimized for each user’s location, browser, IP connectivity, and more.

Admins aren’t wrong for wanting to use query management for security reasons. That’s because DNS traffic operates at a critical ingress point for managing incoming traffic; as in, you can filter and root out malicious traffic before it even reaches your site. But what most admins seem to forget is these same management tools can be used to eliminate latency and improve network performance.

End-users are demanding faster load times, especially from mobile sites. DNS resolution times are only one portion of load time, but 50% of page load time is taken up by network latency overhead. Admins have to leverage every layer of the stack for optimal performance, or get left behind.

All-in-one management solutions are proving to be invaluable during high traffic periods. You can analyze traffic loads and redirect segments of traffic so that it’s balanced across many different resources or locations. You can also use this technology to minimize resolution times, by ensuring queries are being answered at the nearest possible server, or most optimally performing server (in case the closest one is under strain or underperforming).

These platforms are also incorporating Artificial Intelligence (AI) to analyze areas causing performance degradation and then make changes to alleviate them before they can cause appreciable affects to end-users. Some AI’s are paired with automated services that are able to recognize performance trends and patterns. They then use the analytics to anticipate and even predict potential attacks or fluctuations.

These all-in-one suites have created a new breed of traffic management, called Internet Traffic Optimization Services (ITOS). This new industry seeks to redefine the way admins manage their networks, by harnessing the power of analytics to make informed proactive changes. DNS is a user’s first and most impactful step when accessing a website, which is why ITOS places a strong emphasis on informed DNS management.

In the end, it all comes down to the cold hard stats. In order to get the most ROI out of a service, you need to look for reliability, cost efficiency, and proven performance improvements. All-in-one and ITOS solutions may still be in their formative years, but these solutions provide admins with all the tools they need in one platform. Now admins can see the performance improvement of their configurations in real time, while still costing less than non-integrated services.

By Steven Job

Microsoft Releases Beta of Microsoft Cognitive Toolkit For Deep Learning Advances

Microsoft Releases Beta of Microsoft Cognitive Toolkit For Deep Learning Advances

Microsoft Cognitive Toolkit

Microsoft has released an updated version of Microsoft Cognitive Toolkit, a system for deep learning that is used to speed advances in areas such as speech and image recognition and search relevance on CPUs and NVIDIA® GPUs.

The toolkit, previously known as CNTK, was initially developed by computer scientists at Microsoft who wanted a tool to do their own research more quickly and effectively. It quickly moved beyond speech and morphed into an offering that customers including a leading international appliance maker and Microsoft’s flagship product groups depend on for a wide variety of deep learning tasks.

We’ve taken it from a research tool to something that works in a production setting,” said Frank Seide, a principal researcher at Microsoft Artificial Intelligence and Research and a key architect of Microsoft Cognitive Toolkit.

The latest version of the toolkit, which is available on GitHub via an open source license, includes new functionality that lets developers use Python or C++ programming languages in working with the toolkit.  With the new version, researchers also can do a type of artificial intelligence work called reinforcement learning.

Finally, the toolkit is able to deliver better performance than previous versions. It’s also faster than other toolkits, especially when working on big datasets across multiple machines. That kind of large-scale deployment is necessary to do the type of deep learning across multiple GPUs that is needed to develop consumer products and professional offerings…

Read Full Article: Microsoft

A President’s Trove of Data

A President’s Trove of Data

Then vs Now

According to some popular opinions, today’s information age affords more information to teens today than a few of the world leaders had access to 20 years ago. C+R Research has put this hypothesis through its paces, comparing access to information across areas such as private data, classified information, genetics, public opinion, and more and finds that in many ways the average smartphone user does, in fact, have access to a lot more information than those with the highest clearance would have two decades ago. However, the accuracy and quality of data available don’t necessarily compare.

Critical Information vs. the Non-Essentials

C+R Research finds that just about any 13-year- old with a smartphone in 2016 would beat President Bill Clinton’s 1996 intelligence and access in areas such as traffic data, music, trivia, opinion, and even genetics. But then, the president of the United States might not have time to listen to the 30 million songs immediately accessible via Spotify, nor would Air Force One likely be constrained by the same traffic limitations as the rest of us. Of course, political campaign teams of 20 years ago would drool for the polling possibilities that Twitter offers today.

Data

On the other hand, President Clinton would have had better access to classified information, data from satellites, and research journals, as well as access to private data – though there are rules governing this, some very important people tend to be ‘incorporated’ into such regulations. Happily, or unhappily depending on how much privacy you desire, tracking of family members via the secret service in 1996 was about as proficient as the smartphone apps we use today to monitor friends and family.

In the end, the 13-year-old wins 7 to 5 in the most general terms, but it’s important to recognize that the broad scope of information available today doesn’t necessarily point to accurate or significant information, two traits President Clinton could be sure of.

By Jennifer Klostermann

Blockchain and the IoT

Blockchain and the IoT

IoT Blockchain

Blockchain, also known as Distributed Ledger Technology (DLT), is the innovative technology behind Bitcoin. The impact of Bitcoin has been tremendous and, as with any revolutionary technology, was treated initially with awe and apprehension. Since its open source release back in 2009, Bitcoin became a transformative force in the global payments system, establishing itself without the aid or support of the traditional financial infrastructure. While initial usage saw huge success in black markets, Bitcoin defied odds, and the blockchain technology spawned other cryptocurrencies, exchanges, commercial ventures, alliances, consortiums, investments, and uptake by governments, merchants, and financial services worldwide.

block-chain

On August 12, the World Economic Forum (WEF) published a report on the future of the financial infrastructure, and in particular on the transformative role that blockchain technology is set to play. Notably, it analyzes the technology’s impact on the financial services industry and how it can provide more transparency and security. Potential use cases are examined, including for insurance, deposits and lending, insurance, capital raising, investment management, and market provisioning. The report also looks at the current challenges in instituting a widespread implementation of blockchain, many of which will require international legal frameworks, harmonized regulatory environments, and global standardization efforts.

DLT is already having a serious impact on the financial services industry. The WEF report states that 80% of banks will initiate a DLT project by next year, and more than $1.4 billion has already been invested in the technology in the past three years. More than that, governments and law firms are seriously staking their claim in advancing the technology. Law firm Steptoe & Johnson LLP recently announced the expansion of its Blockchain Team into a multidisciplinary practice involving FinTech, financial services, regulatory, and law enforcement knowledge. The firm is also one of the founders of the Blockchain Alliance, a coalition of blockchain companies and law enforcement and regulatory agencies, alongside the U.S. Chamber of Digital Commerce and Coin Center. This expansion is an endorsement of the potential of DLT, within and eventually beyond financial services.

blockchain-landscape-2016

(Image source: Startupmanagement.org)

The possible applications of blockchain are already being explored in numerous new sectors: energy, transportation, intellectual property, regulation and compliance, international trade, law enforcement, and government affairs, among many others. Ethereum is one blockchain endeavor that features smart contract functionality. The distributed computing platform provides a decentralized virtual machine to execute peer-to-peer contracts using the Ether cryptocurrency. The Ether Hack Camp is launching a four-week hackathon in November 2016 for DLT using Ether. Currently, the Camp is requesting developers to propose ideas to the public, which will be voted on by registered fans and those selected will be able to take part in the hackathon. The ideas can be seen online already and are vast and varied, ranging from academic publishing without journals, music licensing reform, decentralized ISP, voting on the blockchain, alternative dispute resolution, and rural land register. The idea winning first place in November will collect $50,000 USD.

IBM is one of the most dynamic forerunners currently pushing DLT for the IoT. The firm just announced it is investing $200 million in blockchain technology to drive forward its Watson IoT efforts. The firm is opening up a new office in Germany, which will serve as headquarter to new blockchain initiatives. The investment is part of the $3 billion that IBM pledged to develop Watson’s cognitive computing for the IoT. The goal of the new investment is to enable companies to share IoT data in a private blockchain. A commercial implementation is already underway with Finnish company Kouvola Innovation, which wants to integrate its capabilities into the IBM Blockchain and link devices for tracking, monitoring, and reporting on shipping container status and location, optimizing packing, and transfer of shipments.

IBM is working hard to align its IoT, AI and Blockchain technologies through Watson. The new headquarters in Germany will be home to a Cognitive IoT Collaboratories for researchers, developers and engineers.

Many of IBM’s current projects are developed leveraging the open source Hyperledger Project fabric, a consortium founded by the Linux Foundation in which IBM is a significant contributor, alongside Cisco and Intel. IBM pushed its involvement even further with the June launch of its New York-based Bluemix Garage. The idea is to allow developers and researchers the opportunity to use IBM Cloud APIs and blockchain technologies to drive cognitive, IoT, unstructured data, and social media technology innovation. Just one month after the launch, IBM announced the launch of a cloud service for companies running blockchain technology. The cloud service is underpinned by IBM’s LinuxONE technology, which is specifically designed to meet the security requirements of critical sectors, such as financial, healthcare, and government.

The potential for DLT is certainly broad and rather long-term, but the engagement by the financial services industry is a testament to its potential. While FinTech remains the big focus for blockchain technologies, its success will drive the use of DLT for other areas. The promise of blockchain is to deliver accountability and transparency; although this could be disrupted significantly if announcements, such as the one made by Accenture on ‘editable’ Blockchain, become a reality. While banks may welcome the feature, it would be a serious blow to not only the integrity, but also the security of blockchain technology.

By Michela Menting

Great Cloud Platforms Need to Win the Hearts and Minds of Developers First

Great Cloud Platforms Need to Win the Hearts and Minds of Developers First

Great Cloud Platforms 

Adoption of cloud computing services is growing exponentially all around the world. Companies are realizing that so much of the hard, expensive work that they used to have to do internally can now be outsourced to cloud providers, allowing the companies to focus on what it is that they do best. That’s the reason why tech research firm Gartner projects that over the next five years, the shift to the cloud is looking to be a US$1-trillion market.

Everything from running payrolls, to marketing, logistics, data analysis and much, much more is moving to the cloud, and one of the most successful uses of the cloud is the concept of Platform-as-a-Service (PaaS, as it is known). What this does is enable customers to develop, run and manage their own applications without having to invest heavily in the infrastructure required in order to develop and launch a web application.

The key to creating a good product on the right platform is to win the hearts and minds of web developers so that they choose the right platform to go with. SAP, the world’s largest enterprise cloud company with over 320,000 customers and over 110 million cloud users in 190 countries is using its extensive experience and knowledge in the business space to offer the SAP HANA Cloud Platform, a remarkable service for all company sizes. This platform is already being used extensively by developers who are creating apps for their customers or their various organizations and employees.

hcp_customer_journey_october12_2016_v5_001

The SAP HANA Cloud Platform enables developers to build business applications in the cloud quickly and easily.

Three features of this platform stand out:

  1. its ability to extend your cloud and on-premise applications to develop customized hybrid solutions,
  2. the awesome feature allowing you to integrate applications seamlessly and securely to synchronize data and processes across cloud, on-premise and third-party applications, as well as
  3. the core feature which allows you to build new enterprise-ready applications rapidly with an open standards platform that brings out the best in developers.

The Director of Group Software at the Danone Group, Ralf Steinbach, says that “with SAP HANA Cloud Platforms, we can quickly develop beautiful, user-friendly applications that are opening new opportunities to connect our customers directly to our back-end systems.”

Cloud services are a rapidly expanding market, and research indicates there are over 150 PaaS offerings to choose from. Too often companies simply choose the PaaS of a cloud-service provider that they’re already working with, without exploring the offerings in-depth and with a long-term focus.

According to John Rymer of Forrester Research, there are three types of developers who make use of PaaS offerings to build apps:

  1. Coders, who want the ability to do it all themselves,
  2. DevOps developers who want the ability to do some coding if they need to but can also plug into some level of abstraction, and
  3. RapidDevs who don’t want to code at all but just to configure a task to the capabilities of the platform.

For each of these types of developers, the SAP HANA Cloud Platform can deliver, due to its flexibility, requiring fewer skills and still at a lower cost. That flexibility extends to the choices that customers are offered between selecting to use a private, managed cloud, a public pay-as-you-go model or even public cloud infrastructure-as-a-service or platform-as-a-service.

In order for a platform to survive and thrive, it requires developers to regard it as the best choice for what they have to do on a daily basis: easily and quickly deploy applications that leverage a proven in-memory platform for next generation applications and analytics supported by a world-class technical team at every step of the way.

A great way to get started with SAP HANA Cloud Platform is with the user-based packages. Priced per users, they offer the flexibility to choose the package that best fits your needs. You can get started for as little as $25 / user / month, and scale as you go, adding more users or upgrading to add more resources when you need them.

For a limited time, you can get 30% off SAP HANA Cloud Platform user-based packages on the SAP Store by using the promo code HCP30.

Sponsored spotlight series by SAP

By Jeremy Daniel

The Intelligent Industrial Revolution

The Intelligent Industrial Revolution

AI Revolution

Prefatory Note: Over the past six weeks, we took NVIDIA’s developer conference on a world tour. The GPU Technology Conference (GTC) was started in 2009 to foster a new approach to high performance computing using massively parallel processing GPUs. GTC has become the epicenter of GPU deep learning — the new computing model that sparked the big bang of modern AI. It’s no secret that AI is spreading like wildfire. The number of GPU deep learning developers has leapt 25 times in just two years. Some 1,500 AI startups have cropped up. This explosive growth has fueled demand for GTCs all over the world. So far, we’ve held events in Beijing, Taipei, Amsterdam, Tokyo, Seoul and Melbourne. Washington is set for this week and Mumbai next month. I kicked off four of the GTCs. Here’s a summary of what I talked about, what I learned and what I see in the near future as AI, the next wave in computing, revolutionizes one industry after another.

A New Era of Computing

a_new_era-iot

Intelligent machines powered by AI computers that can learn, reason and interact with people are no longer science fiction. Today, a self-driving car powered by AI can meander through a country road at night and find its way. An AI-powered robot can learn motor skills through trial and error. This is truly an extraordinary time. In my three decades in the computer industry, none has held more potential, or been more fun. The era of AI has begun.

Our industry drives large-scale industrial and societal change. As computing evolves, new companies form, new products are built, our lives change. Looking back at the past couple of waves of computing, each was underpinned by a revolutionary computing model, a new architecture that expanded both the capabilities and reach of computing.

In 1995, the PC-Internet era was sparked by the convergence of low-cost microprocessors (CPUs), a standard operating system (Windows 95), and a new portal to a world of information (Yahoo!). The PC-Internet era brought the power of computing to about a billion people and realized Microsoft’s vision to put “a computer on every desk and in every home.” A decade later, the iPhone put “an Internet communications” device in our pockets. Coupled with the launch of Amazon’s AWS, the Mobile-Cloud era was born. A world of apps entered our daily lives and some 3 billion people enjoyed the freedom that mobile computing afforded.

Today, we stand at the beginning of the next era, the AI computing era, ignited by a new computing model, GPU deep learning. This new model — where deep neural networks are trained to recognize patterns from massive amounts of data — has proven to be “unreasonably” effective at solving some of the most complex problems in computer science. In this era, software writes itself and machines learn. Soon, hundreds of billions of devices will be infused with intelligence. AI will revolutionize every industry.

GPU Deep Learning “Big Bang”

Why now? As I wrote in an earlier post (“Accelerating AI with GPUs: A New Computing Model”), 2012 was a landmark year for AI. Alex Krizhevsky of the University of Toronto created a deep neural network that automatically learned to recognize images from 1 million examples. With just several days of training on two NVIDIA GTX 580 GPUs, “AlexNet” won that year’s ImageNet competition, beating all the human expert algorithms that had been honed for decades. That same year, recognizing that the larger the network, or the bigger the brain, the more it can learn, Stanford’s Andrew Ng and NVIDIA Research teamed up to develop a method for training networks using large-scale GPU-computing systems.

The world took notice. AI researchers everywhere turned to GPU deep learning. Baidu, Google, Facebook and Microsoft were the first companies to adopt it for pattern recognition. By 2015, they started to achieve “superhuman” results — a computer can now recognize images better than we can. In the area of speech recognition, Microsoft Research used GPU deep learning to achieve a historic milestone by reaching “human parity” in conversational speech.

Image recognition and speech recognition — GPU deep learning has provided the foundation for machines to learn, perceive, reason and solve problems. The GPU started out as the engine for simulating human imagination, conjuring up the amazing virtual worlds of video games and Hollywood films. Now, NVIDIA’s GPU runs deep learning algorithms, simulating human intelligence, and acts as the brain of computers, robots and self-driving cars that can perceive and understand the world. Just as human imagination and intelligence are linked, computer graphics and artificial intelligence come together in our architecture. Two modes of the human brain, two modes of the GPU. This may explain why NVIDIA GPUs are used broadly for deep learning, and NVIDIA is increasingly known as “the AI computing company.”

An End-to-End Platform for a New Computing Model

As a new computing model, GPU deep learning is changing how software is developed and how it runs. In the past, software engineers crafted programs and meticulously coded algorithms. Now, algorithms learn from tons of real-world examples — software writes itself. Programming is about coding instruction. Deep learning is about creating and training neural networks. The network can then be deployed in a data center to infer, predict and classify from new data presented to it. Networks can also be deployed into intelligent devices like cameras, cars and robots to understand the world. With new experiences, new data is collected to further train and refine the neural network. Learnings from billions of devices make all the devices on the network more intelligent. Neural networks will reap the benefits of both the exponential advance of GPU processing and large network effects — that is, they will get smarter at a pace way faster than Moore’s Law.

Whereas the old computing model is “instruction processing” intensive, this new computing model requires massive “data processing.” To advance every aspect of AI, we’re building an end-to-end AI computing platform — one architecture that spans training, inference and the billions of intelligent devices that are coming our way.

Let’s start with training. Our new Pascal GPU is a $2 billion investment and the work of several thousand engineers over three years. It is the first GPU optimized for deep learning. Pascal can train networks that are 65 times larger or faster than the Kepler GPU that Alex Krizhevsky used in his paper. A single computer of eight Pascal GPUs connected by NVIDIA NVLink, the highest throughput interconnect ever created, can train a network faster than 250 traditional servers.

Soon, the tens of billions of internet queries made each day will require AI, which means that each query will require billions more math operations. The total load on cloud services will be enormous to ensure real-time responsiveness. For faster data center inference performance, we announced the Tesla P40 and P4 GPUs. P40 accelerates data center inference throughput by 40 times. P4 requires only 50 watts and is designed to accelerate 1U OCP servers, typical of hyperscale data centers. Software is a vital part of NVIDIA’s deep learning platform. For training, we have CUDA and cuDNN. For inferencing, we announced TensorRT, an optimizing inferencing engine. TensorRT improves performance without compromising accuracy by fusing operations within a layer and across layers, pruning low-contribution weights, reducing precision to FP16 or INT8, and many other techniques.

Someday, billions of intelligent devices will take advantage of deep learning to perform seemingly intelligent tasks. Drones will autonomously navigate through a warehouse, find an item and pick it up. Portable medical instruments will use AI to diagnose blood samples onsite. Intelligent cameras will learn to alert us only to the circumstances that we care about. We created an energy-efficient AI supercomputer, Jetson TX1, for such intelligent IoT devices. A credit card-sized module, Jetson TX1 can reach 1 TeraFLOP FP16 performance using just 10 watts. It’s the same architecture as our most powerful GPUs and can run all the same software.

In short, we offer an end-to-end AI computing platform — from GPU to deep learning software and algorithms, from training systems to in-car AI computers, from cloud to data center to PC to robots. NVIDIA’s AI computing platform is everywhere.

AI Computing for Every Industry

Our end-to-end platform is the first step to ensuring that every industry can tap into AI. The global ecosystem for NVIDIA GPU deep learning has scaled out rapidly. Breakthrough results triggered a race to adopt AI for consumer internet services — search, recognition, recommendations, translation and more. Cloud service providers, from Alibaba and Amazon to IBM and Microsoft, make the NVIDIA GPU deep learning platform available to companies large and small. The world’s largest enterprise technology companies have configured servers based on NVIDIA GPUs. We were pleased to highlight strategic announcements along our GTC tour to address major industries:

AI Transportation: At $10 trillion, transportation is a massive industry that AI can transform. Autonomous vehicles can reduce accidents, improve the productivity of trucking and taxi services, and enable new mobility services. We announced that both Baidu and TomTom selected NVIDIA DRIVE PX 2 for self-driving cars. With each, we’re building an open “cloud-to-car” platform that includes an HD map, AI algorithms and an AI supercomputer.

Driving is a learned behavior that we do as second nature. Yet one that is impossible to program a computer to perform. Autonomous driving requires every aspect of AI — perception of the surroundings, reasoning to determine the conditions of the environment, planning the best course of action, and continuously learning to improve our understanding of the vast and diverse world. The wide spectrum of autonomous driving requires an open, scalable architecture — from highway hands-free cruising, to autonomous drive-to-destination, to fully autonomous shuttles with no drivers.

NVIDIA DRIVE PX 2 is a scalable architecture that can span the entire range of AI for autonomous driving. At GTC, we announced DRIVE PX 2 AutoCruise designed for highway autonomous driving with continuous localization and mapping. We also released DriveWorks Alpha 1, our OS for self-driving cars that covers every aspect of autonomous driving — detection, localization, planning and action.

We bring all of our capabilities together into our own self-driving car, NVIDIA BB8. Here’s a little video:

NVIDIA is focused on innovation at the intersection of visual processing, AI and high performance computing — a unique combination at the heart of intelligent and autonomous machines. For the first time, we have AI algorithms that will make self-driving cars and autonomous robots possible. But they require a real-time, cost-effective computing platform.

At GTC, we introduced Xavier, the most ambitious single-chip computer we have ever undertaken — the world’s first AI supercomputer chip. Xavier is 7 billion transistors — more complex than the most advanced server-class CPU. Miraculously, Xavier has the equivalent horsepower of DRIVE PX 2 launched at CES earlier this year — 20 trillion operations per second of deep learning performance — at just 20 watts. As Forbes noted, we doubled down on self-driving cars with Xavier.

  • AI Enterprise: IBM, which sees a $2 trillion opportunity in cognitive computing, announced a new POWER8 and NVIDIA Tesla P100 server designed to bring AI to the enterprise. On the software side, SAP announced that it has received two of the first NVIDIA DGX-1 supercomputers and is actively building machine learning enterprise solutions for its 320,000 customers in 190 countries.
  • AI City: There will be 1 billion cameras in the world in 2020. Hikvision, the world leader in surveillance systems, is using AI to help make our cities safer. It uses DGX-1 for network training and has built a breakthrough server, called “Blade,” based on 16 Jetson TX1 processors. Blade requires 1/20 the space and 1/10 the power of the 21 CPU-based servers of equivalent performance.
  • AI Factory: There are 2 billion industrial robots worldwide. Japan is the epicenter of robotics innovation. At GTC, we announced that FANUC, the Japan-based industrial robotics giant, will build the factory of the future on the NVIDIA AI platform, from end to end. Its deep neural network will be trained with NVIDIA GPUs, GPU-powered FANUC Fog units will drive a group of robots and allow them to learn together, and each robot will have an embedded GPU to perform real-time AI. MIT Tech Review wrote about it in its story “Japanese Robotics Giant Gives Its Arms Some Brains.”
  • The Next Phase of Every Industry: GPU deep learning is inspiring a new wave of startups — 1,500+ around the world — in healthcare, fintech, automotive, consumer web applications and more. Drive.ai, which was recently licensed to test its vehicles on California roads, is tackling the challenge of self-driving cars by applying deep learning to the full driving stack. Preferred Networks, the Japan-based developer of the Chainer framework, is developing deep learning solutions for IoT. Benevolent.ai, based in London and one of the first recipients of DGX-1, is using deep learning for drug discovery to tackle diseases like Parkinson’s, Alzheimer’s and rare cancers. According to CB Insights, funding for AI startups hit over $1 billion in the second quarter, an all-time high.

The explosion of startups is yet another indicator of AI’s sweep across industries. As Fortune recently wrote, deep learning will “transform corporate America.”  

AI Everyone

AI for Everyone

AI can solve problems that seemed well beyond our reach just a few years back. From real-world data, computers can learn to recognize patterns too complex, too massive or too subtle for hand-crafted software or even humans. With GPU deep learning, this computing model is now practical and can be applied to solve challenges in the world’s largest industries. Self-driving cars will transform the $10 trillion transportation industry. In healthcare, doctors will use AI to detect disease at the earliest possible moment, to understand the human genome to tackle cancer, or to learn from the massive volume of medical data and research to recommend the best treatments. And AI will usher in the 4thindustrial revolution — after steam, mass production and automation — intelligent robotics will drive a new wave of productivity improvements and enable mass consumer customization. AI will touch everyone. The era of AI is here.

Syndicated article courtesy of Nvidia

By Jen-Hsun Huang

jen-hsun-huangJen-Hsun Huang founded NVIDIA in 1993 and has served since its inception as president, chief executive officer and a member of the board of directors.

NVIDIA invented the GPU in 1999 and, from its roots as a PC graphics company, has gone on to become the world leader in AI computing.

CloudTweaks Comics
Cloud Infographic – Monetizing Internet Of Things

Cloud Infographic – Monetizing Internet Of Things

Monetizing Internet Of Things There are many interesting ways in which companies are looking to connect devices to the cloud. From the vehicles to kitchen appliances the internet of things is already a $1.9 trillion dollar market based on research estimates from IDC. Included is a fascinating infographic provided by AriaSystems which shows us some of the exciting…

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

DDoS Attacks October 21st, 2016 went into the annals of Internet history for the large scale Distributed Denial of Service (DDoS) attacks that made popular Internet properties like Twitter, SoundCloud, Spotify and Box inaccessible to many users in the US. The DDoS attack happened in three waves targeting DNS service provider Dyn, resulting in a total of about…

Five Cloud Questions Every CIO Needs To Know How To Answer

Five Cloud Questions Every CIO Needs To Know How To Answer

The Hot Seat Five cloud questions every CIO needs to know how to answer The cloud is a powerful thing, but here in the CloudTweaks community, we already know that. The challenge we have is validating the value it brings to today’s enterprise. Below, let’s review five questions we need to be ready to address…

Cloud Infographic – Guide To Small Business Cloud Computing

Cloud Infographic – Guide To Small Business Cloud Computing

Small Business Cloud Computing Trepidation is inherently attached to anything that involves change and especially if it involves new technologies. SMBs are incredibly vulnerable to this fear and rightfully so. The wrong security breach can incapacitate a small startup for good whereas larger enterprises can reboot their operations due to the financial stability of shareholders. Gordon Tan contributed an…

Disaster Recovery And The Cloud

Disaster Recovery And The Cloud

Disaster Recovery And The Cloud One of the least considered benefits of cloud computing in the average small or mid-sized business manager’s mind is the aspect of disaster recovery. Part of the reason for this is that so few small and mid-size businesses have ever contemplated the impact of a major disaster on their IT…

6 Tech Predictions To Have A Major Impact In 2016

6 Tech Predictions To Have A Major Impact In 2016

6 Tech Predictions To Have A Major Impact The technology industry moves at a relentless pace, making it both exhilarating and unforgiving. For those at the forefront of innovation it is an incredibly exciting place to be, but what trends are we likely to see coming to the fore in 2016? Below are six predictions…

Cloud-Based or On-Premise ERP Deployment? Find Out

Cloud-Based or On-Premise ERP Deployment? Find Out

ERP Deployment You know how ERP deployment can improve processes within your supply chain, and the things to keep in mind when implementing an ERP system. But do you know if cloud-based or on-premise ERP deployment is better for your company or industry? While cloud computing is becoming more and more popular, it is worth…

Cloud Infographic – The Internet Of Things In 2020

Cloud Infographic – The Internet Of Things In 2020

The Internet Of Things In 2020 The growing interest in the Internet of Things is amongst us and there is much discussion. Attached is an archived but still relevant infographic by Intel which has produced a memorizing snapshot at how the number of connected devices have exploded since the birth of the Internet and PC.…

Will Your Internet of Things Device Testify Against You?

Will Your Internet of Things Device Testify Against You?

Will Your Internet of Things Device Testify Imagine this:  Your wearable device is subpoenaed to testify against you.  You were driving when you were over the legal alcohol limit and data from a smart Breathalyzer device is used against you. Some might argue that such a use case could potentially safeguard society. However, it poses…

The Future Of Cloud Storage And Sharing…

The Future Of Cloud Storage And Sharing…

Box.net, Amazon Cloud Drive The online (or cloud) storage business has always been a really interesting industry. When we started Box in 2005, it was a somewhat untouchable category of technology, perceived to be a commodity service with low margins and little consumer willingness to pay. All three of these factors remain today, but with…

Is Machine Learning Making Your Data Scientists Obsolete?

Is Machine Learning Making Your Data Scientists Obsolete?

Machine Learning and Data Scientists In a recent study, almost all the businesses surveyed stated that big data analytics were fundamental to their business strategies. Although the field of computer and information research scientists is growing faster than any other occupation, the increasing applicability of data science across business sectors is leading to an exponential…

Adopting A Cohesive GRC Mindset For Cloud Security

Adopting A Cohesive GRC Mindset For Cloud Security

Cloud Security Mindset Businesses are becoming wise to the compelling benefits of cloud computing. When adopting cloud, they need a high level of confidence in how it will be risk-managed and controlled, to preserve the security of their information and integrity of their operations. Cloud implementation is sometimes built up over time in a business,…

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

What the Dyn DDoS Attacks Taught Us About Cloud-Only EFSS

DDoS Attacks October 21st, 2016 went into the annals of Internet history for the large scale Distributed Denial of Service (DDoS) attacks that made popular Internet properties like Twitter, SoundCloud, Spotify and Box inaccessible to many users in the US. The DDoS attack happened in three waves targeting DNS service provider Dyn, resulting in a total of about…

Technology Influencer in Chief: 5 Steps to Success for Today’s CMOs

Technology Influencer in Chief: 5 Steps to Success for Today’s CMOs

Success for Today’s CMOs Being a CMO is an exhilarating experience – it’s a lot like running a triathlon and then following it with a base jump. Not only do you play an active role in building a company and brand, but the decisions you make have direct impact on the company’s business outcomes for…

7 Common Cloud Security Missteps

7 Common Cloud Security Missteps

Cloud Security Missteps Cloud computing remains shrouded in mystery for the average American. The most common sentiment is, “It’s not secure.” Few realize how many cloud applications they access every day: Facebook, Gmail, Uber, Evernote, Venmo, and the list goes on and on… People flock to cloud services for convenient solutions to everyday tasks. They…

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Cloud Email Migration In today’s litigious society, preserving your company’s data is a must if you (and your legal team) want to avoid hefty fines for data spoliation. But what about when you move to the cloud? Of course, you’ve probably thought of this already. You’ll have a migration strategy in place and you’ll carefully…

Cloud-Based or On-Premise ERP Deployment? Find Out

Cloud-Based or On-Premise ERP Deployment? Find Out

ERP Deployment You know how ERP deployment can improve processes within your supply chain, and the things to keep in mind when implementing an ERP system. But do you know if cloud-based or on-premise ERP deployment is better for your company or industry? While cloud computing is becoming more and more popular, it is worth…

How To Humanize Your Data (And Why You Need To)

How To Humanize Your Data (And Why You Need To)

How To Humanize Your Data The modern enterprise is digital. It relies on accurate and timely data to support the information and process needs of its workforce and its customers. However, data suffers from a likability crisis. It’s as essential to us as oxygen, but because we don’t see it, we take it for granted.…

The Rise Of BI Data And How To Use It Effectively

The Rise Of BI Data And How To Use It Effectively

The Rise of BI Data Every few years, a new concept or technological development is introduced that drastically improves the business world as a whole. In 1983, the first commercially handheld mobile phone debuted and provided workers with an unprecedented amount of availability, leading to more productivity and profits. More recently, the Cloud has taken…