Category Archives: Cloud Computing

Robotics, AI, FinTech, and IoT – Most Significant Emerging Technological Trends

Robotics, AI, FinTech, and IoT – Most Significant Emerging Technological Trends

NEW YORK, Sept. 13, 2016 /PRNewswire/ — Global X Funds, the New York-based provider of exchange-traded funds (ETFs), today launched the Global X FinTech Thematic ETF (Nasdaq: FINX), the Global X Robotics & Artificial Intelligence Thematic ETF (Nasdaq: BOTZ), and the Global X Internet of Things Thematic ETF (Nasdaq: SNSR). The three new ETFs join Global X’s Thematic suite, which now has fifteen funds and approximately $1 billion in assets under management as of September 6, 2016. Global X’s Thematic suite includes funds that have been available to investors since 2010.

The Global X Internet of Things Thematic ETF aims to offer exposure to companies that stand to potentially benefit from the broader adoption of the Internet of Things (IoT). This includes the development and manufacturing of semiconductors and sensors, integrated products and solutions, and applications serving smart grids, smart homes, connected cars, and the industrial Internet. The ecosystem of devices and objects that are wireless connected is expected to total over 50 billion by 2020, with an estimated economic impact of $3.9-$11.1 trillion by 2025, according to McKinsey.


Robotics & AI, FinTech, and the Internet of Things are among the most significant emerging technological trends in the world as they are set to disrupt a broad range of industries and change how we interact with ordinary things like banks, cars, and even refrigerators.” said Jay Jacobs, director of research of Global X. “Our aim with launching these funds is to provide investors with tools to efficiently gain exposure to the companies that are well-positioned to grow from these technological revolutions…

Read Full Release: PR Newswire

New Bromium Labs Threat Report

New Bromium Labs Threat Report

2016 Threat Report

The semi-annual Bromium Labs Threat Report has just been released providing an analysis of cyber-attacks and threats which have struck enterprise security in the last six months. It’s found an eruption of ransomware usage as well as an increase in app, browser, and plug-in vulnerabilities and notes that while Microsoft strengthens security, nefarious forces are changing tack and concentrating on ‘drive-by download attacks.’

Significant Conclusions

bromium-evp-and-chief-security-architect-rahul-kashyapThough it’s clear that criminals are working harder than ever to get their hands on protected data, it’s not all bad news. Bromium Labs Threat Report also notes that although the amount of vulnerabilities is constantly rising, they aren’t all being exploited. Unfortunately, there have been several high-profile data breaches and ransomware attacks of late, leaving enterprise security in a somewhat precarious position. Commenting exclusively to CloudTweaks, Bromium EVP and Chief Security Architect, Rahul Kashyap, states, “We’re only halfway through 2016, and our analysis shows numbers of vulnerabilities surpassing 2015 rates. But at the same time, there are less exploits across the board with the exception of Flash, which continues to have high ROI for hackers. Security is improving, but old attack techniques like phishing and watering hole attacks are still plaguing enterprises. It goes without question that we can expect attackers to evolve in response to heightened security. We need isolation and instant protection to secure our networks and data.”

Specific discoveries by Bromium Labs include:

  • A rise in vulnerabilities, with 516 reported to the National Vulnerability Database in the first half of 2016, as compared to 403 vulnerabilities reported over all of 2015.
  • Fewer exploitable vulnerabilities in popular software systems than in previous years, potentially due to the additional attention software vendors’ are giving to security.
  • Adobe Flash had 31 exploits in the first half of 2016, up from eight in 2016, resulting in some security vendors blocking or ending support for Flash. Regrettably from a security standpoint, Flash remains popular with end users and so continues to be a top target for criminals.
  • The most used exploit kits include Neutrino and Rig, though Angler and Nuclear kits also featured but disappeared in early June possibly due to crackdowns on cybercrime groups.
  • Since the beginning of 2016, many new ransomware families have been circulated, the current leader being Locky with 755 tracked instances infecting RAM disks and removable drives.


Tackling the Threats

Though the dangers are becoming more sophisticated and insidious, Kashyap believes real efforts are being made to secure networks and IT infrastructure. “As an industry, we’ve always said there’s no one silver bullet to address the complexities of attacks that are affecting our business. However, our latest research shows that enterprises and vendors alike are stepping up to do a better at securing their networks and data. But there’s still work to be done.” It’s expected that over the next 12 months social engineering tactics will continually be exploited by attackers, and “instant protection, detection, and remediation is more critical than ever.”

Bromium Labs finds most AV vendors are executing multiple updates per day in an attempt to keep up with machine timescale attacks but with new malware observable for less than 60 seconds before it transforms into a victim-specific variant current malicious detection capabilities are found to be lacking. It’s suggested the best strategy is a dramatic reduction of the attack surface, isolating attacks and limiting possible danger and spread. Taking a new approach, Bromium’s unique micro-visualization technology is advancing endpoint security and their solution automatically isolates each user-task in a lightweight, CPU-enforced micro-VM. For all of Bromium Labs security insights and judgements, download the full Bromium Lab Threats Report.

By Jennifer Klostermann

Digital Twin And The End Of The Dreaded Product Recall

Digital Twin And The End Of The Dreaded Product Recall

The Digital Twin 

How smart factories and connected assets in the emerging Industrial IoT era along with the automation of machine learning and advancement of artificial intelligence can dramatically change the manufacturing process and put an end to the dreaded product recalls in the future.

In recent news, Samsung Electronics Co. has initiated a global recall of 2.5 millions of their Galaxy Note 7 smartphones, after finding that the batteries of some of their phones exploded while charging. This recall would cost the company close to $1 Billion.

This is not a one-off incident.

Product recalls have plagued the manufacturing world for decades, right from food and drug to automotive industries, causing huge losses and risk to human life. In 1982, Johnson & Johnson recalled 31 million bottles of Tylenol which retailed at $100 million after 7 people died in Chicago-area. In 2000, Ford recalled 20 million Firestone tires losing around $3 billion, after 174 people died in road accidents due to faulty tires. In 2009, Toyota issued a recall of 10 million vehicles due to numerous issues including gas pedals and faulty airbags that resulted in $2 billion loss consisting of repair expenses and lost sales in addition to the stock prices dropping more than 20% or $35 billion.

Most manufacturers have very stringent quality control processes for their products before they are shipped. Then how and why do these faulty products make it to the market which poses serious life risks and business risks?

Koh Dong-jin, president of Samsung’s mobile business, said that the cause of the battery issue in Samsung Galaxy Note 7 device was “a tiny problem in the manufacturing process and so it was very difficult to find out“. This is true for most of the recalls that happens. It is not possible to manually detect these seemingly “tiny” problems early enough before they result in catastrophic outcomes.

But this won’t be the case in the future.

The manufacturing world has seen 4 transformative revolutions:

  • 1st Industrial Revolution brought in mechanization powered by water and stream.
  • 2nd Industrial Revolution saw the advent of the assembly line powered by gas and electricity
  • 3rd Industrial Revolution introduced robotic automation powered by computing networks
  • The 4th Industrial Revolution has taken it to a completely different level with smart and connected assets powered by machine learning and artificial intelligence.

It is this 4th Industrial Revolution that we are just embarking on that has the potential to transform the face of the manufacturing world and create new economic value to the tune of tens of trillions of dollars, globally, from costs savings and new revenue generation. But why is this the most transformative of all revolutions? Because it is this revolution that has transformed mechanical lifeless machines into digital life-forms with the birth of the Digital Twin.


Digital Twin refers to the computerized companions (or models) of the physical assets that use multiple internet-connected sensors on these assets to represent their near real-time status, working condition, position, and other key metrics that help understand the health and functioning of these assets at granular levels. This helps us understand asset and asset health like we understand humans and human health, with the ability to do diagnosis and prognosis like never before.

How can this solve the recall problem?

Sensor enabling the assembly line and creating Digital Twin of all the individual assets and workflows provides timely insights into tiniest of the issues that can otherwise be easily missed in the manual inspection process. This can detect causes and predict potential product quality issues right in the assembly line as early as possible so that the manufacturers can take proactive action to resolve them before they start snowballing.  This can not only prevent recalls but also reduce scraps in the assembly line taking operational efficiency to unprecedented heights.

What is so deterrent? Why is this problem not solved most organizations that have smart-enabled their factories?

The traditional approach of doing data science and machine learning to analyze data doesn’t scale for this problem. Traditionally, predictive models are created by taking a sample of data from a sample of assets and then these models are generalized for predicting issues on all assets. While this can detect common known issues, which otherwise get caught in the quality control process itself, but it fails to detect the rare events that cause the massive recalls. Rare events have failure patterns that don’t commonly occur in the assets or the assembly line. Although, highly sensitive generalized models can be created to detect any and all deviations but that would generate a lot of false positive alerts which cause a different series of problems altogether. The only way to ensure that we get accurate models that detect only the true issues is to model each asset and the workflow channels individually, understand their respective normal operating conditions and detect their respective deviations. But this is what makes this challenge beyond human-scale. When there are hundreds, thousands or millions of assets and components it is impossible to keep generating and updating models for each one of them manually. It requires automation of the predictive modeling and the machine learning process itself, as putting human data scientists in the loop doesn’t scale.

But aren’t there standard approaches or scripts to automate predictive modeling?

Yes, there are. However, these plain vanilla automation of modeling process which just runs all permutations of algorithms and hyper-parameters again doesn’t work. The number of assets and as such the number of individual models, the frequency at which models need to be updated to capture newer real-world events, the volume of the data and the wide variety of sensor attributes all create prohibitive computational complexity (think millions or billions of permutations), even if someone has infinite infrastructure to process them. The only solution is Cognitive Automation, which is an intelligent process that mimics how a human data scientists leverage prior experience to run fewer experiments to get to an optimal ensemble of models in the fastest possible way. In short, this is about teaching machines to do machine learning and data science like an A.I. Data Scientist.

This is the technology that is required to give Digital Twin a true life-form that delivers the end business value – in this case to prevent recalls.

Does it sound like sci-fi?

It isn’t and it is already happening with the advancement in the world of machine learning and artificial intelligence. Companies like Google are using algorithms to create self-driving cars or beat world champions in complex games. At the same time, we at DataRPM are using algorithms to teach machines to do data analysis and detect asset failures and quality issues on the assembly line. This dramatically improves operational efficiency and prevents the product recalls.

The future, where the dreaded product recalls will be a thing of the past, is almost here!

By Ruban Phukan, Co-Founder and Chief Product & Analytics Officer, DataRPM

Microsoft Dynamics CRM Online Selected By HP To Transform Sales And Partner Engagement

Microsoft Dynamics CRM Online Selected By HP To Transform Sales And Partner Engagement

REDMOND, Wash. — Sept. 12, 2016 — Microsoft Corp. has entered a six-year agreement with HP Inc. to deploy Microsoft Dynamics to thousands of employees across HP, dramatically enhancing collaboration across marketing, sales and service operations. With Dynamics, as well as Azure, Office 365 and other Microsoft Cloud solutions, HP has invested in the sales and service collaboration platform it needs to deliver a seamless sales experience for customers and partners while increasing the company’s performance and economies.

We have chosen Microsoft Dynamics as our CRM solution for our direct selling, partners and services,” said Jon Flaxman, chief operating officer, HP. “This brings us a cloud-based solution that delivers a more effective and efficient collaboration engine across our business.”

HP is undergoing a journey to transform its sales and partner environment, driving increased productivity and collaboration in a virtually all-digital world. As part of this transformation, the company is moving to a more integrated sales experience for both HP sales reps and the channel partner community.

Complementing Dynamics CRM, Office 365 provides worldwide sales, service and marketing professionals at HP with an immersive, connected productivity experience for teamwork and collaboration. In addition, Power BI will empower HP marketers to uncover powerful business insights and predictions. Azure will provide the IT organization with a global, open, hybrid cloud for all of the solutions, while also giving HP a platform for new capabilities and services at a low total cost of ownership.

HP continues to innovate in its customer engagement, with the tools and business processes it provides to its employees and partner community and, of course, the products and services it delivers,” said Judson Althoff, executive vice president of Worldwide Commercial Business at Microsoft. “We share this dedication to digital transformation with HP and are incredibly proud to work with it as it delivers amazing technology experiences to people around the globe.”

Read more at: Microsoft News

The History of Containers and Rise of Docker

The History of Containers and Rise of Docker

Containers 101

Docker started out as a means of creating single application containers, but since has grown into a widely used dev tool and runtime environment. It has been downloaded around two billion times, and Redmonk has said that “we have never seen a technology become ubiquitous so quickly.” The Docker registry stores container images and provides a central point of access which can be used to share containers. Users can either place images into the registry or obtain images from it to deploy directly from the registry. Despite its widespread growth and acceptance, Docker still retains its free open source roots, and hosts a free public registry for containers from which anyone can obtain official Docker images. Below is an infographic discovered via Twistlock which a really nice overview of Container technologies.


By Jonquil McDaniel

Write Once, Run Anywhere: The IoT Machine Learning Shift From Proprietary Technology To Data

Write Once, Run Anywhere: The IoT Machine Learning Shift From Proprietary Technology To Data

The IoT Machine Learning Shift

While early artificial intelligence (AI) programs were a one-trick pony, typically only able to excel at one task, today it’s about becoming a jack of all trades. Or at least, that’s the intention. The goal is to write one program that can solve multi-variant problems without the need to be rewritten when conditions change—write once, run anywhere. Digital heavyweights—notably Amazon, Google, IBM, and Microsoft—are now open sourcing their machine learning (ML) libraries in pursuit of that goal as competitive pressures shift focus from proprietary technologies to proprietary data for differentiation.

Machine learning is the study of algorithms that learn from examples and experience, rather than relying on hard-coded rules that do not always adapt well to real-world environments. ABI Research forecasts ML-based IoT analytics revenues will grow from $2 billion in 2016 to more than $19 billion in 2021, with more than 90% of 2021 revenue to be attributed to more advanced analytics phases. Yet while ML is an intuitive and organic approach to what was once a very rudimentary and primal way of analyzing data, it is worth noting that the ML/AI model creation process itself can be a very complex.


The techniques used to develop machine learning algorithms fall under two umbrellas:

  • How they learn: based on the type of input data provided to the algorithm (supervised learning, unsupervised learning, reinforcement learning, and semi-supervised learning)

  • How they work: based on type of operation, task, or problem performed on I/O data (classification, regression, clustering, anomaly detection, and recommendation engines)

Once the basic principles are established, a classifier can be trained to automate the creation of rules for a model. The challenge lies in learning and implementing the complex algorithms required to build these ML models, which can be costly, difficult, and time-consuming.

Engaging the open-source community introduces an order of magnitude to the development and integration of machine learning technologies without the need to expose proprietary data, a trend which Amazon, Google, IBM, and Microsoft swiftly pioneered.

At more than $1 trillion, these four companies have a combined market cap that dwarfs the annual gross domestic product of more than 90% of countries in the world. Each also open sourced its own deep learning library in the past 12 to 18 months: Amazon’s Deep Scalable Sparse Tensor Network Engine (DSSTNE; pronounced “destiny”), Google’s TensorFlow, IBM’s SystemML, and Microsoft’s Computational Network Toolkit (CNTK). And others are quickly following suit, including Baidu, Facebook, and OpenAI.

But this is just the beginning. To take the most advanced ML models used in IoT to the next level (artificial intelligence), modeling, and neural network toolsets (e.g., syntactic parsers) must improve. Open sourcing such toolsets is again a viable option, and Google is taking the lead by open sourcing its neural network framework, Google’s SyntaxNet, driving the next evolution in IoT from advanced analytics to smart, autonomous machines.

But should others continue to jump on this bandwagon and attempt to shift away from proprietary technology and toward proprietary data? Not all companies own the kind of data that Google collects through Android or Search, or that IBM picked up with its acquisition of The Weather Company’s B2B, mobile, and cloud-based web-properties. Fortunately, a proprietary data strategy is not the panacea for competitive advantage in data and analytics. As more devices get connected, technology will play an increasingly important role for balancing insight generation from previously untapped datasets, and the capacity to derive value from the highly variable, high-volume data that comes with these new endpoints—at a cloud scale, with zero manual tuning.



Collaborative economics is an important component in the analytics product and service strategies of these four leading digital companies all seeking to build a greater presence in IoT and more broadly the convergence of the digital and the physical. But “collaboration” should be placed in context. Once one company open-sourced its ML libraries, other companies were forced to release theirs as well. Millions of developers are far more powerful than a few thousand in-house employees. As well, open sourcing offers these companies tremendous benefits because they can use the new tools to enhance their own operations. For example, Baidu’s Paddle ML software is being used in 30 different online and offline Baidu businesses ranging from health to financial services.

And there are other areas for these companies to invest resources that go beyond the analytics toolsets. Identity management services, data exchange services and data chain of custody are three key areas that will be critical in the growth of IoT and the digital/physical convergence. Pursuing ownership or proprietary access to important data has its appeal. But the new opportunities in the IoT landscape will rely on great technology and the scale these companies possess for a connected world that will in the decades to come reach hundreds of billions of endpoints.

martin-ryan-hi-rezBy  Ryan Martin and Dan Shey

Ryan Martin, Senior Analyst at ABI Research, covers new and emerging mobile technologies, including wearable tech, connected cars, big data analytics, and the Internet of Things (IoT) / Internet of Everything (IoE). 

Ryan holds degrees in economics and political science, with an additional concentration in global studies, from the University of Vermont and an M.B.A. from the University of New Hampshire.

Why Do Television Companies Need A Digital Transformation

Why Do Television Companies Need A Digital Transformation

Cloud TV

Over just a few years, the world of television production, distribution, and consumption has changed dramatically. In the past, with only a few channels to choose from, viewers watched news and entertainment television at specific times of the day or night. They were also limited by where and how to watch. Options included staying home, going to a friend’s house, or perhaps going to a restaurant or bar to watch a special game, show, news story, or event. When we are talking about the TV industry has now been completing and moving to the high definition from the standard definition, now the discussion is about 4K and 8K video standard. But before all these things happen, analog based broadcasting needs to transform digitally. That means TV industry is unavoidable needing a disruptive transformation in their ICT platform to cope with the new processes of acquisition, production, distribution and consumption.


Fast-forward to today, and you have a very different scenario. Thanks to the rise of the Internet – and, in particular, mobile technology – people have nearly limitless options for their news and entertainment sources. Not only that, but they can choose to get their news and other media on TV or on a variety of smart devices, including phones, tablets, smart watches, and more.

Improved Business Value From New Information and Communication Technologies (ICT)

The world has changed, and continues to change, at a rapid pace. This change has introduced a number of challenges to businesses in the television industry. Making the digital media transformation can do a number of things to resolve these challenges and improve your business and viewership.

With leading new ICT, you can see significant business value and improved marketing and production strategies. For example, making this transformation can vastly improve your television station’s information production and service capabilities. It can also smooth the processes involved with improving broadcasting coverage and performance as well.

With these improvements, your station will have faster response times when handling time-sensitive broadcasts. This delivers to your audience the up-to-the-minute coverage and updates they want across different TV and media devices and platforms.

Improved Social Value with New ICT

A television station that refuses to change and evolve with viewers’ continuously evolving needs and wants will find themselves falling behind competitors. However, a TV station that understands the necessity to make the digital media transformation will have significantly improved social value with their audiences.


Television stations that embrace new technology, digital media, storage, cloud computing and sharing will see massive improvements in social value. Consider that this transformation enables your station to produce timely and accurate reports faster, giving your audience the freshest information and entertainment.

By bringing news and entertainment media to your audience when, where and how they want and need it, you can enrich their lives and promote a culture of information sharing that will also serve to improve your ratings and business. With technologies like cloud-based high-definition video production and cloud-based storage and sharing architectures, you can eliminate many of the challenges and pain points associated with reporting news and bringing TV entertainment to a large audience.

Why Do Television, Media, and Entertainment Companies Need a Digital Transformation?

Consider the basic steps that a TV news station must take to get the news to their audience:

  • Acquisition
  • Production
  • Distribution
  • Consumption

For television stations that have not yet embraced a digital media transformation, these steps do not just represent the process of delivering news media to the public. They also represent a series of pain points that can halt progress and delay deadlines. These include:

  • Traditional AV matrices use numerous cables, are limited by short transmission distance for HD signals and require complicated maintenance, slowing down 4K video evolution.
  • Delays when attempting to transmit large video files from remote locations back to the television station.
  • Delays when reporters edit videos because office and production networks in TV stations are separated from each other, requiring them to move back and forth between the production zone and the office zone in their building to do research
  • Delays due to the time it takes to transmit a finished program (between six and twenty-four minutes, depending on the length and whether or not it is a high-definition video) to the audience.
  • 4K video production has much higher requirements on bandwidth and frame rates.

These challenges all occur in traditional structures and architectures for media handling, but they quickly dissolve when a TV station makes the digital transformation and begin using a cloud-based architecture with new ICT.

Keeping Up With Viewer Demand via Ultra High Definition (UHD) Omnimedia

Increasingly, viewers demand more and more individualized experiences. These include interactive programming, rich media, UHD video, and they want it across all applicable devices. Delivering UHD omnimedia is only possible through new ICT, as older IT infrastructures simply cannot scale to the levels necessary to keep up with viewer demands.

Fortunately, through cloud-based architectures and faster sharing, networks and stations may not only keep up with consumer demand but actually surpass it. For example, when using 4K formatting, your station can provide viewers with the highest resolution possible (4096 x 2160 pixels), and your video formatting will be easily scalable for different platforms for the most convenient viewing possible.

Furthermore, by becoming an omnimedia center, your station can enjoy the benefits of converged communications. Essentially, this means that you will be creating information and/or entertainment that can be used in multiple different ways for television, social media, news sites, etc., giving you more coverage and exposure than ever before.

What Is Required to Make the Transformation to Digital Media?

Cloud computing and embracing 4K for video formatting are both essential to digital media transformation, but they are not all that is necessary. Aside from these two elements, television stations can take advantage of advances in technology in a number of ways to improve their marketing and production strategies through the use of new ICTs.

For example, thin clients and cloud computing could enable video editing anywhere and anytime, increasing efficiency. In order to improve the latency between the thin clients and the cloud, with the help of enhanced display protocol, virtual machine and GPU virtualization technology, the new ICT architectures today can enable a smooth editing of 8-track HD video in audio / video synchronization, or even support 6-track 4K video editing on clients via the industry’s only IP storage system.

As mentioned earlier, through cloud computing, it is no longer necessary to physically transport video from a news site to the station. Likewise, it is no longer necessary to do all production work and research in separate areas. Thanks to cloud storage and sharing, these pain points can easily be eliminated, as sharing and sending information becomes much simpler and faster.

An all-IP based video injection process is a must if TV stations want to lower network complexity and simplify system maintenance. There are two ways to approach this:

  1. For example, IP cables can replace traditional SDI signals. Each cable transmits 1 channel of 4K video signal. (SDI requires 4 cables to transmit the same video.) Thus, using IP cables can reduce the number of necessary cables by up to 92%, improving O&M efficiency by 60%, and bringing convenience to system interworking and interaction.
  2. With the help of mobile broadband, WAN accelerated networks, smart phones or tablets, journalists in the field can now shorten the video submission process by 90%. Most importantly, cloud computing allows journalists to edit video anywhere and anytime. With the help of fast trans-coding resources in the cloud, real time video reporting is now possible.

Another major factor in any digital media transformation is big data and data analytics. By collecting and analyzing information on your station’s viewers, you can better create more personalized viewing experiences. Netflix has, perhaps, one of the best and most widely known examples of this, as they have created specific algorithms based on previous customer behavior to predict whether or not a viewer will enjoy a certain film or show, and which media to recommend for any viewer.


Through these and other information and communication technologies, such as the Internet of Things(IoT), SDN (software-defined networking), improved mobile broadband, etc., television stations can bring faster, more accurate, and more convenient news and entertainment to their customers and viewers.

Who Is Leading the Way in the Transformation?

In my opinion, the company who has complete agile innovations across cloud-pipe-device collaboration will lead the way to transformation. One of companies in China called Huawei is now trying to create an ecosystem for the global channel partners and solution partners across the news and media entertainment industry, and it provides an open ICT platform that encourages media industry developers to continue to innovate their products. With strong development in cloud-based architectures, SDN, mobile broadband, and IoT, developers and partners are able to create the most comprehensive solutions that best empower media stations of all kinds to move into the future.

What do you think of the digital media transformation in the Television Industry?

(Originally published September 7th, 2016)

By Ronald van Loon

Huawei Announces BES Cloud

Huawei Announces BES Cloud

Cloud Business Enabling System

SHANGHAI, Sept. 9, 2016 /PRNewswire/ — Huawei announced its BES Cloud solution in HUAWEI CONNECT 2016. BES Cloud represents Huawei’s ongoing commitment to the development of industry leading BSS solutions through provision of a SaaS based model for its new generation BSS, BES (Business Enabling System). Built on a digital native architecture, BES Cloud leverages global cloud infrastructure as well as flexible and scalable platform to offer a highly configurable suite of OOTB (out-of-the-box) software features.

To meet the needs of the digital economy Huawei’s solution helps to accelerate digital transformation across a global footprint. The system can be deployed and go live in 3 months whilst reducing TCO (Total Cost of Ownership) by up to 45%. In addition to driving operational efficiencies, BES Cloud supports IT and business agility with seamless upgrade paths and an integrated feature set of best practices that shorten the time to market for launching new products and services whilst enabling innovative new business and engagement models.

Maurice Ma (right), VP of Huawei Carrier Software BU and Jian Guan (left), General Manager of BES as a Service Product, Huawei Carrier Software BU, answer the journalists’ questions

Huawei BES Cloud includes the BSS Lite Cloud and Commerce Cloud. BSS Lite Cloud addresses traditional BSS systems’ long release periods and high costs by providing an end-to-end solution that offers a full suite of BSS applications from CRM to billing for emerging brands as well as small and medium-sized mobile operators. Core benefits include shortening system deployment time by 60% in average and significantly reducing operating costs.

Commerce Cloud, which can be integrated to existing on premise systems, is an agile and light solution that addresses the challenges faced by operators as they transition to a digital operating model. With a customizable UI and web-front end, Commerce Cloud features a cloud based multi-tenant architecture that employs flexible meta-data driven modeling and auto-scaling. Building on an extensive set of embedded best practices, Huawei’s absolute focus on customer centric design principles is a critical factor in its mission to deliver solutions that enable an optimized customer experience over digital channels. By supporting personalized customer interactions, Commerce Cloud helps to improve NPS at the same time as helping to increase order conversion rates in a fully orchestrated omni-channel environment.

About Huawei

Huawei is a leading global information and communications technology (ICT) solutions provider. Driven by customer-centric innovation and open partnerships, Huawei has established an end-to-end ICT solutions portfolio that gives customers competitive advantages in telecom and enterprise networks, devices and cloud computing. Our innovative ICT solutions, products and services are used in more than 170 countries and regions, serving over one-third of the world’s population.

For more information, please visit Huawei online at

CloudTweaks Comics
Cloud Infographic – DDoS attacks, unauthorized access and false alarms

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

DDoS attacks, unauthorized access and false alarms Above DDoS attacks, unauthorized access and false alarms, malware is the most common incident that security teams reported responding to in 2014, according to a recent survey from SANS Institute and late-stage security startup AlienVault. The average cost of a data breach? $3.5 million, or $145 per sensitive…

Cloud Infographic: Security And DDoS

Cloud Infographic: Security And DDoS

Security, Security, Security!! Get use to it as we’ll be hearing more and more of this in the coming years. Collaborative security efforts from around the world must start as sometimes it feels there is a sense of Fait Accompli, that it’s simply too late to feel safe in this digital age. We may not…

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

DDoS Knocks Out Several Websites Cyber attacks targeting the internet infrastructure provider Dyn disrupted service on major sites such as Twitter and Spotify on Friday, mainly affecting users on the U.S. East Coast. It was not immediately clear who was responsible. Officials told Reuters that the U.S. Department of Homeland Security and the Federal Bureau…

The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks! So we are all cheering as the FCC last week made the right choice in upholding the principle of net neutrality! For the general public it is a given that an ISP should be allowed to charge for bandwidth and Internet access but never to block or somehow…

Data Breaches: Incident Response Planning – Part 1

Data Breaches: Incident Response Planning – Part 1

Incident Response Planning – Part 1 The topic of cybersecurity has become part of the boardroom agendas in the last couple of years, and not surprisingly — these days, it’s almost impossible to read news headlines without noticing yet another story about a data breach. As cybersecurity shifts from being a strictly IT issue to…

The Cancer Moonshot: Collaboration Is Key

The Cancer Moonshot: Collaboration Is Key

Cancer Moonshot In his final State of the Union address in January 2016, President Obama announced a new American “moonshot” effort: finding a cure for cancer. The term “moonshot” comes from one of America’s greatest achievements, the moon landing. If the scientific community can achieve that kind of feat, then surely it can rally around…

3 Keys To Keeping Your Online Data Accessible

3 Keys To Keeping Your Online Data Accessible

Online Data Data storage is often a real headache for businesses. Additionally, the shift to the cloud in response to storage challenges has caused security teams to struggle to reorient, leaving 49 percent of organizations doubting their experts’ ability to adapt. Even so, decision makers should not put off moving from old legacy systems to…

Security: Avoiding A Hatton Garden-Style Data Center Heist

Security: Avoiding A Hatton Garden-Style Data Center Heist

Data Center Protection In April 2015, one of the world’s biggest jewelry heists occurred at the Hatton Garden Safe Deposit Company in London. Posing as workmen, the criminals entered the building through a lift shaft and cut through a 50cm-thick concrete wall with an industrial power drill. Once inside, the criminals had free and unlimited…

Achieving Network Security In The IoT

Achieving Network Security In The IoT

Security In The IoT The network security market is experiencing a pressing and transformative change, especially around access control and orchestration. Although it has been mature for decades, the network security market had to transform rapidly with the advent of the BYOD trend and emergence of the cloud, which swept enterprises a few years ago.…

Digital Twin And The End Of The Dreaded Product Recall

Digital Twin And The End Of The Dreaded Product Recall

The Digital Twin  How smart factories and connected assets in the emerging Industrial IoT era along with the automation of machine learning and advancement of artificial intelligence can dramatically change the manufacturing process and put an end to the dreaded product recalls in the future. In recent news, Samsung Electronics Co. has initiated a global…

Choosing IaaS or a Cloud-Enabled Managed Hosting Provider?

Choosing IaaS or a Cloud-Enabled Managed Hosting Provider?

There is a Difference – So Stop Comparing We are all familiar with the old saying “That’s like comparing apples to oranges” and though we learned this lesson during our early years we somehow seem to discount this idiom when discussing the Cloud. Specifically, IT buyers often feel justified when comparing the cost of a…


Sponsored Partners