Category Archives: Big Data

Security and the Potential of 2 Billion Device Failures

Security and the Potential of 2 Billion Device Failures

IoT Device Failures

I have, over the past three years, posted a number of Internet of Things (and the broader NIST-defined Cyber Physical Systems) conversations and topics. I have talked about drones, wearables and many other aspects of the Internet of Things.

One of the integration problems has been the number of protocols the various devices use to communicate with one another. The rise of protocol gateways in the cloud service provider market is an incredibly good thing. Basically, this allows an organization to map sensors and other IoT/CPOS device outputs to a cloud gateway that will connect, transfer and communicate with the device – regardless of the device’s protocol of choice.

Racing out of the Gate


What the new gateways do is remove integration as a stumbling block for ongoing and future IoT solutions. Pick the wrong horse in the initial protocol race? With a gateway, it doesn’t matter. You can, over time, replace the devices deployed with the orphaned protocol and move forward with your system. The cloud service provider protocol gateway gives you the flexibility to also consider deploying multiple types of sensors and protocols, instead of limiting your organization to one.

The question going forward is this: does the integration provided by the gateway give rise to the broader concept of an IoT broker? This is where the services offered by IoT devices could be parsed out and shared within organizations and companies that are members of the broker. Think of it as being like a buyer’s club for sensors.

From my perspective, the issue that keeps me awake at night is IoT device security. For the most part, IoT devices are often ‘fire and forget’. Yes, occasionally, you may have to change a battery or replace a cellular connection. Sometimes you may have to update how the device is deployed. Others just aren’t going to be attacked because you won’t gain anything. I read an article that wrote about hacking the river monitoring system, causing a flood downstream. I thought about that for a long time, and I realized the reality of flooding is we know when it coming and everyone would be out there with manual measurements anyway. That would work. There are other ways to create an effective attack through the IoT.

It is the security of IoT devices that will become more and more troublesome. Firstly, because the number of them is growing rapidly. From 10 billion or so deployed in 2015 to more than 40 billion devices deployed by 2020. That’s 4 times the devices in the next 4 years.

If we consider the reality of devices, that means that many devices that are deployed today will still be deployed in 4 years. The cost of devices and often the capital expenses for hardware are spread over 3 to 5 years. That means a growing number of devices will be already deployed by 2020. It isn’t a run to the cliff and then leap into 40 billion deployed devices.

2 Billion Device Failures


What scares me is that there are 10 billion or so devices deployed today. Logically, 2 billion of them will fail. 2 billion more will be replaced naturally. That leaves 6 billion devices deployed with the security solutions of today – that will rapidly become obsolete. That is a fairly expensive number to replace. The gateways mentioned earlier in this article will suddenly appear again. Today, they represent a way to bring multiple IoT protocols together. In the future, they will become the best line of defense for deployed devices.

Deploying secure solutions at the gateway level will be the best defense against attacks for IoT devices that do not have integrated security. The next-best thing would be the deployment of devices with easily removed security modules, but that is a consideration for upcoming devices – not ones deployed today.

A secure IoT future – enabled by a simple cloud gateway.

By Scott Andersen

Negotiating Wearable Device Security

Negotiating Wearable Device Security

Wearable Device Security

Recent studies have highlighted gaps in security and privacy created by wearable technology, with one report by the US Department of Health noting that many of the new devices available which “collect, share and use health information are not regulated by the Health Insurance Portability and Accountability Act (HIPAA).” With personal information collected and shared more than ever, regulations managing the security and privacy of such data have a hard time keeping up with the potential risks and this particular report suggests, “To ensure privacy, security, and access by consumers to health data, and to create a predictable business environment for health data collectors, developers, and entrepreneurs to foster innovation, the gaps in oversight identified in this report should be filled.” Pertinent questions, however, remain. Who is responsible for ensuring adequate privacy and security concerns are addressed? And precisely where are all of these gaps?

Widespread Concerns


Concerns aren’t only for the vulnerability of health data, though it should be understood that much of this information is highly sensitive and necessarily requires the provision of first class security measures. Research from Binghamton University and the Stevens Institute of Technology has pointed to the potential for wearable devices to leak passwords. Using data from wearable tech sensors including smartwatches and fitness trackers, researchers were able to crack pins on a first attempt 80% of the time. Of course, some might shrug and suggest they care very little if hackers have access to how many steps they’ve taken on any particular day, but let’s not forget the data available to anyone who cracks the code of a smartwatch, nor how many of us reuse pins across devices. Says Yan Wang, assistant professor of computer science within the Thomas J. Watson School of Engineering and Applied Science at Binghamton University, “Wearable devices can be exploited. Attackers can reproduce the trajectories of the user’s hand then recover secret key entries to ATM cash machines, electronic door locks, and keypad-controlled enterprise servers. The threat is real, although the approach is sophisticated.”

Business Adoption of Wearable Tech

A range of benefits exists for the adoption of wearable tech within companies, including improved productivity, better employee safety, and enhanced customer engagement. However, the security concerns of wearable tech are as, if not more, pronounced as those which exist in personal environments. Network security, in particular, is put under strain with the appropriate configuration of an organization’s network being a key fortification. Because many of the wearable devices we’re using today have poor or no encryption, data interception is easier and company networks which were otherwise well secured become vulnerable. Moreover, most wearables arrive with software that is unique and difficult to update resulting in an ecosystem of dissimilar devices each with their own distinctive weaknesses, requiring tailored security adjustments.

The Fix?

There is, unfortunately, no one-fits-all solution to the security and privacy issues of our wearables, and besides, any solution today will be in need of updates and amendments tomorrow. But the future of wearables is by no mean a bleak one. Responsible designers and developers are accounting for today’s concerns with more robust security processes for the next generations of devices, and networks are already being restructured to guard against wearable vulnerabilities.

Wang points to two attacking scenarios, internal and sniffing attacks, the first typically perpetrated through malware and the second via wireless sniffers that eavesdrop on sensor data sent via Bluetooth. Solutions to such assaults include improved encryption between host operating systems and wearable devices, and the injection of “a certain type of noise to data so it cannot be used to derive fine-grained hand movements.” And for businesses keen to adopt BYOD policies, the implementation of channels outside of the company network specifically for wearable devices can ensure limited access to sensitive data.

Finding the middle ground between the benefits of wearable device usage and the vulnerabilities they introduce is likely to be a painstaking negotiation at first but the more policies defined and effected, the better networks are delineated, and the stronger wearable encryption and protection becomes, the easier the process will be and the greater our rewards.

By Jennifer Klostermann

New Bromium Labs Threat Report

New Bromium Labs Threat Report

2016 Threat Report

The semi-annual Bromium Labs Threat Report has just been released providing an analysis of cyber-attacks and threats which have struck enterprise security in the last six months. It’s found an eruption of ransomware usage as well as an increase in app, browser, and plug-in vulnerabilities and notes that while Microsoft strengthens security, nefarious forces are changing tack and concentrating on ‘drive-by download attacks.’

Significant Conclusions

bromium-evp-and-chief-security-architect-rahul-kashyapThough it’s clear that criminals are working harder than ever to get their hands on protected data, it’s not all bad news. Bromium Labs Threat Report also notes that although the amount of vulnerabilities is constantly rising, they aren’t all being exploited. Unfortunately, there have been several high-profile data breaches and ransomware attacks of late, leaving enterprise security in a somewhat precarious position. Commenting exclusively to CloudTweaks, Bromium EVP and Chief Security Architect, Rahul Kashyap, states, “We’re only halfway through 2016, and our analysis shows numbers of vulnerabilities surpassing 2015 rates. But at the same time, there are less exploits across the board with the exception of Flash, which continues to have high ROI for hackers. Security is improving, but old attack techniques like phishing and watering hole attacks are still plaguing enterprises. It goes without question that we can expect attackers to evolve in response to heightened security. We need isolation and instant protection to secure our networks and data.”

Specific discoveries by Bromium Labs include:

  • A rise in vulnerabilities, with 516 reported to the National Vulnerability Database in the first half of 2016, as compared to 403 vulnerabilities reported over all of 2015.
  • Fewer exploitable vulnerabilities in popular software systems than in previous years, potentially due to the additional attention software vendors’ are giving to security.
  • Adobe Flash had 31 exploits in the first half of 2016, up from eight in 2016, resulting in some security vendors blocking or ending support for Flash. Regrettably from a security standpoint, Flash remains popular with end users and so continues to be a top target for criminals.
  • The most used exploit kits include Neutrino and Rig, though Angler and Nuclear kits also featured but disappeared in early June possibly due to crackdowns on cybercrime groups.
  • Since the beginning of 2016, many new ransomware families have been circulated, the current leader being Locky with 755 tracked instances infecting RAM disks and removable drives.


Tackling the Threats

Though the dangers are becoming more sophisticated and insidious, Kashyap believes real efforts are being made to secure networks and IT infrastructure. “As an industry, we’ve always said there’s no one silver bullet to address the complexities of attacks that are affecting our business. However, our latest research shows that enterprises and vendors alike are stepping up to do a better at securing their networks and data. But there’s still work to be done.” It’s expected that over the next 12 months social engineering tactics will continually be exploited by attackers, and “instant protection, detection, and remediation is more critical than ever.”

Bromium Labs finds most AV vendors are executing multiple updates per day in an attempt to keep up with machine timescale attacks but with new malware observable for less than 60 seconds before it transforms into a victim-specific variant current malicious detection capabilities are found to be lacking. It’s suggested the best strategy is a dramatic reduction of the attack surface, isolating attacks and limiting possible danger and spread. Taking a new approach, Bromium’s unique micro-visualization technology is advancing endpoint security and their solution automatically isolates each user-task in a lightweight, CPU-enforced micro-VM. For all of Bromium Labs security insights and judgements, download the full Bromium Lab Threats Report.

By Jennifer Klostermann

Digital Twin And The End Of The Dreaded Product Recall

Digital Twin And The End Of The Dreaded Product Recall

The Digital Twin 

How smart factories and connected assets in the emerging Industrial IoT era along with the automation of machine learning and advancement of artificial intelligence can dramatically change the manufacturing process and put an end to the dreaded product recalls in the future.

In recent news, Samsung Electronics Co. has initiated a global recall of 2.5 millions of their Galaxy Note 7 smartphones, after finding that the batteries of some of their phones exploded while charging. This recall would cost the company close to $1 Billion.

This is not a one-off incident.

Product recalls have plagued the manufacturing world for decades, right from food and drug to automotive industries, causing huge losses and risk to human life. In 1982, Johnson & Johnson recalled 31 million bottles of Tylenol which retailed at $100 million after 7 people died in Chicago-area. In 2000, Ford recalled 20 million Firestone tires losing around $3 billion, after 174 people died in road accidents due to faulty tires. In 2009, Toyota issued a recall of 10 million vehicles due to numerous issues including gas pedals and faulty airbags that resulted in $2 billion loss consisting of repair expenses and lost sales in addition to the stock prices dropping more than 20% or $35 billion.

Most manufacturers have very stringent quality control processes for their products before they are shipped. Then how and why do these faulty products make it to the market which poses serious life risks and business risks?

Koh Dong-jin, president of Samsung’s mobile business, said that the cause of the battery issue in Samsung Galaxy Note 7 device was “a tiny problem in the manufacturing process and so it was very difficult to find out“. This is true for most of the recalls that happens. It is not possible to manually detect these seemingly “tiny” problems early enough before they result in catastrophic outcomes.

But this won’t be the case in the future.

The manufacturing world has seen 4 transformative revolutions:

  • 1st Industrial Revolution brought in mechanization powered by water and stream.
  • 2nd Industrial Revolution saw the advent of the assembly line powered by gas and electricity
  • 3rd Industrial Revolution introduced robotic automation powered by computing networks
  • The 4th Industrial Revolution has taken it to a completely different level with smart and connected assets powered by machine learning and artificial intelligence.

It is this 4th Industrial Revolution that we are just embarking on that has the potential to transform the face of the manufacturing world and create new economic value to the tune of tens of trillions of dollars, globally, from costs savings and new revenue generation. But why is this the most transformative of all revolutions? Because it is this revolution that has transformed mechanical lifeless machines into digital life-forms with the birth of the Digital Twin.


Digital Twin refers to the computerized companions (or models) of the physical assets that use multiple internet-connected sensors on these assets to represent their near real-time status, working condition, position, and other key metrics that help understand the health and functioning of these assets at granular levels. This helps us understand asset and asset health like we understand humans and human health, with the ability to do diagnosis and prognosis like never before.

How can this solve the recall problem?

Sensor enabling the assembly line and creating Digital Twin of all the individual assets and workflows provides timely insights into tiniest of the issues that can otherwise be easily missed in the manual inspection process. This can detect causes and predict potential product quality issues right in the assembly line as early as possible so that the manufacturers can take proactive action to resolve them before they start snowballing.  This can not only prevent recalls but also reduce scraps in the assembly line taking operational efficiency to unprecedented heights.

What is so deterrent? Why is this problem not solved most organizations that have smart-enabled their factories?

The traditional approach of doing data science and machine learning to analyze data doesn’t scale for this problem. Traditionally, predictive models are created by taking a sample of data from a sample of assets and then these models are generalized for predicting issues on all assets. While this can detect common known issues, which otherwise get caught in the quality control process itself, but it fails to detect the rare events that cause the massive recalls. Rare events have failure patterns that don’t commonly occur in the assets or the assembly line. Although, highly sensitive generalized models can be created to detect any and all deviations but that would generate a lot of false positive alerts which cause a different series of problems altogether. The only way to ensure that we get accurate models that detect only the true issues is to model each asset and the workflow channels individually, understand their respective normal operating conditions and detect their respective deviations. But this is what makes this challenge beyond human-scale. When there are hundreds, thousands or millions of assets and components it is impossible to keep generating and updating models for each one of them manually. It requires automation of the predictive modeling and the machine learning process itself, as putting human data scientists in the loop doesn’t scale.

But aren’t there standard approaches or scripts to automate predictive modeling?

Yes, there are. However, these plain vanilla automation of modeling process which just runs all permutations of algorithms and hyper-parameters again doesn’t work. The number of assets and as such the number of individual models, the frequency at which models need to be updated to capture newer real-world events, the volume of the data and the wide variety of sensor attributes all create prohibitive computational complexity (think millions or billions of permutations), even if someone has infinite infrastructure to process them. The only solution is Cognitive Automation, which is an intelligent process that mimics how a human data scientists leverage prior experience to run fewer experiments to get to an optimal ensemble of models in the fastest possible way. In short, this is about teaching machines to do machine learning and data science like an A.I. Data Scientist.

This is the technology that is required to give Digital Twin a true life-form that delivers the end business value – in this case to prevent recalls.

Does it sound like sci-fi?

It isn’t and it is already happening with the advancement in the world of machine learning and artificial intelligence. Companies like Google are using algorithms to create self-driving cars or beat world champions in complex games. At the same time, we at DataRPM are using algorithms to teach machines to do data analysis and detect asset failures and quality issues on the assembly line. This dramatically improves operational efficiency and prevents the product recalls.

The future, where the dreaded product recalls will be a thing of the past, is almost here!

By Ruban Phukan, Co-Founder and Chief Product & Analytics Officer, DataRPM

Microsoft Dynamics CRM Online Selected By HP To Transform Sales And Partner Engagement

Microsoft Dynamics CRM Online Selected By HP To Transform Sales And Partner Engagement

REDMOND, Wash. — Sept. 12, 2016 — Microsoft Corp. has entered a six-year agreement with HP Inc. to deploy Microsoft Dynamics to thousands of employees across HP, dramatically enhancing collaboration across marketing, sales and service operations. With Dynamics, as well as Azure, Office 365 and other Microsoft Cloud solutions, HP has invested in the sales and service collaboration platform it needs to deliver a seamless sales experience for customers and partners while increasing the company’s performance and economies.

We have chosen Microsoft Dynamics as our CRM solution for our direct selling, partners and services,” said Jon Flaxman, chief operating officer, HP. “This brings us a cloud-based solution that delivers a more effective and efficient collaboration engine across our business.”

HP is undergoing a journey to transform its sales and partner environment, driving increased productivity and collaboration in a virtually all-digital world. As part of this transformation, the company is moving to a more integrated sales experience for both HP sales reps and the channel partner community.

Complementing Dynamics CRM, Office 365 provides worldwide sales, service and marketing professionals at HP with an immersive, connected productivity experience for teamwork and collaboration. In addition, Power BI will empower HP marketers to uncover powerful business insights and predictions. Azure will provide the IT organization with a global, open, hybrid cloud for all of the solutions, while also giving HP a platform for new capabilities and services at a low total cost of ownership.

HP continues to innovate in its customer engagement, with the tools and business processes it provides to its employees and partner community and, of course, the products and services it delivers,” said Judson Althoff, executive vice president of Worldwide Commercial Business at Microsoft. “We share this dedication to digital transformation with HP and are incredibly proud to work with it as it delivers amazing technology experiences to people around the globe.”

Read more at: Microsoft News

The History of Containers and Rise of Docker

The History of Containers and Rise of Docker

Containers 101

Docker started out as a means of creating single application containers, but since has grown into a widely used dev tool and runtime environment. It has been downloaded around two billion times, and Redmonk has said that “we have never seen a technology become ubiquitous so quickly.” The Docker registry stores container images and provides a central point of access which can be used to share containers. Users can either place images into the registry or obtain images from it to deploy directly from the registry. Despite its widespread growth and acceptance, Docker still retains its free open source roots, and hosts a free public registry for containers from which anyone can obtain official Docker images. Below is an infographic discovered via Twistlock which a really nice overview of Container technologies.


By Jonquil McDaniel

Write Once, Run Anywhere: The IoT Machine Learning Shift From Proprietary Technology To Data

Write Once, Run Anywhere: The IoT Machine Learning Shift From Proprietary Technology To Data

The IoT Machine Learning Shift

While early artificial intelligence (AI) programs were a one-trick pony, typically only able to excel at one task, today it’s about becoming a jack of all trades. Or at least, that’s the intention. The goal is to write one program that can solve multi-variant problems without the need to be rewritten when conditions change—write once, run anywhere. Digital heavyweights—notably Amazon, Google, IBM, and Microsoft—are now open sourcing their machine learning (ML) libraries in pursuit of that goal as competitive pressures shift focus from proprietary technologies to proprietary data for differentiation.

Machine learning is the study of algorithms that learn from examples and experience, rather than relying on hard-coded rules that do not always adapt well to real-world environments. ABI Research forecasts ML-based IoT analytics revenues will grow from $2 billion in 2016 to more than $19 billion in 2021, with more than 90% of 2021 revenue to be attributed to more advanced analytics phases. Yet while ML is an intuitive and organic approach to what was once a very rudimentary and primal way of analyzing data, it is worth noting that the ML/AI model creation process itself can be a very complex.


The techniques used to develop machine learning algorithms fall under two umbrellas:

  • How they learn: based on the type of input data provided to the algorithm (supervised learning, unsupervised learning, reinforcement learning, and semi-supervised learning)

  • How they work: based on type of operation, task, or problem performed on I/O data (classification, regression, clustering, anomaly detection, and recommendation engines)

Once the basic principles are established, a classifier can be trained to automate the creation of rules for a model. The challenge lies in learning and implementing the complex algorithms required to build these ML models, which can be costly, difficult, and time-consuming.

Engaging the open-source community introduces an order of magnitude to the development and integration of machine learning technologies without the need to expose proprietary data, a trend which Amazon, Google, IBM, and Microsoft swiftly pioneered.

At more than $1 trillion, these four companies have a combined market cap that dwarfs the annual gross domestic product of more than 90% of countries in the world. Each also open sourced its own deep learning library in the past 12 to 18 months: Amazon’s Deep Scalable Sparse Tensor Network Engine (DSSTNE; pronounced “destiny”), Google’s TensorFlow, IBM’s SystemML, and Microsoft’s Computational Network Toolkit (CNTK). And others are quickly following suit, including Baidu, Facebook, and OpenAI.

But this is just the beginning. To take the most advanced ML models used in IoT to the next level (artificial intelligence), modeling, and neural network toolsets (e.g., syntactic parsers) must improve. Open sourcing such toolsets is again a viable option, and Google is taking the lead by open sourcing its neural network framework, Google’s SyntaxNet, driving the next evolution in IoT from advanced analytics to smart, autonomous machines.

But should others continue to jump on this bandwagon and attempt to shift away from proprietary technology and toward proprietary data? Not all companies own the kind of data that Google collects through Android or Search, or that IBM picked up with its acquisition of The Weather Company’s B2B, mobile, and cloud-based web-properties. Fortunately, a proprietary data strategy is not the panacea for competitive advantage in data and analytics. As more devices get connected, technology will play an increasingly important role for balancing insight generation from previously untapped datasets, and the capacity to derive value from the highly variable, high-volume data that comes with these new endpoints—at a cloud scale, with zero manual tuning.



Collaborative economics is an important component in the analytics product and service strategies of these four leading digital companies all seeking to build a greater presence in IoT and more broadly the convergence of the digital and the physical. But “collaboration” should be placed in context. Once one company open-sourced its ML libraries, other companies were forced to release theirs as well. Millions of developers are far more powerful than a few thousand in-house employees. As well, open sourcing offers these companies tremendous benefits because they can use the new tools to enhance their own operations. For example, Baidu’s Paddle ML software is being used in 30 different online and offline Baidu businesses ranging from health to financial services.

And there are other areas for these companies to invest resources that go beyond the analytics toolsets. Identity management services, data exchange services and data chain of custody are three key areas that will be critical in the growth of IoT and the digital/physical convergence. Pursuing ownership or proprietary access to important data has its appeal. But the new opportunities in the IoT landscape will rely on great technology and the scale these companies possess for a connected world that will in the decades to come reach hundreds of billions of endpoints.

martin-ryan-hi-rezBy  Ryan Martin and Dan Shey

Ryan Martin, Senior Analyst at ABI Research, covers new and emerging mobile technologies, including wearable tech, connected cars, big data analytics, and the Internet of Things (IoT) / Internet of Everything (IoE). 

Ryan holds degrees in economics and political science, with an additional concentration in global studies, from the University of Vermont and an M.B.A. from the University of New Hampshire.

Why Do Television Companies Need A Digital Transformation

Why Do Television Companies Need A Digital Transformation

Cloud TV

Over just a few years, the world of television production, distribution, and consumption has changed dramatically. In the past, with only a few channels to choose from, viewers watched news and entertainment television at specific times of the day or night. They were also limited by where and how to watch. Options included staying home, going to a friend’s house, or perhaps going to a restaurant or bar to watch a special game, show, news story, or event. When we are talking about the TV industry has now been completing and moving to the high definition from the standard definition, now the discussion is about 4K and 8K video standard. But before all these things happen, analog based broadcasting needs to transform digitally. That means TV industry is unavoidable needing a disruptive transformation in their ICT platform to cope with the new processes of acquisition, production, distribution and consumption.


Fast-forward to today, and you have a very different scenario. Thanks to the rise of the Internet – and, in particular, mobile technology – people have nearly limitless options for their news and entertainment sources. Not only that, but they can choose to get their news and other media on TV or on a variety of smart devices, including phones, tablets, smart watches, and more.

Improved Business Value From New Information and Communication Technologies (ICT)

The world has changed, and continues to change, at a rapid pace. This change has introduced a number of challenges to businesses in the television industry. Making the digital media transformation can do a number of things to resolve these challenges and improve your business and viewership.

With leading new ICT, you can see significant business value and improved marketing and production strategies. For example, making this transformation can vastly improve your television station’s information production and service capabilities. It can also smooth the processes involved with improving broadcasting coverage and performance as well.

With these improvements, your station will have faster response times when handling time-sensitive broadcasts. This delivers to your audience the up-to-the-minute coverage and updates they want across different TV and media devices and platforms.

Improved Social Value with New ICT

A television station that refuses to change and evolve with viewers’ continuously evolving needs and wants will find themselves falling behind competitors. However, a TV station that understands the necessity to make the digital media transformation will have significantly improved social value with their audiences.


Television stations that embrace new technology, digital media, storage, cloud computing and sharing will see massive improvements in social value. Consider that this transformation enables your station to produce timely and accurate reports faster, giving your audience the freshest information and entertainment.

By bringing news and entertainment media to your audience when, where and how they want and need it, you can enrich their lives and promote a culture of information sharing that will also serve to improve your ratings and business. With technologies like cloud-based high-definition video production and cloud-based storage and sharing architectures, you can eliminate many of the challenges and pain points associated with reporting news and bringing TV entertainment to a large audience.

Why Do Television, Media, and Entertainment Companies Need a Digital Transformation?

Consider the basic steps that a TV news station must take to get the news to their audience:

  • Acquisition
  • Production
  • Distribution
  • Consumption

For television stations that have not yet embraced a digital media transformation, these steps do not just represent the process of delivering news media to the public. They also represent a series of pain points that can halt progress and delay deadlines. These include:

  • Traditional AV matrices use numerous cables, are limited by short transmission distance for HD signals and require complicated maintenance, slowing down 4K video evolution.
  • Delays when attempting to transmit large video files from remote locations back to the television station.
  • Delays when reporters edit videos because office and production networks in TV stations are separated from each other, requiring them to move back and forth between the production zone and the office zone in their building to do research
  • Delays due to the time it takes to transmit a finished program (between six and twenty-four minutes, depending on the length and whether or not it is a high-definition video) to the audience.
  • 4K video production has much higher requirements on bandwidth and frame rates.

These challenges all occur in traditional structures and architectures for media handling, but they quickly dissolve when a TV station makes the digital transformation and begin using a cloud-based architecture with new ICT.

Keeping Up With Viewer Demand via Ultra High Definition (UHD) Omnimedia

Increasingly, viewers demand more and more individualized experiences. These include interactive programming, rich media, UHD video, and they want it across all applicable devices. Delivering UHD omnimedia is only possible through new ICT, as older IT infrastructures simply cannot scale to the levels necessary to keep up with viewer demands.

Fortunately, through cloud-based architectures and faster sharing, networks and stations may not only keep up with consumer demand but actually surpass it. For example, when using 4K formatting, your station can provide viewers with the highest resolution possible (4096 x 2160 pixels), and your video formatting will be easily scalable for different platforms for the most convenient viewing possible.

Furthermore, by becoming an omnimedia center, your station can enjoy the benefits of converged communications. Essentially, this means that you will be creating information and/or entertainment that can be used in multiple different ways for television, social media, news sites, etc., giving you more coverage and exposure than ever before.

What Is Required to Make the Transformation to Digital Media?

Cloud computing and embracing 4K for video formatting are both essential to digital media transformation, but they are not all that is necessary. Aside from these two elements, television stations can take advantage of advances in technology in a number of ways to improve their marketing and production strategies through the use of new ICTs.

For example, thin clients and cloud computing could enable video editing anywhere and anytime, increasing efficiency. In order to improve the latency between the thin clients and the cloud, with the help of enhanced display protocol, virtual machine and GPU virtualization technology, the new ICT architectures today can enable a smooth editing of 8-track HD video in audio / video synchronization, or even support 6-track 4K video editing on clients via the industry’s only IP storage system.

As mentioned earlier, through cloud computing, it is no longer necessary to physically transport video from a news site to the station. Likewise, it is no longer necessary to do all production work and research in separate areas. Thanks to cloud storage and sharing, these pain points can easily be eliminated, as sharing and sending information becomes much simpler and faster.

An all-IP based video injection process is a must if TV stations want to lower network complexity and simplify system maintenance. There are two ways to approach this:

  1. For example, IP cables can replace traditional SDI signals. Each cable transmits 1 channel of 4K video signal. (SDI requires 4 cables to transmit the same video.) Thus, using IP cables can reduce the number of necessary cables by up to 92%, improving O&M efficiency by 60%, and bringing convenience to system interworking and interaction.
  2. With the help of mobile broadband, WAN accelerated networks, smart phones or tablets, journalists in the field can now shorten the video submission process by 90%. Most importantly, cloud computing allows journalists to edit video anywhere and anytime. With the help of fast trans-coding resources in the cloud, real time video reporting is now possible.

Another major factor in any digital media transformation is big data and data analytics. By collecting and analyzing information on your station’s viewers, you can better create more personalized viewing experiences. Netflix has, perhaps, one of the best and most widely known examples of this, as they have created specific algorithms based on previous customer behavior to predict whether or not a viewer will enjoy a certain film or show, and which media to recommend for any viewer.


Through these and other information and communication technologies, such as the Internet of Things(IoT), SDN (software-defined networking), improved mobile broadband, etc., television stations can bring faster, more accurate, and more convenient news and entertainment to their customers and viewers.

Who Is Leading the Way in the Transformation?

In my opinion, the company who has complete agile innovations across cloud-pipe-device collaboration will lead the way to transformation. One of companies in China called Huawei is now trying to create an ecosystem for the global channel partners and solution partners across the news and media entertainment industry, and it provides an open ICT platform that encourages media industry developers to continue to innovate their products. With strong development in cloud-based architectures, SDN, mobile broadband, and IoT, developers and partners are able to create the most comprehensive solutions that best empower media stations of all kinds to move into the future.

What do you think of the digital media transformation in the Television Industry?

(Originally published September 7th, 2016)

By Ronald van Loon

CloudTweaks Comics
Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

DDoS Knocks Out Several Websites Cyber attacks targeting the internet infrastructure provider Dyn disrupted service on major sites such as Twitter and Spotify on Friday, mainly affecting users on the U.S. East Coast. It was not immediately clear who was responsible. Officials told Reuters that the U.S. Department of Homeland Security and the Federal Bureau…

A New CCTV Nightmare: Botnets And DDoS attacks

A New CCTV Nightmare: Botnets And DDoS attacks

Botnets and DDoS Attacks There’s just so much that seems as though it could go wrong with closed-circuit television cameras, a.k.a. video surveillance. With an ever-increasing number of digital eyes on the average person at all times, people can hardly be blamed for feeling like they’re one misfortune away from joining the ranks of Don’t…

Update: Timeline of the Massive DDoS DYN Attacks

Update: Timeline of the Massive DDoS DYN Attacks

DYN DDOS Timeline This morning at 7am ET a DDoS attack was launched at Dyn (the site is still down at the minute), an Internet infrastructure company whose headquarters are in New Hampshire. So far the attack has come in 2 waves, the first at 11.10 UTC and the second at around 16.00 UTC. So…

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

DDoS attacks, unauthorized access and false alarms Above DDoS attacks, unauthorized access and false alarms, malware is the most common incident that security teams reported responding to in 2014, according to a recent survey from SANS Institute and late-stage security startup AlienVault. The average cost of a data breach? $3.5 million, or $145 per sensitive…

The Importance of Cloud Backups: Guarding Your Data Against Hackers

The Importance of Cloud Backups: Guarding Your Data Against Hackers

The Importance of Cloud Backups Cloud platforms have become a necessary part of modern business with the benefits far outweighing the risks. However, the risks are real and account for billions of dollars in losses across the globe per year. If you’ve been hacked, you’re not alone. Here are some other companies in the past…

Do Not Rely On Passwords To Protect Your Online Information

Do Not Rely On Passwords To Protect Your Online Information

Password Challenges  Simple passwords are no longer safe to use online. John Barco, vice president of Global Product Marketing at ForgeRock, explains why it’s time the industry embraced more advanced identity-centric solutions that improve the customer experience while also providing stronger security. Since the beginning of logins, consumers have used a simple username and password to…

How You Can Improve Customer Experience With Fast Data Analytics

How You Can Improve Customer Experience With Fast Data Analytics

Fast Data Analytics In today’s constantly connected world, customers expect more than ever before from the companies they do business with. With the emergence of big data, businesses have been able to better meet and exceed customer expectations thanks to analytics and data science. However, the role of data in your business’ success doesn’t end…

Ending The Great Enterprise Disconnect

Ending The Great Enterprise Disconnect

Five Requirements for Supporting a Connected Workforce It used to be that enterprises dictated how workers spent their day: stuck in a cubicle, tied to an enterprise-mandated computer, an enterprise-mandated desk phone with mysterious buttons, and perhaps an enterprise-mandated mobile phone if they traveled. All that is history. Today, a modern workforce is dictating how…

Your Biggest Data Security Threat Could Be….

Your Biggest Data Security Threat Could Be….

Paying Attention To Data Security Your biggest data security threat could be sitting next to you… Data security is a big concern for businesses. The repercussions of a data security breach ranges from embarrassment, to costly lawsuits and clean-up jobs – particularly when confidential client information is involved. But although more and more businesses are…

Cloud Services Providers – Learning To Keep The Lights On

Cloud Services Providers – Learning To Keep The Lights On

The True Meaning of Availability What is real availability? In our line of work, cloud service providers approach availability from the inside out. And in many cases, some never make it past their own front door given how challenging it is to keep the lights on at home let alone factors that are out of…

Are Cloud Solutions Secure Enough Out-of-the-box?

Are Cloud Solutions Secure Enough Out-of-the-box?

Out-of-the-box Cloud Solutions Although people may argue that data is not safe in the Cloud because using cloud infrastructure requires trusting another party to look after mission critical data, cloud services actually are more secure than legacy systems. In fact, a recent study on the state of cloud security in the enterprise market revealed that…


Sponsored Partners