Category Archives: Technology

Shaking Up The Cloud Technology Marketplace

Shaking Up The Cloud Technology Marketplace

Cloud Technology Marketplace

Cloud continues its devastating rearrangement of the technology marketplace. As legacy vendors struggle to compete many deck chairs are getting moved about – some pretty spectacularly. In the meantime, the boat is still sinking.

We have seen how SaaS (Software as a Service) is tearing up traditional software firms who must adapt or perish. Computer hardware companies are also in true peril. When customers move to public cloud IaaS (Infrastructure as a Service) they no longer buy hardware. Worse yet, the cloud service providers don’t buy it from them either. They have equipment built for them, to their specifications, right down to the chip sets.

Likewise, one by one, the big IT Outsourcing houses that used legacy hardware and software are winking out, too. They used to be giants straddling across the global landscape. Now, they only get mentioned in news releases with so much spin on them it’s embarrassing: EDS, ACS, Perot Systems, and of course – IBM.


Everywhere you look you see the old ITO (Information Technology Outsourcing) model that sold the concept of “your mess for less” disappearing. They offered beleaguered CXO’s relief from the worry and hassle of running IT with the promise of reduced IT expenses. The outsourcers’ could do this through scale in operations and purchasing, plus squeezing labor costs through juniorization and off shore labor arbitrage.

And they were valuable, or at least seemed to be at the time. HP bought EDS in 2008 for almost $13.9 Billion. Xerox Acquired ACS in 2009 for $6.4 Billion. Also in 2009, Dell acquired Perot for $3.9 Billion. In 2008 IBM had a record year and it’s Strategic Outsourcing signings were up 20% worldwide and 44% in North America. There didn’t seem to be any clouds on the horizon (pun intended). After all, Amazon Web Services was just two years old in 2008 and just a blip.

Fast forward to today. Oh, and you might need a scorecard to keep track of the dizzying rearrangement of deck chairs. Just the other day – HP Enterprises, the technology half of the old, venerable HP – announced that it was spinning off its troubled services business, which includes what is left of EDS to CSC. Ironically, HP and CSC are calling the resulting entity “Spinco” for now. (I swear I did not make that up!) Of course, it is wonderful win-win for both sides and reflects great management insight. Well, at least they got a bump in their stock prices.

HP itself split up into two companies just the year before. (Pre-split HP had written down the value of EDS by $8 Billion just four years after purchasing it.)  And, CSC also split itself up into two companies during the same time. Seems pretty straightforward, right? But as they say wait there is more! Xerox sold ACS to Atos, a French company, for a little over $1 Billion. Recall, Xerox paid $6.4 Billion for ACS. And, after a few months after the sale was revealed, Xerox announced that it too was splitting into two companies. Let’s summarize: HP, Xerox and CSC each split themselves into two parts and then shuffled some of the pieces among themselves or sold them.

Dell, who went private in 2013, and agreed in 2015 to merge with the legacy storage provider EMC (more deck chairs), recently said it would sell Perot Systems to NTT. At least it appears Dell only lost $800 million on that one – still, that is 20%. And IBM, what can we say about IBM? Big Blue isn’t so big anymore. It recently posted a 14-year low in quarterly sales. Besides nobody wanting their hardware or software anymore, a key contributor to this decline is Global Services (which includes outsourcing) where the backlog is shrinking (Translation: No New Sales).

How did this meltdown take place in such a short time? Cloud is a dagger to the heart of the ITO value proposition. Remember this was: reduce management hassle and save money through scale and cost cutting. But you still got your mess, just for less.  Cloud brings a scale and efficiency to operations that not even the largest outsourcers could meet because its operating model is so different.


Cloud assumes machines will fail and therefore they should be cheap, run very efficiently and use very little labor. Management hassle is non-existent. Software manages the machines and applications automatically move to another healthy machine if one fails. There are no repairs. They just throw them out. All of this means there are very few people. All the labor arbitrage in the world does not help you when the competition is at, or near zero.

But where the clouds value really kills the legacy outsourcing model is its flexibility and agility. Remember in the legacy model? It’s still your mess. All the laments about IT departments being too slow are still true with ITO. With cloud there is no wait to order and install machines. They are already there waiting for someone to use them. The beauty of it is you don’t pay for them until you actually use them and you stop paying when you shut them off. That kind of flexibility means you can be faster to market with new products, functions and features. Cloud does everything ITO tried to do but then goes one better.

Software, hardware, outsourcing are all being beaten up pretty bad – what do you think will be next? It’s seems obvious that cloud would be doing this in the high tech arena but the impact of cloud is much more far reaching than there. Look for restructuring and M&A activity. Often times there are cloud powered invaders shaking things up. Your industry isn’t rearranging deck chairs is it?

Originally published on July 14th, 2016.  You can periodically see John’s insights show up here on CloudTweaks through our syndication program.

By John Pientka

Cloud-based GRC Intelligence Supports Better Business Performance

Cloud-based GRC Intelligence Supports Better Business Performance

Cloud-based GRC Intelligence

All businesses need a strategy and processes for governance, risk and compliance (GRC). Many still view GRC activity as a burdensome ‘must-do,’ approaching it reactively and managing it with non-specialized tools. GRC is a necessary business endeavor but it can be elevated from a cost drain to a value-add activity. By integrating GRC holistically throughout the organization, and by minimizing manual and duplicative processes through the use of cloud-based tools, firms can benefit from actionable business intelligence and support rapid and informed decision-making.


Companies are questioning whether their GRC can be managed more efficiently because there is an increasing number and range of regulatory mandates and risk concerns that demand action. As problems arise, and multiple regulatory and compliance issues need to be addressed, but manual approaches to the day to day practice of GRC can swamp organizations through a constant need to monitor information sources for changes. The planning and implementation of changes often takes place as and when the demands arise.

Where GRC management and implementation occurs within business silos there is duplication of effort and inefficient cost control. There is also limited capability for knowledge sharing and learning across the organization.

Businesses need to tap into a staggering number of information sources to stay alert to changes in regulatory and legal requirements. If this is manual activity it’s a considerable burden. A MetricStream survey of 123 compliance professionals late last year looked at regulatory change management only and found that it is indeed a resource-hungry endeavor for businesses of all sizes.

Manual mind-set

A third of mid-sized businesses revealed they devote between three and ten employees to the activity, over 30 percent of large businesses said they have 21 or more and over 45 percent of small businesses reported one to two.


(Image Source: Shutterstock)

It’s a lot of resource to tie-up on regulatory activity. And a great deal of that resource’s time is spent on manual activities. Over 50 percent of respondents said monitoring regulatory intelligence sources for a new regulation, or changes to existing regulations, is part of their role. The three principal information sources being regulatory agency filings and releases, industry and trade associations and trade industry publications.

Cloud-based technology enables businesses to tap into collated GRC information sources. These can supply multiple information needs in one place and provide updates on regulatory compliance, risk, vendor due diligence and IT risk and compliance. With information brought together in this way, businesses can save valuable time and resource that would be spent on manual searches.

By integrating GRC knowledge with business systems and operations the whole activity can become even more streamlined and effective. The content picked up from the intelligence portal can be streamed automatically as alerts or email notifications.

From these automated alerts, businesses can quickly identify where action needs to be taken, notify relevant departments and individuals and address process and system workflows. Not only does this proactive approach to GRC management save cost and time on manual information scanning, it also helps businesses stay ahead of changes in regulation and compliance. This knowledge is elevated to almost real-time and the action to preserve corporate compliance is quicker and more efficient.

Integration and automation

In the same MetricStream survey, which was conducted late last year, nearly half (48 percent) of compliance respondents advised they still use office productivity software like spreadsheets to track regulatory changes. These traditional methods fail to address large and small enterprises need for compliance issues to be addressed at scale. However,cloud-based GRC intelligence can integrate with GRC systems to seamlessly update multiple policies, procedures and controls. Data sets can be applied across applications, including Enterprise Resource Management (ERM) systems and compliance. The trained risk and compliance personnel who have spent so much of their time market-watching can turn to more productive endeavors such as analysis, forecasting and implementation.

To a great extent, this takes a change of mind-set. It’s a cultural shift to stop viewing GRC as a cumbersome burden, and to instead incorporate it into the fabric of the organization. By adopting such an approach though, and by embracing automation to manage GRC, companies can derive real business value from taking a proactive approach.

GRC intelligence as a governance layer in the cloud can enhance business operations. From rapid change identification and the ongoing analysis of GRC data, quicker and better decision-making results. Integrated and automated GRC management puts companies in a better position to protect the company from contraventions and to perform well. By identifying risk patterns and trends, and using these in business planning the organization can improve its change response.

Fully rounded visibility into risk and compliance demands and GRC activities across the business is a step in the right direction to viewing GRC holistically, instead of in business silos. With GRC intelligence businesses can make more informed decisions and this means reduced risk and better business performance for competitive advantage.

By Vidya Phalke

Amazon Web Services (AWS), Data and Medical Breakthroughs

Amazon Web Services (AWS), Data and Medical Breakthroughs

Data and Medical Breakthroughs

Amazon Web Services (AWS), in conjunction with Intel, is bringing the power of the cloud to the doctor’s office. In a newly released e-book entitled “Personalized Medicine and Genomic Research: Profiles in Cloud-Enabled Scientific Discovery”, AWS profiles four organizations in the medical field on the forefront of cloud-based bio-information systems. Advancements in (Curing Cancer with Big Data) cloud computing allow medical professionals to take advantage of whole genome sequencing, which has benefits to both patient care and experience. Medical data can be stored more securely and cheaply and collaboration with other doctors is instant and seamless. The lag time for genetic test results takes hours as opposed to days, and treatment plans tailored to your specific genome become available.


Researchers at Genomenext can complete entire genomic sequences in a matter of hours using AWS cloud technology. The National Institute of Health’s Human Microbiome Project is using it to take a closer look at the ecosystem of microbes alive in our bodies. The Inova Translational Medicine Institute uses the technology to track billions of genetic variants in one of the world’s largest genetic databases. The University of California San Diego’s Center for Computational Biology & Informatics analyzes these vast data sets on the cloud to preemptively identify disease-causing genetic patterns.

The e-book, available for free from Amazon, highlights how these organizations are utilizing the power of the cloud to fight cancer, reduce preterm births, and detect and treat congenital disorders in newborns. Cloud-based data sets are even detecting genetic diseases like Spinal Muscular Atrophy earlier, and provide hope for cures in the future.

By Thomas Dougherty

Are CEO’s Missing Out On Big Data’s Big Picture?

Are CEO’s Missing Out On Big Data’s Big Picture?

Big Data’s Big Picture

Big data allows marketing and production strategists to see where their efforts are succeeding and where they need some work. With big data analytics, every move you make for your company can be backed by data and analytics. While every business venture involves some level of risk, with big data, that risk gets infinitesimally small, thanks to information and insights on market trends, customer behaviour, and more.

Unfortunately, however, many CEOs seem to think that big data is available to all of their employees as soon as it’s available to them. In one survey, nearly half of all CEOs polled thought that this information was disseminated quickly and that all of their employees had the information they needed to do their jobs. In the same survey, just a little over a quarter of employees responded in agreement.

Great Leadership Drives Big Data

In entirely too many cases, CEOs look at big data as something that spreads in real-time and that will just magically get to everyone who needs it in their companies. That’s not the case, though. Not all employees have access to the same data collection and analytics tools, and without the right data analysis and data science, all of that data does little to help anyone anyway.

In the same study that we mentioned above, of businesses with high-performing data-driven marketing strategies, 63% had initiatives launched by their own corporate leaders. Plus, over 40% of those companies also had centralized departments for data and analytics. The corporate leadership in these businesses understood that simply introducing a new tool to their companies’ marketing teams wouldn’t do much for them. They also needed to implement the leadership and structure necessary to make those tools effective.


(Image Source: Altamira)

Great leaders see big data for what it is – a tool. If they do not already have a digital strategy – including digital marketing and production teams, as well as a full team for data collection, analytics, data science, and information distribution – then they make the moves to put the right people in the right places with the best tools for the job.

Vision, Data-Driven Strategy, and Leadership Must Fit Together

CEOs should see vision, data-driven strategy, and leadership as a three-legged chair. Without any one of the legs, the chair falls down. Thus, to succeed a company needs a strong corporate vision. The corporate leadership must have this vision in mind at all times when making changes to strategy, implementing new tools and technology, and approaching big data analytics.

At the same time, marketing and production strategies must be data-driven, and that means that the employees who create and apply these strategies must have full access to all of the findings of the data collection and analysis team. They must be able to make their strategic decisions based directly on collected data on the market, customer behaviour, and other factors.

To do all this, leadership has to be in place to organize all of strategic initiatives and to ensure that all employees have everything they need to do their jobs and move new strategies forward.

Have you implemented a digital strategy for your business? What’s changed since you’ve embraced your strategy, and what are your recommendations for strategy and data-driven technology for business owners and executives like yourself?

Let us know what you think and how you’ve used your digital strategy to set your business apart from the competition.

(Originally published July 12th, 2016)

By Ronald van Loon

Addressing Security, Quality and Governance of Big Data

Addressing Security, Quality and Governance of Big Data

Addressing Data Quality

Article sponsored by SAS Software and Big Data Forum

Big Data is quickly being recognized as a valuable influencer of business strategy, able to improve productivity, streamline business processes, and reduce costs. However, not all data holds the same value and organizations need to take care to address the quality of the data they’re exploiting, while carefully managing security and governance concerns.

Data Quality

Blindly trusting business reports to be based on sound and quality information can lead not only to embarrassment but also business decline should the foundational data be found lacking. For this reason ensuring the data your organization employs in its analytics and reporting is of both relevant and high quality is of the utmost importance. While only using high-quality data is a sound principle, it is a case more easily said than done. Understanding where data originated, who has control and accountability for it, and precisely what the quality standards of your organization’s data should be, are significant tasks that must be undertaken. Moreover, while software exists which helps with data correction and error analysis, such tools only address part of the problem. To best meet the challenge of ensuring high-quality data, businesses need to implement a plan that helps identify quality issues, chase down the causes, and create problem resolution strategies.


Carol Newcomb, Senior Data Management Consultant at SAS, suggests a sustainable plan for managing data quality, warning that the process is not likely to be simple and including many steps such as the implementation of rules for collecting or creating data, data standardization and summarization rules, data integration rules with other sources of data, and hierarchy management.

Newcomb asserts that an effective, sustainable data quality plan should include the following five elements:

  • Elevate the visibility and importance of data quality.
  • Formalize decision making through a data governance program.
  • Document the data quality issues, business rules, standards and policies for data correction.
  • Clarify accountability for data quality.
  • Applaud your successes.

Considering reports from Gartner analysts that by 2017, one-third of the world’s largest companies will experience information crises due to an inability to adequately value, govern and trust their enterprise information, it’s crucial that businesses put data quality programs in place.

Data Regulations, Security & Privacy

The ITRC’s latest Data Breach Report points to 500 data breaches in the first half this year with more than 12 million records compromised. Such breaches expose Social Security numbers, medical records, financial information and more, putting individuals at risk. It’s no wonder privacy is such a concern as the mountains of data increase exponentially day by day. Though the ability to track customers and predict future behaviors is of great benefit to many companies, it’s a trade-off for most consumers as privacy is eclipsed by the convenience and assistance that data analysis and prediction provides. And although organizations such as Google promise data anonymization, it’s impossible to know how carefully our privacy is guarded, while the increasing popularity of the Internet of Things means our engagement with the internet and data collection is only escalating.


(Image Source: Shutterstock)

This extensive data does offer many noble assistances, not least of all patient monitoring for superior and prompt medical care, and real-time data utilization in classrooms ensuring education systems are functioning efficiently. And so perhaps data governance and regulation can help mitigate the risks without abandoning the rewards. States Scott Koegler of Big Data Forum, “Pulling actionable but protected data is key to developing advanced business practices while protecting customer identity.” A ‘Zero Trust Data’ policy is a potential solution wherein access to data is denied by default and always requires verification. In this way, data is made available to those legitimately requesting it, but information is correctly de-identified.

By designing efficient data management policies and properly adhering to regulations, companies are able to reap the benefits of Big Data without infringing on the privacy and security of individuals’ data. Furthermore, applying quality control systems to data collection and management enhances the value businesses draw from their data without needlessly breaching confidentiality. Ultimately, the time expended in data security, quality, and governance benefits the businesses who use it as well as the individuals and communities to whom it relates.

By Jennifer Klostermann

IoTT, The Internet of Things, Tomorrow

IoTT, The Internet of Things, Tomorrow

What Should Your Home Be Telling You?

Home. The place where you lay your head to sleep, where a roof of some form protects you. It’s where you leave the things that are valuable to you as you head off each day to live your life. If you were to strike up a conversation with your home about its day, what information would you like your home to convey to you? How would you like to get that information? These walls can’t talk.


(Infographic Source:

There is a value to real time information, things you want to know about as soon as possible. Examples of this from your house would be things like smoke detection and security. You need to know right away if your house is on fire or someone is trying to break in. Critical information should be delivered using a persistent messaging system that won’t stop until you verify receipt. Think of what a great excuse you’ll always have: “I am sorry I have to leave this very exciting meeting because my home is paging me that someone is trying to break into the house.” Forget those fake phone calls and made up excuses.

Less critical conversations can also take place, informing you of what’s in the refrigerator and even information on the products inside. Imagine being at the grocery store and being able to remotely look inside of your refrigerator. Refrigerators of today feature cameras inside as well as barcode readers that can tell you specific information about products like expiration date, price, nutritional information, product reviews, and more.

With the Internet of Things, you’ll never forget the milk again.


Given these preliminary conversations, what should your house be talking to you about?

  • Home Security – remote connection to video surveillance systems and sensors
  • Critical information – Fire alarms, burglar alarms, leaking pipes, electrical outages and other critical pieces of information
  • Home Information – Non-essential information like replacing air and furnace filters, refilling the egg tray, temperature, energy use, and pet-information
  • Environmental Information – Particulate and dust levels in the air, CO2, Radon, and other harmful gas detection

All of these home information systems are available today via their individual systems. What the Internet of Things will do is provide seamless control and notification across all aspects of your home’s functions, in a single platform. This will soon become a reality, so get used to receiving text messages from your house. It all sounds useful until your house starts sending you Baseball picks and you’re forced to write a book: “Conversations with my House: Why I stopped Changing my Furnace Filter.

By Scott Andersen

Steps To Ensure A Successful Cloud IAM Environment

Steps To Ensure A Successful Cloud IAM Environment

Cloud IAM Environment

Sales and implementations of identity and access management (IAM) solutions have drastically increased over the last couple of years as the solutions have become the standard bearer for organization’s access and security. This is primarily because organizations in every industry and their leaders are realizing how beneficial they can be. Instead of using their older, out-of-date manual processes to manage user accounts and passwords, IAM solutions allow for automation of the entire user account lifecycle.

As there has been, of course, a dramatic increase in the use of cloud applications, organizations need to have a method to easily manage both cloud systems and their in-house applications. A cloud IAM environment will ensure that the company is efficiently automating its account management lifecycle for both in house and cloud applications so that only one solution is needed.

If your organization is still using manual processes and is beginning to look at IAM solutions and how they can help your company, it is beneficial to know what a successful cloud IAM environment is and the basic steps and considerations your organization should contemplate to achieve a successful implementation. Here is a brief overview of what to consider when beginning to look at vendors and implementing an IAM solution.

IAM Considerations

The first step would be to find a vendor that offers an IAM solution that will work well with the needs of your organization. Your organization should make a list of all important process that they are performing manually, as well as any issues that they need to have solution in place so that you know exactly what your top priorities are.

Popular SaaS Applications

(Image Source: Interxion)

You should then make a list of all of the applications that the company uses, both cloud and in house. It needs to be guaranteed that the vendor you are working with can build, or has a connector to, these cloud applications.

The organization can then begin conducting research on what type of solutions are needed to solve some of your top concerns and issues. Identity and access management, as a term, covers multiple components, solutions and modules. Here are a few of the main components, as well as what can be achieved with a successful cloud IAM environment.

  • Account Management — This is the management of creating accounts, making changes when necessary and disabling accounts once the end user is no longer working at the organization. A source system, such as HR, is connected with all cloud and in-house applications that your organization utilizes. This allows any change made in the source system to be automatically reflected in all connected systems, so that no manual actions need to be made. For example, when a new employee is on boarded, their information is simply entered into the HR system and accounts in every application they need will automatically be generated for them, without needing human intervention.
  • Role-based Access Control/Access Management — This is the management of access rights. Within an organization, there are many different types and levels of access that employees may require, and they all need to have access to the correct systems and applications. Just as with in-house applications, it is important that users have the exact access they need in cloud applications. This component not only ensures that access is correct, it can also assist with the automation of account change requests. For example, an employee can request an access change via a portal and the request is automatically routed to the correct manager for approval. Once approved, the change will automatically be carried out within the network or appropriate application.
  • Compliance Management — This component is used to monitor what is taking place in the IT infrastructure and making the changes where appropriate. Some organizations may want to monitor who has access to what and may need to comply with certain rules and regulations. Many cloud IAM vendors allow for admins to easily generate a report of exactly who is accessing which applications and what changes they are making. This is beneficial in two ways: First, it allows the organization to ensure security and provide an easy trail for audit reasons, and second, this also allows them to easily see which applications are actually being used for licensing reasons. The organizations may be paying for expensive licenses to applications that users aren’t even accessing. Reporting such as this keeps the network and cloud secure and accurate.
  • Password/Authentication Management — This component is the management of the user’s credentials for accessing the applications they need. It also encompasses certain solutions used to make the login procedure both more convenient for the user, as well as more secure. One of these is a web-based single sign-on (SSO) solution to allow end users to easily access cloud applications. Users simply access a portal where all of their available applications are located. They provide a single set of credentials for authentication to the portal and can then access any of their applications by simply clicking on an icon. This allows them to easily access their applications from anywhere that they are working, whether inside or outside of the company’s network. Many vendors also offer the ability for users to download an app on their device and the app will prompt the user to enter the single set of credentials to get to a portal where they can access their applications. This is extremely convenient for users who are using tablets or smartphones.

The next step is to decide which of these components you need and in what order of importance. Many IAM vendors are very flexible and will allow your organization the opportunity to customize your solution and implementation to meet your company’s needs and timeframe. Often, organizations are nervous about an IAM solution because they fear that such solutions can be costly and timely to implement, taking money from other important budgets. This is actually a misconception that many have. When an IAM implementation is done in modules or phases, the sponsoring organization can choose to purchase only those that they need. They can then also choose to implement the most important aspects of such solutions first.

Information Security


Another factor to consider is the security of the network. Your organization might want to work with the vendor to ensure certain extra security measures, or tailor the solution based on the industry you work in or the data you handle.

Certain modules in an IAM solution already increase security dramatically without any extra measures. For example, the web SSO component allows users on the go to login with one single password to access a portal of all of their cloud applications. This not only improves efficiency; it also helps with security since it eliminates the need of end users to write down their passwords.

If security is a top priority, though, certain features can be added, such as two-factor authentication to the password solutions. This requires a user to provide, for example, a password and another form of identification, such as a finger scan or PIN to further validate they are the correct user.

Overall, a successful cloud IAM environment will allow your organization to easily and efficiently manage cloud applications, and in-house applications, while also increasing organizational security. Your organization should use these basic guidelines to find a vendor that works best for their organization.

By Dean Wiech

Microsoft And General Electric Partnership A Big Step For The Industrial IoT

Microsoft And General Electric Partnership A Big Step For The Industrial IoT

Microsoft and General Electric

It was recently revealed that Microsoft and General Electric (GE) would partner up to bring GE’s Predix software to the Microsoft cloud, Azure.

Predix is an industrial Internet of Things (IoT) platform or a cloud-based platform-as-a-service (PaaS) that will allow industrial machinery to connect to the Internet. Rather than increasing home and personal connectivity where the focus on IoT is often placed, GE is hoping to use the Predix platform to better catalyze the effort of industrial organizations in creating IoT networks. To this end it ensures that most devices used in an industrial contexts can be standardized, monitored and optimized more easily and effectively. It will also help in collating massive amounts of data, saving both time and money in the industry.

The move by GE to make Predix widely available is not new. Prior to the partnership with Microsoft, it was available to be run on Amazon or Oracle’s clouds. However, partnering with Microsoft is a significant event that is mutually beneficial to both parties. For Microsoft, the hope is that more industrial users will begin to use Azure. For GE it highlights their “software ambitions” as opposed to purely industrial ones, estimating $6 billion in digital revenue for 2016, in which Predix plays a crucial role. According to GE CEO Jeff Immelt, they see “in Azure, not only a great cloud technology and [the] ability to globalize quickly. But, it’s a platform on which other platforms can go.” Moreover, that the “relationship…with Microsoft will allow [GE] to move more quickly in places that are important for use, which is really all over the world.”

Beyond what the partnership means for the two companies however, is what it means for the wider public. The partnership is ultimately a part of a large movement towards the IoT. It has been predicted that by 2020 close to 21 billion devices could be connected and in use in the IoT. Of this, a third will be used in business.

By Jason de Klerk

CloudTweaks Comics
A New CCTV Nightmare: Botnets And DDoS attacks

A New CCTV Nightmare: Botnets And DDoS attacks

Botnets and DDoS Attacks There’s just so much that seems as though it could go wrong with closed-circuit television cameras, a.k.a. video surveillance. With an ever-increasing number of digital eyes on the average person at all times, people can hardly be blamed for feeling like they’re one misfortune away from joining the ranks of Don’t…

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

DDoS Knocks Out Several Websites Cyber attacks targeting the internet infrastructure provider Dyn disrupted service on major sites such as Twitter and Spotify on Friday, mainly affecting users on the U.S. East Coast. It was not immediately clear who was responsible. Officials told Reuters that the U.S. Department of Homeland Security and the Federal Bureau…

Security and the Potential of 2 Billion Device Failures

Security and the Potential of 2 Billion Device Failures

IoT Device Failures I have, over the past three years, posted a number of Internet of Things (and the broader NIST-defined Cyber Physical Systems) conversations and topics. I have talked about drones, wearables and many other aspects of the Internet of Things. One of the integration problems has been the number of protocols the various…

Update: Timeline of the Massive DDoS DYN Attacks

Update: Timeline of the Massive DDoS DYN Attacks

DYN DDOS Timeline This morning at 7am ET a DDoS attack was launched at Dyn (the site is still down at the minute), an Internet infrastructure company whose headquarters are in New Hampshire. So far the attack has come in 2 waves, the first at 11.10 UTC and the second at around 16.00 UTC. So…

The Cloud Is Not Enough! Why Businesses Need Hybrid Solutions

The Cloud Is Not Enough! Why Businesses Need Hybrid Solutions

Why Businesses Need Hybrid Solutions Running a cloud server is no longer the novel trend it once was. Now, the cloud is a necessary data tier that allows employees to access vital company data and maintain productivity from anywhere in the world. But it isn’t a perfect system — security and performance issues can quickly…

Security: Avoiding A Hatton Garden-Style Data Center Heist

Security: Avoiding A Hatton Garden-Style Data Center Heist

Data Center Protection In April 2015, one of the world’s biggest jewelry heists occurred at the Hatton Garden Safe Deposit Company in London. Posing as workmen, the criminals entered the building through a lift shaft and cut through a 50cm-thick concrete wall with an industrial power drill. Once inside, the criminals had free and unlimited…

Three Ways To Secure The Enterprise Cloud

Three Ways To Secure The Enterprise Cloud

Secure The Enterprise Cloud Data is moving to the cloud. It is moving quickly and in enormous volumes. As this trend continues, more enterprise data will reside in the cloud and organizations will be faced with the challenge of entrusting even their most sensitive and critical data to a different security environment that comes with using…

Adopting A Cohesive GRC Mindset For Cloud Security

Adopting A Cohesive GRC Mindset For Cloud Security

Cloud Security Mindset Businesses are becoming wise to the compelling benefits of cloud computing. When adopting cloud, they need a high level of confidence in how it will be risk-managed and controlled, to preserve the security of their information and integrity of their operations. Cloud implementation is sometimes built up over time in a business,…

5 Things To Consider About Your Next Enterprise Sharing Solution

5 Things To Consider About Your Next Enterprise Sharing Solution

Enterprise File Sharing Solution Businesses have varying file sharing needs. Large, multi-regional businesses need to synchronize folders across a large number of sites, whereas small businesses may only need to support a handful of users in a single site. Construction or advertising firms require sharing and collaboration with very large (several Gigabytes) files. Financial services…

How The CFAA Ruling Affects Individuals And Password-Sharing

How The CFAA Ruling Affects Individuals And Password-Sharing

Individuals and Password-Sharing With the 1980s came the explosion of computing. In 1980, the Commodore ushered in the advent of home computing. Time magazine declared 1982 was “The Year of the Computer.” By 1983, there were an estimated 10 million personal computers in the United States alone. As soon as computers became popular, the federal government…

The Cancer Moonshot: Collaboration Is Key

The Cancer Moonshot: Collaboration Is Key

Cancer Moonshot In his final State of the Union address in January 2016, President Obama announced a new American “moonshot” effort: finding a cure for cancer. The term “moonshot” comes from one of America’s greatest achievements, the moon landing. If the scientific community can achieve that kind of feat, then surely it can rally around…


Sponsored Partners