Category Archives: Big Data

Despite Record Breaches, Secure Third Party Access Still Not An IT Priority

Despite Record Breaches, Secure Third Party Access Still Not An IT Priority

Secure Third Party Access Still Not An IT Priority

Research has revealed that third parties cause 63 percent of all data breaches. From HVAC contractors, to IT consultants, to supply chain analysts and beyond, the threats posed by third parties are real and growing. Deloitte, in its Global Survey 2016 of third party risk, reported that 87 percent of respondents had faced a disruptive incident with third parties in the last two to three years.


In May this year, Ponemon Institute published the results of a 617 person survey that revealed that 75 percent of IT and security professionals said the risk of a breach from a third party is serious and increasing.

The infamous Target breach that occurred during the 2013 holiday shopping season is a prime example of a catastrophic third party data breach. Target confirmed that payment card information from roughly 40 million customers was stolen, as well as 70 million customer records. The root cause of the data breach was compromised network credentials that linked back to the company’s third party HVAC systems subcontractor. The breach cost Target millions of dollars, damage to its brand and reputation, and the resignation of both its CEO and CIO. In the past 12 months, organizations represented in the Ponemon report spent an average of $10 million each to respond to a security incident that was the result of negligent or malicious third parties.

Despite these warnings, a recent study conducted by the Soha Third Party Advisory Group, which consists of industry security and IT experts from Aberdeen Group; Akamai; Assurant, Inc.; BrightPoint Security; CKure Consulting; Hunt Business Intelligence, PwC; and Symantec, found that just two percent of respondents consider third party access a top priority in terms of IT initiatives and budget allocation. The report, which surveyed over 200 enterprise IT and security C-Level executives, directors and managers from enterprise-level companies, uncovered a few reasons for this apathy.

Breaches Happen to Other Organizations

Data Breach Comic

While CVS, American Express and Experian are just a few of the recognizable organizations that have recently suffered through a significant third party breach, the negative news stories published about them and others has not done much to motivate today’s IT personnel. Sixty-two percent of respondents to the Advisory Group report said they do not expect their organization to be the target of a serious breach due to third party access, but they believe 79 percent of their competitors will suffer a serious data breach in the future. Interestingly, 56 percent acknowledged they had concerns about their ability to control and/or secure their own third party access.

Providing Third Party Access Is Difficult

The complexity of providing secure access to applications spread across many clouds or in multiple data centers, and to contractors and suppliers who do not work for you, using devices IT knows nothing about, is a challenge. The Third Party Advisory Group report found that most of those polled believe that providing third party access was a complex and tedious process. The survey found IT needs to touch five to 14 network and application hardware and software components to provide third party access. Fifty-five percent said providing third party access to new supply chain partners or others was a “Complex IT Project,” and on average, they have to touch 4.6 devices, such as VPNs, firewalls, directories, and more. Forty percent described the process as tedious or painful, and 48 percent described it as an ongoing annoyance. This is a problem that will not go away anytime soon, as 48 percent of respondents saw third party access grow over the past three years, while 40 percent said they see growth continuing over the next three years.

People Are Not Afraid of Losing Their Jobs

When the Advisory Group survey asked IT professionals “If a data breach occurred in your area of responsibility, would you feel personally responsible,” 53 percent said they would, because they felt it would reflect poorly on their job performance. However, only 8 percent thought they might lose their jobs if a data breach occurred during their watch. The survey showed that IT professionals takes their jobs seriously, but it is unclear who is being held accountable for data breaches and how this ambiguity might affect attitudes and behavior in ensuring organizations are safe from outside threats.

Four Must-Have Features for Secure Third Party Access

When evaluating a secure third party access platform, it’s important the solution be able to navigate and manage a complex maze of people, processes and technologies. The solution should provide a convenient, simple and fast way to manage the platform, policies and security. And at minimum, the solution under evaluation should include the following four features:

  • Identity Access: Identity Access confirms that the third party vendor accessing the IT network has the right to do so. The goal is to provide authenticated end user access only to the specific applications the vendor needs, not to the whole network.
  • Data Path Protection: Rather than building a unique access string through an organization’s firewall, data path protection allows existing security measures to stay as they are, without having to be altered. This feature provides a secure pathway for vendors to access the parts of the network that they need for work purposes. And in the event that credentials are compromised, the direct pathway prevents outside attackers from scanning through the network.
  • Central Management: Keeping track of vendor access can be a challenge, but a centrally managed solution allows organizations to manage and control third party access in a simple and uncluttered fashion. The elimination of complexity means easy, functional connections that provide fundamentally better security that allows for detailed audit, visibility, control and compliance reporting.

The divide between IT priorities and the need to mitigate third party data breaches affects all industries. IT professionals must recognize that the threat from third parties accessing their infrastructure is very real. The good news is that with the right access platform with the appropriate feature sets, organizations can significantly mitigate their risk.

0015Soha-Mark-June-2015-head-shotBy Mark Carrizosa, chief information security officer (CISO) and vice president of security for Soha Systems.

Mark joined Soha in 2015 from Walmart, where, as principal security architect, he developed and implemented the company’s global e-commerce security architecture framework. Prior to Walmart, Carrizosa was operational risk consultant at Wells Fargo, where he analyzed the company’s infrastructure and application compliance to improve the security risk posture of both customer-facing and internal systems.

Education Through Collaboration And The Cloud

Education Through Collaboration And The Cloud

Education And The Cloud

Online education, supported by cloud computing, has seen much growth due to the spread of massive open online courses (MOOCs) hosted in the cloud and a changing learning environment in which today’s tech-savvy students make use of their own devices to facilitate their learning. Providing cost-effective availability and scalability to e-learning programs, cloud computing additionally delivers access to streaming video, simulations, and virtual learning worlds. With the benefit of collaboration in the cloud, it’s now easier for groups of students to collect and analyze data together, and interaction with educators can happen seamlessly through these same channels.


As education globalizes, MOOCs are changing how courses are structured and delivered. Budget limitations coupled with the need for wide-ranging delivery of programs has made MOOCs a particularly relevant practice.


(Infographic Source: MOOCs)

While some consider the spread of MOOCs a threat to traditional schools and universities, many established organizations are instead putting MOOCs to good use as they modernize their existing structures. Unfortunately, research into MOOCs has found a high rate of abandonment due to factors such as low quality, lack of recognition, poor student motivation, and theoretical teaching without the benefit of any practical application. However, examination by Gartner finds that MOOCs have renewed interest in online learning while significantly changing course boundaries, and their considerable impact on digitalized learning in higher education means progressive CIOs are adopting new models and technology to strengthen the online learning in their establishments.

BYOD and Mobile Learning

Bring your own device (BYOD) is an approach that’s taken hold of many organizations, educational institutions included, and it’s predicted that by 2017 half of employers will require employees to source their own devices for work purposes. Already many students see technology as an essential learning tool that offers peer collaboration and communication, as well as a diverse range of approaches to the assemblage of information. The use of such technology further aids in many of the key principles of effective learning, which include applying theory to practice, motivating students, encouraging reflection and creativity, and promoting dialogue and collaboration.

Though mobile learning has previously been limited by the processing and storage capacity of devices in use, it’s clear that the cloud can assist by providing adequate computing resources and scalability. With the more resource-intensive computing tasks executed in the cloud, applications can more easily run on mobile devices, and students can retrieve and share content stored in the cloud from wherever they like, whenever they choose. Additionally, some new trends in mobile learning are expanding the field; geolocation is being activated in some applications that provide courses appropriate to geographically-determined customers, societal norms, and backgrounds, and though big data has always been an important part of e-learning, it’s anticipated that 2016 will prove to be a year of big data app analytics that improves mobile training strategies.

Cloud Study Groups

Both traditional and online programs make use of collaborative learning groups, either involving the entire class or, for more detailed analysis of materials, smaller groups. The concept of a study group is nothing new, and its benefits include the development of critical thinking skills, co-assembly of knowledge and meaning, and transformative learning. This valuable learning facility becomes more flexible in the cloud. Though some students shy away from study groups because they consider them inflexible, are wary of taking part for fear their teammates will slow them down, believe workloads will not be equally shared, or have difficult relationships with specific peers, cloud study groups offer some solutions. Groups can be created without considering student location, which means those with the most comparable aims and suitable partnerships can collaborate and learn together. Further, with appropriate applications in place, the cloud helps build study groups that are adaptable but charted, ensuring work product is correctly attributed.

We’ve already seen many changes in education through online learning; the cloud is the next step ensuring more efficient delivery and improved collaboration for greater dissemination.

By Jennifer Klostermann

How To Humanize Your Data (And Why You Need To)

How To Humanize Your Data (And Why You Need To)

How To Humanize Your Data

The modern enterprise is digital. It relies on accurate and timely data to support the information and process needs of its workforce and its customers. However, data suffers from a likability crisis. It’s as essential to us as oxygen, but because we don’t see it, we take it for granted. Because we take it for granted, we don’t often think we need to go open a window for fresh air.

We work with data constantly but often don’t see how that data is used. Thus, it’s difficult to visualize the tedious work behind the product, which affects what we are “selling” as an experience or an outcome.

Consider a company that calls itself “the company that cares for its workforce.” This is the type of company that should know its employees. But consistently getting names, notifications, and other basic information wrong could make that company seem like it doesn’t care.

Why Humanizing Data Is Important

Why would an organization expend the effort to humanize data? The simple answer is that it reduces risk, improves business performance, and (to an increasing degree), it’s a necessity in transitioning to digital excellence. Every industry will have laggards who change slowly, but the best examples are organizations that recognize the data is underutilized within their businesses.


This is important to acknowledge, as according to Gartner research, inaccurate and low quality data can result in millions of dollars of lost benefits per year for the average enterprise.

More specifically, data needs to feel personal because:

  • Intangibles are often dynamic and require context. For example, consider the data that makes up a customer record. Attributes such as a company name, address, and contact information seem basic, but the reality is these attributes change at an alarming rate when you look at a customer population as a whole. Customers have growing, shrinking, and changing businesses, too.
  • A customer isn’t just a record. Attached to a customer are orders, opportunities, and interactions that all build on the context of how a company serves and benefits from a customer. Learning and visualizing that context takes time, especially if it’s only supported through tribal knowledge.
  • It helps to bridge the likability gap and manage the organizational change to a digital enterprise. Stories that humanize data are essential in outlining the expectations for adding value, managing risk, and providing services to the customer. Humanizing data provides meaningful stories that quickly capture the context of data value and use.

How to Humanize Data

It doesn’t have to be complicated. These three basics will take your business a long way.

1. Spread the word. I’ve seen some spectacular examples of organizations humanizing data through videos, tent cards, gamification, and other messaging techniques that revolve around the “so what” and connect the data driving the event. There is no universal way to do this; each company culture responds to communication differently. However, companies simply need to make sure their methods start conversations.

2. Start during the orientation. Most organizations expend their change management efforts when change becomes necessary for more experienced workers. But there is a steady stream of younger, more digitally savvy employees entering the picture. Spell check and text messages are the norm for them, and these team members will be trusted to steward data in systems that were implemented around the time they got their first cell phones. The systems are neither smart nor agile and don’t have a sense of humor for what Siri misinterpreted. They do exactly what you tell them, whether you mean to say it or not.

3. Communicate every data issue by starting with the human elements and outcomes. Repetition and practice are important for reinforcement. Those who can best communicate contextual use and value of information and insights will determine leadership in the digital era. If we want to develop digital leaders, we have to practice and become versed in understanding data context and how it applies to problem-solving and innovation in the future.

There’s little question that data is vital for today’s companies. That’s why it’s troubling that so few companies are using it well. For example, only 36 percent of companies use it to guide strategic initiatives. Further, 41 percent of high-growth firms reported that data quality issues represent a barrier to using it for strategic planning.

Humanizing data might be the solution. It will give employees a sense of how important it is, which in turn will make them more likely to identify with data outcomes. In the long run, humanizing data will lead to a leaner, more efficient company.


Will-CrumpBy Will Crump

As president and CEO of DATUM, Will Crump brings more than 15 years of experience in building high-performance, cross-functional teams to compete in global venues. He is a sought-after voice in the areas of software product development, OEM and enterprise B2B web application technologies. For information on advisory services and DATUM’s SaaS, Information Value Management, visit DATUM’s website.

The Week In Tech – NVidia AI Inception Program and Google Goes Wireless?

The Week In Tech – NVidia AI Inception Program and Google Goes Wireless?

The Week In Tech

Much of the news coming out of the technology sector this week focused on looking ahead to new challenges, and deploying new technologies to solve difficult global problems.

Here are some of the standout developments in the technology sector this week.

Nvidia, the American pioneer in GPU-processing, announced the establishment of a comprehensive global program to support startups that are driving breakthroughs in Artificial Intelligence and data science.

The Nvidia Inception Program , “provides unique tools, resources and opportunities to the waves of entrepreneurs starting new companies, so they can develop products and services with a first-mover advantage.” Startups that qualify will receive access to Nvidia’s deep-learning technologies, its global network, technical training and funding. Senior Director of Business Development, Kimberley Powell, said that “We’re committed to helping the world’s most innovative companies break new ground with AI and revolutionize every industry.”

Google Going Wireless

Alphabet executive chairman Eric Schmidt told a gathering of shareholders that the company is getting serious about beaming the next generation of wireless technology into people’s homes. The project Google Fiber, which initially launched in 2010 is now looking for ways to provide wireless internet services without having to lay fiber-optic cables and dig up infrastructure in order to do so. “There appears to be wireless solutions that are point to point that are inexpensive now because of the improvements in semiconductors,” said Schmidt.

The company is planning to have a test network up and running by year’s end in and around the Kansas City area.


(Image Source: Wikipedia)

Verizon Still Leading Candidate

Speculation has been rife for some time over who would be interested in acquiring Yahoo’s core internet business, since the once mighty search company announced that it would be auctioning off its non-core assets earlier this year. This week, it emerged that the leading contender for these assets is telecommunications company Verizon.

The Wall Street Journal reportsVerizon, which acquired AOL Inc. last year for $4.4 billion, is seen as having the clearest path to turning around Yahoo. The telecom giant likely would combine Yahoo’s web properties, which together attract more than a billion users a month, with its growing business in online ads. That would enable Verizon to offer more than at least some other bidders.”

Potential suitors are expected to bid between US$2 and 3 billion, a figure that has fallen sharply since CEO Marissa Mayer’s sale presentation revealed the true state of the company’s advertising revenue business.

By Jeremy Daniel

Setting Up Your Marketing Cloud – Finding The Right Tools

Setting Up Your Marketing Cloud – Finding The Right Tools

Marketing Tools The Fit

Most recently, an acquisition agreement for the purchase of marketing automation vendor Marketo by a private equity firm was announced, with a price tag of $1.79 billion. This is the latest buyout of a marketing automation vendor, with past trades including Oracle’s purchase of Eloqua, Adobe’s acquisition of Neolane, and IBM attaining Silverpop. Effectively making use of martech and the marketing cloud is becoming more necessary and relevant for businesses of all sizes.

Setting Up Your Marketing Cloud


(Image Source: Shutterstock)

For effective and comprehensive marketing strategies, the cloud platform a business chooses should have most, if not all, of the tools marketers require. This includes efficient means of data management and analytics, social media and mobile marketing, audience segmentation, and marketing automation. Setting up such a cloud is a massive undertaking and ensuring the skills to operate and run it are available, with policies in place for effective data management, is just the tip of the iceberg. Finding the right solution won’t be easy, and experts suggest discussing platform experiences with organizations who’re actually using them instead of relying only on vendor presentations. It’s also important to understand your own weaknesses because no two platforms are alike. So, rather than widening gaps, businesses can implement solutions to secure weak spots while enhancing strengths. Suggests TJ Hunter, senior director of customer marketing at Rosetta Stone, “Look to see where your biggest return on marketing is, and then the next closest companion channel in terms of what drives revenue for your organization, and find a platform that marries those two channels.

Grappling with Data

Neil-MichelNeil Michel, Chief Strategy Officer at Wire Stone, believes big data isn’t necessarily a problem most marketers are facing but suggests rather that general marketing data is not being properly analyzed. Big data includes massive volumes, varieties, and velocities of data, typically unstructured, and marketers aren’t alone in the quest to thoroughly exploit it’s potential. However, Michel suggests that marketers probably aren’t missing many insights by failing to utilize everything from sensor data streams to brand mentions in social conversations. Instead, he proposes more effectively employing untapped marketing data that’s already available.

Mining insights from data that’s already structured still requires the right skills and technology, as well as an incurable curiosity, but when successfully performed, organizations are able to make smarter decisions and deliver superior customer experiences. Effective data analysis should result in increased revenues and improved loyalty because organizations can offer the right customer the best product at a time, place, and through a channel that won’t be ignored. And on the admin side, shrewd data analysis also helps lower costs per lead, shortens time to sale, and additionally decreases account servicing costs.

Who’s Who

IBM, Adobe, Salesforce, HP, and Oracle all offer their own digital marketing platforms, and they’re just a few of the largest and earlier entrants to the market. Experian, Sitecore, SAP, and Hubspot have emerged more recently as fierce competitors, and many startups have launched limited but more affordable products suitable for small and medium businesses. With marketers coming to grips with the martech environment their requirements are evolving and marketing cloud vendors need to provide simple and centralized systems with access to powerful data as well as positive execution solutions to meet these needs. This, of course, has led to a rapid change in size and scale off offering from marketing cloud vendors. Today, the top vendors offer content management tools that can be deployed across different channels, social media tools for listening and engaging, analytics which allows customer profile development, and multi-channel marketing automation, but they’re not stopping there. We’ll see innovative and exciting capabilities such as predictive analytics and customer journey mapping is future solutions, as well as a selection of tools not yet imagined.

By Jennifer Klostermann

How Strategy – Not Technology – Is The Real Driver For Digital Transformation

How Strategy – Not Technology – Is The Real Driver For Digital Transformation

The Real Driver For Digital Transformation

Business owners and executives today know the power of social media, mobile technology, cloud computing, and analytics. If you pay attention, however, you will notice that truly mature and successful digital businesses do not jump at every new technological tool or platform.

While they do not sit and wait for months or years to create social media pages or to take advantage of new analytical services, they do approach every piece of technology that they use with a solid strategy. Why? Marketing, production, and brand management require concrete planning to be effective and coherent. Implementing new technology without a set strategy is a recipe for failure – or, at the very least, for ineffective use of an otherwise powerful tool.

The Importance of Digital Strategy and Vision

To make the most use out of the technologies and tools available to your business today, you must have a coherent and cohesive digital strategy. Companies that have good digital strategies are said to be “digitally mature” and are more likely to embrace the most strategic technologies as they are developed, rather than casting about, trying everything, and failing to use most of it to their advantage.

Predictive Marketing

(Image Source: Shutterstock)

A good digital strategy is born out of a vision for the company. Savvy leaders will understand that they must first envision the form they want their business to take, the presence they want it to have online and in the physical world, and the brand tone and voice they will use to engage with customers across all media. This is the basis for a strong strategy that will carry you through software and hardware updates, new tools, social media platforms, and much more.

Technology Gives You Analytics – Strategy Shows You How to Use Them

Now, we are not saying that technology is unimportant. In fact, without data streams and analytics, you would have a much more difficult time collecting the information you need on your customers, website traffic, and the market in general. Without analytical tools like these, you would have a much harder time finding the data to make your next strategic move.

However, you might think of your analytics and data streams as the tools to fix your car and your strategy as your mechanic’s knowledge and experience. You could have all of the tools necessary to change the struts on your wheels, replace the alternator, or do anything else to repair your car, but those tools will do nothing for you if you don’t have the knowledge and experience necessary to perform those jobs.

With a solid strategy, you’ll have a guide for how to use the tools that technology gives you. You’ll see how your business can embrace these tools and platforms, how it will change and evolve, and how to continue to use them in the future as they become a part of your business. Without strategy, you might get lucky and choose the right platform, the right analytics tools, and the right interpretations of the data in front of you…but it’s highly unlikely.

Businesses that put strategy before technology and then use that strategy to embrace and fully utilize that technology show a digital maturity that will drive them into the future and help them to maintain sustainable growth and success.

Have you implemented a digital strategy for your business? What’s changed since you’ve embraced your strategy, and what are your recommendations for strategy and data-driven technology for business owners and executives like yourself?

Let us know what you think and how you’ve used your digital strategy to set your business apart from the competition.

By Ronald van Loon

(Originally published on LinkedIn Pulse. You can periodically read Ronald’s syndicated articles here on CloudTweaks. Contact us for more information on these new programs)

The Cancer Moonshot: Collaboration Is Key

The Cancer Moonshot: Collaboration Is Key

Cancer Moonshot

In his final State of the Union address in January 2016, President Obama announced a new American “moonshot” effort: finding a cure for cancer. The term “moonshot” comes from one of America’s greatest achievements, the moon landing. If the scientific community can achieve that kind of feat, then surely it can rally around the goal of finding a cure for cancer. President Obama put his second-in-command, Vice President Joe Biden, in charge of “Mission Control” for the cancer moonshot efforts.

Though this is certainly an ambitious undertaking, what’s encouraging is that the project isn’t starting from scratch. Researchers and clinicians have already made remarkable progress in the forms of research, clinical trials, drug development and more. There already have been many masterful achievements that propel this effort to its goal. For example, the successful mapping of the human genome nearly two decades ago provided a tremendous jumping-off point for customized cancer treatments and potential cures. But in order to land this moonshot, there must be significant innovation in how all of these stakeholders communicate, collaborate and share important information.


(Image Source: Shutterstock)

Silo-breaking: as vital as funding?

Two of the biggest challenges of this project are to provide increased funding to the strategic participants, and to increase collaboration and information sharing among the numerous research teams and clinicians all around the world. Vice President Biden has said that he wants to “break down silos and bring all the cancer fighters together—to work together, share information, and end cancer as we know it.” The goal is to double the pace of progress, or as he put it: “to make a decade worth of advances in five years.

Those of us in the cloud computing community are especially invested in the efforts to increase coordination, eliminate silos and open up access to information. These things can only be done through improving upon and innovating technology solutions, so that storing and managing data doesn’t kill productivity. Let’s consider some of the issues that will affect what underlying technologies can be utilized to further drive collaboration and support access to information.

Protecting massive amounts of private data

A project of this magnitude will have massive amounts of data, generated by a multitude of sources. These large data sets must use common data elements (data descriptors, or metadata) to ensure that researchers are comparing apples to apples. Toward this end, the National Cancer Institute (NCI) has developed the Common Data Elements (CDE) to serve as a controlled vocabulary of data descriptors for cancer research. The CDE will help facilitate data interchange and inter-operability between cancer research centers.

Big data and learning algorithms will enable researchers to identify patterns and anomalies that, for instance, may help to identify patients who can benefit from standard treatments, or whose tumors require a different approach. Given that these large data sets will contain highly personal patient health information that can’t be anonymized, they will need to be protected with the strongest measures for data privacy to protect patients‘ rights and to maintain HIPAA compliance.

Preserving data integrity through controlled access


Of course, data integrity is of paramount concern. The data and other forms of information will come from numerous sources, and technology solutions will be needed to ensure that it maintains its consistency—that it isn’t inappropriately accessed and changed or corrupted. This means that access control to research information is critical. Yes, the project aims to increase sharing of data, but it needs to be shared with the right people in the right ways. Much of the information will be in documents, not databases, and this means access control, version control, and document retractions and expirations are important features for the underlying collaboration technology. And of course, all this must be done with strict HIPAA compliancy and patient privacy.

Setting content free to get work done

The time spent on collaborating and sharing information has ballooned by 50 percent or more over the last two decades, according to Harvard Business Review. Too much time is wasted trying to piece together disconnected information among team members who are scattered across the globe, leaving little time for actual work to get done.

Teams need virtual workspaces built for specific business and clinical research processes. Think in terms of flowing content across the extended ecosystem, instead of just improving systems of record behind the firewall. To take on this initiative, the clinical research community requires what some industry analysts call “systems of engagement,” meaning information only comes to life when it is put to use and acted upon. But many technologies fail to account for specific use cases (such as global clinical research) or the security and compliance needs of information in motion (such as confidential patient data).

In this race to exterminate cancer, the first challenge that must be resolved is to control the flow of information across the complete content lifecycle — even after external sharing — while also setting that information free so those who access it can increase productivity. Solving the collaboration challenge will ultimately allow researchers to remain focused on the important work of the cancer moonshot initiative.

The countdown is on…

By Daren Glenister

So How Will Big Data Impact Our Lives?

So How Will Big Data Impact Our Lives?

The Big Data Impact

Last week we took a closer look at a few of the differences between Business Analytics vs Data Science. Clearly there is great deal of interest in this field especially when leading analyst research firm IDC predicts that big data may soon reach worldwide revenues of nearly $187 billion by 2019.

Techcrunch recently spoke with Matt Turck who said: ‘We are entering the most exciting time for big data’. In 2010 only 2.5% of the Series A market was committed to big data. Today, the sector amounts to more than 7.5% of total venture investments.

So how will big data impact our lives? Included is an infographic discovered via which takes a closer look.


CloudTweaks Comics
The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks! So we are all cheering as the FCC last week made the right choice in upholding the principle of net neutrality! For the general public it is a given that an ISP should be allowed to charge for bandwidth and Internet access but never to block or somehow…

A New CCTV Nightmare: Botnets And DDoS attacks

A New CCTV Nightmare: Botnets And DDoS attacks

Botnets and DDoS Attacks There’s just so much that seems as though it could go wrong with closed-circuit television cameras, a.k.a. video surveillance. With an ever-increasing number of digital eyes on the average person at all times, people can hardly be blamed for feeling like they’re one misfortune away from joining the ranks of Don’t…

The DDoS That Came Through IoT: A New Era For Cyber Crime

The DDoS That Came Through IoT: A New Era For Cyber Crime

A New Era for Cyber Crime Last September, the website of a well-known security journalist was hit by a massive DDoS attack. The site’s host stated it was the largest attack of that type they had ever seen. Rather than originating at an identifiable location, the attack seemed to come from everywhere, and it seemed…

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

DDoS attacks, unauthorized access and false alarms Above DDoS attacks, unauthorized access and false alarms, malware is the most common incident that security teams reported responding to in 2014, according to a recent survey from SANS Institute and late-stage security startup AlienVault. The average cost of a data breach? $3.5 million, or $145 per sensitive…

Security and the Potential of 2 Billion Device Failures

Security and the Potential of 2 Billion Device Failures

IoT Device Failures I have, over the past three years, posted a number of Internet of Things (and the broader NIST-defined Cyber Physical Systems) conversations and topics. I have talked about drones, wearables and many other aspects of the Internet of Things. One of the integration problems has been the number of protocols the various…

Using Cloud Technology In The Education Industry

Using Cloud Technology In The Education Industry

Education Tech and the Cloud Arguably one of society’s most important functions, teaching can still seem antiquated at times. Many schools still function similarly to how they did five or 10 years ago, which is surprising considering the amount of technical innovation we’ve seen in the past decade. Education is an industry ripe for innovation…

The Fully Aware, Hybrid-Cloud Approach

The Fully Aware, Hybrid-Cloud Approach

Hybrid-Cloud Approach For over 20 years, organizations have been attempting to secure their networks and protect their data. However, have any of their efforts really improved security? Today we hear journalists and industry experts talk about the erosion of the perimeter. Some say it’s squishy, others say it’s spongy, and yet another claims it crunchy.…

The Cloud Is Not Enough! Why Businesses Need Hybrid Solutions

The Cloud Is Not Enough! Why Businesses Need Hybrid Solutions

Why Businesses Need Hybrid Solutions Running a cloud server is no longer the novel trend it once was. Now, the cloud is a necessary data tier that allows employees to access vital company data and maintain productivity from anywhere in the world. But it isn’t a perfect system — security and performance issues can quickly…

What You Need To Know About Choosing A Cloud Service Provider

What You Need To Know About Choosing A Cloud Service Provider

Selecting The Right Cloud Services Provider How to find the right partner for cloud adoption on an enterprise scale The cloud is capable of delivering many benefits, enabling greater collaboration, business agility, and speed to market. Cloud adoption in the enterprise has been growing fast. Worldwide spending on public cloud services will grow at a…

The Rise Of BI Data And How To Use It Effectively

The Rise Of BI Data And How To Use It Effectively

The Rise of BI Data Every few years, a new concept or technological development is introduced that drastically improves the business world as a whole. In 1983, the first commercially handheld mobile phone debuted and provided workers with an unprecedented amount of availability, leading to more productivity and profits. More recently, the Cloud has taken…

How Formal Verification Can Thwart Change-Induced Network Outages and Breaches

How Formal Verification Can Thwart Change-Induced Network Outages and Breaches

How Formal Verification Can Thwart  Breaches Formal verification is not a new concept. In a nutshell, the process uses sophisticated math to prove or disprove whether a system achieves its desired functional specifications. It is employed by organizations that build products that absolutely cannot fail. One of the reasons NASA rovers are still roaming Mars…

Three Factors For Choosing Your Long-term Cloud Strategy

Three Factors For Choosing Your Long-term Cloud Strategy

Choosing Your Long-term Cloud Strategy A few weeks ago I visited the global headquarters of a large multi-national company to discuss cloud strategy with the CIO. I arrived 30 minutes early and took a tour of the area where the marketing team showcased their award winning brands. I was impressed by the digital marketing strategy…

5 Things To Consider About Your Next Enterprise Sharing Solution

5 Things To Consider About Your Next Enterprise Sharing Solution

Enterprise File Sharing Solution Businesses have varying file sharing needs. Large, multi-regional businesses need to synchronize folders across a large number of sites, whereas small businesses may only need to support a handful of users in a single site. Construction or advertising firms require sharing and collaboration with very large (several Gigabytes) files. Financial services…


Sponsored Partners