Category Archives: Big Data

Backblaze Study: Backing Up The Big Data Backup

Backblaze Study: Backing Up The Big Data Backup

Backblaze Study

Backing up data in any business is essential but with the rate of hard drive failure purported to be high, ensuring that you have a backup plan for the backup is just as important too.

In fact, manufacturers of hard drives themselves will not deny the possibility of failure, but knowing when it will do so is the technical version of the question ‘how long is a piece of string?’ relying on manufacturer statistics may not be sufficient alone.

The ongoing project by Backblaze…

Backblaze have been monitoring the failure rate of their huge bank of hard drives that continually backup their online business. To help others understand what happens and why, they publish their statistics every quarter.

The statistics make for interesting reading. Taking note of both brand and model, the data highlights how hard their 40,000+ hard drives are working, and which are failing – and when.

The research began in 2013, and this infographic looks at data taken from the first quarter of 2015. Castle Computers done some number crunching of our own, taking a different angle on the statistics. We have determined not only which brand appears to be performing better, but we also suggest what the optimum size of hard drive is best to buy based on the latest figures from Backblaze.

With many other online businesses also doing the same, there will soon be a whole heap of statistics, data and research by which the consumer can make the best choice for them, their back up needs and budget too.


How Big Data Is Influencing Web Design

How Big Data Is Influencing Web Design

How Big Data Is Influencing Web Design

For all you non-techies…

You’re probably wondering what big data is (I know I was….a few years back) so let’s get the definitions out of the way so we’re on the same page, okay?

Big data is A LOT of data – really, it is. It is a catchy and rather new term that refers to a large volume of both data, structured and unstructured, that is impossible to process using traditional software techniques. It takes on terms like exabytes which are 1,024 pentabytes, where a pentabyte is 1024 terabytes (now you see why it’s call big?).


While the term does seem to lean to the massive amounts of information, it is sometimes used by vendors when referring to the technology needed to handle large amounts of data. So it can be referring to both the data and the technology needed for it, okay?

Now that we have that out of the way…

You can appreciate the need to develop the best possible web designs that can handle these massive amounts of data, right? And it’s not just about containing the data but using it in a way that will help you actualize the goals of the website.

You see, back in the day when the Internet was still a new development, people were impressed by the ability to have a lot of information at the click of their mouse (all you young ones can Google what a mouse was). It was the quantity of data that was impressive but now, anyone can access this information pretty much on their own; so what is the incentive to get on your website?

Design, design, design!

design big data

It is what you do with the data that will determine whether you get as many visitors to your site as you can or not. This refers to both the presentation and the arrangement of your data in a way that is useful to your clients, and this will be determined by your web design – something the techies call data-driven design.

The benefits of letting your data determine you web design are endless; from enhancing iterative design to getting prompt information on how people interact with the design and the information. You will also get to have data that you can use for future work and possibly automate web development in the future.

It is not an entirely foreign concept this data-driven design. Many other industries have applied it to improve their products and services. For instance, in the healthcare industry, big data in the form of medical history, employment and residential information among others, is used to improve patient care as well as determine the best possible course of treatment. So it is the same concept only referring to web design.

Critical Questions

Before you embark in the designing for big data, there are a few questions you will have to answer.

1. What are your objectives?

What exactly would you like the design to achieve for you? Do you just want to join different datasets or find information about high-value clients faster? Begin with the end in mind but also be open to redefining those goals as you go through the data you have. You just might find something interesting.

2. What is the nature of your data?

You need to know where you data is coming from, how much data you have moving through your system and where this data integrate within the current system. This will help you consolidate all the data about each client from all the different sources and convergence points within your current system. This way, you don’t lose any information.

3. Which platform?

This will be determined by the kind of data you have as well as the volume. Based on this information, you can then decide what will work with you specific needs in order to provide adequate support.

Important things to remember

Once you have all these questions answered, there are a few things you will have to keep in mind as you start designing your website.

  • Easy does it

Don’t flood all your information onto your new platform all at once. Bring it in slowly to see how it works and how the staff responds to it. Start offline and in small batches before moving to real-time processing of large quantities of data.

  • Develop an updating system

Your data will need refreshing from time to time in order to remain current and relevant. You will need to find out how you can make these updates to your big data platform as easily and quickly as possible.

  • Get a feedback loop

You will need datasets that can make the existing systems smarter and this will be done using a feedback loop. These smaller datasets will be able to improve existing applications by providing real-time information from systems that they were previously impervious to or entire unaware of.

  • Establishing and evaluating analytics

As your feedback loop grows large enough to link all your data sources, you can start data mining and conducting behavioral analysis to make better predictions and optimize resources. This way, you can stay on top of trends within your industry and give yourself a competitive edge.

  • Privacy is paramount

With all that information there is a great chance of losing privacy of your users’ information; that must never happen! Emphasize user privacy in your design at every level and especially in niche segments. This is especially true if sensitive user information will be needed during transactions with the website link bank accounts, residential addresses and all that.

  • Training the users

Big data web design will change how things work. This is because it necessitates greater access by the end users in order to give real time. As a result, it is important for organizations to educate their staff on how to use big data as a team to achieve the set objective.

  • It’s an ongoing process

Transitioning to big data web design is an ongoing process. There will be some kinks to iron out and you will have to constantly evaluate the system to see if you are getting what you need from it.

It’s all about the experience

At the end of the day, using big data (also known as business intelligence) to develop a web design is all about creating an experience that is attractive and satisfactory to the user. Clients are often more impressed with a personalized experience than a standard, albeit efficient one. The good news is you have all that data; you just need to be smart about how you use it.

By Jack Dawson

Jack is a web developer and UI/UX specialist at BigDropInc – He works at a design, branding and marketing firm, having founded the same firm 9 years ago. He likes to share knowledge and points of view with other developers and consumers on platforms.

IBM Insight 2015: Large Focus On Big Data

IBM Insight 2015: Large Focus On Big Data

IBM Insight 2015 Event

IBM Insight 2015, taking place in Las Vegas between the 25th and 29th of October, focuses on big data, analytics, and the inevitable insights. Stating it would be focusing on a world changing matter, it’s apparent Big Data analytics is IBM’s target, with Watson, IBM’s cognitive technology, turning into a powerful processing engine that helps users solve everyday problems. Last year, IBM concentrated on Platforms as a Service. This year, the emphasis has moved to services and APIs, a potential shift from the traditional model of big business problem solving to a commodity-orientated platform.

IBM Datacap Mobile

IBM Insight

The new data capture solution, IBM Datacap Mobile, is embedded with cognitive computing capabilities and promises to help organizations capture and extract additional information and understanding from enterprise documents. Today, unstructured information accounts for approximately 90% of enterprise information, in a variety of forms including documents, images, and rich media. IBM Datacap Mobile applies a combination of natural language processing, advanced imaging, and machine learning technologies to automatically classify and understand documents, helping organizations quickly and correctly determine appropriate action.

The solution is particularly relevant in sectors such as banking and finance, healthcare, and insurance, and Paul Deffinger, Director ECM Applications at Great American Insurance, states, “In the insurance industry, it is important to accurately capture and understand unstructured data. This is especially true when it comes to claims because they often include a high percentage of unstructured data in the form of correspondence. A cognitive capture solution would be extremely useful in helping us to automate the processing of these documents.” The integration of cognitive capabilities within document capture reduces the need for time-consuming and expensive manual intervention while preserving accuracy and advancing speed. Cognitive capture additionally allows businesses to learn as they process transactions, continuously improving speed and precision.

IBM Insight Cloud Services

IBM Collaboration

(Image Source: Gil C/Shutterstock)

Collaborating with Twitter and The Weather Company, IBM has announced a transformational approach to exploiting data: IBM Insight Cloud Services. This new service promises to help clients stream unstructured data into insights, changing critical business outcomes across industries such as insurance, retail, and media and entertainment. Insight Cloud Services help companies realize cognitive technology benefits, learning from a variety of data sets and receiving feedback from outcomes, and can be accessed through several offerings including IBM Insights APIs for Developers; IBM Insights Data Packages for Weather; and IBM Industry Analytics Solutions.

Joel Cawley, GM, Information and Insights as a Service, IBM asserts, “Insight Cloud Services help clients create actionable insights from the noisy reality of the world. IBM is applying data science expertise and advanced analytics to exploit external data, find and connect the signals in that data to create new insights, and then deliver these insights embedded in clients’ business processes.” Built from a combination of technologies and resources from IBM’s Analytics portfolio and cloud infrastructure, the service has been tested and proved to support more than 15 billion API calls per day, with tremendously low latency and high availability replicated through global deployments.

Apache Spark

And Apache Spark is now available as a service on IBM’s Bluemix cloud platform. The open source Spark processing engine was first developed by the AMPLab at UC Berkeley and was designed for data science uses. However, its in-memory processing capabilities allow performance to exceed that of the MapReduce engine by up to 100 times, particularly in applications involving machine learning. Big Blue announced it has redesigned over 15 core analytics and commerce solutions with Spark, and of IBM Analytics on Apache Spark, the new Spark-as-a-service offering, Rob Thomas, vice president of Product Development at IBM Analytics, says, “It’s been an incredible success since we put this in beta back in June. We’ve had more than 5,000 developers coming in and building applications with it.

Thomas affirms that Spark has allowed the simplification of the architecture of many software solutions and cloud data services, resulting in simplified operations and reduced build and deployment times. “For data scientists and engineers who want to do more with their data, the power and appeal of open source innovation for technologies like Spark is undeniable,” says Thomas. “IBM is committed to using Spark as the foundation for its industry-leading analytics platform, and by offering a fully managed Spark service on IBM Bluemix, data professionals can access and analyze their data faster than ever before, with significantly reduced complexity.”

By Jennifer Klostermann

Entering Godzilla-Mode With The Zetabyte This 2016

Entering Godzilla-Mode With The Zetabyte This 2016

Entering Godzilla-Mode

It’s no secret that the data scene is growing to epic proportions, and the days of small storage devices are quickly becoming a thing of the past. Most organizations fail to stray any further than Terabyte options, but what happens when they do?

This is when things begin to get really interesting.

Jeff Davidson recently stated “By some estimates, the data-storage curve is rocketing upward at the rate of 800 percent per year. Organizations are collecting so much data they’re overwhelmed.”

Overwhelmed may not quite hit the mark – in fact more and more companies are looking for alternate options. The Petabyte, with a volume capacity of 1000 Terabytes, and the Exabyte – amounting to no less than 1000 Petabytes. And now there’s another larger storage solution possibly waiting on the horizon.

It’s said to be capable of storing 1000 times more than its predecessor the Exabyte (which can store almost 40k years’ worth of HD video). The metaphorical ground shudders as it approaches, the data scene is leaning back in its chair with anticipation, and the internet is taking a large gulp in preparation for its arrival….Ladies and gentlemen, the Zetabyte has almost landed, and no thanks to the sheer magnitude of internet usage to date which is pushing the Exabyte to its peak.

Check out the infographic below, courtesy of Cisco that details the sheer expansiveness of our data as we know it.



By Stacy Coe

Furthering Business By Seeing Beyond

Furthering Business By Seeing Beyond

Furthering Your Business

Here is a typical customer service story that illustrates the gap between the power of modern commerce and the struggling mindset of business.

John is a retail customer, who, like many people, enjoys shopping at specific stores. He re-visits these stores often, out of habit and convenience. He recently purchased a coffee maker from a homewares store in his neighborhood just one week prior to moving house. He brought it home, but did not open it. It remained in its original packaging. Two weeks later, after moving into his new house, he found the coffee maker, removed it from its box and plugged it in. It did not work. He called customer service and was told to take it to a local affiliate – a store that was not a direct part of the chain, but that sold some of the chain’s branded merchandise.


(Image Source: Shutterstock)

When John arrived at the affiliate store, with his coffee maker in hand, the young sales clerk informed him that although the store was connected to the homewares chain, she was not able to accept the appliance, since her store did not directly deal with this particular brand. She politely suggested he return to the main store back in his old neighborhood. John left the store, with his coffee maker under his arm. He felt a little under-appreciated and consequently decided to switch his loyalty to their competition.

Question: what – if anything – could the young sales clerk have done differently to stop John from leaving the brand?

This type of customer service scenario happens very often. It is the end result of an absence of long-range thinking on the part of higher-ups in the retail chain, a subsequent lack of education of front-line retail staff and a lack of time. Store associates seldom have the time or the permission to think proactively.

Data is King

big data

The people who manage this affiliate store overlooked a key component of the new business economy. Even if the coffee maker was rightfully not a brand that they supported, the cost of returning it on behalf of the customer pales in comparison to what John would have left them in return: data. Customer relationships and customer data carry a far greater value than any individual transactions. Business, both in the B2C (retail) and B2B (industrial/commercial) spheres, relies increasingly on big data and analytics. This is the material that helps further individual customer relationships, spreading them out into additional channels.

Data allows vendors to outperform – For example:

Up-selling:John, your 4-cup coffee machine is good, but have you considered an 8-cup model, so you can make enough for guests?

Cross-selling: “John, most people who buy this type of coffee maker also buy this amazing kettle, made by the same manufacturer, with six different water temperature settings.

Data-based selling: “John, last time you shopped at the main store, you bought a highway safety kit for your car. Do you know about our really great thermos cups? They’re perfect for enjoying that great coffee safely while you’re driving.”

Subscription services: “John, we’ve partnered with this premium coffee supplier who sends coffee by courier. Not only will you never run out, they always send an additional sampler with every shipment.

Freemium: “John, I know you might never have tried coffee shipments by courier before, so we are happy to send the first 1-week package at no charge. You can order online if you like it.

Loyalty: “John, if you choose to order your coffee online, maybe you want to try our loyalty app. It works on your smartphone and you get points and rewards with every purchase.”

Mobile Commerce: “John, since you’re thinking about the loyalty app, you might want to think about our full, downloadable native app that shows the specials throughout the entire store, but primarily the areas that we know you like the most, like coffee and cars. If you set the permissions, it will also know when you physically enter the store and you will get 15% off automatically.”

New Service Lines: “John, we are offering gourmet dessert preparation classes online in conjunction with a local catering school. Perhaps you or a family member might wish to sign up, to learn how to make great desserts to go with that wonderful coffee.

The sales clerk in this scenario was only doing what she had been instructed to do, which points to a deficiency of vision in the management hierarchy. John should not have been allowed to leave the store without the clerk entering his account code to find out who he was, how long he had been a customer of the main store, and to identify and deliver these types of up-sell opportunities right there and then. The clerk should have been educated to understand that rejecting a customer for any reason will result in a high possibility of losing that customer, whereas helping him would have opened up more channels of loyalty and business.

Data is king. Customer data gives company representatives at any level the opportunity to fully understand the needs of each client/customer and to address them in a high-touch, contextual manner. That is the currency of modern commerce.

For more on this topic, please visit, sponsored by HP Enterprise Services.

By Steve Prentice

Infographic – Western Digital $19 Billion Dollar SanDisk Acquisition

Infographic – Western Digital $19 Billion Dollar SanDisk Acquisition

IRVINE, Calif. and MILPITAS, Calif. — Oct. 21, 2015 — Western Digital® Corporation (NASDAQ: WDC) and SanDisk Corporation (NASDAQ: SNDK) today announced that they have entered into a definitive agreement under which Western Digital will acquire all of the outstanding shares of SanDisk for a combination of cash and stock. The offer values SanDisk common stock at $86.50 per share or a total equity value of approximately $19 billion, using a five-day volume weighted average price ending on October 20, 2015 of $79.60 per share of Western Digital common stock. If the previously announced investment in Western Digital by Unisplendour Corporation Limited closes prior to this acquisition, Western Digital will pay $85.10 per share in cash and 0.0176 shares of Western Digital common stock per share of SanDisk common stock; and if the Unisplendour transaction has not closed or has been terminated, $67.50 in cash and 0.2387 shares of Western Digital common stock per share of SanDisk common stock. The transaction has been approved by the boards of directors of both companies.



The combination is the next step in the transformation of Western Digital into a storage solutions company with global scale, extensive product and technology assets, and deep expertise in non-volatile memory (NVM). With this transaction, Western Digital will double its addressable market and expand its participation in higher-growth segments. SanDisk brings a 27-year history of innovation and expertise in NVM, systems solutions and manufacturing. The combination also enables Western Digital to vertically integrate into NAND, securing long-term access to solid state technology at lower cost…

Read Full Release: Western Digital

Integrating Supply Chain Solutions On The Cloud

Integrating Supply Chain Solutions On The Cloud

Integrating Supply Chain Solutions

When oversimplified, Sales & Operation Planning (S&OP) is the iterative process behind optimizing Sales, Marketing, Product Management, Finance, Operations & Post Sales. It provides Executives from cross-functional organizations, a framework to collaborate and maximize utilization of its’ resource and ultimately optimize productivity across its’ entire value chain.

supply chain

(Image Source: Shutterstock)

Why should this be of interest to you?

In a complex multi-matrix Enterprise environment, your goal may be to accurately measure current and future Demand variables across each node of your value chain, hopefully maintaining accuracy and consistency.

Or in an isolated functional organization, you find yourself responsible for improving any one of the following (or so you are told):

  • Demand Generation
  • Improve Sales
  • Increase Revenue
  • Reduce Cost (of Business Operations)
  • Optimize Throughput (Plants, Production line, Time-to-market etc…) and the likes

I am assuming (and sincerely hope) you already leverage technology to optimize your specific business area or areas. If not, this blog can serve during your technology selection process.

Let’s step back…so we can move forward.

The Road so far for S&OP Technologies:

Since its conception in the 1980’s, by Richard Ling, S&OP Solutions has evolved to a significantly mature state. We also may know them of SCM Systems. SCM’s core strength is in its ability to tailor ad-hoc processes so to meet desired results; while accommodating varied levels of organization maturity. This allowed it to evolve continuously and iteratively (we call it agile now). However, when compared to other technology solutions, it’s evolution over last 25 years, appears dwarfed. Why?

Our flaws are often the mirror image of our strengths. And so it has been for SCM Solutions.



Through its evolution into Supply Chain Collaboration (SCC), S&OP has remained focused in hyper-optimizing specific functional areas with bare minimal process automation, limiting itself to critical business information. Although this selective bias allowed us an ability to measure only some of the core KPI’s (revenue, performance etc.), it made it impossible or cost prohibitive to measure the intangibles – Constraints, Channel Behaviors, and Cross-functional efficiencies.

Only a handful few, likely 20% of the Fortune 100, may have the capital to muscle up and create this custom visibility across isolated applications. The rest of us will have to looked into the future…

We can better understand this as we find S&OP’s drive to morph itself into Integrated Business Planning (IBP) through early-late 2000s. IBP (often touted as the Big brother of S&OP) is a term coined by OliverWight group and widely adopted by leading Solution Providers.

The whole is greater than the sum of its parts” — Aristotle. And that’s how S&OP technologies will continue to evolve.

The differentiating factors:

1. Existing investments in isolation (ERP, CRM, APM etc.) CANNOT be scaled up efficiently to meet IBP needs. Need for an Integrated Ecosystem has driven this surge in API driven frameworks across all products.

2. Best-In-Class S&OP applications are already available, but they appear optimal in a silo and too difficult (or expensive) to be tightly integrated.

3. Internal Maturity alone will not determine success. Improve Maturity of your Partners, Suppliers, 3PLs etc. ecosystem as well.

4. Further, optimize the Speed & Cost of Doing Business? Choose products which allow: Rapid Deployment, Mobile-Enabled Enterprise, Omni-Channel Management, Private/Hybrid Cloud etc.

5. Real-time Demand Planning: Extend your core Application modules to mobile endpoints.

6. Data & Everything in between: With access to disposable (read relatively inexpensive) Cloud Storage, we can now track all Business-behavioral data. Mine them based on your own algorithms (iterative evolution) to pro-actively determine meaningful changes – like channel behaviors,

7. Leverage IOT: IOT will significantly improve Visibility of Goods across the Fulfillment, Transportation phases. Industrial IOT (IIOT) will disrupt and revolutionize the service levels which 3PLs are able to provide today (…a future article)

Our Future:

With the stabilization of Best-in-Class frameworks across isolated Applications (thanks to SaaS), we find a spike in the overall maturity curve for applications within a functional area. But these isolated SaaS Applications may provide the maximum intrinsic value because of the issues we discussed above.

It is not a surprise that SCM Applications demonstrated a 10.8 percent annual growth accumulating a $9.9B in 2014. While the SCM Cloud demonstrated an above-market growth of 17 percent.

Companies who are positioned with a Technology Stack, instead of isolated applications, can provide exponential value. Primarily from leveraging its eco-system and in tightly coupled process integrations, pre-baked into the environment. Hence, the top 3 market leaders – SAP, Oracle & JDA Software, controls a whopping 44.8% market share in 2014.

By Sourin Paul

New IoT And Big Data Offerings Announced At Dell World 2015

New IoT And Big Data Offerings Announced At Dell World 2015

Dell World 2015

Dell World 2015 officially opens in Austin, Texas at 9:00 am on Wednesday 21 October, but ahead of the official conference launch, Dell has announced new services including dedicated IoT solutions for buildings and manufacturing, Big Data analytics capabilities, and cloud client offerings.


The new Edge Gateway 5000 Series is Dell’s first purpose-built IoT gateway for factory and building automation, developed to allow customers to make the most of enormous amounts of data collected by sensors. Using local analytics and other middleware, the device receives, aggregates and analyzes data, relaying only useful parts to either the cloud or data center with the use of Dell Statistica data analytics.


Organizations are struggling to make the best decisions regarding the data volume and complexity created by the vast numbers of sensors, embedded systems, and connected devices now on the network,” remarks Andy Rhodes, executive director of Commercial IoT Solutions at Dell. “As more of the data is processed in real time at the edge of the network, the gateway becomes the spam filter for IoT.”

Big Data

Dell Statistica 13.0 and new analytics-as-a-service offerings target industry verticals such as healthcare, insurance, and banking. A user-friendly solution requiring no coding, Statistica 13.0 has been designed to simplify and improve processes by which organizations deploy predictive models directly to data sources inside firewalls, the cloud, and partner ecosystems, and integrates seamlessly with open source R.

The Dell analytics-as-a-service solution being offered has been designed to help customers predict business outcomes, extract valuable insights, and improve the precision and efficacy of critical business processes. Says John K. Thompson, general manager of advanced analytics at Dell Software, “In the modern data economy, the ability to gain predictive insight from all data is critical to building an agile, connected and thriving data-driven enterprise.”


Expanding its cloud offerings, Dell has announced the Wyse 5050 AIO zero client for VMware, suggesting it’s the most important advancement of its OptiPlex commercial PC portfolio in the last five years. Given Dell’s recent acquisition of EMC, it’s perhaps unsurprising that the expansion is optimized for VMware. The Wyse 5050 AIO, based on Dell’s P2414H monitor, includes a zero client on the back of the monitor, six USB ports (with four located on the sides of the device), and a built-in power supply.

And the ninth release of Wyse Cloud Client Manager has also been publicized, managing mobile devices, mobile workspaces, and thin clients from a single console. The new release is professed to simplify device management and offers centralized reporting as well as control of multiple branch offices. Administrators are now able to organize devices into groups, and delegate administration of groups through role-based access control. Additionally, support for iOS 9 and Windows 10 has been added, along with enhanced application management functionality.

By Jennifer Klostermann

CloudTweaks Comics
The DDoS That Came Through IoT: A New Era For Cyber Crime

The DDoS That Came Through IoT: A New Era For Cyber Crime

A New Era for Cyber Crime Last September, the website of a well-known security journalist was hit by a massive DDoS attack. The site’s host stated it was the largest attack of that type they had ever seen. Rather than originating at an identifiable location, the attack seemed to come from everywhere, and it seemed…

A New CCTV Nightmare: Botnets And DDoS attacks

A New CCTV Nightmare: Botnets And DDoS attacks

Botnets and DDoS Attacks There’s just so much that seems as though it could go wrong with closed-circuit television cameras, a.k.a. video surveillance. With an ever-increasing number of digital eyes on the average person at all times, people can hardly be blamed for feeling like they’re one misfortune away from joining the ranks of Don’t…

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

DDoS attacks, unauthorized access and false alarms Above DDoS attacks, unauthorized access and false alarms, malware is the most common incident that security teams reported responding to in 2014, according to a recent survey from SANS Institute and late-stage security startup AlienVault. The average cost of a data breach? $3.5 million, or $145 per sensitive…

The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks!

The Conflict Of Net Neutrality And DDoS-Attacks! So we are all cheering as the FCC last week made the right choice in upholding the principle of net neutrality! For the general public it is a given that an ISP should be allowed to charge for bandwidth and Internet access but never to block or somehow…

How Formal Verification Can Thwart Change-Induced Network Outages and Breaches

How Formal Verification Can Thwart Change-Induced Network Outages and Breaches

How Formal Verification Can Thwart  Breaches Formal verification is not a new concept. In a nutshell, the process uses sophisticated math to prove or disprove whether a system achieves its desired functional specifications. It is employed by organizations that build products that absolutely cannot fail. One of the reasons NASA rovers are still roaming Mars…

Ending The Great Enterprise Disconnect

Ending The Great Enterprise Disconnect

Five Requirements for Supporting a Connected Workforce It used to be that enterprises dictated how workers spent their day: stuck in a cubicle, tied to an enterprise-mandated computer, an enterprise-mandated desk phone with mysterious buttons, and perhaps an enterprise-mandated mobile phone if they traveled. All that is history. Today, a modern workforce is dictating how…

Micro-segmentation – Protecting Advanced Threats Within The Perimeter

Micro-segmentation – Protecting Advanced Threats Within The Perimeter

Micro-segmentation Changing with the times is frequently overlooked when it comes to data center security. The technology powering today’s networks has become increasingly dynamic, but most data center admins still employ archaic security measures to protect their network. These traditional security methods just don’t stand a chance against today’s sophisticated attacks. That hasn’t stopped organizations…

Using Cloud Technology In The Education Industry

Using Cloud Technology In The Education Industry

Education Tech and the Cloud Arguably one of society’s most important functions, teaching can still seem antiquated at times. Many schools still function similarly to how they did five or 10 years ago, which is surprising considering the amount of technical innovation we’ve seen in the past decade. Education is an industry ripe for innovation…

The Security Gap: What Is Your Core Strength?

The Security Gap: What Is Your Core Strength?

The Security Gap You’re out of your mind if you think blocking access to file sharing services is filling a security gap. You’re out of your mind if you think making people jump through hoops like Citrix and VPNs to get at content is secure. You’re out of your mind if you think putting your…

Three Reasons Cloud Adoption Can Close The Federal Government’s Tech Gap

Three Reasons Cloud Adoption Can Close The Federal Government’s Tech Gap

Federal Government Cloud Adoption No one has ever accused the U.S. government of being technologically savvy. Aging software, systems and processes, internal politics, restricted budgets and a cultural resistance to change have set the federal sector years behind its private sector counterparts. Data and information security concerns have also been a major contributing factor inhibiting the…

Multi-Cloud Integration Has Arrived

Multi-Cloud Integration Has Arrived

Multi-Cloud Integration Speed, flexibility, and innovation require multiple cloud services As businesses seek new paths to innovation, racing to market with new features and products, cloud services continue to grow in popularity. According to Gartner, 88% of total compute will be cloud-based by 2020, leaving just 12% on premise. Flexibility remains a key consideration, and…


Sponsored Partners