Category Archives: Cloud Computing

Big Problems, Big Payoff: Setting Up Your Own Cloud Server

Big Problems, Big Payoff: Setting Up Your Own Cloud Server

Big Problems, Big Payoff: Setting Up Your Own Cloud Server

Your boss just informed you that you need to start migrating all of your server applications to the cloud. How would you go about doing that? Is it even possible? Your mind starts to whirl with the issues you can already see cropping up over the following weeks and months.


(Image Source: Shutterstock)

Setting up a cloud server on your existing infrastructure is easier and less painful than you think. However, there are a number of potential problems you’ll want to plan for ahead of time, and your new cloud server will require a substantial investment upfront to get it up and running. Here is a sample of some of the issues you might encounter:

The Basics 

Let’s go over what you do know at this point. You need to take your existing server infrastructure and applications and transform them into a cloud-based platform. You can keep your existing Windows or Linux operating system, but you’ll need to install a hypervisor-based server application, which allows you to monitor all of your virtual machines at the same time.

You can install your hypervisor-based server application in one of two places. A native hypervisor is installed directly on the server without first going through a separate operating system, which should be sufficient for your needs. Hosted hypervisors run through operating systems and provide greater flexibility. For example, you could run multiple operating systems both with and without hypervisors, or you could even use different hypervisors if you prefer.

As far as your cloud migration is concerned, the hypervisor you choose is the single most important application you’ll install. It will allow you to effortlessly maintain your server and troubleshoot any issues in a fraction of the time.

The Hard Part’s Done

Virtualization has been around for a lot longer than the concept of cloud computing, which is what most network administrators have difficulty with at first. Once you have your hypervisors and operating systems installed in the correct hierarchy, all that remains to get your cloud server fully operational is to provide your company’s employees with remote access. Fortunately, you shouldn’t run into any trouble at all with this process, and you’ll likely find it tedious more than anything else.

After you have the entire cloud server set up and configured to your specifications, you’ll want to take this time and test, test, and test some more. Remember that you aren’t just adding a single new application — you’re completely rebuilding your server from the ground up. Applications probably won’t work at first, and putting in the extra hours now will help your cloud server work perfectly or at least well enough when you turn it on for real.

Scaling Your Hardware Up

Every business wants to grow. They want more employees, more revenue, and more space. Cloud computing makes growth easy. Let’s say that your company averaged 50 percent resource usage two years ago, 70 percent usage last year, and 90 percent usage this year. You’ve probably already started receiving complaints from employees who can’t load software or files quickly.

You can easily upgrade a cloud server with additional processors, memory, and hard drives without breaking a sweat, but dedicated servers require significantly more time and money to upgrade. By adding additional hardware for all applications at the same time, you’ll never need to partition resources for one application or another.

The Ups and Downs of Sharing Resources 

We all learned that sharing was beneficial when we were just a few years old, but we would occasionally grow upset when we had to part with a favorite toy. Just as in real life, sharing server resources benefits all virtual applications most of the time.

Imagine that you have 10 different applications that all use a maximum of 10 GB of memory but an average of just 3 GB. Instead of installing 100 GB of memory, you could likely get away with installing 50 or 60 GB while keeping overall usage low. Applications won’t always use the same amount of memory or hard drive space, and your cloud server will seamlessly allocate resources from application A to application B if employees aren’t currently using application A.

As previously mentioned, sharing is good most of the time, but when can it cause a problem? Let’s say your company hosts a website on a cloud server, and Black Friday is coming up. If your hardware isn’t up to the challenge, the influx of new customers might drag your server to a standstill. If all of your resources are tied up, employees won’t be able to get work done, and customers won’t be able to make any additional purchases.

Time to Invest in a Generator

Nature happens, and it’s up to you to prepare for unforeseen events. If the power goes out during the night, your employees won’t be able to work from home unless your cloud server has a generator backup. If your company already uses a generator, you’re good to go. If not, now might be a good time to upgrade your facilities.

Ensuring that Employees Have Sufficient Access

Software as a service, or SaaS, applications are fantastic because they don’t require software installations on every single computer. However, they only work when your employees have reliable Internet access. Keep that in mind if your employees need to travel on a regular basis — you might want to remind them to schedule flights on Wi-Fi-enabled flights if possible.

You Never Have Enough Security 

Do you have enough security? Trick question! You will never have enough security. Although your cloud services should be relatively isolated from one another through software, remember that they’re still on the same hardware. If you’re migrating to cloud computing, now is the time to boost those encryption standards.

A data protection manager, or DPM, will also help protect and keep track of your data from one central location. Different DPMs can protect different workloads like SQL and Exchange, but you might have to install two or more DPMs to protect all of your data.

Rapid Updates 

Speaking of security, most cloud platforms will automatically update your applications. That’s less work for you and your IT staff, and you’ll never need to fix critical vulnerabilities in the middle of the night again. In fact, cloud platforms are so convenient that you’ll have plenty of time to devote to other projects.

Employee Training 

You’ll need to provide seminars, tutorials, and other resources to educate employees on how to use your cloud applications. Expect hundreds of questions in the first few days after you migrate to a new platform, but most confusion should clear up within a week or two. However, employee training is one of those things IT departments typically forget to prepare for.

The Bottom Line 

Making the switch to cloud computing readies your business for any amount of future growth while slashing IT costs. We could extoll the virtues of cloud computing all day, but the fact is that migrating to a new platform will require time, effort, and a healthy dose of troubleshooting. Once you get your new server off the ground, you should experience smooth sailing moving forward.

By Anthony Lévesque /GloboTech Communications

Cloud Infographic: 2013 Cyber Security Intelligence Index

Cloud Infographic: 2013 Cyber Security Intelligence Index

Cloud Infographic: 2013 Cyber Security Intelligence Index

Based on daily monitoring of security for more than 4,000 clients, IBM has determined that DDoS attacks are on the rise. The average large company must filter through 1,400 cyber attacks weekly according to the IBM Cyber Security Intelligence Index. But many organizations do not have the on-site expertise or the right IT skills and tools required to combat them. Also, many do not have an incident response program in place or rely on existing programs that are out of date, not regularly tested or recently updated to address the growing exponential threats.  Continue Reading


Infographic Source: IBM

CloudBerry Lab Takes You To The Glacier

CloudBerry Lab Takes You To The Glacier

CloudBerry Lab Takes You To The Glacier

The world of data management and storage continues to change and flex as it enters the realm of the cloud, with sophisticated technologies running in a never-ending horse-race against the increased demands of the end users. One constant remains:  people need access to data, and they also need to store data, which is why they turn to cloud storage and backup specialists.


CloudBerry Lab ( is one such specialist. They work a great deal in the Windows Azure environment, providing low-cost and reliable backup for small and mid-sized businesses. Alexander Negrash, Marketing Manager at CloudBerry Lab describes their service as “an easy and comfortable way for users to move data,” and they recognize the Windows Azure environment as a dynamic cloud solution that many of their customers are already comfortable with. This allows them to implement a multi-cloud backup strategy to meet a wide range of storage needs and operating platforms.

CloudBerryLab recognizes also that cost will always remain an issue for many corporate customers, especially given that the amount of data that needs to be accessed and stored increases with each passing day. So, in addition to their affiliation with Windows Azure, they assist by giving their clients access to Amazon Glacier.

Glacier is a fascinating spin on the concept of archiving, especially when one considers the effort that must have gone into the branding: the world’s largest river (Amazon) tied to the world’s slowest river (a glacier). But semantics aside, Glacier offers a remarkable visual and practical antithesis to the high-speed, amorphous entity that is the cloud. Although it, too, is cloud-based, Glacier essentially takes snapshots of existing data sets and then swallows them into a state of permanent immobility, encased, as it were, inside a giant block of ice. This is not real-time backup. This is month-end archiving. And once the data is frozen, it can be retrieved, but not immediately. Currently a retrieval request takes up to 24 hours to complete. It is ice, after all, not cloud vapor.

Amazon Glacier has an added benefit in addition to its huge storage capacity: it is very inexpensive. Current pricing is set at $0.01 per gigabyte, with a reliability factor of 99.9999999% against data loss. CloudBerry Lab offers Amazon Glacier backup service to its customers, thus completing a spectrum of backup and storage options, from fast to slow, short-term to long-term.

Negrash points out that CloudBerry Lab also provides encryption in a secure method that happens before the data is sent to the cloud, which differs from many other storage organizations that offer encryption, but store everything within the same data centres. By encrypting on the client’s end, before it is sent to the cloud, Negrash says, a greater level of security is achieved.

As always, however, Negrash states that customers should do some research when deciding on their backup and storage options, including just where and how their long-term storage should occur, given that the data to be stored will continue to expand on a weekly or monthly basis. Of course, the specialists at CloudBerry Lab are happy to provide assistance, and a great deal of information is available at their website.  You can also download CloudBerry Backup with free 15 days trial.

By Steve Prentice

Post Sponsored By CloudBerry lab

Firehost Reveals Increases In Cyberattacks In Its Superfecta Report

Firehost Reveals Increases In Cyberattacks In Its Superfecta Report

Firehost Reveals Increases in Cyberattacks in its Superfecta Report

There are always wolves at the door, and they are relentless in both their creativity and determination when it comes to getting in. This is the message that can be gleaned from a report released Tuesday October 22, by FireHost, ( a provider of managed, secure cloud IaaS. Their Superfecta report highlights upticks in Cross-site Scripting (XSS) and SQL Injection activity that specifically target applications carrying sensitive information about organizations and their customers.

ChartThe adoption of cloud computing, mobile applications and virtualized enterprise architectures have led to an expansion of applications that are connected to Internet resources,” explained FireHost founder and CEO Chris Drake. He and his team are noticing the attacks becoming more prevalent and automated, meaning that of the nearly 32 million attacks that Firehost blocked in the third quarter of 2013 alone (a 32 percent increase over Q2 2013), the increase in attempted SQL Injection and Cross-Site Scripting attacks signifies that what was once the domain of the sophisticated hacker has now become commoditized, which poses a greater risk to any businesses with hosted resources.

According to Jeremiah Grossman, founder and CTO of WhiteHat Security, the hacker community is becoming particularly creative in combining and integrating CRSF, XSS and Directory Traversal attacks to inject code that is designed to penetrate databases that underpin many mission-critical, web-based applications.

Kurt Hagerman, Director of Information Security for Firehost, in speaking with, points out that all is far from lost. What is required, he suggests, is a greater level of communication and understanding between IT and the C-suite. Investment in security, he says should stay proportional to investments made in infrastructure such as networks, but this can only happen if both sides are talking regularly. Often, he points out, a company’s IT group is left to make its own decisions, and, because of the wide range of issues a typical IT department has to handle, the requests for support that filter up to the senior levels are disjointed and lack overarching context.

All the while, the bad guys continue to insert malicious code into web pages, online forms and directory files, with the tenacity of hungry predators.

The strategy, Hagerman suggests, is to shift from “thinking IT”  to “thinking governance.” He points out how attacks to a company’s vulnerable application layer assets aren’t just about data loss. They hold the potential to destroy a brand. He highlights the famous 2007 case in which TJX Companies Inc., parent company of T.J. Maxx, Winners and HomeSense, discovered a breach of its credit card processing system in which the theft of unencrypted track-2 data compromised over 45 million credit and debit card numbers, and which resulted in a class action lawsuit. Although this case is now six years old, it highlights the fact that innovative thieves stop at nothing to discover cracks in the system, and that the hard-earned reputation of the company itself must struggle for years to right itself. Smaller companies, with shallower pockets, might never get the chance to recover.

Grossman and Hagerman suggest that the Firehost Superfecta report be seen more as a business enabler than a sky-is-falling doom-and-gloom scenario.  “The point to elevating security sophistication, they say, is to make the bad guys work harder, to a point at which it is no longer worth their time to try to break down your barricades.” Although that might seem like common sense, too many companies still underspend on security.

Traditionally, we see the lion’s share of technology budget being spent on creating or obtaining applications. After that, infrastructure and hosting solutions receive the most financial attention. Investments in security and preventative measures come in last in most cases,” said Drake.

In addition to being a deterrent to thieves, a governance approach to security also helps to head off costly fines for data breaches that may be imposed by banking regulators, healthcare/privacy authorities and many others.

Today, in many organizations, as much as $1 out of every $10 invested in enterprise infrastructure technology is allocated to protect network resources.  Only $1 out of $100 is invested in web application security.  This unbalanced approach does not reflect the newly emerging threat landscape,” said Drake.

In short, the reality of cyberattacks is that they are becoming more frequent and varied. Hagerman points out in just half a year, the number of blocked attacks that he has overseen has doubled. “A proactive and up-to-date defence is the reality of doing business,” he says.

Kurt Hagerman will be a featured speaker at the AKJ e-Crime and Information Security Mid-Year Conference in London, UK on October 24, 2013.

By Steve Prentice

Passing Big Data Through A Drinking Straw

Passing Big Data Through A Drinking Straw

Passing Big Data Through A Drinking Straw

Big Data has all the corporate heads up and about in excitement since it promises to uncover golden nuggets of information out from an ocean of mundane and redundant data. But here’s the problem sticking everybody in the side, Big Data is big, as in it can reach the levels of “we-can’t-come-up-with-enough-names” bytes big. And with current upload speeds nowhere near as fast as download speeds, all the fancy analytics software and techniques aren’t going to do us any good if we can’t get our data where we need them.


It is called the Skinny Straw or Drinking Straw problem and it is the biggest and most obvious problem being faced by Big Data. The analogy is simple; imagine passing an elephant through a drinking straw. Sure you can grind the elephant into very tiny bits so it can fit through the straw, but how long is that going to take? I admit that was a little gory, the real analogy was filling a swimming pool using a drinking straw, but you get the picture. The straw represents bandwidth and how small it is compared to the amount of data that needs to get to the other side of that straw.

The only real solution we can think of right off the bat is to get a bigger straw, but usually that would require major infrastructure upgrades on the part of the ISP or backbone provider, and we are talking about extreme amounts of cash (or credit if that’s how you roll). There are also the obvious technology limitations, we can upgrade to the best there is and it might not still be 100% enough. Some Big Data providers have tried their own proprietary ideas to try and get around this issue, or at least lessen it to some degree.

Here are some ways and techniques that are being used in the industry right now:

  1. We have the data compression and de-duplication techniques to make data transfers faster. That’s the “grinding the elephant and pushing it through the straw as fast as possible” solution.
  2. There is the “tinker with current protocols” direction by combining the reliability of TCP connections and the speed and bandwidth of UDP transfers into something that they call FASP. This ensures that communication is fast and secure while doing away with various handshaking processes that TCP requires.
  3. We can also work with various protocol optimizations in order to get around the problem. But one way that is really worth mentioning is the tried and tested transfer method –the old SneakerNet approach. Providers that use this method allow their customers to mail their hard drives to the company address so that they can transfer the data and then mail the hard drives back. This method is often faster at moving extremely large amounts of data quickly even taking into consideration the delivery time.

By Abdul Salam

(Image Source: ShutterStock)

Virtualization As A Key Component Of Cloud Computing

Virtualization As A Key Component Of Cloud Computing

Virtualization as a key component of cloud computing

Virtualization is one of the key components of the cloud computing paradigm, especially in infrastructure as a service model where the mentioned technology is essential to provide a large set of computing resources. Some experts even define cloud computing as simple as virtualized hardware and software plus advanced monitoring and resource management technologies.


Virtualization has changed the ICT, and some of its benefits include reduction of hardware vendor lock-in, faster disaster recovery, ability to extend life of old legacy applications, and reduction of operational costs.

However, implementation of virtualization is rarely simple in enterprise environment. If you use virtualization on your machine only to run some legacy applications or have multiple operating systems available, this technology is very simple. You have to install chosen virtualization software, create/modify the image, and start to work. Even for absolute beginners, there are a lot of free online tutorials. On the other hand, if you deal with multiple database systems, complex enterprise applications where some application are on-premise and some deployed on different cloud services, the story becomes a quite different one. In the aforementioned case, you will need a great amount of computing resources and sophisticated virtualization expertise. Therefore, any initiative that freely provides computing resources, complex virtualization software, and expert knowledge is welcome.

One of the most exciting parts about working in ICT is also the ability to attend conferences where you can discuss your work and complex problems with people from your field. Currently, the budget for conferences has been reduced at many businesses. Part of solution could be virtual online conferences that enable networking opportunities without leaving your office. I’ve attended a few of these online events in the past, and my experience is positive, I have always managed to learn something useful and meet interesting people from my field.

There’s a good event coming up October 22 for anyone interested in learning more about virtualization. Online VMware Forum 2013 is free to attend, and the best part is that they offer the people capability to experiment with VMWare products free with hands on lab online. Lab is running in minutes with full technical capabilities, and at the same time you can chat live with VMware experts who can answer your questions. You can also navigate in a 3D virtual environment with interactive booths, and test the virtualization solutions without having to install anything on your machine. Attending this event you can also learn how to simplify your IT infrastructure and hear VMWare and virtualization experts’ discussions on how to find solutions to complex IT problems.

The agenda for the event include themes such as: vSphere and vCloud Suite, Virtualization Management, Virtualization 101, Cloud Management, End User Computing, Business Continuity/Disaster Recovery

You can find more information and sign-up details for this free event here.

By Darko Androcec

(Post Sponsored By VMware)

Why NSA Revelations Will Be Good For Cloud Security

Why NSA Revelations Will Be Good For Cloud Security

NSA Revelations And Cloud Security

Edward Snowden’s recent disclosures, including concerns about the NSA’s ability to break certain types of encryption, and the extent of surveillance on cloud service providers, put the entire cloud industry into an uproar.

The bad news is that this has eroded companies’  trust that their data can be secure in the cloud. In fact, industry analysts are predicting that these disclosures will cost US cloud service providers between $22 and $35 billion in revenue by 2016.

But there is light at the end of this tunnel, and what will emerge is a safer, more resilient cloud.

Is Encryption Dead?

In short, no. Expert cryptographer and author of the book “Practical Cryptography,” Bruce Schneier, recently blogged: “Whatever the NSA has up its top-secret sleeves, the mathematics of cryptography will still be the most secure part of any encryption system. I worry a lot more about poorly designed cryptographic products, software bugs, bad passwords, companies that collaborate with the NSA to leak all or part of the keys, and insecure computers and networks. Those are where the real vulnerabilities are, and where the NSA spends the bulk of its efforts.”

Even Snowden has also commented, “Properly implemented strong crypto systems are one of the few things that you can rely on.”

Consequently, we will see continued adoption of encryption technologies in the cloud to protect data in transit and at rest in these shared storage infrastructures.

Encryption will evolve

The evolution of encryption algorithms is nothing new. In recent years, as compute power gets stronger, we’ve seen the migration from DES, to 3DES, to AES-128/256. These longer key lengths are the ‘math’ that prevents computer systems from being able to ‘guess’ an encryption key.  The good news here is that as computer systems get more powerful, they can leverage encryption with longer key lengths easily, without degrading performance.

Further, encryption standards are approved by independent bodies like the National Institute of Standards and Technology (NIST), and are put up for extensive public review before they are published. While those who lean toward conspiracy theories hint at intentional ‘backdoors’ built into these algorithms that can be exploited by the NSA or others, it’s highly unlikely these wouldn’t be found during the review process. These reviews will continue to play a critical role as encryption technologies adapt in the future. Furthermore, the details and implementation of encryption algorithms, such as AES, are public domain.

The Importance of Key Management

If you use AES encryption with a 256-bit key strength, but your encryption system only uses an eight-character password to access those keys, then you effectively have reduced the strength of your encryption key significantly, since a hacker must only guess your password, instead of the actual key. This is why managing and storing these keys securely is so critical.

Threats from Abroad

Data has become a treasure trove, and the cloud can make an even sweeter target. You can be sure that if the NSA is interested in your data, others are as well. Make sure you clearly understand your cloud service provider’s (CSP) service level agreements, particularly as related to security measures. The cloud will become too cost effective to avoid for most organizations, so continued pressure from cloud clients will be the best way to gain security improvements.

Bring your own security

While many CSPs – like Google – have introduced encryption in their cloud offerings, you still need to look a bit deeper. Google’s encryption may protect you from a hacker who manages to get access to their infrastructure, but it won’t prevent Google from giving your data to the Feds. To be sure you are the only one with access to your data, use strong encryption with a good key management system, and make sure YOU keep the keys, not your CSP.


You can use the cloud, but remember that security is ultimately your responsibility.

  • Encrypt any data you put in the cloud that you want to be private.
  • Use strong crypto (for example one utilizing AES-256, RSA-2048) to protect the data.
  • Use a strong key management solution that supports multi-tenancy, strong separation and audit of administrative roles.
  • Use a key management system that you retain outside of your CSP, and that is independent of your provider.


By Steve Pate

Steve  co-founder and CTO of HighCloud Security, has more than 25 years of experience in designing, building, and delivering file system, operating system, and security technologies, with a proven history of converting market-changing ideas into enterprise-ready products. Before HighCloud Security, he built and led teams at ICL, SCO, VERITAS, HyTrust, Vormetric, and others. Steve has published two books on UNIX kernel internals and UNIX file systems. He earned his bachelor’s in computer science from the University of Leeds.

CloudTweaks Comics
Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

Reuters News: Powerfull DDoS Knocks Out Several Large Scale Websites

DDoS Knocks Out Several Websites Cyber attacks targeting the internet infrastructure provider Dyn disrupted service on major sites such as Twitter and Spotify on Friday, mainly affecting users on the U.S. East Coast. It was not immediately clear who was responsible. Officials told Reuters that the U.S. Department of Homeland Security and the Federal Bureau…

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

DDoS attacks, unauthorized access and false alarms Above DDoS attacks, unauthorized access and false alarms, malware is the most common incident that security teams reported responding to in 2014, according to a recent survey from SANS Institute and late-stage security startup AlienVault. The average cost of a data breach? $3.5 million, or $145 per sensitive…

A New CCTV Nightmare: Botnets And DDoS attacks

A New CCTV Nightmare: Botnets And DDoS attacks

Botnets and DDoS Attacks There’s just so much that seems as though it could go wrong with closed-circuit television cameras, a.k.a. video surveillance. With an ever-increasing number of digital eyes on the average person at all times, people can hardly be blamed for feeling like they’re one misfortune away from joining the ranks of Don’t…

Update: Timeline of the Massive DDoS DYN Attacks

Update: Timeline of the Massive DDoS DYN Attacks

DYN DDOS Timeline This morning at 7am ET a DDoS attack was launched at Dyn (the site is still down at the minute), an Internet infrastructure company whose headquarters are in New Hampshire. So far the attack has come in 2 waves, the first at 11.10 UTC and the second at around 16.00 UTC. So…

Cloud Infographic: Security And DDoS

Cloud Infographic: Security And DDoS

Security, Security, Security!! Get use to it as we’ll be hearing more and more of this in the coming years. Collaborative security efforts from around the world must start as sometimes it feels there is a sense of Fait Accompli, that it’s simply too late to feel safe in this digital age. We may not…

Ending The Great Enterprise Disconnect

Ending The Great Enterprise Disconnect

Five Requirements for Supporting a Connected Workforce It used to be that enterprises dictated how workers spent their day: stuck in a cubicle, tied to an enterprise-mandated computer, an enterprise-mandated desk phone with mysterious buttons, and perhaps an enterprise-mandated mobile phone if they traveled. All that is history. Today, a modern workforce is dictating how…

The Fully Aware, Hybrid-Cloud Approach

The Fully Aware, Hybrid-Cloud Approach

Hybrid-Cloud Approach For over 20 years, organizations have been attempting to secure their networks and protect their data. However, have any of their efforts really improved security? Today we hear journalists and industry experts talk about the erosion of the perimeter. Some say it’s squishy, others say it’s spongy, and yet another claims it crunchy.…

How You Can Improve Customer Experience With Fast Data Analytics

How You Can Improve Customer Experience With Fast Data Analytics

Fast Data Analytics In today’s constantly connected world, customers expect more than ever before from the companies they do business with. With the emergence of big data, businesses have been able to better meet and exceed customer expectations thanks to analytics and data science. However, the role of data in your business’ success doesn’t end…

Security: Avoiding A Hatton Garden-Style Data Center Heist

Security: Avoiding A Hatton Garden-Style Data Center Heist

Data Center Protection In April 2015, one of the world’s biggest jewelry heists occurred at the Hatton Garden Safe Deposit Company in London. Posing as workmen, the criminals entered the building through a lift shaft and cut through a 50cm-thick concrete wall with an industrial power drill. Once inside, the criminals had free and unlimited…

Four Recurring Revenue Imperatives

Four Recurring Revenue Imperatives

Revenue Imperatives “Follow the money” is always a good piece of advice, but in today’s recurring revenue-driven market, “follow the customer” may be more powerful. Two recurring revenue imperatives highlight the importance of responding to, and cherishing customer interactions. Technology and competitive advantage influence the final two. If you’re part of the movement towards recurring…

Your Biggest Data Security Threat Could Be….

Your Biggest Data Security Threat Could Be….

Paying Attention To Data Security Your biggest data security threat could be sitting next to you… Data security is a big concern for businesses. The repercussions of a data security breach ranges from embarrassment, to costly lawsuits and clean-up jobs – particularly when confidential client information is involved. But although more and more businesses are…

The Cancer Moonshot: Collaboration Is Key

The Cancer Moonshot: Collaboration Is Key

Cancer Moonshot In his final State of the Union address in January 2016, President Obama announced a new American “moonshot” effort: finding a cure for cancer. The term “moonshot” comes from one of America’s greatest achievements, the moon landing. If the scientific community can achieve that kind of feat, then surely it can rally around…

Why Security Practitioners Need To Apply The 80-20 Rules To Data Security

Why Security Practitioners Need To Apply The 80-20 Rules To Data Security

The 80-20 Rule For Security Practitioners  Everyday we learn about yet another egregious data security breach, exposure of customer data or misuse of data. It begs the question why in this 21st century, as a security industry we cannot seem to secure our most valuable data assets when technology has surpassed our expectations in other regards.…


Sponsored Partners