Category Archives: Technology



The cyberwar is on!

At this stage of the game, the stakes are higher than ever, and safeguarding networks from cyberattacks is a devilish combination of Chicken and Cat-and-Mouse. Attacks are now so commonplace that many events of serious cybersecurity breaches go uncovered by mainstream media.

Despite the rapid advancements in IT security technology, hackers continue to invent even more sophisticated ways of infiltrating systems around the globe. No one is safe—and although the awareness is there, and the threats are very real, some companies still think they are safe and have it covered.

The fact is that it is second by second battle. Entire global networks can be comprised by a single click of the mouse or the press of a button at any given time. Everything is at risk: content, data and network security concerns for businesses, consumers and even governments. Although cyberterrorists always seem to find a way in, leaving the experts scratching their heads, the outlook is not at all as grim as one might suspect.

According to an August 22, 2016 article on magazine, “3 Hot Cybersecurity Stocks,” hacking is now a global growth industry! The cyberwar is on, which is why cybersecurity is the fastest growing sector in technology today. Not only do the good guys want to stop the bad guys—they want to make a profit doing it. It is a win-win from a business standpoint. Attached is an attractive security infographic discovered via IDG Enterprise.


Close to 170 cyber attacks against corporations and other industry leaders have been documented by Privacy Rights Clearing House thus far in 2016. The massive breach upon DNC leading the pack as one of the most shocking.

What Companies Should Be Doing

Upwards of 40 percent of investment firms reported increasing the budget for cybersecurity for the year. Given the state of matters, IT security experts have a lot to say to those firms lagging behind. Cybersecurity is a daily investment every company can’t afford not to make.

From small startups to large corporations, everyone in business needs to put network and system security at the top of their priority list. The areas most in need of heightened security are application encryption, event and information management systems, user access management, data-at-rest defenses, and tokenization (the process of substituting a sensitive data element with a non-sensitive equivalent).

Even the little guy on the block should invest in professional IT security management. Your neighbor who just graduated from an online cybersecurity school might not have the chops under his belt to keep up with the lightning speed this cyberwar is gaining. You need serious guards against serious attackers who take no prisoners and spare nothing to get at information and control, sometimes just for the fun of it.

By CJ Callen



Big Data Analytics Career Path

With the inception of Big Data, we have witnessed a data revolution around us. Big Data Analytics is one of the biggest trends in IT sector at the moment and the buzz around it is not going to subdue anytime soon. Big Data is everywhere now-a-days, and if the industry forecasts are to be believed, then Big Data Analytics market will continue to grow bigger as businesses realize the importance of making data-driven decisions.

Wondering whether to take up a career in Big Data Analytics or not? If you are not convinced yet, then here are 6 reasons why making the switch to Big Data Analytics is a smart career move in 2017.

1. Exponential Growth Rate

Big Data sector has recorded six times faster growth than the average growth rate of IT industry in the past couple of years. According to the market experts, Big Data sector would sustain the momentum and continue to outpace other IT sectors by a significant margin in the years to come. Currently the size of Big Data Analytics market is around one-tenth of the global IT market, but it is expected to evolve to at least one-third by the end of 2020. So if you are looking to make a career in one of the fastest growing IT sectors, there is no better alternative than Big Data Analytics.

2. Soaring Demand for Qualified Professionals

Big Data initiatives were once reserved only for the giants like Microsoft and Amazon. That’s not the case anymore. Today, organizations of all sizes are venturing out to leverage the power of Big Data. As more and more IT companies are getting involved in Big Data projects, currently there is a big demand for engineers who can help process data at large-scale.

A closer look at the prominent job sites can give you a sense of the huge demand. There has been a steady increase in the number of job opportunities related to Big Data Analytics and the trend is still swinging upwards. It’s no wonder that qualified Big Data professionals are becoming the hottest targets of IT recruiters from all around the globe.

3. Lack of Skilled Professionals

Data is useless without the skill to process and analyze it. Being a relatively young field, the Big Data sector is currently witnessing severe shortage of technical expertise. While the demand is going up steadily, there is a huge deficit on the supply side. Lots of vacant positions have remained unfilled due to the scarcity of skilled professionals. In an industry where opportunities are plenty but skills are scarce, finding a suitable job shouldn’t be too difficult for the candidates having the right qualification.

4. Fat Paychecks

Thanks to the inadequate supply of Big Data skills, companies are willing to shell out lucrative salaries to attract professionals with the right kind of technical expertise. A quick look at the current salary trend for Big Data professionals indicates an exponential growth. According to a recent survey conducted by US-based recruitment agency Burtch Works, Big Data professionals tend to be the highest compensated group of IT employees with an average salary of $115,000 – which is around 30% more than that of other IT professionals with the same experience level.

5. Adaptation Across Different Industry Verticals

The astonishing growth of Big Data is largely attributed to its relevance across different industry verticals. All types of businesses can use the insights derived from Big Data to get an edge over their competitors. Besides the technology sector, Big Data is increasingly being utilized by other industry verticals for business intelligence, predictive analytics and data mining tasks.

Healthcare, consumer appliances, energy sector, manufacturing industry, and banking are a few of the verticals where Big Data has made its presence felt. As a result, the rate of Big Data implementation has increased by leaps and bounds. So the professionals with Big Data expertise can choose which industry to work for, based on their own preferences.

6. Wide Range of Roles and Responsibilities

Last, but not the least, Big Data professionals have a wide variety of positions to choose from. From Data Engineer to Business Analyst, Visualization Specialist to Machine Learning Expert, and Analytics Consultant to Solution Architect – there are so many options available for the aspiring professionals to align their career paths according to their interests and preferences.

Conclusion: Big Data Analytics help organizations derive meaningful insights from raw data to make the right business decisions at the right time. In today’s competitive job market, technology professionals with the right kind of skill-set are finding themselves in high demand as businesses look to harness the power of Big Data. With the increase in demand, shortage of talent, bigger paychecks, multiple job titles and relevance to different industry verticals, Big Data Analytics is certainly a smart career choice in 2017.

By Jack Danielson



Cloud Has Been A Godsend

Cloud has been a godsend for folks trying to implement systems quickly and for us to secure workloads better,” said CIA Chief Information Security Officer Sherrill Nicely at a recent conference. Surprised that our spies – ahem, the US intelligence community (IC) – uses cloud? What about security? Who built it and runs it? The story behind it, and its rapid and successful deployment, has a lot of lessons for all of us.

Yes, the CIA is a special place and yes, they do have very special security needs. But, if we think about it, they process vast amounts of information and they need to be agile and flexible to respond to evolving threats and situations. The cloud provides that agility and flexibility plus a virtually unlimited sea of capacity. How do you take advantage of that and still meet those special security needs. The answer was to build a “community cloud” that not only the CIA could use but the whole IC of 17 agencies.

Here is one of the best definitions from the National Institute of Standards:

A community cloud in computing is a collaborative effort in which infrastructure is shared between several organizations from a specific community with common concerns (security, compliance, jurisdiction, etc.), whether managed internally or by a third-party and hosted internally or externally. This is controlled and used by a group of organizations that have shared interest.

Sure sounds like this is just what the spooks would want. How they went about getting it was both straightforward and a bit industry shattering. Like most government agencies the CIA ran a procurement. Now Federal procurements are much like a Kabuki dance with very specific steps and regulations that must be followed to the letter. It does not necessarily land up with the government buying the best tech but instead often just selecting the vendor who knows the process the best. Can you say:

Nonetheless, the ball got rolling in 2012 and true to the process Microsoft and AT&T protested the CIA’s request-for-proposal specifications in mid-2012, forcing the CIA to pull the procurement and rework it. AWS (Amazon Web Services) then won the contract in early 2013, only to have the process slowed again by protests and legal proceedings from the then only other bidder, IBM.

IBM was and is a big government contractor. AWS at the time was not so much. The odds were that IBM would take this candy away from baby AWS. At first, that’s what certainly seemed to play out when the GAO – first stop in the protest process – declared for IBM. But AWS did not take it lying down and sued in federal court – the next step. To everyone’s surprise, the judge not only gave AWS the contract award but also slammed IBM for some sketchy proposal tactics. This was an industry moment of truth. Mighty Big Blue was not only defeated but all agreed the Amazon solution was better and IBM had tried to cheat to beat it!

AWS got the green light to start work in late 2013 and by early 2015 – less than 18 months – it was up and operational. Then AWS took it a step further with the CIA’s blessing. In the commercial world, AWS operates the AWS Marketplace. The AWS Marketplace was launched in 2012 to accommodate and foster the growth of AWS services from third-party providers that have built their own solutions on top of the Amazon Web Services platform. It provides a one-stop shop to get all kinds of applications and services.

AWS said why not do the same thing for the intelligence community. It launched the IC Marketplace allowing spy agencies – led by the CIA – to evaluate and buy common software, developer tools and other products that meet stringent security standards. This really shakes up the usual Federal software procurement process and enables even more of the flexibility and agility that were the original goals. Once your offering has been vetted for the Marketplace any properly cleared shop can try and buy.

Pretty nifty, eh? When was the last time you thought of your IT as a godsend?

Originally published August 18th, 2016

By John Pientka

Artificial intelligence In the Enterprise

Artificial intelligence In the Enterprise

Artificial Intelligence

Since the dawn of the computer age we have been enthralled by the prospect of Artificial Intelligence. It dominated the science fiction of the 1950s and 1960s, and it was a passion so strong that it bled into the fabric of day-to-day existence. Everyone wanted their own robot servant, everyone got a little giddy at the prospect of a walking, talking, thinking robot.

The Golden Age of Science Fiction did get a few things right about the future. But while the modern world is not too far removed from the imagined utopias of Arthur C. Clark and H. G. Wells, we are lacking that one key feature: an intelligent, calculating, man-made brain.

At least, thats what many of us believe. The truth is that AI is all around us. It powers many of the things we rely on and the companies that we love. Without it, we wouldnt be able to use the internet or play computer games, and the manufacturing world would look decidedly poorer as well.

So, while we dont quite have a robot in every home, our lives are still ruled by some form of AI. It may not be as advanced or as intelligent as those aforementioned sci-fi authors had hoped, but its more advanced than many of us realize.

And at least we dont have to worry about that AI overthrowing humanity and taking over the world. Not yet, anyway. Infographic discovered via


Artificial Intelligence In The Enterprise


Artificial Intelligence

Since the dawn of the computer age we have been enthralled by the prospect of Artificial Intelligence. It dominated the science fiction of the 1950s and 1960s, and it was a passion so strong that it bled into the fabric of day-to-day existence. Everyone wanted their own robot servant, everyone got a little giddy at the prospect of a walking, talking, thinking robot.

The Golden Age of Science Fiction did get a few things right about the future. But while the modern world is not too far removed from the imagined utopias of Arthur C. Clark and H. G. Wells, we are lacking that one key feature: an intelligent, calculating, man-made brain.


At least, thats what many of us believe. The truth is that AI is all around us. It powers many of the things we rely on and the companies that we love. Without it, we wouldnt be able to use the internet or play computer games, and the manufacturing world would look decidedly poorer as well.

So, while we dont quite have a robot in every home, our lives are still ruled by some form of AI. It may not be as advanced or as intelligent as those aforementioned sci-fi authors had hoped, but its more advanced than many of us realize.

And at least we dont have to worry about that AI overthrowing humanity and taking over the world. Not yet, anyway. Infographic discovered via


By David Jester



Two-Factor Authorization

Two-factor authorization. Most of us think we know what it is, but a recent news event brought something alarming to my attention: even huge companies misunderstand what two-factor authorization means, and your personal information could end up at risk because of this.

Let’s start with this recent report: United Airlines changed its security protocols.

Originally, the account holder only needed a username and password. Years ago, that was enough. But, in today’s world, simply displaying a username and password is not sufficient for keeping you protected.

Changes have been made to the security structure of United Airlines’ accounts. Instead of just typing in your password and username, they have now integrated two additional security questions for you to answer. Sounds great, right? Well, maybe not so much.


Answering two security questions, in addition to your password and username, is nothing new. Many people do this on a daily basis. A couple examples of these security questions include: “What elementary school did you attend?” and “What was the name of your first pet?”. While these do offer another layer of security, some are up in arms over how small the security blanket really is with these types of questions. Not only that, but United Airlines is claiming these additional security questions are a form of “two-factor authorization.” In reality, they aren’t even close.

What Is Two-factor Authorization?

Two-factor authorization is a much more stable and secure form of protection for people. At the heart of two-factor authorization is the mantra Jon Evans puts forth on TechCrunch (post cited above): “Something you know, something you have.” A third factor (“something you are”) may also be used in conjunction with the first two. “Something you know” is anything from a PIN to a password or even a pattern of some kind. Most people are used to putting in a PIN when they use their credit or debit card to buy gas, for example.

The second factor is “something you have.” This is a physical factor. Most of the time, the physical factor takes the form of a card. Sticking with the gas example, swiping your card at the reader acts as the physical factor. In other instances, you may be given a physical token for one-time use. Either way, a physical factor is in play.

Something you are” is bit more advanced. If you’ve ever used a fingerprint scanner to get into work (along with a card), you’ve experienced two-factor authorization. Along with your fingerprints, your voice is also a form of “something you are.” Both forms are becoming ever more present in the digital world and are key to keeping you safe.

Now you know what two-factor authorization is. But how does it help you? And why is it important? Let’s take a look at the pros and cons of the two security questions and two-factor authorization.

The Difference

A pro for the basic security questions is that it does offer an extra layer of protection to your account. However, that’s really where the pros end. And that’s not good.

A pro of the two-factor authorization method is simple: physicality. Hackers have an extremely difficult time breaking into an account that’s using two-factor authorization. Why? Because they actually need either your physical card (token) or a recording of your voice or a copy of your fingerprint. Essentially, the hacker needs to meet you, physically, and steal your belongings, to hack into your account.

Looking at the cons, the two-factor authorization method really doesn’t have any. However, the cons for the basic security questions are obvious. Not only can hackers successfully hack your account over the computer using these security questions, but the questions themselves pose a problem.

People tend to choose the first two questions to answer. They rarely look at the list and pick two meaningful questions. Thus, hackers have a much easier chance of guessing your answers.

It’s All Up to You Now

In the world of cyber security, two-factor authorization is the way to go. With a physical component attached to it, this method is a much safer choice for your business. If you have any accounts that do not use two-factor authorization, make sure you answer security questions that appear further down the list. According to a senior writer at CNET, the best way to create a safe and secure password is to pick four random words. This will give you a leg up on the criminals.

With the advent of two-factor authorization, hacking should be more difficult. Only time will tell.

By Kayla Matthews

Tweaking with Application Assessment Tools

Tweaking with Application Assessment Tools

Application Assessment

We have all seen the TV commercial where impossible situations are solved quickly by simply pressing a button market “Easy.” For many organizations, the cloud presents a difficult transition. Over the past few years as a consultant helping organizations consider cloud computing, I have developed a number of useful tools to help customers make the leap.

A number of the tools I’ve developed have nothing to do with technology, or they are focused more on the business reality of the organization, not the technical reality. In fact, I have a tools-based process to help customers take a look at their environment and ultimately get to where they want to be. One of the tools I have been working with for the past five years is that of Application Assessment.

Many years ago I developed an application assessment process. That process was designed to map organization requirements to application capabilities in order to produce a view of what an organization really needed to migrate. There is another piece to that process that hasn’t been published until now –the concept of application improvement.


As you move applications from your on-premise data center to the cloud, the first thing you are told is to prepare or “cloudify” the application, where cloudify represents enabling the components of cloud applications that traditional on-premise applications don’t normally have (unless it is a cloud app running on premise for whatever reason). This tool involves starting with questions for IT and the business about the application.

Spinning of Wheels

The concept and the process are holistic. The goal is to see your application end to end. It is possible, as you consider your application, to consider components. In considering components of your application it is possible that you can speed up any one component. The example would be a motorcycle. Whereas an improvement, you can speed up the front wheel of the motorcycle. This would allow the front wheel to spin faster than the rear wheel. However, the result would be that either the governor for the rear wheel would overheat and seize, or the governor would burn out and you wouldn’t be able to apply brakes to the rear wheel. In either case, you wouldn’t speed up the motorcycle and the process improvement would lead to additional repairs.

Speed up both wheels by reducing friction when the brakes aren’t applied and you will speed up that motorcycle.  The holistic approach then takes a view of the application and what it touches. This overall process has a number of tools that gather the data you will need for this particular tool. The goal of this tool is to evaluate specific applications and the impact of speeding up all, or part, of that specific application. The intent of this tool is not to gather data, but rather to impact the process of determining whether or not we can speed up a specific application.

1. What are the components of the application overall?

2. What are the components that wait for other components in either assisting or building this applications output?

3. Can we speed up the components that produce wait times in the application overall?

4. If we speed up the components that delay the application now, will the overall application speed up?

The last one is the most important of the four; the first three give us the possible answers and the last one gives us the final answer. Again our goal is not to speed up the front wheel of our application motorcycle but to speed up the entire motorcycle.

Knowing the long-term goals of the organization and the overall capabilities of every application makes the transition easier. Good luck – and remember don’t speed something up because you can. Speed applications up because it makes your entire process faster. Nothing is worse than waiting for data. Data that waits for use is out of date.

By Scott Andersen

To Migrate or to Not Migrate: In-House vs. Outsourced Cloud Computing

To Migrate or to Not Migrate: In-House vs. Outsourced Cloud Computing

In-House vs. Outsourced Cloud Computing

Through working as an executive in the managed DNS industry for over 15 years, I have become a sort of expert in managing costs while also maintaining high performance and uptime standards. With the recent push the cloud, I have been urged to evaluate the cost efficiencies between operating an in-house or cloud-hosted network. But most importantly, I have been asked to discuss the performance benefits of each, and whether these benefits are worth the price tag.

I have managed networks ranging from a rack in my basement, to a multi-million dollar network that spans over 16 facilities on 5 different continents. When you manage large-scale networks, you have to learn many different skills that venture far beyond network engineering. You have to learn how to pick the right hardware, strike deals with different providers, and of course do all of this while trying to stay cost efficient.

Outsourced Cloud Computing

(Image Source: Shutterstock)

The recent push to migrate to cloud infrastructure has won over the majority of top online retailers. Some converters have gone so far as to move their entire on-premises systems onto the cloud, and are boasting significant performance improvements. Ecommerce giants, like Etsy, are using the cloud to host big data analytics that predict what customers will want to purchase next. Big data analytics require massive amounts of storage and bandwidth, better served with cloud-based solutions.

Organizations who deliver large content loads to international audiences, like Netflix, have moved to the cloud because their on-prem systems couldn’t grow to scale quickly enough. Netflix announced that had finally completed their seven-year migration to the cloud earlier this year. Big moves like Netflix’s are ideal for companies that need to expand at a rapid pace, because the cloud offers a flexible environment engineered for growth.

Costs of Moving to the Cloud

But usually, these decisions all come to down to price. So I’m going to cut to the chase and show you a rough breakdown of how much it costs to move to the cloud, and how that compares to hosting an in-house system.

We host our network from 16 different facilities around the world. Over the years we have dealt with pretty much everything when it comes to hosting your own infrastructure. For this example, I’m going to use a rough average of what our infrastructure requires.

Let’s say that in our environment, eight servers would cost us roughly $25,000. That’s going to be your total upfront cost to purchase and erect your servers. Now you have to think about hosting, which can and will fluctuate. A typical month, requiring a “good” amount of bandwidth would roughly cost $300 per server.

So that’s 8 servers x $300 = $2,400 a month

You also have to take into account the cost of lighting up each server, which requires staff hours and initial setup and administration fees. You will also need a place to store your servers, racks, routers, and switches. Once everything is setup, you will also have to pay for additional staff hours to maintain all of your equipment. All of this together will run you about $450 a month per server.

Now we have 8 servers x $450 cost of initial light-up and maintenance = $3,600 per month

Don’t forget to add the initial $25,000 for servers + $2,400 for hosting + $3,600 for lighting up = $31,000 for your first month. Each subsequent month will cost $6,000 per month to maintain your infrastructure.

On the other hand, let’s take a look at what it would cost to host the same number of servers and bandwidth on cloud infrastructure. It would cost roughly $450 per month for each system alone.

That would be 8 servers x $450 for each system = $3,600 per month

Like an in-house system, you also have to factor in the cost of bandwidth. Depending on your usage, this could range anywhere from $600 to $1000 a month.

That would run you 8 servers x $800 for average bandwidth = $6,400 per month

Add that $6,400 to your $3,600 per system = $10,000 per month

If you look at the cost of each service side by side: an in-house system will cost substantially more up-front ($31,000), but less each following month ($6,000). Moving to the cloud may have a small premium, but the monthly cost ($10,000) is nearly double the cost to maintain an in-house system. Furthermore, the monthly estimation of a cloud system doesn’t account for staff hours for maintenance. While this is on an as needed basis, it could significantly increase monthly costs.

It’s also important to remember that in-house solutions are not for everyone. These systems demand at least a two-year commitment to earn back the cost of resources. The up-front costs for an in-house system may sound a little daunting, but you will end up earning back your investment quicker than you would with a cloud-based system. You also don’t have to pay a monthly cost for memory, since you will be using existing infrastructure that you paid for in your premium.

On the other hand, if you only require a short-term or temporary commitment, then cloud-based infrastructure is the best solution. If you have an app you need to test for a few weeks or months, then I would recommend looking at the cloud. The cloud is also a great way to see how your application or software would respond to different scalability requirements. You can also use the cloud if you are unsure what requirements you may need for an in-house system.


Startups tend to turn to cloud-based systems, because they don’t require a large premium, and can scale quicker than an in-house system. Cloud-based infrastructure also requires less staff hours, costs less for energy, and is able to offer more redundancy at a lesser price.

However, the cloud should only be used for a finite amount of time. Most VC’s don’t care about this, and will continue to pump money in the cloud. But if you want to be sustainable for the long haul and achieve significantly greater ROI, than an in-house system would be the best solution.

Some organizations continue to stray away from completely moving to the cloud, because they have already made a significant investment into their on-prem systems. It’s rare that companies are able to repurpose any of the equipment in their on-prem systems that they have spent years acquiring and maintaining.

For these organizations, either staying with an in-house system or using a hybrid system would be best. Hybrid infrastructure uses a balance of both cloud-based and in-house systems. This provides organizations with the elasticity of the cloud, while still being able to maintain their own infrastructure. One of the most efficient methods we have seen is a customer will use an on-prem network, but use move some traffic over to the cloud during high traffic periods, like Black Friday.


Whatever solution you choose for your organization, remember that you have to balance performance with cost. Even if you think one solution will help your business grow faster, you might see a lower ROI because the costs for that infrastructure are too high.

By Steven Job

What You Need To Know About Choosing A Cloud Service Provider

What You Need To Know About Choosing A Cloud Service Provider

Selecting The Right Cloud Services Provider

How to find the right partner for cloud adoption on an enterprise scale

The cloud is capable of delivering many benefits, enabling greater collaboration, business agility, and speed to market. Cloud adoption in the enterprise has been growing fast. Worldwide spending on public cloud services will grow at a 19.4% compound annual growth rate to go from $70 billion in 2015 to $141 billion in 2019, according to IDC.

Over the past several years, the software industry has been shifting to a cloud-first (SaaS) development and deployment model. By 2018, most software vendors will have fully shifted to a SaaS/PaaS code base,” said Frank Gens, a chief analyst at IDC.

But the boosts in efficiency and your bottom line that cloud adoption brings are not a foregone conclusion. In order to realize those benefits, it’s vital to find a reliable cloud services provider or integrator. The right partner can provide a platform that enables digital transformation and fosters innovation. As you begin your search, here are some key concepts that should be at the forefront of your mind.

Build trust and security

Security may not be the barrier to cloud adoption that it once was. Almost 65% of IT and security professionals surveyed by Skyhigh Networks agreed the cloud is either as secure or more secure than on-premises software. However, the firm also found that the average organization experiences 19.6 cloud-related security incidents every month. Attitudes may be shifting, but security concerns still loom large for many companies.

It’s important to find a cloud services provider that you can really trust. Seek a partner with proven security expertise, a solid platform for data sovereignty, and an impeccable track record. Ensure that they understand security is an ongoing battle, and have a continually evolving long-term release plan in place to address potential security issues.

Don’t sacrifice flexibility

One of the main advantages of the cloud is the fast scalability and business agility it can provide, so it’s important not to get walled in. Hybrid capability is important, and you want to be able to transition quickly and easily when you see a potential advantage.

The services you adopt should allow you to leverage the public cloud and integrate partner services. You want something that supports the usage of any public cloud, allowing for new service adoption down the line, but also leveraging essentials likeMicrosoft Azure. Consider how to handle peak demand and cater for customer preferences. It’s all about achieving the right balance to enable your business to grow and innovate.

Standardization is good

For the sake of clarity and cost, standardized services are desirable. Consider the compatibility, safety, interoperability, repeatability, and quality that standardization can provide. A platform like Office 365 will deliver a consistent experience for all of your customers and employees, regardless of the platform or the device they’re using.

Pick and choose the right blend of standardized software and open source technologies to create bundles that deliver the features you need without sacrificing the flexibility that enables you to stay competitive. Customized solutions are expensive, inflexible, and they lock you in to your partner’s roadmap.

Global coverage and stability

You need to be able to deploy, manage and upgrade your software and applications easily. Look for reliability and a strong history of release stability to minimize disruption. You also want a partner with a good balance between compute workload and location. International coverage can boost performance significantly by delivering compute where the users actually are. A single, centralized location is a major bottleneck.

Set business objectives

Performance reports, resource monitoring and service level agreements are all important, but you need to set tangible business goals at the outset and put metrics in place to test the effectiveness of your cloud services. You should have a deep understanding of the business advantages you’re expecting to achieve, and so should your partner. Look beyond the technical statistics and ask what they can do for your business.

A cloud services aggregator can take advantage of economies of scale to deliver services far more cheaply than you can ever manage internally, but finding the right partner is about more than cost. They need to be trustworthy, security-conscious, reliable and globally distributed. You need to retain the flexibility to adopt the emerging technologies that can drive innovation and creativity in your business.

By Nicholas Lee

CloudTweaks Comics
Cloud Infographic: Security And DDoS

Cloud Infographic: Security And DDoS

Security, Security, Security!! Get use to it as we’ll be hearing more and more of this in the coming years. Collaborative security efforts from around the world must start as sometimes it feels there is a sense of Fait Accompli, that it’s simply too late to feel safe in this digital age. We may not…

Update: Timeline of the Massive DDoS DYN Attacks

Update: Timeline of the Massive DDoS DYN Attacks

DYN DDOS Timeline This morning at 7am ET a DDoS attack was launched at Dyn (the site is still down at the minute), an Internet infrastructure company whose headquarters are in New Hampshire. So far the attack has come in 2 waves, the first at 11.10 UTC and the second at around 16.00 UTC. So…

The DDoS Attack That Shook The World

The DDoS Attack That Shook The World

DDoS Attack: Update 2 6 days after the DDoS attack that rocked the internet to its core, Dyn have released detailed analysis of the attack and further details have emerged. The attack has been confirmed to have been the largest of its kind in history, and the Mirai botnet has been cited as the official cause.…

A New CCTV Nightmare: Botnets And DDoS attacks

A New CCTV Nightmare: Botnets And DDoS attacks

Botnets and DDoS Attacks There’s just so much that seems as though it could go wrong with closed-circuit television cameras, a.k.a. video surveillance. With an ever-increasing number of digital eyes on the average person at all times, people can hardly be blamed for feeling like they’re one misfortune away from joining the ranks of Don’t…

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

Cloud Infographic – DDoS attacks, unauthorized access and false alarms

DDoS attacks, unauthorized access and false alarms Above DDoS attacks, unauthorized access and false alarms, malware is the most common incident that security teams reported responding to in 2014, according to a recent survey from SANS Institute and late-stage security startup AlienVault. The average cost of a data breach? $3.5 million, or $145 per sensitive…

Cloud-based GRC Intelligence Supports Better Business Performance

Cloud-based GRC Intelligence Supports Better Business Performance

Cloud-based GRC Intelligence All businesses need a strategy and processes for governance, risk and compliance (GRC). Many still view GRC activity as a burdensome ‘must-do,’ approaching it reactively and managing it with non-specialized tools. GRC is a necessary business endeavor but it can be elevated from a cost drain to a value-add activity. By integrating…

Four Keys For Telecoms Competing In A Digital World

Four Keys For Telecoms Competing In A Digital World

Competing in a Digital World Telecoms, otherwise largely known as Communications Service Providers (CSPs), have traditionally made the lion’s share of their revenue from providing pipes and infrastructure. Now CSPs face increased competition, not so much from each other, but with digital service providers (DSPs) like Netflix, Google, Amazon, Facebook, and Apple, all of whom…

Are Cloud Solutions Secure Enough Out-of-the-box?

Are Cloud Solutions Secure Enough Out-of-the-box?

Out-of-the-box Cloud Solutions Although people may argue that data is not safe in the Cloud because using cloud infrastructure requires trusting another party to look after mission critical data, cloud services actually are more secure than legacy systems. In fact, a recent study on the state of cloud security in the enterprise market revealed that…

Part 1 – Connected Vehicles: Paving The Way For IoT On Wheels

Part 1 – Connected Vehicles: Paving The Way For IoT On Wheels

Connected Vehicles From cars to combines, the IoT market potential of connected vehicles is so expansive that it will even eclipse that of the mobile phone. Connected personal vehicles will be the final link in a fully connected IoT ecosystem. This is an incredibly important moment to capitalize on given how much time people spend…

5% Of Companies Have Embraced The Digital Innovation Fostered By Cloud Computing

5% Of Companies Have Embraced The Digital Innovation Fostered By Cloud Computing

Embracing The Cloud We love the stories of big complacent industry leaders having their positions sledge hammered by nimble cloud-based competitors. chews up Oracle’s CRM business. Airbnb has a bigger market cap than Marriott. Amazon crushes Walmart (and pretty much every other retailer). We say: “How could they have not seen this coming?” But, more…

Ending The Great Enterprise Disconnect

Ending The Great Enterprise Disconnect

Five Requirements for Supporting a Connected Workforce It used to be that enterprises dictated how workers spent their day: stuck in a cubicle, tied to an enterprise-mandated computer, an enterprise-mandated desk phone with mysterious buttons, and perhaps an enterprise-mandated mobile phone if they traveled. All that is history. Today, a modern workforce is dictating how…

Don’t Be Intimidated By Data Governance

Don’t Be Intimidated By Data Governance

Data Governance Data governance, the understanding of the raw data of an organization is an area IT departments have historically viewed as a lose-lose proposition. Not doing anything means organizations run the risk of data loss, data breaches and data anarchy – no control, no oversight – the Wild West with IT is just hoping…


Sponsored Partners