On Rob Kaufmann’s Thesis: NAS vs Cloud Part 1

On Rob Kaufmann’s Thesis: NAS vs. Cloud Part 1

A few weeks back KPI Analytics employee Rob Kaufmann suggested that network attached storage (NAS) solutions for data serving could in numerous cases provide a greater advantage than Cloud services. Specifically, he cited that it would be preferable for those serving massive files or very sensitive data. This was debated at a press conference on July 17th.

I’m here to suggest that although Kaufman is making some interesting points, his NAS route isn’t as promising as Cloud – even given the nascent form of the latter. On many counts that Kaufman cites as reasons to adopt NAS in preference of Cloud, technologies are already mobilising to close the gap. Despite this, NAS certainly has its place in smaller-scale set-ups. Let’s look at a few different arrangements.

Kaufmann’s principle argument comes from ‘the bandwidth problem’. A recent Cisco research study in to data creation predicted a global data production rate of 1.3 trillion gigabytes by 2016. That’s 3 times the data production rate in 2009. Couple to this another finding – that the number of web-connected devices was likely to rise from around 2.2 billion in 2011 to 5.1 billion by 2016 – and ‘the bandwidth problem’ does seem like a potential threat to the internet-connection-heavy Cloud service. Our current infrastructure can hardly cope with the rate of expansion – even in highly developed countries, the difference between peak and off-peak broadband capacity is fairly large – and so, suggests Kaufmann, Cloud is not a great service to be putting all your faith in. Network Attached Storage, however, being connected locally (and in server clusters etc.) will never suffer from this problem. The bandwidth of your network is down to your network.

There are a few things to say on this claim here. Firstly, Kaufmann’s argument seems to overlook recent successes seen by server- and client-side virtualisation technologies. These aren’t technologies just starting out – they’re tried, tested and, in some enterprises, established. Through clustering remote servers from across multiple geographical areas, most bandwidth issues can be avoided. In fact, most bandwidth use is by synchronisation protocols in running coherent virtualised servers. The amount the user actually draws is fairly small by way of comparison.

In a smaller, more focused enterprise, NAS might make any bandwidth concerns a non-issue: by locating drives locally, there’s no painful limiting of speed from an external network (such as the world wide web).

Kaufmann then cites another reason to adopt NAS in the place of Cloud – full control of redundancy, security and backup procedures. We are living in an age where these things should be automated. It’s considerably better to ‘set and forget’ automated backup procedures than to manually oversee them. There’s less risk. There’s less data exposure. And it’s very, very unlikely to fail (especially if we’re talking sizeable virtualisation here, in which node failure has little impact on data integrity). Am I just hoping we’ll ‘leave it all to the machines’? Isn’t that a bit Terminator 3? No! Of course it isn’t! ‘Leaving it to the machines’ is surely the appropriate thing to do in so highly automated an industry! We rely on automated routines to provide virtually every user interface we ever interact with in business. Computers are light years beyond where they were when they needed to focus on providing stable GUIs – they can handle little chores like backup routines (in fact, that’s just what they’re good at).

Again, this isn’t a wholesale argument – enterprises may again see benefits form adopting NAS, especially if they haven’t opted for a huge bandwidth via their internet provider. This goes for backups, too – if there’s the option, a local backup via NAS is handy to have as a second, and will make a trusty primary storage location.

Next article I’ll take a look at two additional points that Kaufmann makes, and explain why both of them need more beef to be more convincing given the service provision by some corporations out there.

By Joanna Stevenson

Joanna studied mechanical engineering in London, and currently works for an energy research and consulting firm. She enjoys writing tech and business articles in her free time. She aspires to be an intrepid tech and Gaming enthusiast with the exploratory spirit and witty prose of her favourite author of Robert Louis Stevenson.  Treasure Island for the tech world.

Miha Kralj

SaaS Native – Design, Delivery and Management of Applications

Going cloud native, the right way Moving from a traditional IT organization to one that’s cloud native is an inevitability for all businesses. This is ...
Sebastian Grady

Digital Transformation – Updated Metrics for the Cloud Era

Cloud Era Metrics Undertaking digital transformation means also transforming how IT success is defined, including metrics that address business in the cloud.  With up to ...
Mark Kirstein

IT Pros Can Now Deliver a More Streamlined, Cost-Efficient Migration of Microsoft Teams

IT Pros Deliver a More Streamlined Migration of Microsoft Teams In the modern workplace, the ability for employees to collaborate and engage with each other ...
Will Crump

The Key to a Successful M&A = Data

Successful M&A = Data Data is often the single point of failure for many organizations. Divestitures, privatization, leveraged buyouts, and management buyouts are all on ...
Karen Gondoly

You Don’t Need Cloud Desktops, You Need Cloud-Based VDI. Here’s Why

Cloud Desktops / Cloud-Based VDI Virtual Desktop Infrastructures (VDI) have been around for a while. As an example, VMware started selling their first VDI product ...
Ajay

Deep learning to avoid real time computation

Avoid real time computation “The underlying physical laws necessary for the mathematical theory of a large part of physics and the whole of chemistry are ...
Data Fallout.png