The Secrets to Achieving Cloud File Storage Performance Goals

Storage Performance with Cost Reduction

By 2025, according to Gartner, 80 percent of enterprises will shut down their traditional data centers. As of 2019, 10 percent have already shifted their data centers and storage to the cloud. Additionally, data generated is doubling every two years. This exponential growth of data leads to increased costs to store and manage the data in the cloud, trapping businesses in a cycle that requires exponentially more funds each year.

At a time when businesses are more reliant on data performance while also increasingly scrutinizing costs, how can IT professionals and departments, deliver on both cost and performance?

Addressing this question is arguably the biggest challenge in adopting the cloud. It isn’t the technology shift – it’s finding the right balance of cost vs. performance and availability that justifies the move to the cloud. Evaluating the many decisions that will play a role in the management of data in the cloud is daunting, but with an understanding of how services and features can reduce the amount of storage needed and improve data performance, enterprises can create a reliable and cost-effective cloud environment.

Leverage Features to Optimize Data Sets

When transitioning to the cloud, businesses expect to see some of the same tools and capabilities available to them via on-premises solutions, but they are not always in cloud-native storage solutions. When evaluating service providers, prioritize the following three features as they can have a large impact in improving data performance and reducing cost.

The first feature is deduplication. Deduplication compares block-to-block files and determine which ones to eliminate. This feature identifies and eliminates repetitive files and, in most cases, can reduce data storage by 20 to 30 percent.

Next is compression. Compression reduces the numbers of bits needed to represent a file or piece of data. By running compression across all files, the amount of storage required can be dramatically reduced and cut costs to store data by 50 to 75 percent, depending on the types of files being stored.

Finally, data tiering is a must. It may surprise people that 80 percent of an organization’s data is rarely used in the past 90 days. While it may not be frequently accessed, it doesn’t mean it should be eliminated or doesn’t have value. Data tiering is an important solution that manages aging data without eliminating it. It can move data sets from more expensive, faster performance storage to less expensive storage. Not all data needs to be high performance, so don’t pay for unnecessary levels of performance.

Through effective use of these features, deduplication, compression and data tiering, organizations can reduce data storage costs up to 80 percent, all while ensuring better data performance.

Evaluate Data Performance Needs

In addition to an understanding of how to optimize data, IT teams need to evaluate the level of performance needed for data sets. Businesses rely on data to manage daily operations, revenue growth, and customer satisfaction and retention. With mission-critical, revenue-generating applications housed in the cloud, data performance and reliability are a must and unplanned downtime can be catastrophic.

Organizations expect their data to perform consistently, even under heavy loads, unpredictable interference from noisy cloud neighbors, occasional cloud hardware failures, sporadic cloud network glitches, and other anomalies that just come with the territory of large scale datacenter operations.

To meet customer and business service load agreements (SLAs), cloud-based workloads must be carefully designed; the core of these designs is how data will be managed. Selecting the right file service component is one of the most critical decisions a cloud architect must make.

But ensuring data performance in the cloud is often easier said than done when businesses no longer control the IT infrastructure (compared to when they did with on-premises systems).

So how can enterprises negotiate competing objectives around cost, performance, and availability when they are no longer in control of the hardware or virtualization layers in their datacenter? And how can these variables be controlled and adapted over time to keep things in balance? In a word: control. Correctly choosing where to give up control and where to maintain control over key aspects of the infrastructure stack supporting each workload is critical to finding balance between cost and performance.

There are service providers that will enable businesses to maintain control over their data, similar to when they managed file storage and applications in their own datacenter. Instead of turning control over to the cloud vendors, seek providers that enable their customers to maintain control over the file storage infrastructure. This gives the businesses storing data the flexibility to keep costs and performance in balance over time.

More Storage is Not the Answer

One allure of the cloud is that it’s (supposedly) going to simplify everything into easily managed services, eliminating the worry about IT infrastructure costs and maintenance. For non-critical use cases, managed services can, in fact, be a great solution. But what about when IT teams need to control costs, performance and availability?

Unfortunately, managed services must be designed and delivered for the “masses,” which means tradeoffs and compromises must be made. And to make these managed services profitable, significant margins must be built into the pricing models to ensure the cloud provider can grow and maintain its services.

In the case of public cloud shared file services, such as AWS® Elastic File System (EFS) and Azure NetApp® Files (ANF), performance throttling is required to prevent thousands of customer tenants from overrunning the limited resources that are actually available. To get more performance, businesses must purchase and maintain more storage capacity, even if they don’t need additional storage. As storage capacity inevitably grows, so do the costs. And to make matters worse, much of that data is inactive most of the time, so businesses pay for data storage every month that is rarely, if ever, even accessed. And cloud vendors have no incentive to help businesses reduce these excessive storage costs, which just keep going up as data continues to grow each day.

This scenario has occurred countless times. But there is a critical distinction that needs to be made. Businesses should not need to buy more storage for better performance. They are not one and the same. It’s important to scrutinize costs and seek opportunities to optimize data and focus on performance, not storage.

Cloud migration projects can be complex and daunting, leading IT teams to take shortcuts to get everything up and running. These teams commonly choose cloud file services as an easy first step to a migration without considering important elements of the process and how they can dramatically impact the performance and reliability of their data, as well as the overall cost of the migration. With a clear understanding of how to effectively optimize data sets, the performance required for varying data sets, and that additional storage isn’t always the answer, businesses can reduce costs significantly. Data generation and the demand for consistent data performance continues to grow exponentially, but IT budgets don’t have to grow at the same rate.

By Rick Braddy

Gary Taylor

6 Organizational Challenges for Cloud Services

Cloud Service Challenges Organizations have rapidly come to the realization that digital cloud services make a compelling business case for helping them navigate this difficult pandemic year. The market for cloud services is expected to ...
Trading view

Notable Stock Screeners for 2021

Stock Screeners By this point, you probably already heard about everything that happened with GameStop (GME) stocks. GameStop is the largest video game retailer with 5,000 stores around the U.S. Due to the pandemic, the ...
Juan Pablo Perez Etchegoyen

The S/4 HANA Decade is Here: Three Tips for a Successful Migration

Three Migration Tips For organizations using SAP, migrating to S/4 HANA is a project that’s either in the works or on the horizon as the 2027 deadline for completion looms. The new generation of SAP ...
Atman Rathod

How APIs and Machine Learning are Evolving? 

Machine Learning Continues to Make API Development Better  For any developer, API or Application Programming Interfaces come as the helpful components to add valuable features and functionalities with the app they develop. API in many ...
Robert Van Der Meulen

Focusing on Online Gaming Security During Development

Online Gaming Security Infrastructure Updated article: June 2nd, 2020 There are millions of gamers around the globe and as of 2018, video games generated sales of US$134.9 billion annually worldwide. As video games continue to ...
Tunio Zafer

Remote Collaboration Solutions That Cloud Storage Solves

Remote Collaboration Solutions Over the last few decades, cloud computing has improved the digital world in profound ways. With immediate access to a greater number of resources and tools, cloud computing allows users to pursue ...