The Secrets to Achieving Cloud File Storage Performance Goals

Storage Performance with Cost Reduction

By 2025, according to Gartner, 80 percent of enterprises will shut down their traditional data centers. As of 2019, 10 percent have already shifted their data centers and storage to the cloud. Additionally, data generated is doubling every two years. This exponential growth of data leads to increased costs to store and manage the data in the cloud, trapping businesses in a cycle that requires exponentially more funds each year.

At a time when businesses are more reliant on data performance while also increasingly scrutinizing costs, how can IT professionals and departments, deliver on both cost and performance?

Addressing this question is arguably the biggest challenge in adopting the cloud. It isn’t the technology shift – it’s finding the right balance of cost vs. performance and availability that justifies the move to the cloud. Evaluating the many decisions that will play a role in the management of data in the cloud is daunting, but with an understanding of how services and features can reduce the amount of storage needed and improve data performance, enterprises can create a reliable and cost-effective cloud environment.

Leverage Features to Optimize Data Sets

When transitioning to the cloud, businesses expect to see some of the same tools and capabilities available to them via on-premises solutions, but they are not always in cloud-native storage solutions. When evaluating service providers, prioritize the following three features as they can have a large impact in improving data performance and reducing cost.

The first feature is deduplication. Deduplication compares block-to-block files and determine which ones to eliminate. This feature identifies and eliminates repetitive files and, in most cases, can reduce data storage by 20 to 30 percent.

Next is compression. Compression reduces the numbers of bits needed to represent a file or piece of data. By running compression across all files, the amount of storage required can be dramatically reduced and cut costs to store data by 50 to 75 percent, depending on the types of files being stored.

Finally, data tiering is a must. It may surprise people that 80 percent of an organization’s data is rarely used in the past 90 days. While it may not be frequently accessed, it doesn’t mean it should be eliminated or doesn’t have value. Data tiering is an important solution that manages aging data without eliminating it. It can move data sets from more expensive, faster performance storage to less expensive storage. Not all data needs to be high performance, so don’t pay for unnecessary levels of performance.

Through effective use of these features, deduplication, compression and data tiering, organizations can reduce data storage costs up to 80 percent, all while ensuring better data performance.

Evaluate Data Performance Needs

In addition to an understanding of how to optimize data, IT teams need to evaluate the level of performance needed for data sets. Businesses rely on data to manage daily operations, revenue growth, and customer satisfaction and retention. With mission-critical, revenue-generating applications housed in the cloud, data performance and reliability are a must and unplanned downtime can be catastrophic.

Organizations expect their data to perform consistently, even under heavy loads, unpredictable interference from noisy cloud neighbors, occasional cloud hardware failures, sporadic cloud network glitches, and other anomalies that just come with the territory of large scale datacenter operations.

To meet customer and business service load agreements (SLAs), cloud-based workloads must be carefully designed; the core of these designs is how data will be managed. Selecting the right file service component is one of the most critical decisions a cloud architect must make.

But ensuring data performance in the cloud is often easier said than done when businesses no longer control the IT infrastructure (compared to when they did with on-premises systems).

So how can enterprises negotiate competing objectives around cost, performance, and availability when they are no longer in control of the hardware or virtualization layers in their datacenter? And how can these variables be controlled and adapted over time to keep things in balance? In a word: control. Correctly choosing where to give up control and where to maintain control over key aspects of the infrastructure stack supporting each workload is critical to finding balance between cost and performance.

There are service providers that will enable businesses to maintain control over their data, similar to when they managed file storage and applications in their own datacenter. Instead of turning control over to the cloud vendors, seek providers that enable their customers to maintain control over the file storage infrastructure. This gives the businesses storing data the flexibility to keep costs and performance in balance over time.

More Storage is Not the Answer

One allure of the cloud is that it’s (supposedly) going to simplify everything into easily managed services, eliminating the worry about IT infrastructure costs and maintenance. For non-critical use cases, managed services can, in fact, be a great solution. But what about when IT teams need to control costs, performance and availability?

Unfortunately, managed services must be designed and delivered for the “masses,” which means tradeoffs and compromises must be made. And to make these managed services profitable, significant margins must be built into the pricing models to ensure the cloud provider can grow and maintain its services.

In the case of public cloud shared file services, such as AWS® Elastic File System (EFS) and Azure NetApp® Files (ANF), performance throttling is required to prevent thousands of customer tenants from overrunning the limited resources that are actually available. To get more performance, businesses must purchase and maintain more storage capacity, even if they don’t need additional storage. As storage capacity inevitably grows, so do the costs. And to make matters worse, much of that data is inactive most of the time, so businesses pay for data storage every month that is rarely, if ever, even accessed. And cloud vendors have no incentive to help businesses reduce these excessive storage costs, which just keep going up as data continues to grow each day.

This scenario has occurred countless times. But there is a critical distinction that needs to be made. Businesses should not need to buy more storage for better performance. They are not one and the same. It’s important to scrutinize costs and seek opportunities to optimize data and focus on performance, not storage.

Cloud migration projects can be complex and daunting, leading IT teams to take shortcuts to get everything up and running. These teams commonly choose cloud file services as an easy first step to a migration without considering important elements of the process and how they can dramatically impact the performance and reliability of their data, as well as the overall cost of the migration. With a clear understanding of how to effectively optimize data sets, the performance required for varying data sets, and that additional storage isn’t always the answer, businesses can reduce costs significantly. Data generation and the demand for consistent data performance continues to grow exponentially, but IT budgets don’t have to grow at the same rate.

By Rick Braddy

Gilad David Maayan
What Is Object Storage? Object storage, in the simplest terms, is a data storage architecture that manages data as objects, as opposed to traditional block storage or file storage architectures. These objects include the data, ...
Steve Prentice
The Need for Experts The explosion in AI technologies has brought with it clear concern that easy answers and intelligent copywriting are now the domain of machines. This has led to the question of whether ...
Tiago Ramalho
More equitable future for food distribution with AI At best, only 70% of food gets used in the United States. The rest goes to waste. Although devastating, the good news is this massive waste of ...
Ray Meiring
Fueled by extensive demand in IT, healthcare, financial services, and telecommunication—initially spurred by the pandemic-driven frenzy to transition to remote working—managed service providers (MSPs) are busier than ever. As businesses adopt MSP services to upgrade, ...
Gary Bernstein
AI-powered identity verification Even if you don’t want to admit it, doing business online in today’s environment poses a greater risk. Criminals are constantly on the lookout for vulnerabilities to exploit, including hacking, data breaches, ...
Vulnerabilities
Cyber Threat Intelligence In an era of rapid digital transformation, we have witnessed a concerning evolution in the cyber threat landscape. Recent data analyses, as illustrated in the "Cyber Threat Intelligence Index: Q3 2023" report, ...