July 15, 2020

The Secrets to Achieving Cloud File Storage Performance Goals

By Rick Braddy

Storage Performance with Cost Reduction

By 2025, according to Gartner, 80 percent of enterprises will shut down their traditional data centers. As of 2019, 10 percent have already shifted their data centers and storage to the cloud. Additionally, data generated is doubling every two years. This exponential growth of data leads to increased costs to store and manage the data in the cloud, trapping businesses in a cycle that requires exponentially more funds each year.

At a time when businesses are more reliant on data performance while also increasingly scrutinizing costs, how can IT professionals and departments, deliver on both cost and performance?

Addressing this question is arguably the biggest challenge in adopting the cloud. It isn’t the technology shift – it’s finding the right balance of cost vs. performance and availability that justifies the move to the cloud. Evaluating the many decisions that will play a role in the management of data in the cloud is daunting, but with an understanding of how services and features can reduce the amount of storage needed and improve data performance, enterprises can create a reliable and cost-effective cloud environment.

Leverage Features to Optimize Data Sets

When transitioning to the cloud, businesses expect to see some of the same tools and capabilities available to them via on-premises solutions, but they are not always in cloud-native storage solutions. When evaluating service providers, prioritize the following three features as they can have a large impact in improving data performance and reducing cost.

The first feature is deduplication. Deduplication compares block-to-block files and determine which ones to eliminate. This feature identifies and eliminates repetitive files and, in most cases, can reduce data storage by 20 to 30 percent.

Next is compression. Compression reduces the numbers of bits needed to represent a file or piece of data. By running compression across all files, the amount of storage required can be dramatically reduced and cut costs to store data by 50 to 75 percent, depending on the types of files being stored.

Finally, data tiering is a must. It may surprise people that 80 percent of an organization’s data is rarely used in the past 90 days. While it may not be frequently accessed, it doesn’t mean it should be eliminated or doesn’t have value. Data tiering is an important solution that manages aging data without eliminating it. It can move data sets from more expensive, faster performance storage to less expensive storage. Not all data needs to be high performance, so don’t pay for unnecessary levels of performance.

Through effective use of these features, deduplication, compression and data tiering, organizations can reduce data storage costs up to 80 percent, all while ensuring better data performance.

Evaluate Data Performance Needs

In addition to an understanding of how to optimize data, IT teams need to evaluate the level of performance needed for data sets. Businesses rely on data to manage daily operations, revenue growth, and customer satisfaction and retention. With mission-critical, revenue-generating applications housed in the cloud, data performance and reliability are a must and unplanned downtime can be catastrophic.

Organizations expect their data to perform consistently, even under heavy loads, unpredictable interference from noisy cloud neighbors, occasional cloud hardware failures, sporadic cloud network glitches, and other anomalies that just come with the territory of large scale datacenter operations.

To meet customer and business service load agreements (SLAs), cloud-based workloads must be carefully designed; the core of these designs is how data will be managed. Selecting the right file service component is one of the most critical decisions a cloud architect must make.

But ensuring data performance in the cloud is often easier said than done when businesses no longer control the IT infrastructure (compared to when they did with on-premises systems).

So how can enterprises negotiate competing objectives around cost, performance, and availability when they are no longer in control of the hardware or virtualization layers in their datacenter? And how can these variables be controlled and adapted over time to keep things in balance? In a word: control. Correctly choosing where to give up control and where to maintain control over key aspects of the infrastructure stack supporting each workload is critical to finding balance between cost and performance.

There are service providers that will enable businesses to maintain control over their data, similar to when they managed file storage and applications in their own datacenter. Instead of turning control over to the cloud vendors, seek providers that enable their customers to maintain control over the file storage infrastructure. This gives the businesses storing data the flexibility to keep costs and performance in balance over time.

More Storage is Not the Answer

One allure of the cloud is that it’s (supposedly) going to simplify everything into easily managed services, eliminating the worry about IT infrastructure costs and maintenance. For non-critical use cases, managed services can, in fact, be a great solution. But what about when IT teams need to control costs, performance and availability?

Unfortunately, managed services must be designed and delivered for the “masses,” which means tradeoffs and compromises must be made. And to make these managed services profitable, significant margins must be built into the pricing models to ensure the cloud provider can grow and maintain its services.

In the case of public cloud shared file services, such as AWS® Elastic File System (EFS) and Azure NetApp® Files (ANF), performance throttling is required to prevent thousands of customer tenants from overrunning the limited resources that are actually available. To get more performance, businesses must purchase and maintain more storage capacity, even if they don’t need additional storage. As storage capacity inevitably grows, so do the costs. And to make matters worse, much of that data is inactive most of the time, so businesses pay for data storage every month that is rarely, if ever, even accessed. And cloud vendors have no incentive to help businesses reduce these excessive storage costs, which just keep going up as data continues to grow each day.

This scenario has occurred countless times. But there is a critical distinction that needs to be made. Businesses should not need to buy more storage for better performance. They are not one and the same. It’s important to scrutinize costs and seek opportunities to optimize data and focus on performance, not storage.

Cloud migration projects can be complex and daunting, leading IT teams to take shortcuts to get everything up and running. These teams commonly choose cloud file services as an easy first step to a migration without considering important elements of the process and how they can dramatically impact the performance and reliability of their data, as well as the overall cost of the migration. With a clear understanding of how to effectively optimize data sets, the performance required for varying data sets, and that additional storage isn’t always the answer, businesses can reduce costs significantly. Data generation and the demand for consistent data performance continues to grow exponentially, but IT budgets don’t have to grow at the same rate.

By Rick Braddy

Rick Braddy

Rick Braddy, Founder and CTO of Buurst

Rick Braddy is an IT and software industry leader with more than 40 years of experience and a track record of success as an IT executive and leader. Rick co-founded Buurst, a leading enterprise-class data performance company formerly known as SoftNAS, in 2012 and currently serves as the CTO. He led the development of the original product, which directly contributed to the formation of the Cloud NAS market in 2014.

Rick has led the company through various phases of growth and established it as the #1 Cloud NAS on AWS and Microsoft Azure for the most mission-critical, high-performance applications customers move to the public cloud today.
JB Baker

SSD Controllers for AI & Data Centers: JB Baker Talks Future of Storage

SSD Controllers for AI & Data Centers Welcome to this Q&A session hosted by CloudTweaks, [...]
Read more

5 Azure Cost Management Strategies

What Is Azure Cost Management? Azure cost management refers to the practices and processes that [...]
Read more
Vulnerabilities

Flashpoint’s Cyber Threat Intelligence Index Edition

Cyber Threat Intelligence In an era of rapid digital transformation, we have witnessed a concerning [...]
Read more
Wealth Management Software Solutions - ServiceNow

Leading Online Savings and Wealth Management Services

Financial wealth management services (Updated: 06/29/2022) Many want to live in abundance, but very few [...]
Read more
David Dymko

Episode 17: Diving deep into Kubernetes

Working with virtual machines and or Kubernetes A conversation with David Dymko, Director of Engineering [...]
Read more
Algirdas Stasiūnaitis

The Future of Cybersecurity: Insights from Cyber Upgrade’s Founders

AI and Cybersecurity: Innovations and Challenges In the rapidly evolving landscape of technology, where artificial [...]
Read more

SPONSOR PARTNER

Explore top-tier education with exclusive savings on online courses from MIT, Oxford, and Harvard through our e-learning sponsor. Elevate your career with world-class knowledge. Start now!
© 2024 CloudTweaks. All rights reserved.