Allocating Cloud Costs
Enterprises often use tags to organize and optimize cost allocation. Although AWS services like EC2, EBS, RDS, and S3 can be tagged, there are many that can’t, making it quite difficult to allocate their cost. We previously touched on the basics of cloud resource tagging in an early post. Now we’d like to further examine a few comprehensive cost allocation scenarios that involve enterprise-grade clouds and thousands of AWS Amazon EC2 instances.
Proportional and Static Business Rules For Cost Allocation
Untaggable cloud resources can be allocated according to the same proportion of taggable resource types, like EC2 or RDS. Data transfer or AWS support costs can be allocated proportionally according to EC2 compute allocation for different business units. However, this can result in disproportionately taxing business units that don’t use EC2.
To demonstrate how this ‘proportional’ rule works, let’s take a look at Company X. Comprised of three business units (Marketing, Sales, and Engineering), Company X would like to measure AWS Support usage costs, which are untaggable. After applying the proportional rule, each of the units accounts for 33.3% of the company’s AWS Support usage. However, the Engineering department doesn’t use AWS Support and is therefore unfairly charged for an unused service. To get around this problem, the company decides to apply a static, custom cost allocation rule that allocates each business unit according to expected or actual usage. That way, both the Marketing and Sales departments account for the total overall cost of AWS Support usage, at 50% each, and Engineering accounts for 0%.
Combining Proportional and Static Rules
Unfortunately, a static rule is not particularly useful for companies that have dynamic environments with fluctuating resource usage in their applications or cost centers. Instead, they need to set up a dynamic rule to ensure meaningful cost allocation in their environments. For example, one of Cloudyn’s customers had several applications running on EC2 and also used another untaggable cloud service where usage was in direct proportion to their EC2 consumption. They came up with a dynamic rule where the untaggable resource was allocated dynamically based on each application’s EC2 usage. If this dynamic rule weren’t implemented, their financial department would have needed to manually update the aforementioned static rule every month to reflect the reality of the untaggable service’s usage.
Combining Linked Accounts and Tagging
There are two ways that Amazon has allowed companies to compartmentalize different business teams or applications. If an AWS customer has a single account shared by multiple applications or business units, they can only allocate cost and usage by tagging their resources. However, there are many AWS customers that have linked accounts, allowing them to break up their usage and billing per business unit or per application. We believe that the optimal solution for calculating aggregate cost allocation is to use both linked accounts and tagging.
Nevertheless, AWS customers that use both linked accounts and tagging can still run into conflicts. If some of the same tags appear in more than one linked account, then it’s impossible to allocate costs properly. As a temporary fix, usage and cost allocation should be measured according to each linked account, rather than the identical tag. After applying this temporary rule, companies should be sure to remove any identical tags.
The cloud’s dynamic environment can make cost allocation a complex task, especially when it comes to enterprises that need to maintain their business rules. A new approach to cloud monitoring needs to be considered in order to optimize the usage and cost of each specific business unit across an organization.
Today, enterprises can choose to allocate costs based on tags or linked accounts. It’s important they undertake comprehensive, dynamic tagging to maintain the most accurate cost allocations for cloud computing.
By Sharon Wagner
Sharon is the founder and CEO of Cloudyn, a leading provider of cloud analytics and optimization tools for multi-cloud deployments. He is a leading expert and key patent holder in SLA technologies and previously worked at CA Technologies within its cloud-connected enterprise business unit.