How to start saving money on your Public Cloud costs


Public Cloud is instantly accessible, consumption based Operational Expenditure, and a modern, agile IT platform that bodes well for the future.

However, simply replicating the on-premise world in the cloud causes a significant cost headache. A large number of product vendors follow their traditional manuals, manuals constructed for non-Cloud platforms resulting in over-provisioning and the re-introduction of IT sprawl. This is something I see is commonplace in initial customer assessments.

Admitting you have a problem

Most of my customers know there is a lot of optimisation to be introduced, even if they are unable to articulate it in data and are unsure how to identify the opportunities.

What can we do about it? Well, the tools exist to make the job pretty easy such as Trusted Advisor and Cost Explorer in AWS.

To begin with we carry out some analysis. Looking at deployed resources and ensuring we have some decent tagging and segmenting the data based on usage and nature of the resource.

Next, we can identify the actions to take based on the data:

· Retire resources that are no longer required or have no (traceable) owners

· Right-size — remove unnecessary waste from resources with too much capacity; ensure the resource type matches the workload requirements

· Reserve — Reserved Instances can bring large savings. Once right-sized and it’s established that resources have a longer-term need, reserve them

· Spot Instances — where possible look to use on-demand instances. ‘Spot’ are specific to workload types but offer huge savings

· Savings plans — commit to a certain expenditure on compute resources to save large amounts of expenditure

· Automate provisioning cycles and autoscaling. Plan and understand how automation can alert and/or scale resources for intra-day and periodic cycles of demand

It’s a results-based business

At one customer, of the 170 EC2 instances, 90% were under-utilised. 50% of the total was less than 5% utilised at historic and measurable peak. 20% of the total were effectively idle and unused for months.

Of the 350TB of storage, 19GB had been allocated to HDD with the rest all utilising SSD. All 350TB being directly attached EBS volumes, with only 200GB found in S3. 150TB of SSD storage holding snapshots dating back a year, despite policy requiring no more than a calendar month. Another 50TB was without an owner and subsequently found to be a copy of an obsolete on-premises back-up.

Applying a simple process, alongside knowledge and experience has identified huge savings for a number of customers. After a week of analyses, easy to implement quick-wins totalled more than £1 million in annual savings for one High Street customer.

In a similar process with another customer it took just a few days to identify £500k potential annual savings through right sizing, retiring and reserving.

So potential savings are easy to identify, but it does need commitment and a little experience. Public cloud cost optimisation should be treated as an ongoing activity not as a one-off exercise so that you really start to save money.

Back to Blog