Screenshot 2020 04 11 at 4.40.43 PM 1
AWS

Cut the spend on your AWS Infrastructure!

With the present COVID -19 difficult situation, I am sure that almost all businesses will be thinking of cutting down their costs, and CFO’s will be looking for all possible cost reduction options. Generally, it’s easy to think about reducing the unwanted human resource investments, come on! The present financial crisis is not suitable for thinking that way, Let’s think sensible, and stand shoulder to shoulder to support each other.

So, where do the business leaders and CFO’s should check first? Of course, it is your cloud infrastructure spending!

A study done by Gartner, Inc forecasts the growth of worldwide public cloud services from a total of $227.8 billion in 2019 to a whopping $266.4 billion of 2020.

We all know AWS (Amazon Web Services) is the undisputed leader in the industry with about 47% of the market share. This consists of Small and Medium business using a single EC2 instance to Enterprises using multi region workloads who spends several thousand dollars per year.

We all might see different mails from various public cloud providers about the tools they provide to check the optimal usage and suggestion on cost reductions, but might not have taken it too seriously. The IMF chief recently announced this pandemic will unleash worst recession. So why don’t we optimize our spendings?

Since Urolime being an AWS Partner company, and supporting customers for the last 9 years for AWS consulting and cost optimization, we now get numerous requests from AWS customers optimizing their cloud spend. In this, I would like to share a few best practices that would be worth considering:

The 11 AWS cost optimization best practices

1. EC2 Instances

One of the common cost optimization best practices by AWS, the purpose of it is to match the instance size to their necessary workloads. Most of the time, customers don’t really know what is the best instance type to use, so they start with biggest in thought that we never want to run out of CPU or RAM.

2. Purchasing Reserved Instances (RIs)

I believe every AWS customer would have heard about RIs. This is one of the easiest ways to reduce your cost in AWS, at the same time if you do not know how optimally this needs to be planned and utilized, it will increase your AWS cost.

3. Scheduling on/off times

I still our team writing scripts to automate this with AWS CLI in the past to make it a one click solution for the development team, but now everything is available from the dashboard. This is one of the best ways to reduce your AWS cost by scheduling start/stop of your non-production environments such as development, staging, QA etc. This could save you 60% – 70% of running this non-production instances.

4. Upgrade instances to the latest generation

Amazon Web Services releases new generation of instances, so often, and thus tend to have improved performance and functionality compared to their predecessors. The option here is either upgrade existing instances to the latest generation, or downsize existing instances with borderline utilization metrics in order to benefit from the same level of performance at lower cost.

5. Delete unattached EBS volumes

If you are an AWS customer for quite a long time, I am sure that you tried creating and deleting EC2 instances multiple times. The general mistake found here is we forget to select “delete on termination” box when the instance was launched. If the box wasn’t checked, the EBS volume still exists and it will be charged until you delete it.

6. Delete orphaned snapshots, always have a retention policy

Most of the customers just want to make sure the backup is happening, without a proper plan or retention policy. If you have one, and maintain the retention it is simply great! Otherwise the number will grow, and so your monthly bill. Usually, you just require the most recent snapshot to restore data if something goes wrong, and although snapshots don’t cost very much individually, you could save thousands of dollars by deleting those which is no longer required.

7. Terminate zombie assets

You need to keep an eye on unused assets and do frequent clean up. The common unused assets we usually find in new customer environments are ELB, ALB, Elastic IPs, Snapshots, Unattached EBS volumes, NAT gateways etc.

8. Move infrequently-accessed data to lower cost tiers

Based on the data access frequency from AWS S3 buckets, you need to decide which tier is suitable for your use case. The savings generated from storing infrequently accessed, non-critical data in a lower cost tier can be substantial. Determining which storage tier is most suitable for data depends on factors such as how often data is accessed and how quickly a business would need to retrieve data in the event of a disaster.

The six tiers of storage:

  • S3 Standard
  • 3 Intelligent- Tiering
  • S3 Infrequent Access (IA)
  • S3 Infrequent Access (IA, Single Zone)
  • S3 Glacier
  • S3 Glacier Deep Archive

9. Use Spot Instances

Amazon EC2 Spot Instances allow you to access spare Amazon EC2 computing capacity. Since Spot Instances are available with up to 90% discount when compared to On-Demand pricing, you can significantly reduce the cost of running your applications, grow your application’s compute capacity and throughput for the same budget, and enable new types of cloud computing applications.

10. Monitor, track and your service offerings

You can use Amazon CloudWatch to collect and track metrics, monitor log files, set alarms, and automatically react to changes in your AWS resources. You can also use Amazon CloudWatch to gain system-wide visibility into resource utilization, application performance, and operational health.

With Trusted Advisor you can provision your resources following the best practices to improve system performance and reliability, increase security, and look for opportunities to save money. You can also turn off non-production instances and use Amazon CloudWatch and Autoscaling to match increases or reductions in demand.

11. Analyze your costs and usage using cost explorer

AWS Cost Explorer gives you the ability to analyze your costs and usage. Using a set of default reports, you can quickly get started with identifying your underlying cost drivers and usage trends. From there, you can slice and dice your data along numerous dimensions to dive deeper into your costs.

One powerful, cost management feature available in Cost Explorer is to filter and group your Cost Explorer data according to your resource tags. Tagging gives you the ability to create resource groups that that match your business structures, making it easier to map your resources and workloads to the appropriate cost center.

It is also suggested to look at your existing infrastructure architecture and make sure it’s highly available and scalable. With horizontal scalability, you can cut down the number of running instances, rather let AWS handle the extra traffic with its auto scalability feature for you.

We, at Urolime always experienced a significant cost reduction of 25% to 35% in most of the customer environments after cost optimization. And let me remind you, cost optimization is a continuous process, you always need to monitor and identified unused or underutilized resources. There are a lot of ways to automate this process. Optimize your AWS spendings today!

If you need some help, feel free to contact us any time! Stay healthy and stay safe!

Category: AWS
Urolime Technologies has made groundbreaking accomplishments in the field of Google Cloud & Kubernetes Consulting, DevOps Services, 24/7 Managed Services & Support, Dedicated IT Team, Managed AWS Consulting and Azure Cloud Consulting. We believe our customers are Smart to choose their IT Partner, and we “Do IT Smart”.
Posts created 470

Related Posts

Begin typing your search term above and press enter to search. Press ESC to cancel.

Enjoy this blog? Please spread the word :)

Follow by Email
Twitter
Visit Us
Follow Me
LinkedIn
Share
Instagram