FY20-Q1 VPE OKR: Make GitLab.com profitable => 100%
Key Result: Recognize TBD% cost reduction on a per user basis. => 70%
- We started a working group for cost savings, another for kickstarting growth
- Several other cost savings items are in-flight: https://gitlab.com/groups/gitlab-com/-/boards/993893?label_name=Cloud%20Spend
- Highlights include the following savings:
|Removed a redundant 800 TB registry bucket on S3||2019-03-26||AWS Storage||$17,700|
|Reduced instance sizes back to n1-standard-2||2019-03-27||GCP Compute||$51,000|
|Locked in a 3-year CUD for gitlab-ci.||2019-03-29||GCP Compute||$142,365|
|Downsized DR nodes.||2019-04-15||GCP Compute||$16,000|
|Locked in a 3-year CUD for gitlab-production.||2019-04-17||GCP Compute||$60,853|
|Disable VPC flow logs||2019-04-29||GCP Compute + Storage||$25,000|
- We formed a working group and it proved to be effective
- We brought our FY20 spend from $11.6M baseline to $9.8M for a 15% reduction
- Andrew: Broader awareness of cost implications that our engineering and product decisions are having on cloud spending.
- We didn't do a good enough job setting our measurements and accounting methods up at the front end of the process, so we had a hard time answering the question "how much % did we save" and it led to confusion and differing opinions
- Andrew: Without proper data-warehousing capabilities, our ability to anaylse usage and cost patterns remains limited. Even relatively simple questions are difficult to answer, especially since we are querying an OLTP database for historical usage trends.
- Andrew: as a future improvement to our maturity in this area, would be good to establish a framework for planning ahead on new features, so that we know how product changes are going to impact our cloud spend going forward. Examples of such planning could include:
- How much should we expect enabling elasticsearch for all git repos on gitlab.com to cost?)
- How do changes in our free CI usage policy affect our cloud spend?