GitLab CI Cache Service
## Problem to solve
In the current caching architecture, the runner uploads the cache as one Binary Large Object (BLOB) to a pre-signed upload URL.
While this works fine for uploading caches that are less than 100 MB in size, if the size of the cache object is 100s MB in size, users can run into various problems. Those problems include slower performance as larger files take longer to upload/download or in the case of AWS, cache upload failures if the file is larger than AWS's configured limits. Public cloud providers such as AWS and Google address the problem of large file uploads using multipart uploads.
A multipart upload is a process that enables an object to be uploaded in multiple parts that are then reassembled into a final object. Multipart upload reduces the time to upload a large object, resulting in a more efficient and resilient process.
After numerous async discussions and a R&D [spike](https://gitlab.com/gitlab-org/gitlab-runner/-/issues/29356) to investigate how to add support for multipart uploads to GitLab Runner, the leading [proposal](https://gitlab.com/gitlab-org/gitlab-runner/-/issues/29356#note_1466106960) is to develop an external caching service for GitLab CI.
epic