Add max chunked size for uploading to Object Storage
<!-- This template is a great use for issues that are feature::additions or technical tasks for larger issues.--> ### Proposal <!-- Use this section to explain the feature and how it will work. It can be helpful to add technical details, design proposals, and links to related epics or issues. --> We should let the uploader to accept a specific maximum chunked size to fit a maximum upload size in order to prevent uploading issue. <!-- Consider adding related issues and epics to this issue. You can also reference the Feature Proposal Template (https://gitlab.com/gitlab-org/gitlab/-/blob/master/.gitlab/issue_templates/Feature%20proposal.md) for additional details to consider adding to this issue. Additionally, as a data oriented organization, when your feature exits planning breakdown, consider adding the `What does success look like, and how can we measure that?` section. --> ### Rationale Currently, Cloudflare limits maximum post body size to 100MB for free users, this means uploading to a custom S3 endpoint (such as self-hosted Minio or RADOS) through the free plan of Cloudflare will not be possible if the body size is not chunked to 100MB apiece. It will return a 413 Entity Too Large status code and some of the pipelines cannot proceed due to not being able to upload large artifacts. As such I recommend to let us control the chunk size just like how we can control `multipart_chunk_size_mb` in the Task Runner Backup ### Workaround Disable Cloudflare😂
issue