Automatically compress with GZip/Zopfli & Brotli
Why at the API?
Other options to handle content-encoding
compression:
- User build step
- Con: This forces every user to figure this out independently. Overall human time cost is proportional to user base, which would slow user base growth.
- Pro: Precise control? Not sure that's a pro or con.
- API client (e.g. http2live CLI tool)
- Con: This means additional processing time on the client, additional bandwidth to deploy files, complication in the client software, and possible future compatibility issues between different clients.
- Pro: Processing is handled by end user. Less CPU load on API server.
-
API server
- Con: Extra storage needed for every hosted file in GZ+BR.
- Pro: No complications for the client. Just deploy one file and it gets served to visitors as BR or GZ (if it's a compressible
content-type
like HTML/CSS/JS/JSON/TXT/etc).
- Edge server on-the-fly
- Con: Brotli/Zopfli (at max strength) are very CPU intensive. Would need to reduce compression efficiency. Could get tricky to handle this on the edge, considering the cluster mode.
- Pro: Less storage costs by not having to store compressed variants for unused files/encodings permutations.
Edited by Sebastiaan Deckers