Geo: Buffer large proxied payloads into a file and stream in chunks

Right now the push-to-secondary code buffers the entire payloads of the info/refs and git-receive-pack data into RAM, base-64 encodes it, and sends it to the primary. This can lead to high RAM usage.

We should consider:

  1. Creating a temporary file
  2. Using HTTP chunked-encoding to send the data in batches

This should probably be done in conjuction with moving the proxy functionality to Workhorse (https://gitlab.com/gitlab-org/gitlab-ee/issues/7405), but there's no reason it can't also be done in Unicorn.

Edited Jul 13, 2019 by Stan Hu
Assignee Loading
Time tracking Loading