Try to discard data we don't need as soon as possible while importing

We sometimes see memory issues when importing large projects, and here's an example: https://gitlab.com/gitlab-org/gitlab-ce/issues/33135#note_42497226

A simple and quick possible improvement might be changing this pattern:

response.body.each do |raw|
  # ...
end

Into this:

body = Github::Client.new(options).get(url).body

while raw = body.shift
  # ...
end

So that as we walk it through, the GC could collect any memory we don't need anymore. Note that we need to discard the response immediately. We have some codes like:

url = response.rels[:next]

Lines like these need to be moved before we iterating through the body.

Assignee Loading
Time tracking Loading