Performance issues when loading large number of wiki pages

Overview

When loading a large number of wiki pages/commits. Unicorn can time out due to memory limits/timeout.

Reproduce

  1. Create large number of pages and commits for a GitLab wiki
git clone git@gitlab-testing.local:mygroup/new-group/test.wiki.git
cd test.wiki
  • Auto create commits/pages
#!/bin/bash
for i in {1..800}
do
FILENAME=$(cat /dev/urandom | env LC_CTYPE=C tr -dc 'a-zA-Z0-9' | fold -w 32 | head -n 1)
    git add .
    git commit -a -m "Welcome $i times"
    echo "Welcome $i times" > $FILENAME.md
done
  1. Attempt to load all pages via UI /wikis/pages
  2. Load via rails:
wiki = Project.find_by_full_path('mygroup/new-group/test').wiki
wiki.wiki.pages.map { |page| WikiPage.new(wiki.wiki.paged("test page"), true)}
  1. Watch memory get eaten

Ruby-prof

  • Input
irb(main):001:0> app.get("/mygroup/new-group/test/wikis/0WbWItAUO5n3AjWKSxEO2UdCOtzOVp45?private_token=#{User.find_by_username('root').authentication_token}", nil, {'X-Profile-Token' => "#{Gitlab::RequestProfiler.profile_token}"})
Started GET "/mygroup/new-group/test/wikis/0WbWItAUO5n3AjWKSxEO2UdCOtzOVp45?private_token=[FILTERED]" 
Processing by Projects::WikisController#show as HTML
  Parameters: {"private_token"=>"[FILTERED]", "namespace_id"=>"mygroup/new-group", "project_id"=>"test", "id"=>"0WbWItAUO5n3AjWKSxEO2UdCOtzOVp45"}
Completed 200 OK in 4089ms (Views: 1990.3ms | ActiveRecord: 40.3ms | Elasticsearch: 0.0ms)
=> 200
  • Output

image

Full output: https://drive.google.com/file/d/0B_4wYK1qcPT1ZmRBWl96NDl6LUk

Links