Gitlab Import/Export Not Working in HA mode
I ran into an issue attempting to Import an exported gitlab tarball (the feature introduced in 8.9). I was just testing the feature for usability (exported from the same CE gitlab instance that I was importing to).
The export worked correctly, but the import failed. The error in the log file was something similar to:
/var/log/gitlab/gitlab-rails/production.log:Import/Export error raised on /opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/import_export/command_line_util.rb:32:in `execute': tar (child): /tmp/RackMultipart20160801-20204-1kqba1x.gz-import: Cannot open: No such file or directory
/var/log/gitlab/gitlab-rails/production.log:Import/Export error raised on /opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/import_export/file_importer.rb:19:in `rescue in import': Unable to decompress /tmp/RackMultipart20160801-20204-1kqba1x.gz-import into /var/opt/gitlab/gitlab-rails/shared/tmp/project_exports/test/test_project_4
Our HA setup consists of 2 front end gitlab instances behind an HAProxy server. Those front ends mount directories from a server running postgresql and an nfs server.
The issue happening is that the import uploads the tarball using the rack multipart gem (sorry if my terminology is off - not a ruby developer). This class uses ruby's Tempfile module and (from what I can tell) no 'tmpdir' is being specified in the call to Tempfile.new and it is defaulting to /tmp. The next request to untar the the upload might or might not get sent to the same server (in my setup it never does) and always fails with the 'Cannot open: No such file or directory'.
I "solved" my issue by setting a TMPDIR env variable at startup to point to the tmp dir on the mounted nfs share, but there's likely a more elegant solution to this (config file setting?).