Overriding robots.txt via custom_gitlab_server_config returns 403
As proposed in #3955 (closed) you can override the default robots.txt
with a customized one, when adding this to /etc/gitlab/gitlab.rb
:
nginx['custom_gitlab_server_config'] = "\nlocation =/robots.txt { alias /etc/gitlab/nginx/gitlab-robots.txt; }\n"
No idea since when this is broken, but I just found out that doing this returns HTTP/1.1 403 Forbidden
instead of the file now here on my GitLab Omnibus (v13.0.5). It worked with the version that was latest on 2019-12-11 (this is when I set this up). The file /etc/gitlab/nginx/gitlab-robots.txt
has the same permissions and owner as /opt/gitlab/embedded/service/gitlab-rails/public/robots.txt
.