Feature Request: Dynamically Generated robots.txt With Option to Exclude Forks

As part of our Gitlab installation, we index public repositories with a search appliance.

One of the issues is that if I search for something, it may appear in N locations - the canonical repository but also in all the M forks of said repository. Luckily, our search indexer respects the robots.txt file in Gitlab, so I'm requesting an option to grammatically generate said file with the option to exclude forked repositories.

Edited Oct 02, 2025 by 🤖 GitLab Bot 🤖
Assignee Loading
Time tracking Loading