• Ben Bodenmiller's avatar
    disallow irrelevant pages by default in robots · 595a93ee
    Ben Bodenmiller authored
    Update default robots.txt rules to disallow irrelevant pages that search
    engines should not care about. This will still allow important pages
    like the files, commit details, merge requests, issues, comments, etc.
    to be crawled.
    595a93ee
Name
Last commit
Last update
..
uploads Loading commit data...
404.html Loading commit data...
422.html Loading commit data...
500.html Loading commit data...
502.html Loading commit data...
apple-touch-icon-precomposed.png Loading commit data...
apple-touch-icon.png Loading commit data...
deploy.html Loading commit data...
favicon.ico Loading commit data...
logo.svg Loading commit data...
robots.txt Loading commit data...
static.css Loading commit data...