Remove robots.txt from urls.py

urls.py has an entry to serve a robots.txt under hyperkittys namespace. This currently doesn't work, because the actual file is located in the static directory and not in templates.

Either way I don't think Hyperkitty should serve it by default. If the archives are public chances are people want them to be indexed by search engines. If they don't want that, they should define a robots.txt site-wide. Since it's static it's easier to do that with the native webserver.