Create sitemap file
Hyperkitty can generate public archives. When search engines crawl these, a lot of load can be generated. That load can be avoided, by maintaining a sitemap file, where all webpages, generated by hyperkitty are stored. Thus search engines reload the sitemap files, read there which pages were modified when, how often webpages are expected to change, etc, and index only what was changed since the last crawl (=the previous date), instead of crawling everything.
I suggest creating two separate sitemap files per mailing list. One containing all archives since 1st January this year, and once referencing all mails after 1st January this year. The latter file will be updated quite often, but it will be small, thus modifying it will use less resources.
The Sitemap format is described at https://www.sitemaps.org/protocol.html . If somebody is willing to implement this, I will be more than glad to answer any questions.