Switch test path for large directory for both GPT and SiteSpeed.
Testing further with SiteSpeed today found it was directly impacted by gitlab-org/gitlab#211366 (closed). The path we're using to test the file lists in both API and Web is massive, containing 1795 files causing a O(n) performance error. This page sequentially loads the files 25 at a time in order and as such takes 2.1 minutes on that directory and breaks SiteSpeed that has a 30 second timeout.
In the above issue the dev teams are aware of the problem and there's now an Epic to track it.
As it stands today though the directory we're testing with is now breaking our pipeline and could be argued is not representive. It is an extreme situation, too much so for our unwritten target of large but realistic. As such we should switch to a directory that contains a smaller number of files but is still large (to still test performance) for both GPT and SiteSpeed tests.