Use limit for search count queries
What does this MR do?
Search query is especially slow if a user searches a generic string which matches many records, in such case search can take tens of seconds or time out. To speed up the search query, we search only for first 1000 records, if there is >1000 matching records we just display "1000+" instead of precise total count supposing that with such amount the exact count is not so important for the user.
Because for issues even limited search was not fast enough, 2-phase approach is used for issues: first we use simpler/faster query to get all public issues, if this exceeds the limit, we just return the limit. If the amount of matching results is lower than limit, we re-run more complex search query (which includes also confidential issues). Re-running the complex query should be fast enough in such case because the amount of matching issues is lower than limit. We could optimize this more by searching only for confidential issues in this second query, but for simplicity we just re-run the original query.
Because exact total_count is now limited, this patch also switches to to "prev/next" pagination.
Are there points in the code the reviewer needs to double check?
Check that 2-phase issue search works as expected. Check that pagination works as expected.
Why was this MR needed?
We need to further improve slow search queries.
Does this MR meet the acceptance criteria?
- Changelog entry added, if necessary
- Documentation created/updated
- API support added
- Tests added for this feature/bug
- Has been reviewed by Backend
- Has been reviewed by Database
- Conform by the merge request performance guides
- Conform by the style guides
- Squashed related commits together
- Internationalization required/considered