Limit size of search query to avoid stack level too deep error

Goal

Make GitLab search more stable and robust against unintentional or intentional abuses and scenarios where "noisy neighbors" (users, projects, groups) are able to disrupt the search by just doing one action.

Background

The Create Teams have been asked via Epic &1737, to proactively find application limits within the Editor code. The Editor Team identified limits in create-stage#40

Solution

Application Limit When elasticsearch is disabled, searching with more than 1000+ terms (separated by comma and a space) will result in stack level too deep error. If we search with say 900 terms, the query would take around 40ms (which might be okay) The severity is low as this happens in constant time, and does not hit the database
What is the risk of not adopting this new limit? The severity is low as this happens in constant time, and does not hit the database.
Estimated Weight 1
Configurable/ Not Configurable Is not Configurable
Are there existing limits? No
Include this in the implementation When this limit is triggered, it should fail and raise log errors, should be monitorable and should fail as nicely as possible for the user.
Edited Nov 04, 2019 by Dylan Griffith (ex GitLab)
Assignee Loading
Time tracking Loading