Make parallel limit user configurable and do not restrict to 50 max
Everyone can contribute. Help move this issue forward while earning points, leveling up and collecting rewards.
Proposal
We need to run one/off jobs on our production cluster to compute specific computations. As part of our gitlab pipeline, we have build a docker image which we run on each nodes where the computation will happen. When the computation has been completed, the docker image is shutdown. The computation might take several days to finish on each machine.
Instead of rolling our own framework to distributed and run the docker images on the various machines, we plan on using gitlab's parallel feature to do so, since it has already the complete error handling, status updates, etc... integrated. We would then only need to install the gitlab-runner on each machine, and gitlab would take care of the entire orchestration.
A past issue has limited the amount of parallel tasks to run to 50 (gitlab-foss!22907 (merged)).
You also created an issue on increasing the limit, but this was dismissed after 1 year without providing any reason (#228561 (closed))
As we plan on running our process on more than 500 hosts, please make this limit user configurable (or at least configurable by the configuration file)
If self hosted, the resulting memory constraints shouldn't be an issue.
Relevant discussion: