New parameter to overwrite page/per-page to fetch more than the maximum "per-page" results
Problem to solve
Right now one can only retrieve at maximum --per-page per execution. This includes delete-commands.
In case of list-commands fetching many items at once can be used for exports, or pipes.
If one wants to delete more than --per-page items, one has to call glab multiple times and must keep track on already deleted and to-delete items themself.
Proposal
A new parameter --limit, which overlays --page and --per-page. Using --limit with one of the pagination parameters should lead to an error. --limit allows exceed the maximum of 100 items. Also --limit <= 0 will return all items (use at your own risk). --limit does not support pagination, but will always return the first results. Later improvements may introduce a skip-like parameter to skip the first items. A pagination-like mechanism wouldn't make sense, because it would mimic --page/--per-page completely (see below "Alternative").
Alternative: Reuse --page/--per-page for this purpose. Allow --per-page to exceed 100 and even set it to 0. In latter case --page would be useless (you cannot select page 2 after page 1 already returned everything).
Further details
-
--limit(and maybe--skip) would separate the CLI-interface from the actual Gitlab-API implementation details. Don't know, if that is actually a benefit though. - Right now it is for example not possible to remove all failed pipelines at once:
glab ci delete --status=failed --limit=0would solve this - For values <= 100 the implementation could use the default pagination, but > 100 it would need separate loop
- It is backward compatible
- This covers all
list- anddelete-commands (where one can delete lists). Right now deleting a list of items is only supported for pipelines. This proposal don't want to change this.