Performance monitoring
Goals:
- avoid performance regressions
- provide feedback on proposed optimizations
- provide a single command which tests several versions of buildstream and outputs a set of results that can be analyzed
- allow running only a subset of tests
Components:
-
Project generation script/library -
BuildStream log scraper -> script in contrib/ dir which parses BuildStream stdout and outputs a wellknown JSON format -
Test harness which wraps project generation, obtaining multiple versions of BuildStream (using Docker images), and log generation: https://gitlab.com/BuildStream/benchmarks/merge_requests/1 -
Host machine analysis: https://gitlab.com/BuildStream/benchmarks/merge_requests/1 -
Output analysis -- e.g. a table -
Automation: https://gitlab.com/BuildStream/benchmarks/merge_requests/1
Configurable aspects:
- Scale of generated projects, e.g. 1 file, 10 files, 100 files ... lots of data points allow analyzing how a feature scales, but also means we have lots of data.
- Which test(s) to run
Out of scope initially:
- Simulation of low network speed / latency
We expect some of this to live in contrib/benchmarking inside the buildstream.git tree. The project generation tools and the test harness may as well live in a separate repo, at least initially.
Edited by Tristan Maat