Skip to content

Add performance benchmarks to scanner service and summarize in report

Problem to solve

The benchmarking suite used for SAST in the IDE currently computes one benchmark, round trip time.

We should compute additional benchmarks to help identify where to focus our future tuning efforts.

For example, the /scan web method of our scanner service has an associated "scan time", which when subtracted from the "roundtrip time" results in the "network latency".

In addition to benchmarks, each should be summarized with additional statistics. Currently only the mean of the benchmarks is reported. It would be helpful to also include standard deviation, median, and quartiles.

Proposal

  • add a boolean benchmark parameter to the service /scan web method that computes additional benchmarks and adds them to the returns JSON
    • real_time: the total "wall clock" time spent scanning the file
    • sys_time: the system time spent scanning, including by any child process spawned for the scan
    • user_time: the user time spent scanning
    • memory_used: an estimate on the memory used during scan
    • io_read_size: total data read from file
    • io_write_size: total data written to file
    • io_read_time: time spent reading from file
    • io_write_time: time spend writing to file
  • use the updated /scan API in the benchmarking suite and aggregate results in the report.
Edited by Hua Yan