Skip to content

Tests/python: sane output

Context

In !1257 (merged), the full output of all python tests were enabled in the CI.

Eventually, some logs were so long that it would fail the CI. Therefore, in !2615 (merged), the output of pytest was redirected into a file that would be output at end of the jobs execution, which was more or less unreadable.

This MR just reverts to the default behavior of pytest in the CI. Here's how it looks: https://gitlab.com/nomadic-labs/tezos/-/jobs/1797003877

By default, the output looks like tezt does in the CI: one line each test that succeeds, and if one test fails then it's output is printed. One difference is that we do not store the log of all tests (included passed tests) in a separate file as is done for tezt. But I doubt the usability of the logs of passed tests. Anyhow, I can add that in a follow-up, as there does not seem to be an option in pytest for this.

Another possible issue is that the log of some failing tests would be too long to print in the log of the job. However, if no tests are failing we will not be bit by this. In the worst case, we will have a job that is failing first because a tests is failing, and then again because the log is too long. Eventually, we can add a buffer in pytest as in tezt.

Other stuff

I realized various simplifications of the pytest setup could be simplified since we added balancing and coverage:

  • the variables PYTEST_SUITE* are no longer used and removed
  • i moved the extension of coverage extensions in .integration_python_template and moved it to integration:pytest: since we don't care about coverage from the integration:pytest_examples: (the latter only test the test framework really)
  • since we no longer run tail -n 100 tests_python/tmp/* there is no need to create tests_python/tmp/empty__to_avoid_glob_failing
  • no need to reference the after_script of .template__cobertura since we no longer override this key from that template
  • only the job integration:pytest: create artifacts, so I moved that there

Manually testing the MR

Checklist

  • Document the interface of any function added or modified (see the coding guidelines)
  • Document any change to the user interface, including configuration parameters (see node configuration)
  • Provide automatic testing (see the testing guide).
  • For new features and bug fixes, add an item in the appropriate changelog (docs/protocols/alpha.rst for the protocol and the environment, the Development Version section of CHANGES.md for everything else).
  • Select suitable reviewers using the Reviewers field below.
  • Select as Assignee the next person who should take action on that MR
Edited by Arvid Jakobsson

Merge request reports