Further remote execution testing

Background

Now that we are testing remote execution in CI (!1239 (merged)), perhaps we should consider broadening our remote execution tests.

Currently we have two "simple" remote execution tests in tests/remoteexecution/simple.py which:

  1. Build autotools with remote execution, check it out and verify it contains the right "stuff"
  2. Build autotools remotely (again), then shell into it and execute /usr/bin/hello and ensure we get the right output.

This is a great start to remote execution testing in BuildStream but it far from comprehensive. I propose that we simply add some more tests.

Task description

I'd imagine that there is an endless number of tests we could write for this, however, to start with I propose that we should add the following tests:

  • Build something 'bigger' than autotools.
    • The autotools example is great, however, why don't we try to build something a bit 'bigger', for example, a subset of freedesktop? Although I'd recommend avoiding the base ostree import for CI jobs as this is slow.
  • Build something via a junction.
    • We already have a junction example in our docs. This just builds an element which depends on autotools via a junction, we could use this.
  • Build autotools with bst --cache-buildtrees always build autotools/amhello.bst
    • This should mean that the buildtree of the artifact is locally cached (once built remotely), we should verify that it is there.
    • We'd have to obtain the correct digest and then extract the builddir from the artifact.
    • Some of our integration tests do something similar already.
  • Write an element that will fail when built and verify that this artifact contains a buildtree.
    • Again, we'd have to obtain the correct digest and then extract the builddir from the artifact.

I'm open to any more suggestions, but I would like to avoid flooding this issue with so many additional tests that it never closes.


Aside: Running the remote execution test-suite locally

To run the remote execution tests, you need to have a buildgrid instance up and running, and at least one bot on port 50051. Now, I personally did not find it trivial to get tox -- --remote-execution tests/remoteexecution to produce 'passing' tests. However, here is what my final setup looked like:

  1. These tests require a couple of environment vairables: export ARTIFACT_CACHE_SERVICE=http://localhost:11001 REMOTE_EXECUTION_SERVICE=http://localhost:50051
    • Note that I don't even think you need ARTIFACT_CACHE_SERVICE... but hey ho.
  2. I had to slightly modify Martin's Docker-compose manifest. To use these clone my fork: https://gitlab.com/jennis/buildgrid.hub.docker.com/tree/jennis/work_with_bst_tests and checkout the branch jennis/work_with_bst_tests

Now, you should be able to run the remote-execution tests, from one terminal:

docker-compose -f Composefile.buildstream.yml up --scale bots=2

Then, from another terminal, change to your BuildStream clone and:

export ARTIFACT_CACHE_SERVICE=http://localhost:11001 REMOTE_EXECUTION_SERVICE=http://localhost:50051
tox -- --remote-execution tests/remoteexecution

Alternatively, you can install BuildGrid and BuildBox fuse then start your own local buildgrid instance (with all the correct configuration...) and BuildBox (fuse) worker. Just follow the README instructions for both repos, but I found this difficult to get working.

Edited by Raoul Hidalgo Charman