As a developer with an application composed of many microservices, I need to test a project by running integration tests involving other projects. e.g. a front-end react app depends on a back-end API, which itself depends on several back-end services. We should support running tests that involve all services. With Docker, it is possible to use Docker Compose to spin up a collection of services, each from their own GitLab project, with their own images in the GitLab Container Registry.
- We basically support this already, although we could document it better.
- Triggers and cross-project dependencies might help triggering tests when dependencies change.
- But there are things we can do to make it work well without doing any cross-project triggers. e.g. the front-end app would have a
docker-compose.ymlthat knows how to run all needed integration tests for the front-end; pulling in images for the other services as needed. Any change to the front-end would then do a full integration test with all services. Likewise, the API project would have it's own
docker-compose.ymlthat specified how to do integration tests whenever the API changed. The Docker Compose configurations would likely be very similar, but not necessarily the exact same. There should be some way to share configuration between projects (includes, submodules).
- One advantage of this is that integration test failures would be associated with the project that introduced the change rather than potentially being buried by a false success pipeline that triggers a different project to fail its pipeline.
- The above is easy when changes are isolated to a single project, but what if there are two or more projects that need their changes tested together? Can we use comments in merge requests like "Depends on foo/!1234 (merged)" to know how to test these?
Links / references
- Master issue: gitlab-ee#933
If the vision of gitlab is to use docker-compose, what is the role of gitlab services (in .gitlab-ci.yml) in all this? In my company I could do it without docker-compose and just use gitlab service instead but this is much too poor to use it for integration testing so I used docker-compose.. I was blocked with this (for dns resolution):
I think there needs clarification.
@alexispires Excellent question, and we don't have an answer yet. We don't like building out similar functionality into GitLab services that is already developed elsewhere. But there are significant advantages to using services over Docker Compose too. Perhaps services is a gentle intro to get started, but advanced usage requires compose. Or perhaps there's some way to leverage compose directly in the runners. Let's use this issue to discuss.
One thing I don't like is needing to run docker compose as a script on the runner. I wonder about specifying the docker compose file in
.gitlab-ci.ymlinstead of using
image. Then the runner could spin everything up, saving a layer of docker-in-docker.
It gets a bit more complicated when
docker-compose.ymleffectively needs to be dynamic. e.g. to reference an image that was created in a previous CI step.
Very interesting topic. I recently was also blocked by DNS when using
docker network. Using
servicesis not that straightforward at all, when you also need to use docker in docker. I agree this is something we should work on, but it would be also great to keep in mind that
gitlab-ci.ymlis somewhat decoupled from Docker itself, we do not have docker-specific configuration (we obviously use docker under the hood, though). For sure there is a room for improvement here.
On the decoupling from Docker, which is great, perhaps instead of defining services we can ask it to import from a provider? Importing from docker-compose.yml by providing a new top-level key,
# .gitlab-ci.yml docker-compose: file: docker-compose.yml import: # when no array provided, import all? - api - database
We created GitLab QA which is a integration-tests framework for GitLab. It does not use Docker Compose yet, but there are many common elements with what this issue describes.