Skip to content

QA - Adds cluster agent e2e testing

João Alexandre Cunha requested to merge qa/adds-cluster-agent-e2e into master

Description of the test

gitlab-org/quality/testcases#1106 (closed)

Also relates to epic: &4949

Development Tasks

  • creates empty project
  • creates Clusters::Agent and Clusters::AgentToken.
  • creates a K3s cluster.
  • deploys a secret to K3s with the Clusters::AgentToken#token.
  • deploys agentk to K3s (validated only with GDK. Works thanks to the loopback alias).
  • figure out if the above will also work with the real QA Pipeline without a loopback alias. It should work if the GitLab instance deployed by the pipeline has a public resolvable hostname.
  • pushes a manifest.yaml file to the repo.
  • pushes a .gitlab/agents/my-agent/config.yaml file to the repo.
  • validates programmatically that the files got correctly synced. Should be fairly simple to create a script to watch for the manifest.yaml resource to come up.
  • add a check that skips the test if KAS is not deployed or mark the spec as allowed to fail.
  • Decide on how to automatically figure out the agentk version to deploy. Maybe a simple script to download and read the GITLAB_KAS_VERSION from GitLab repos master branch would do? Though, this obliges a GDK spec to have network connection. Maybe we're ok with it? . This is already being discussed on a follow-up issue, #292935 (closed).
  • Find a way to test this against staging already before merging, since the kas-address will be different than from GDK. I think/hope, this is the only possible difference that we might face when testing against staging.
  • Test user login with regular user instead of admin. This will be super important, since I think we skip admin user tests on live environments. But I think it should work with admin. 🤔
  • Remove the created project from the sandbox group via an after hook to help cleanup the house.

Check-list

  • Confirm the test has a testcase: tag linking to an existing test case in the test case project.
  • Note if the test is intended to run in specific scenarios. If a scenario is new, add a link to the MR that adds the new scenario.
  • Follow the end-to-end tests style guide and best practices.
  • Use the appropriate RSpec metadata tag(s).
  • Ensure that a created resource is removed after test execution.
  • Verify the tags to ensure it runs on the desired test environments.
  • If this MR has a dependency on another MR, such as a GitLab QA MR, specify the order in which the MRs should be merged.
  • (If applicable) Create a follow-up issue to document the special setup necessary to run the test: ISSUE_LINK
Edited by João Alexandre Cunha

Merge request reports