Skip to content
GitLab
Next
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
  • GitLab GitLab
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
    • Locked Files
  • Issues 44,767
    • Issues 44,767
    • List
    • Boards
    • Service Desk
    • Milestones
    • Iterations
    • Requirements
  • Merge requests 1,332
    • Merge requests 1,332
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
    • Test Cases
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Packages and registries
    • Packages and registries
    • Package Registry
    • Container Registry
    • Infrastructure Registry
  • Monitor
    • Monitor
    • Metrics
    • Incidents
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Code review
    • Insights
    • Issue
    • Repository
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • GitLab.orgGitLab.org
  • GitLabGitLab
  • Issues
  • #20351
You need to sign in or sign up before continuing.
Closed
Open
Issue created Dec 06, 2017 by Shinya Maeda@shinya.maeda💡Maintainer

Respect `environment_scope` with `only: kubernetes: active` strategy (gitlab-ci.yml)

Summary

When a job has only: kubernetes: active strategy, the job is only executed when the project has an enabled Kubernetes instance (Platform::Kubernetes or KubernetesService).

Here is the document.


See the example below. Job is going to be created only when pipeline has been scheduled or runs for a master branch, and only if kubernetes service is active in the project.

job:
  only:
    refs:
      - master
      - schedules
    kubernetes: active

And recently, we implemented multiple Kubernetes cluster per project. We can have multiple Kubernetes instances in a project. Each instance belongs to each environment.

But today kubernetes: active just checks if the project has any clusters. This could make a slight edge case.

For example,


  1. A project has a cluster with review/* environment scope.

gitlab-ci.yml

deploy to production:
  environment:
    name: production
  only:
    kubernetes: active

The job will be executed, because the project has a cluster, but the corresponding cluster doesn't exist, thus this job will fail or cause unexpected behavior.

Solution

Check environment:name when kubernetes: active strategy exists.

e.g.

  1. If kubernetes: active exists and environment:name exists, we check the activeness of the corresponding cluster.
  2. If kubernetes: active exists and environment:name does not exist, we check if any clusters exist.

Possible workaround

For the workaround, users would need to create a cluster with a wildcard scope (*) for now. IIRC our gitlab-org/gitlab project does the same workaround for the same reason.

Related

  • https://gitlab.com/gitlab-org/gitlab-ee/merge_requests/3603#note_50150076
  • https://gitlab.com/gitlab-org/gitlab-ee/issues/3734

/cc @bikebilly @ayufan @grzesiek

Edited Nov 05, 2020 by Viktor Nagy (GitLab)
Assignee
Assign to
Time tracking