Skip to content

GitLab Next

  • Projects
  • Groups
  • Snippets
  • Help
    • Loading...
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
GitLab
GitLab
  • Project overview
    • Project overview
    • Details
    • Activity
    • Releases
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
    • Locked Files
  • Issues 36,070
    • Issues 36,070
    • List
    • Boards
    • Labels
    • Service Desk
    • Milestones
    • Iterations
  • Merge Requests 1,299
    • Merge Requests 1,299
  • Requirements
    • Requirements
    • List
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
    • Test Cases
  • Operations
    • Operations
    • Metrics
    • Incidents
    • Environments
  • Packages & Registries
    • Packages & Registries
    • Container Registry
  • Analytics
    • Analytics
    • CI/CD
    • Code Review
    • Insights
    • Issue
    • Repository
    • Value Stream
  • Snippets
    • Snippets
  • Members
    • Members
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • GitLab.org
  • GitLabGitLab
  • Issues
  • #20351

Closed
Open
Created Dec 06, 2017 by Shinya Maeda@shinya.maeda💡Maintainer

Respect `environment_scope` with `only: kubernetes: active` strategy (gitlab-ci.yml)

Summary

When a job has only: kubernetes: active strategy, the job is only executed when the project has an enabled Kubernetes instance (Platform::Kubernetes or KubernetesService).

Here is the document.


See the example below. Job is going to be created only when pipeline has been scheduled or runs for a master branch, and only if kubernetes service is active in the project.

job:
  only:
    refs:
      - master
      - schedules
    kubernetes: active

And recently, we implemented multiple Kubernetes cluster per project. We can have multiple Kubernetes instances in a project. Each instance belongs to each environment.

But today kubernetes: active just checks if the project has any clusters. This could make a slight edge case.

For example,


  1. A project has a cluster with review/* environment scope.

gitlab-ci.yml

deploy to production:
  environment:
    name: production
  only:
    kubernetes: active

The job will be executed, because the project has a cluster, but the corresponding cluster doesn't exist, thus this job will fail or cause unexpected behavior.

Solution

Check environment:name when kubernetes: active strategy exists.

e.g.

  1. If kubernetes: active exists and environment:name exists, we check the activeness of the corresponding cluster.
  2. If kubernetes: active exists and environment:name does not exist, we check if any clusters exist.

Possible workaround

For the workaround, users would need to create a cluster with a wildcard scope (*) for now. IIRC our gitlab-org/gitlab project does the same workaround for the same reason.

Related

  • https://gitlab.com/gitlab-org/gitlab-ee/merge_requests/3603#note_50150076
  • https://gitlab.com/gitlab-org/gitlab-ee/issues/3734

/cc @bikebilly @ayufan @grzesiek

Edited Nov 05, 2020 by Viktor Nagy
Assignee
Assign to
Backlog
Milestone
Backlog
Assign milestone
Time tracking
None
Due date
None
Reference: gitlab-org/gitlab#20351