Add cross-request caching for CI includes

What does this MR do and why?

Adds commit SHA keyed Redis caching for CI include content to reduce Gitaly load during pipeline creation.

Problem: Every pipeline creation makes fresh Gitaly calls to fetch included files, even when the same content was recently fetched. When Gitaly is saturated, this causes intermittent timeouts (Gitlab::Ci::Config::External::Context::TimeoutError) for pipelines with many includes.

Solution: Cache file content in Redis using project_id:sha:path as the key. Because Git SHAs are cryptographically immutable, cached content is guaranteed to be correct and never needs invalidation.

Impact: Subsequent pipelines that reference the same SHA and path combination serve content from Redis cache (4-hour TTL) instead of hitting Gitaly.

Issue: Request timed out when fetching configuration f... (#588313)

Implementation

This MR introduces CachedContentFetcher, a shared service that:

  1. Checks Redis cache for requested paths
  2. Batches cache misses into a single Gitaly blobs_at call
  3. Writes fetched content back to cache

Both project includes (include:project) and component includes (include:component) now use this service.

We're caching the include types that reference stable SHAs with high reuse potential. Local includes fetch from frequently-changing project SHAs (new commit = new SHA), while project/component includes reference stable tags and branches in config repositories, achieving much higher cache hit rates.

Sequential cache checking for components

Component includes check two possible template locations (simple: templates/foo.yml, complex: templates/foo/template.yml). To avoid unnecessary Gitaly calls when the simple path is cached, we check cache sequentially:

  1. Check simple path cache -> return if found
  2. Check complex path cache -> return if found
  3. Fetch both paths from Gitaly if both missed

This achieves ** 0️⃣ Gitaly calls** on subsequent pipeline runs while still batching both paths when needed. Improved component batching will be added in a subsequent MR: !230165

Feature flags

Flag Type Scope
ci_cache_project_includes (new) ops Gates caching for include:project
ci_optimize_component_fetching (existing - at 100%) development Gates caching for include:component

Both flags will remain in the codebase long-term for quick disabling if Redis capacity issues arise. They'll be enabled at 100% on gitlab.com but not rolled out to self-managed installations (which may have less Redis capacity than git.abcom).

ci_optimize_component_fetching will be replaced with an ops flag in a subsequent MR.

Edited by Avielle Wolfe

Merge request reports

Loading