XSS in k8s proxy when running Gitlab behind Akamai

⚠️ Please read the process on how to fix security issues before starting to work on the issue. Vulnerabilities must be fixed in a security mirror.

HackerOne report #3277291 by joaxcar on 2025-07-29, assigned to @katwu:

Report | Attachments | How To Reproduce

Report

Summary

TLDR:
It's possible to still cache content on the k8s proxy endpoint. This is not a big issue in and of itself, but can, when linked with a bug in the content-type validation and a Firefox quirk, still lead to stored XSS

Hi team, I found what I think is a final vector for XSS in the k8s proxy. After the fix in 18.2.1, most caching issues should be solved. There is, however, one big CDN that is still at risk, and that is Akamai.

The fix clears out the basic cache headers such as cache-control and surrogate-control as well as two targeted cache control headers (see https://www.akamai.com/blog/news/targeted-cache-control) that are CDN-cache-control and cloudflare-cdn-cache-control. One problem here is that this is not an exhaustive list of targeted cache control headers, and that such a list might be hard to maintain as it gets wider adoption.

Today (currently), two such headers are akamai-cache-control, which is supposedly active, and edge-control, which is a powerful Akamai header that can be used to turn off normal cache-control headers. See https://techdocs.akamai.com/property-mgr/docs/know-caching

On the ​​Akamai​​ network, the Edge-Control settings take precedence over any Cache-Control and Expires headers as well as over many caching-related configuration settings.

This is mainly a side note to prove that caching on this endpoint will still be an issue. However, this should not be of any concern, as the patch also removes x-accell-redirect headers. But one base issue that I mentioned in my original report is still present: the ability to circumvent content-type restrictions and serve content without a content-type header. This can, in Firefox, lead to XSS using MIME sniffing.

The main issue

Since my first report on this endpoint Gitlab is now serving all resposes here with x-content-type-options: nosniff and there is a check for content-types here in https://gitlab.com/gitlab-org/cluster-integration/gitlab-agent/-/blob/master/internal/module/kubernetes_api/server/proxy.go

	allowedResponseContentTypes = []string{  
		runtime.ContentTypeJSON,  
		runtime.ContentTypeYAML,  
		runtime.ContentTypeProtobuf,  
		runtime.ContentTypeCBOR,  
		"text/plain",  
	}  

that is used here

func checkContentType(h http.Header, allowed ...string) error {  
	// There should be at most one Content-Type header, but it's not our job to do something about it if there is more.  
	// We just ensure thy are all allowed.  
nextContentType:  
	for _, ct := range h[httpz.ContentTypeHeader] {  
		mediatype, _, err := mime.ParseMediaType(ct)  
		if err != nil && err != mime.ErrInvalidMediaParameter {  
			// Parsing error and not a MIME parameter parsing error, which we ignore.  
			return fmt.Errorf("check Content-Type: %w", err)  
		}  
		for _, a := range allowed {  
			if mediatype == a {  
				// This one is allowed, onto the next Content-Type header  
				continue nextContentType  
			}  
		}  
		return fmt.Errorf("%s not allowed: %s", httpz.ContentTypeHeader, mediatype)  
	}  
	return nil  
}

where the issue lies in this line for _, ct := range h[httpz.ContentTypeHeader] . The problem is that the method will only throw an error if there is a bad content-type header but still allow any request missing a content-type header.

If there is no content-type header, the range will be zero, and the function will return. This allows us to send requests that lack content-type but still has a nosniff header.

Firefox quirk

Now to Firefox. In Firefox, there are a few places where an HTML creator can give "type hints" when adding links and frames to the browser on how to interpret content if no content type is given. Here is an example from the current version of Firefox

<a type=text/html href="https://gl8.j15.se/-/kubernetes-agent/k8s-proxy/api/v1/pods">a</a>  

The type=text/html tells Firefox to interpret the linked document as HTML if the page does not specify its own content type. This in combination with the lacking content-type header in the k8s endpoint will trigger XSS.

Also note that this HTML tag is valid in markdown in Gitlab.com

Steps to reproduce

There is an issue here that Akamai is not really a trail friendly vendor. So you kind of have to trust the documentation for now, as for the caching part. (there are numerous ways that caching could still be in place). I will instead show a mocked version of the Firefox XSS

__ USE FIREFOX__

  1. Set up a self-hosted Gitlab instance, I don't know if you need SSL configured but its probably good to set it up (my test setup has it)
  2. SSH into the server and open /etc/gitlab/gitlab.rb and enable CSP by adding these lines
gitlab_rails['content_security_policy'] = {  
  'enabled' => true,  
  'report_only' => false,  
}
  1. Save the file and then run sudo gitlab-ctl reconfigure This will make the instance have ".com"-matching CSP
  2. Now log in as a regular user
  3. Create a new group mynewgroup
  4. Create a project in the group called config
  5. In the project mynewgroup/config create a file with the name .gitlab/agents/test-agent/config.yaml and the content (replace groupname if you have another)
user_access:  
  access_as:  
    agent: {}  
  groups:  
    - id: mynewgroup  
  1. Go to https://gitlab.example..com/mynewgroup/config/-/clusters and click Connect a cluster, name it test-agent
  2. In the popup under the heading Register agent with the UI, click Register
  3. Copy the Agent access token
  4. In a terminal on your local computer, create a file called glab-agentk-token-local and paste the token from step 10 in the file
  5. Download the file server.py into the same directory
  6. Open another terminal tab in the same directory, in one of the tabs run python3 server.py and in the other tab run this command (this command is made to work on linux, if you use MacOS replace 127.0.0.1:9999 with host.docker.internal:9999, and replace gitlab.example.com with your server domain)
docker run \  
--network host \  
--rm \  
-it \  
-v ./glab-agentk-token-local:/etc/agentk/secrets/token \  
-e POD_NAMESPACE=agentk-nsname \  
-e POD_NAME=agent-podname registry.gitlab.com/gitlab-org/cluster-integration/gitlab-agent/agentk:latest  \  
--kas-address=wss://gitlab.example.com/-/kubernetes-agent/ \  
--token-file=/etc/agentk/secrets/token \  
-s 127.0.0.1:9999  

You should now in the browser see the test-agent going to connected state (if not refresh the browers)
14. Now go to the page https://gitlab.com/mynewgroup/config/-/environments and click New Environment
15. Give the environment a name, then add https://example.com as the External URL, and select test-agent as the Gitlab Agent.
16. Click Create and take a note of the ID
17. Open the devtools terminal and run this (replace HOST url and the agent ID)

`https://gitlab.example.com/-/kubernetes-agent/k8s-proxy/api/v1/pods?gitlab-agent-id=1&gitlab-csrf-token=${document.getElementsByName('csrf-token')[0].content}`   
  1. Take the URL that is generated and save it
  2. Click Edit on the Environment page and add this description replacing the href with the one from step 17
<a type=text/html href="https://gitlab.example.com/-/kubernetes-agent/k8s-proxy/api/v1/pods?gitlab-agent-id=<AGENT_ID>&gitlab-csrf-token=<CSRF_TOKEN>">test XSS</a>  
  1. Click save
  2. Now click this link that is rendered in the description and an XSS should pop.

NOTE again that this example uses a copied CSRF token. The real issue is when a cache is involved, as then the regular request with the CSRF token in the header (see all other reports on this endpoint) will get cached and trigger this behaviour.

Screen_Recording_2025-07-30_at_00.53.09.mov

What is the current bug behavior?

K8s proxy requests can still be sent without content-type. This in combination with targeted cache headers can lead to XSS with full CSP bypass on selfhosted servers.

What is the expected correct behavior?

If there would always be a forced content-type like plain/text even when a conent-type is missing the issue would not exist.

Output of checks
System information  
System:         Ubuntu 24.10  
Proxy:          no  
Current User:   git  
Using RVM:      no  
Ruby Version:   3.2.5  
Gem Version:    3.6.9  
Bundler Version:2.6.5  
Rake Version:   13.0.6  
Redis Version:  7.2.9  
Sidekiq Version:7.3.9  
Go Version:     unknown

GitLab information  
Version:        18.2.1-ee  
Revision:       289574e3868  
Directory:      /opt/gitlab/embedded/service/gitlab-rails  
DB Adapter:     PostgreSQL  
DB Version:     16.8  
URL:            https://gl8.j15.se  
HTTP Clone URL: https://gl8.j15.se/some-group/some-project.git  
SSH Clone URL:  git@gl8.j15.se:some-group/some-project.git  
Elasticsearch:  no  
Geo:            no  
Using LDAP:     no  
Using Omniauth: yes  
Omniauth Providers: 

GitLab Shell  
Version:        14.43.0  
Repository storages:  
- default:      unix:/var/opt/gitlab/gitaly/gitaly.socket  
GitLab Shell path:              /opt/gitlab/embedded/service/gitlab-shell

Gitaly  
- default Address:      unix:/var/opt/gitlab/gitaly/gitaly.socket  
- default Version:      18.2.1  
- default Git Version:  2.50.1.gl1  

Impact

Stored XSS with full CSP bypass

Attachments

Warning: Attachments received through HackerOne, please exercise caution!

  • server.py
  • Screen_Recording_2025-07-30_at_00.53.09.mov

How To Reproduce

Please add reproducibility information to this section:

Assignee Loading
Time tracking Loading