panic: runtime error: index out of range during PurgeUploads
A customer reported (Zendesk, internal only) that calling the bulk delete API yields a 202 but does nothing. The API was called using:
curl --request DELETE --data 'name_regex=.*' --data 'keep_n=1' --header "PRIVATE-TOKEN: private-token" "https://gitlab.example.com/api/v4/projects/4786/registry/repositories/990/tags"
In their provided Sidekiq current
logs, we can see multiple occurrences of the following error:
2020-04-17_09:04:56.97452 time="2020-04-17T09:04:56.97448611Z" level=info msg="PurgeUploads starting: olderThan=2020-04-10 09:04:56.974458747 +0000 UTC m=-603719.958049450, actuallyDelete=true"
2020-04-17_09:05:12.35987 panic: runtime error: index out of range
2020-04-17_09:05:12.35988
2020-04-17_09:05:12.35989 goroutine 53 [running]:
2020-04-17_09:05:12.35989 github.com/docker/distribution/registry/storage.getOutstandingUploads.func2(0x13d60a0, 0xc0002f5980, 0x13d60a0, 0xc0002f5980)
2020-04-17_09:05:12.35990 /var/cache/omnibus/src/registry/src/github.com/docker/distribution/registry/storage/purgeuploads.go:107 +0x538
2020-04-17_09:05:12.35990 github.com/docker/distribution/registry/storage/driver/s3-aws.(*driver).doWalk.func1(0xc000135c00, 0xc000124301, 0x1)
2020-04-17_09:05:12.35991 /var/cache/omnibus/src/registry/src/github.com/docker/distribution/registry/storage/driver/s3-aws/s3.go:1346 +0x77b
2020-04-17_09:05:12.35991 github.com/docker/distribution/vendor/github.com/aws/aws-sdk-go/service/s3.(*S3).ListObjectsV2PagesWithContext(0xc0000104c0, 0x13d54a0, 0xc000034930, 0xc000145950, 0xc0003df3b0, 0x0, 0x0, 0x0, 0xc0003d32b0, 0x40)
2020-04-17_09:05:12.35992 /var/cache/omnibus/src/registry/src/github.com/docker/distribution/vendor/github.com/aws/aws-sdk-go/service/s3/api.go:6113 +0x15f
2020-04-17_09:05:12.35993 github.com/docker/distribution/registry/storage/driver/s3-aws.(*driver).doWalk(0xc000146c80, 0x13d54a0, 0xc0000aa150, 0xc000506ab8, 0xc000383380, 0x33, 0x11095f5, 0x1, 0xc000690b70, 0x0, ...)
2020-04-17_09:05:12.35995 /var/cache/omnibus/src/registry/src/github.com/docker/distribution/registry/storage/driver/s3-aws/s3.go:1305 +0x397
2020-04-17_09:05:12.35997 github.com/docker/distribution/registry/storage/driver/s3-aws.(*driver).doWalk.func1(0xc00036e380, 0xc000078201, 0x1)
2020-04-17_09:05:12.35998 /var/cache/omnibus/src/registry/src/github.com/docker/distribution/registry/storage/driver/s3-aws/s3.go:1360 +0x889
2020-04-17_09:05:12.35998 github.com/docker/distribution/vendor/github.com/aws/aws-sdk-go/service/s3.(*S3).ListObjectsV2PagesWithContext(0xc0000104c0, 0x13d54a0, 0xc0000aa150, 0xc0001454a0, 0xc000145680, 0x0, 0x0, 0x0, 0xc0003bd1a0, 0xc00001e300)
2020-04-17_09:05:12.35999 /var/cache/omnibus/src/registry/src/github.com/docker/distribution/vendor/github.com/aws/aws-sdk-go/service/s3/api.go:6113 +0x15f
2020-04-17_09:05:12.36000 github.com/docker/distribution/registry/storage/driver/s3-aws.(*driver).doWalk(0xc000146c80, 0x13d55e0, 0xc000149900, 0xc000506ab8, 0xc00001e3f1, 0x20, 0x11095f5, 0x1, 0xc000690b70, 0x0, ...)
2020-04-17_09:05:12.36002 /var/cache/omnibus/src/registry/src/github.com/docker/distribution/registry/storage/driver/s3-aws/s3.go:1305 +0x397
2020-04-17_09:05:12.36003 github.com/docker/distribution/registry/storage/driver/s3-aws.(*driver).Walk(0xc000146c80, 0x13d55e0, 0xc000149900, 0xc0003e1d60, 0x20, 0xc000690b70, 0xc0004d5b30, 0x40d475)
2020-04-17_09:05:12.36004 /var/cache/omnibus/src/registry/src/github.com/docker/distribution/registry/storage/driver/s3-aws/s3.go:1169 +0x178
2020-04-17_09:05:12.36005 github.com/docker/distribution/registry/storage/driver/s3-aws.(*driver).WalkParallel(0xc000146c80, 0x13d55e0, 0xc000149900, 0xc0003e1d60, 0x20, 0xc000690b70, 0xc000001601, 0xc000690b70)
2020-04-17_09:05:12.36009 /var/cache/omnibus/src/registry/src/github.com/docker/distribution/registry/storage/driver/s3-aws/s3.go:1186 +0x58d
2020-04-17_09:05:12.36010 github.com/docker/distribution/registry/storage.getOutstandingUploads(0x13d55e0, 0xc000149900, 0x13ea5c0, 0xc00031aeb0, 0x2, 0x2, 0x0, 0x7fa58f371eb8)
2020-04-17_09:05:12.36011 /var/cache/omnibus/src/registry/src/github.com/docker/distribution/registry/storage/purgeuploads.go:104 +0x2f9
2020-04-17_09:05:12.36012 github.com/docker/distribution/registry/storage.PurgeUploads(0x13d55e0, 0xc000149900, 0x13ea5c0, 0xc00031aeb0, 0xbf9c2b8e3a150f7b, 0xfffddaeb7e5b4d56, 0x1cd49c0, 0xc000365e01, 0xc00004f760, 0xc00004f798, ...)
2020-04-17_09:05:12.36014 /var/cache/omnibus/src/registry/src/github.com/docker/distribution/registry/storage/purgeuploads.go:60 +0x152
2020-04-17_09:05:12.36015 github.com/docker/distribution/registry/handlers.startUploadPurger.func1(0x13ef0c0, 0xc000144be0, 0x13d55e0, 0xc000149900, 0x13ea5c0, 0xc00031aeb0, 0x2260ff9290000, 0x1cf1c00, 0x4e94914f0000)
2020-04-17_09:05:12.36017 /var/cache/omnibus/src/registry/src/github.com/docker/distribution/registry/handlers/app.go:1072 +0x21a
2020-04-17_09:05:12.36019 created by github.com/docker/distribution/registry/handlers.startUploadPurger
2020-04-17_09:05:12.36019 /var/cache/omnibus/src/registry/src/github.com/docker/distribution/registry/handlers/app.go:1065 +0x2d1
Best I can tell, this is happens because file
is either a null value or an empty string. See open PR https://github.com/docker/distribution/pull/3078#issue-356334161 for some details on this.
Are there any workarounds for this, or any additional information I can gather from the customer to help with troubleshooting?