Nested Redis instance password secrets appear to be ignored with the global secret being used instead
## Summary
While testing out an environment with two Redis instances (for Cache and Persistent) AUTH errors were seen in the environment when trying to connect to the Cache instance - `WRONGPASS invalid username-password pair or user is disabled`
In this setup we have Persistent acting as the default and Cache for the `cache` data as per it's namesake, this is configured by configuring the global settings for all classes and a nested `cache` config accordingly as shown below.
Looking at the Webservice and Sidekiq deployment yaml though the cache config looks to be getting ignored, with the global secret key inserted instead:
```yaml
- secret:
name: gitlab-redis-persistent-password. <----
items:
- key: password
path: redis/cache-password
- secret:
name: gitlab-redis-persistent-password
items:
- key: password
path: redis/redis-password
```
Additionally the `dependencies` container didn't pick this up, only checking the Persistent Redis instance and not the separated Cache instance:
```
Begin parsing .erb templates from /var/opt/gitlab/templates
Writing /srv/gitlab/config/cable.yml
Writing /srv/gitlab/config/database.yml
Writing /srv/gitlab/config/gitlab.yml
Writing /srv/gitlab/config/redis.cache.yml
Writing /srv/gitlab/config/resque.yml
Begin parsing .tpl templates from /var/opt/gitlab/templates
Copying other config files found in /var/opt/gitlab/templates to /srv/gitlab/config
Copying smtp_settings.rb into /srv/gitlab/config
Checking: resque.yml, cable.yml
+ SUCCESS connecting to 'redis://10.43.26.119:6379' from resque.yml, through 10.43.26.119
+ SUCCESS connecting to 'redis://10.43.26.119:6379' from cable.yml, through 10.43.26.119
Checking: main
Database Schema - main (gitlabhq_production) - current: 20231115151449, codebase: 20231115151449
```
It's unknown when this specifically started at this time. In our test environments we've been using the same password for both so this is gone unnoticed on our end. It's only recently when we tried to set up via GCP Memorystore, which set's up passwords separately, was this seen. Anecdotally GCP Memorystore has worked in the past so this appears to be a regression from at least this year.
Finally this may extend to [other Redis classes](https://docs.gitlab.com/charts/charts/globals.html#multiple-redis-support) and probably worth checking as well.
## Steps to reproduce
Configure a Charts setup to point to two separate Redis instances, one global and one cache, *with* different passwords for each.
## Configuration used
(Please provide a _sanitized_ version of the configuration used wrapped in a code block (```yaml))
```yaml
redis:
auth:
key: password
secret: gitlab-redis-persistent-password
cache:
auth:
key: password
secret: gitlab-redis-cache-password
host: <redacted>
port: "6379"
scheme: redis
host: <redacted>
port: "6379"
scheme: redis
```
## Current behavior
Redis Cache password is being configured to the global secret.
## Expected behavior
For the Redis Cache password to be configured to the correct secret.
## Versions
- Chart: 7.6.1
- Platform:
- Cloud: GKE (confirming AWS)
- Kubernetes: (`kubectl version`)
- Client: `v1.28.4`
- Server: `v1.27.3-gke.100`
- Helm: (`helm version`)
- `version.BuildInfo{Version:"v3.13.2", GitCommit:"2a2fb3b98829f1e0be6fb18af2f6599e0f4e8243", GitTreeState:"clean", GoVersion:"go1.21.4"}`
## Relevant logs
<details><summary>Webservice dependencies pod</summary>
```
Begin parsing .erb templates from /var/opt/gitlab/templates
Writing /srv/gitlab/config/cable.yml
Writing /srv/gitlab/config/database.yml
Writing /srv/gitlab/config/gitlab.yml
Writing /srv/gitlab/config/redis.cache.yml
Writing /srv/gitlab/config/resque.yml
Begin parsing .tpl templates from /var/opt/gitlab/templates
Copying other config files found in /var/opt/gitlab/templates to /srv/gitlab/config
Copying smtp_settings.rb into /srv/gitlab/config
Checking: resque.yml, cable.yml
+ SUCCESS connecting to 'redis://<redis_persistent_ip>:6379' from resque.yml, through <redis_persistent_ip>
+ SUCCESS connecting to 'redis://<redis_persistent_ip>:6379' from cable.yml, through <redis_persistent_ip>
Checking: main
Database Schema - main (gitlabhq_production) - current: 20231115151449, codebase: 20231115151449
```
</details>
<details><summary>Webservice pod</summary>
```
{"component": "gitlab","subcomponent":"exceptions_json","severity":"ERROR","time":"2023-12-12T14:57:49.005Z","correlation_id":"bdf478a8-5132-45ab-96d3-aa9d492399ef","exception.class":"Redis::CommandError","exception.message":"WRONGPASS invalid username-password pair or user is disabled.","exception.backtrace":["lib/gitlab/instrumentation/redis_interceptor.rb:10:in `block in call'","lib/gitlab/instrumentation/redis_interceptor.rb:42:in `instrument_call'","lib/gitlab/instrumentation/redis_interceptor.rb:9:in `call'","config/initializers/zz_metrics.rb:45:in `connect'","lib/gitlab/instrumentation/redis_interceptor.rb:10:in `block in call'","lib/gitlab/instrumentation/redis_interceptor.rb:42:in `instrument_call'","lib/gitlab/instrumentation/redis_interceptor.rb:9:in `call'","lib/feature.rb:259:in `block in current_feature_value'","lib/feature.rb:274:in `with_feature'","lib/feature.rb:255:in `current_feature_value'","lib/feature.rb:102:in `enabled?'","lib/feature.rb:115:in `disabled?'","lib/gitlab/lograge/custom_options.rb:39:in `call'","lib/gitlab/metrics/elasticsearch_rack_middleware.rb:16:in `call'","lib/gitlab/middleware/memory_report.rb:13:in `call'","lib/gitlab/middleware/speedscope.rb:13:in `call'","lib/gitlab/database/load_balancing/rack_middleware.rb:23:in `call'","lib/gitlab/middleware/rails_queue_duration.rb:33:in `call'","lib/gitlab/etag_caching/middleware.rb:21:in `call'","lib/gitlab/metrics/rack_middleware.rb:16:in `block in call'","lib/gitlab/metrics/web_transaction.rb:46:in `run'","lib/gitlab/metrics/rack_middleware.rb:16:in `call'","lib/gitlab/middleware/go.rb:20:in `call'","lib/gitlab/middleware/query_analyzer.rb:11:in `block in call'","lib/gitlab/database/query_analyzer.rb:37:in `within'","lib/gitlab/middleware/query_analyzer.rb:11:in `call'","lib/gitlab/middleware/multipart.rb:173:in `call'","lib/gitlab/middleware/read_only/controller.rb:50:in `call'","lib/gitlab/middleware/read_only.rb:18:in `call'","lib/gitlab/middleware/same_site_cookies.rb:27:in `call'","lib/gitlab/middleware/path_traversal_check.rb:48:in `call'","lib/gitlab/middleware/handle_malformed_strings.rb:21:in `call'","lib/gitlab/middleware/basic_health_check.rb:25:in `call'","lib/gitlab/middleware/handle_ip_spoof_attack_error.rb:25:in `call'","lib/gitlab/middleware/request_context.rb:15:in `call'","lib/gitlab/middleware/webhook_recursion_detection.rb:15:in `call'","config/initializers/fix_local_cache_middleware.rb:11:in `call'","lib/gitlab/middleware/compressed_json.rb:44:in `call'","lib/gitlab/middleware/rack_multipart_tempfile_factory.rb:19:in `call'","lib/gitlab/middleware/sidekiq_web_static.rb:20:in `call'","lib/gitlab/metrics/requests_rack_middleware.rb:79:in `call'","lib/gitlab/middleware/release_env.rb:13:in `call'"],"user.username":null,"tags.program":"web","tags.locale":"en","tags.feature_category":null,"tags.correlation_id":"bdf478a8-5132-45ab-96d3-aa9d492399ef","extra.storage":"feature_flag"}
```
</details>
issue