object storage not working

context: i have podman running gitlab in a container (based on multiple images that i've tried - ce and ee).

Gitlab loads successfully, and i am able to run it self-hosted with everything working fine...except the ability to store objects in s3.

my container:

      containers.podman.podman_container:
        name: gitlab
        user: root
        image: gitlab/gitlab-ee:latest
        ports:
          - "0.0.0.0:8082:80"
          - "0.0.0.0:2222:22"
        volumes:
          - /home/admin/_volumes/gitlab/config/:/etc/gitlab/
          - /home/admin/_volumes/gitlab/logs/:/var/log/gitlab/
          - /home/admin/_volumes/gitlab/data/:/var/opt/gitlab/
        env:
            GITLAB_OMNIBUS_CONFIG: | 
              external_url 'http://{{ HOST_IP }}/gitlab/'
              # Consolidated object storage configuration
              gitlab_rails['object_store']['enabled'] = true
              gitlab_rails['object_store']['connection'] = {
                'provider' => 'AWS',
                'region' => 'us-east-1',
                'aws_access_key_id' => '77777777777777777',
                'aws_secret_access_key' => '7777777777777777777777777777777777777777'
              }
              # OPTIONAL: The following lines are only needed if server side encryption is required
              gitlab_rails['object_store']['storage_options'] = {
                'server_side_encryption' => 'aws:kms',
                'server_side_encryption_kms_key_id' => 'arn:aws:kms:us-east-1:777777777:key/77777-7777-7777-7777-777777777'
              }
              gitlab_rails['object_store']['objects']['artifacts']['bucket'] = 'wktaanamktetl-gitlab-object-store-artifcats'
              gitlab_rails['object_store']['objects']['external_diffs']['bucket'] = 'wktaanamktetl-gitlab-object-store-external-diffs'
              gitlab_rails['object_store']['objects']['lfs']['bucket'] = 'wktaanamktetl-gitlab-object-store-lfs'
              gitlab_rails['object_store']['objects']['uploads']['bucket'] = 'wktaanamktetl-gitlab-object-store-uploads'
              gitlab_rails['object_store']['objects']['packages']['bucket'] = 'wktaanamktetl-gitlab-object-store-packages'
              gitlab_rails['object_store']['objects']['dependency_proxy']['bucket'] = 'wktaanamktetl-gitlab-object-store-dependency-proxy'
              gitlab_rails['object_store']['objects']['terraform_state']['bucket'] = 'wktaanamktetl-gitlab-object-store-terraform-state'
              gitlab_rails['object_store']['objects']['ci_secure_files']['bucket'] = 'wktaanamktetl-gitlab-object-store-ci-secure-files'
              gitlab_rails['object_store']['objects']['pages']['bucket'] = 'wktaanamktetl-gitlab-object-store-pages'

        shm_size: 2gb
        restart_policy: always
        state: started

I've tried this without the use of volumes for /var/log and /var/data, as well.

the issue seems to be related to the workhorse trying to reach redis:

==> /var/log/gitlab/gitlab-workhorse/current <==
{"error":"keywatcher: pubsub receive: EOF","level":"error","msg":"","time":"2023-04-11T00:40:57Z"}
{"address":"/var/opt/gitlab/redis/redis.socket","level":"info","msg":"redis: dialing","network":"unix","time":"2023-04-11T00:40:57Z"}

when i go into the container and try to ping redis:

redis-cli -s /var/opt/gitlab/redis/redis.socket ping

i get PONG

the s3 connection seems to work, because overnight the uploads folder actually got populated when i had originally started the container with one bucket that had multiple directories:

/uploads/user/avatar/2/alert-bot.png and /uploads/user/avatar/3/support-bot.png

i switched to multiple buckets as another test.

PROBLEM:

so, when i upload a picture to the repo, s3 remains empty.

Assignee Loading
Time tracking Loading