Handle Cloudflare Logs for Dedicated account
Summary
We need a way to access logs from the Cloudflare Dedicated account
Currently DotCom uses Cloudflare Logpush to send our logs to a storage bucket, which are loaded into BigQuery as needed by SREs. See the runbooks documentation for more information https://gitlab.com/gitlab-com/runbooks/-/blob/master/docs/cloudflare/logging.md
Proposal
We would like to follow this same set up for the new Dedicated account, however logs should go into tenant specific AWS/GCP buckets to stay within data residency requirements. These buckets already exist as part of a tenant, we will need to add a new top level directory.
This is also captured on the Blueprint
-
Setup logpush on existing tenant zones to push logs into the corresponding bucket -
Determine how to load those logs into Bigquery/AWS Redshift (or somewhere appropriate for the Dedicated team to parse them) -
Include logpush settings in the Dedicated module, so future tenants have it configured automatically -
Determine where to store the account/audit logs (top level, not tenant specific)
Links
Success criteria
-
logpush is configured for each zone to send logs to the corresponding tenants bucket -
This happens automatically when future tenants are created (probably as part of the terraform module) -
Documentation exists so that SREs can query these logs
Edited by Sarah Walker