Resolve "Final Checks"
Closes #22 (closed)
Remote Logging
Having an issue saving the GCP connection id, might need to worry about this later
-
confirm remote logging works -
separate buckets fortesting
anddefault
Alternate Solution
-
mount logs in a shared/persistent volume as a stop-gap
Deployment
-
point the GOOGLE_APPLICATION_CREDENTIALS
env var to the right place -
update the watcher
container to also get the credentials for k8s so airflow can create/destroy pods
Env Vars
testing
-
add the NAMESPACE
env var as a secret
default
-
add the NAMESPACE
env var as a secret -
add the SLACK_API_TOKEN
as a secret
Running DAGs
-
fix the import errors around the local importsbeing taken care of here: https://gitlab.com/gitlab-data/analytics/merge_requests/812 -
make sure production is loading DAGs correctly -
make sure slack failures are working in prod correctly
Misc.
-
add a copy of the testing secrets to 1pass -
add a copy of the default secrets to 1pass
Change Summary
Removed the secrets template, as the real ones are now being stored in 1pass. Added a persistent volume claim resource manifest which is how the logs are being persisted. Updated lots of things in the deployment file to make it possible to interact with k8s from within the pod using the correct service account.
Edited by Thomas La Piana