Skip to content
Snippets Groups Projects
Commit 7d385382 authored by Donna Alexandra's avatar Donna Alexandra :two:
Browse files

Merge branch 'donna/update-with-eoc-general-slack-channel' into 'main'

chore: Update EOC slack channel reference

See merge request !11534
parents 7e310c71 4e3e8f33
No related branches found
No related tags found
1 merge request!11534chore: Update EOC slack channel reference
Pipeline #1653432810 passed
......@@ -87,7 +87,7 @@ For Sev3 and Sev4 incidents, the EOC is also responsible for [Incident Manager R
1. **As EOC, your highest priority for the duration of your shift is the stability of GitLab.com.**
1. When there is uncertainty of the cause of a degradation or outage, the **first action of the EOC** is to evaluate whether any changes can be reverted. It is always appropriate to toggle (to previous state) any recently changed application feature flags without asking for permission and without hesitation. The next step is to review Change Requests and validate the eligibility criteria for application rollbacks.
1. The SSOT for who is the current EOC is the [GitLab Production](https://gitlab.pagerduty.com/service-directory/PATDFCE) service definition in PagerDuty.
1. SREs are responsible for arranging coverage if they will be unavailable for a scheduled shift. To make a request, send a message indicating the days and times for which coverage is requested to the `#reliability-lounge` Slack channel. If you are unable to find coverage reach out to a Reliability Engineering Manager for assistance.
1. SREs are responsible for arranging coverage if they will be unavailable for a scheduled shift. To make a request, send a message indicating the days and times for which coverage is requested to the `#eoc-general` Slack channel. If you are unable to find coverage reach out to the [EOC coordinator](#engineer-on-call-coordinator) for assistance.
1. Alerts that are routed to PagerDuty require acknowledgment within 15 minutes, otherwise they will be escalated to the oncall Incident Manager.
1. Alert-manager alerts in [`#alerts`](https://gitlab.slack.com/archives/alerts) and [`#feed_alerts-general`](https://gitlab.slack.com/archives/feed_alerts-general) are an important source of information about the health of the environment and should be monitored during working hours.
1. If the PagerDuty alert noise is too high, your task as an EOC is clearing out that noise by either fixing the system or changing the alert.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment