Increased Error Rate Across Web Fleet
Summary
Increased Error Rate Across Web Fleet
Timeline
All times UTC.
2020-04-06
- 19:15 - PagerDuty Alerted
- 19:18 - EOC started investigating
- 19:34 - Incident declared from Slack
- 19:38 - An IP was identified as crawling the
/explore
endpoint causing a bunch of 500s - 19:38 - The IP was blocked via Cloudflare and all errors dropped to 0.
Details
There is currently a large jump in web 500 responses.
This appears to be because someone is using DirBuster-1.0-RC1
hitting the /explore/
path.
After the IP was blocked, all errors dropped to 0. No legitimate users were affected by this, only the malicious IP.
Source
Incident declared by alex in Slack via /incident declare
command.
Resources
- If the Situation Zoom room was utilised, recording will be automatically uploaded to Incident room Google Drive folder (private)
Edited by Alex Hanselka