Deploy fluentd and perform local log parsing and shipping to Elastic Cloud
The simplest and fastest way of solving our immediate problems with logs is to leverage fluentd and Elastic Cloud.
-
fluentdwill be deployed withchef-client -
fluentdwill track and parse system-level logs on all nodes (auth logs, syslog logs in general, etc) -
fluentdwill also track and parse application-specific logs (gitlab-shell, postgresql, redis, etc) -
fluentdwill have a modular configuration approach so we can choose what logs to track, and where, based on chef roles -
fluentdwill write logs directly to the Elastic Cloud cluster via HTTPS + authentication
In order to achieve this, the steps required are the following:
-
URGENT logstashnode cannot cope with all the current log parsing load gitlab-cookbooks/gitlab-elk!93 (merged) -
#164 (closed) Create a gitlab_fluentdcookbook -
#164 (closed) Make the logging configuration modular (each role will track different log files) -
Create stagingcluster in Elastic Cloud -
Handle Elastic Cloud credentials with chef-vaultinstaging -
#167 (closed) Start gathering systemlogs instaging -
#167 (closed) Start gathering gitlablogs instaging -
#167 (closed) Start gathering haproxylogs instaging -
Create productioncluster in Elastic Cloud -
Handle Elastic Cloud credentials with chef-vaultinproduction -
Start gathering systemlogs inproduction -
Start gathering gitlablogs inproduction -
Start gathering haproxylogs inproduction -
Double check for the need of which applications we need to gather logs from
Edited by Ilya Frolov