An error occurred while fetching the assigned iteration of the selected issue.
Stream stdout in job logs
Our current job log UI has some limitations in what it can do:
- We have a log limit and if you reach it you won't be able to see new log entries anymore. Here's an example job that exceeded the log limit and the rest of the output was scrapped.
- Since we're using polling to fetch up-to-date log entries it puts more stress on the server side.
In order to solve both issues we could switch to stdout
streaming from the runner. The way it should work is the following:
- Pipe the stdout stream coming from the runner, tee (split) the stream
- Write one end of the stream to persistent storage: file or DB (with size limit as before)
- Use the second end of the stream to send bytes directly to the client with transformations (we should keep the line index in memory and also wrap every new line in our HTML structure)
For the end-user it should work like this:
- When you open the page the last N lines from the log file are sent with the document (works exactly the same as of now)
- For new lines, connect to a stream (it can be done via SSE, WebSocket or a long-living request) and stream that HTML right where the current logs ended.
- Once the log reaches size limit we should stop writing to the persistent storage but continue streaming to the client. That way we don't overload our storage but the client can still see the final output of the job.