Secondary LLM streaming in chat and display state messages
Description
- Implemented secondary stream over WebSocket (
secondary_llm_message_stream). - Added hook state:
currentSecondaryStreamMessage,secondaryStreamBuffer; no history persistence. - Primary and secondary rAF smoothing; secondary overwrites buffer per chunk (no dupes).
- UI: collapsible “Thought process” panel (gray, smaller font, scrollable, auto‑scrolls).
- Final message clears both streaming states to avoid duplicates.
- State messages shown at the end, italic gray; linger ~1.5s after null.
- Wired in
AssistantWidget,Messages,BotSample/ChatWindow.
Related issue
https://gitlab.com/postgres-ai/ace/-/issues/173+
Examples
Checklist
-
All proposed text changes have been reviewed by an LLM for grammar -
this MR contains text changes and they have been reviewed OR there are no texts changes -
this MR contains GUI/CLI changes and they have been reviewed OR there are no GUI/CLI changes -
this MR contains API changes and they have been reviewed OR there are no API changes
Edited by Bogdan Tsechoev
