UI over the Wire
tl;dr
Server-side render one or more Vue applications directly in Ruby (on Rails), serve the result embedded as HTML in the response and automatically hydrate the application when loaded.
Watch video about this RFC on YouTube.
Watch a video where I demo the Proof of Concept
Problem statement
GitLab has a heterogeneous frontend stack. Some parts are implemented with jQuery
, some are Rails views (HAML) + JavaScript bundled as assets, some parts are written as Vue applications. Even if we get rid of the last jQuery
occurrences, it is highly unlikely that we can get towards a uniform frontend stack in a reasonable amount of time. This has direct implications on how we can serve client-side executed source code (JavaScript). In particular, we lack the ability to stream the rendered UI directly to the browser.
The way servers (Apache, nginx) and browsers work is, they take the content from the HTTP response body and immediately parse and render the included (HTML) markup (stream based rendering). This means that a browser can display visual elements to the user, before the full response is loaded.
In comparison, by rendering a UI on the client side, we accept additional roundtrips and a deferred compilation and rendering timing. For a view that is built as a Vue application, this results in:
- No visual feedback before the dependencies (JavaScript bundles) are loaded, parsed and rendered
- No real above the fold optimization (you always render the full view as a single chunk and insert into DOM)
Proposal
For the longest time, I thought about our heterogeneous frontend setup as a problem, but I think we could instead make it our superpower. The feedback from various software developers at GitLab is that it can take longer to build a Vue application, than to do the same in a simple Rails view. In other words, many developers prefer to have an option when it comes to choosing a stack to work with (Vue or Rails views). As much as I have tried to invalidate this assessment in various conversations with colleagues, it is simply a reality in GitLab's codebase and organization. Without a major strategic shift towards thinking of GitLab as a platform (uniform API strategy → uniform frontend strategy → Vue), we need to think about how we can provide the best experience to users and developers for both worlds. As Rails views are well-supported by the framework, this leaves us with the support for Vue applications.
In December 2021, the release of a Rails "HTML over the wire" library (Hotwire) has inspired many developers to question their thinking about how to build frontends. And indeed, it also gave me the idea for what I'd like to call UI over the wire.
Based on an experiment, I actually have troubles seeing complex stateful applications being fully built with e.g. Turbo and Stimulus.
Besides the technical feasibility, we also have an enormous codebase that would need to be rewritten (which is, as everything, doable, but not desirable). But, there is one concept from Hotwire/Turbo that is particularly interesting: Decomposing views into frames. If we take a look at how we integrate Vue applications into GitLab today, there are some conceptual similarities with Turbo Frames. Most often, we just "mount" a Vue application into a certain position within the DOM tree. Turbo does the same, but follows a more dynamic approach (it can append, replace, delete, …). It also adds another concept by supporting live views via streaming-based content section(s). Turbo Streams operate in a push based manner and server-side state changes are pushed to the client. The Turbo specification does not state what you send to the client, as long as it is HTML and attached to a Turbo Frame or Stream ID.
UI over the wire is similar to what Turbo does. It will allow us to take an existing Vue application, and instead of manually mounting it on the client-side, we serve the pre-rendered application as HTML directly to the target Turbo frame plus rehydrate afterwards.
<!--my_view.html.erb-->
<%= isorun_app_tag("my_app") %>
We have previously shied away from server-side rendering due to its enormous complexity and implications on infrastructure (Node.js like reverse-proxy for all requests, authentication, …). Instead of rendering the full application, we could instead provision V8 Isolates as dedicated rendering containers for Vue during a regular request/response cycle. In the same way we render a partial with ERB or HAML today, we would render a full Vue application into a buffer (String) and serve it to the client.
Key Benefits
- Ready for the future: Server Components are the next hot optimization for rich client-side applications: https://nextjs.org/docs/advanced-features/react-18/server-components. Your server needs to speak JavaScript for this!
- Less client-server roundtrips to fetch data: Due to being embedded into the Ruby process, we have access to the same resources (Model, Database, GraphQL Resolver) and can directly fetch data from the source. No network roundtrips!
- Full control over the JavaScript runtime: The runtime lives in the same Ruby process as the rest of the application.
Decomposition
This is similar to our existing codebase. We usually mount a Vue application to a certain anchor within the DOM. e.g.
const element = document.getElementById("editor-app");
mount(Editor, element);
Usage via tag helper
# issues.erb.html
<!--my_view.html.erb-->
<%= isorun_app_tag("sidebar") %>
<%= isorun_app_tag("editor") %>
Render partial
# issues.erb.html
<div>
<h1>Issues View</h1>
<div id="editor-app">
<%= render isorun_app: EditorApp.render(@state) %>
</div>
<div id="sidebar-app">
<%= render isorun_app: SidebarApp.render(@state) %>
</div>
</div>
As response from the controller
# issues_controller.rb
class IssuesController
def editor
state = SomeModel.find(…)
editor = App.load("./javascript/editor.vue")
respond_to do |format|
format.isorun_app { render isorun_app: editor.render(state) }
end
end
def sidebar
state = SomeModel.find(…)
sidebar = App.load("./javascript/sidebar.vue")
respond_to do |format|
format.isorun_app { render isorun_app: sidebar.render(state) }
end
end
end
Render flow
Dealing with network requests
Most modern JavaScript (Vue) applications access data via APIs. In our case this is either Apollo using GraphQL or arbitrary HTTP requests (REST, …). When server-side rendering apps, the application still needs to access those APIs to fetch the required data for rendering the application. This adds network time during the SSR process and that might not be desirable, because it adds to latency, might cause issues, etc.
In a controlled render runtime, we can intercept network calls (e.g. intercept all GraphQL requests), and immediately forward the query to the Rails GraphQLController
. Due to the almost direct database access, the overall roundtrip time for rendering the app should decrease.
What are V8 Isolates?
You have probably heard about Cloudflare Workers, Shopify Oxygen or Deno Deploy (Netlify/Supabase). They all are built on top of the V8 engine and utilize V8 Isolates.
V8 orchestrates isolates: lightweight contexts that group variables with the code allowed to mutate them. You could even consider an isolate a “sandbox” for your function to run in.
A single runtime can run hundreds or thousands of isolates, seamlessly switching between them. Each isolate’s memory is completely isolated, so each piece of code is protected from other untrusted or user-written code on the runtime. Isolates are also designed to start very quickly. Instead of creating a virtual machine for each function, an isolate is created within an existing environment. This model eliminates the cold starts of the virtual machine model.
In comparison to VM's, Docker containers or Node.js applications, V8 Isolates offer a more lightweight compute platform that has a better memory footprint (if done right) and less cold start problems. A single runtime can handle many, many, many isolates (think tabs in browser). This is like building our own Node.js, deno or bun (without the APIs), but with Ruby (and Rails) as the primary container. In other words, the runtime lifecycle is fully controlled by the Ruby process. Adding isolates and IO is done strictly through a Ruby interface.
Implement a JavaScript renderer with V8 Isolates
Ruby offers us to extend its capabilities with extensions. The same way we maintain connections to Postgres or MySQL, we can also spin up a V8 runtime and provide many isolates to render Vue applications on demand (like we render ERB or HAML templates).
Prototype
This is a proof of concept that utilizes Deno's Rust V8 bindings to create a Ruby gem that is able to read, parse, and compile a JavaScript file and print the result.
code = File.read("examples/vanillajs/index.js")
vm = described_class.new
actual = vm.run code
expected = "<h1>Hello, World!</h1>"
actual == expected # true
Preliminary work (projects doing something similar)
Resource control
V8 isolates potentially allows fine-grained resource control (CPU + Memory). For example:
- It is possible to have a dedicated thread/Isolate, so that some slow code executed in one Isolate does not block the execution of another Isolate.
- Isolates that exceed a defined memory limit can be evicted.
Cold-start problem
V8 implements an optimizing compiler. During code execution, the code paths are tracked and compiled into native code at runtime. Until the application is compiled to native code, rendering can take longer. This is known as the cold-start problem. If this turns out to be a real-life problem with SSR, we can mitigate the effects in various ways:
- Pre-warm the application with a synthetic request (render cycle)
- Use V8 Snapshots to persist the compiled state of an application and load the binary at runtime
Distribution
In order for Isolates being able to render a Vue application, we must provide all the dependencies (npm packages) to the application. As we already bundle JavaScript with Webpack, we can also provide bundles for server-side rendering and allow them to be loaded at server startup time.
Talks & Docs
- https://www.infoq.com/presentations/cloudflare-v8/
- https://v8.github.io/api/head/classv8_1_1Isolate.html
- https://dev.to/tomlienard/v8-isolates-are-taking-over-the-world-3h4m
- https://wintercg.org/
- https://github.com/lagonapp/lagon
- https://www.lambrospetrou.com/articles/golang-v8-isolates/
Other solutions?
There are probably plenty of ways to optimize load times. Reduce bundle size, above the fold optimization, lazy loaded images, preload JavaScript bundles, offline/service workers. None of them solve the fundamental issue of UI not being rendered immediately.