Put commit markdown in the database, rather than redis
In https://gitlab.com/gitlab-org/gitlab-ce/issues/54140 , we started caching commits in redis through cache_markdown_field
.
We mostly cache markdown in the database, instead of redis. There are two main reasons for this:
- It's faster to pull the HTML at the same time as the rest of the object, instead of making one postgres, and one redis, call
- It's expensive to store things in redis
We do store commits for merge requests in the database, as merge_request_diff_commits
, but that's a bit special-purpose.
I wonder if we could benefit from introducing a database table like:
create_table :cached_commits do |t|
t.binary :sha
t.text :title
t.text :title_html
t.text :full_title
t.text :full_title_html
t.text :description
t.text :description_html
t.integer :cached_markdown_version
end
We index on SHA, and use this to store the same values we're currently putting in redis.
This loses us some automatic expiry - when force-pushes, gc, or project deletions remove a set of commits, we'd need to manually remove the entries from the table, or perhaps have a regular cleaner job that limits their age.
We're pretty neutral in terms of performance, naively - this replaces an additional call to redis with an additional call to the database.
One (small) advantage is that the database is replicated for Geo, but Redis is not. So Geo secondaries get to share this cache, doing it like this.