Open projects need moderation. Dealing with the kinds of incidents that have happened recently on GitLab.com and similar sites (see #18608 for more specific discussion) requires fine-grained, easy-to-use moderation tools. No one likes waking up to thousands of angry or troll comments, and it is even worse when there's nothing you can do about it.
Different kinds of projects will need different moderation workflows based on their own governance, CoC, etc. The kinds of moderation situations I want to focus on are:
- large public projects, many contributors
- examples: npm, rails, gitlab
- actions should be public, auditable
- decisions need to be made quickly
- clearly show what behavior is unacceptable
- small projects, targeted externally
- examples: pronoun.is, bumblebee, rouge
- silent deletion: no troll satisfaction
- private audit logs
- maximize signal/noise by cutting out obvious trolling
I think we are capable of designing a simple toolset that meets a wide variety of different needs. In particular, the toolkit I would like to propose is through the following features:
- Content deletion, mod editing
- immediately remove or edit out offensive content
- Non-moderator "report"
- notifies moderators of offensive content
- restricts issue, comment, MR creation for entire user / org context (still allows visibility, cloning)
- restricts mentioning any content in user / org context elsewhere on the site
- optionally temporary, reallowing access after x amount of time
- unless content is deleted, shows up inline in the comment ("this user was banned [for x time] for this content")
- Locks #18608
- disallows anyone adding comments to an issue or MR
- optionally temporary
- Extract derail
- move comments to new issue
- Audit log (optionally public or private)
- when public, a moderator's actions are visible, as well as users' moderation history
- where private = visible to moderators
For self-hosted instances that do not wish to have moderation, I would suggest one of the following:
- making the UI elements involved unobtrusive enough as to not get in the way
- providing a global switch to turn off all moderation
Of these, I prefer the first if possible, since even within self-contained corporations the ability to lock threads or move conversations will be useful.
I see these tools being useful for both kinds of projects. The workflows would be:
workflow for large public projects:
- Project owners select mods (owner implies moderator)
- Uses public audit log
- Lock threads when there is nothing more to be said, or temporarily to provide space for maintainers to deliberate about a decision
- Mods use temporary bans at will, while taking their time to decide for permanent bans
- Mods extract derails into separate issues to maintain focus
- Mods edit out particularly offensive content (images etc)
workflow for small personal projects:
- Usually one owner/moderator
- Private audit log ("have i dealt with this person before")
- Owner deletes troll content, doesn't have to deal with it at all
- Owner can easily delete content + block user in one action
= moderator is logged in new comment/issue from @example: [offensive content] UX options: [moderate] [move] [moderation log] [moderate]: (popup) Moderate content: (*) No action ( ) edit content: [editor (grey unless selected)] ( ) delete content [ ] Ban @example from (user/org) (for [1hr/6hr/24hr/1wk/1mo]) [ ] Close thread [move]: (small popup) Move to: [autocomplete issues, starting with recent, empty = new] - comment disappears from thread - comment appears in new issue in the order it would have given the original timestamp - bonus: bulk move [moderation log]: (new page) Previous moderation actions for @example in context of [project] = non-moderator is logged in new comment/issue from @example: [offensive content] UX options: [report] ([moderation log] if public) [report]: (popup) Report @example for: [dropdown of reasons, including other] [comment, optional] = banned user is logged in "Leave a comment" is grey / button red "You are banned from this project [for x time] [for [content]] [report] is grey or not present = banned user tries to create content elsewhere that mentions this project using GLFM "You cannot mention [project], because you are banned [for x time] [for [content]]" = after mod edit "[Edited by @moderator at TIMESTAMP]" = after ban (unless deleted) "[@example was banned [for x time] for this content]" = after comment deletion "(a comment was deleted)" = after issue/MR deletion - doesn't show up in issues list - 404 when manually navigating to the page
Proposed Implementation Plan
- Implement moderation log backend, scoped to project
- Visual and UX design for locking
- Implement locks (#18608) and set them up to leave logs
- Visual and UX design for moderation buttons, popup
- Implement bans with optional duration, adding log entries
- Add secondary indices to moderation logs - moderator and moderated user
- Design & implement frontends for moderation log in 4 contexts:
- for project
- for organization/user
- filtered by moderated user
- filtered by moderator
- Visual and UX design for editing features of moderation
- Implement content deletion and mod editing, with log entries
- Implement non-moderator reporting
- Does not add to public log, decide on messaging pathway
- Design "Extract derail" moving comments feature
- Implement "Extract derail" and log entries
Made the issue visibleToggle commit list
This is a lot of stuff together, but much of it is separable - we could develop each feature in turn and build up to the full featureset.
I agree a lot with the need for a mod, and of course along with that is the tools the mod needs. I think all of this is great! Once this is broken into pieces I think it would be a good time to talk about the ux here.
This sounds good to me.
With regards to extracting comments, I think Discourse does a pretty good job of this if we want to explore similar existing solutions.
Given these extra options for comments and threads, now might be a good time to add an ellipsis button to comments for hiding less common actions like reporting/editing/deleting/moving.
Changed title: Draft proposal for moderation tools → [meta] Moderation toolsToggle commit list
@jneen some questions:
- Do moderators need to be a new role? Can't we just have everyone with
masteraccess be a moderator? Currently these people can already remove comments.
- The moderation log, do you imagine that it's just a freely editable field scoped by group / project for each user? Some of the functions imply that there is a such a thing, others seem to imply more an activity log by users
I like all ideas. I've added the locking link to the issue body.
- Do moderators need to be a new role? Can't we just have everyone with
I could see making moderators equivalent to master access - I was imagining large orgs with dedicated community managers, but that might not be a use case that exists.
As for the log, I see that as more of an activity log - not manually managed. so, "here's a list of all the things this moderator has done in this project" and "here's a list of all the moderation actions taken with respect to this user in this project".
Assignee removedToggle commit list