!bobaland
OP

The First BoboBan Postmortem

Hello Boobies,

as you might have heard if you hang out in !salt or in our social discord server, we had to take strong moderator action for the first time since this website's inception. This was not a decision taken lightly, but the rules violation was so blatant and systematic that it left no space for hesitation.


In this thread, I'd like to do a small Postmortem of what happened. In tech, a postmortem is a document that blamelessly outlines an incident and discusses what went wrong, what went right and, most importantly, what can be learned (or built) going forward to prevent the same incident from reoccurring.


[Sections Inside]

  • What Happened
  • What Went Right, Wrong & Lucky
  • Potential Action Items
»I'll be writing this here throughout the evening
»so the sections won't appear immediately
    1. OP
  1. OP

    What Happened

    On the afternoon of January 8th, 4 non recent threads in !salt were suddenly dug up with clear inflammatory comments. To avoid ongoing attention in the future, I'm not link to the threads directly, but rest assured their URLs have been archived.


    The language in the comments clearly violated the rules highlighted in the welcome packet, like "don't be an asshole", and "don't yuck someone else's yum". The rapid succession of threads, the similarity among the comments, and the clear-cut wording made it impossible to attribute this to careless remarks. To explain the language used, comments included references to "needing validation for your dumb fetish", "stop liking garbage", "RPF being cringe and its fans being unable to not be creepy", as well as calling someone "peak cringe".


    How it Unfolded

    The incident was first brought to my attention on our Discord server by a Boobie asking about our report process. They were directed to the feedback form for an anonymous report, invited not to publicize the incident too widely to keep the situation from escalating, and to generally try not to engage further if the person was being disruptive.


    At the same time, other Boobies reported having noticed the same worrying posts, which prompted me to investigate immediately. As written above, this was a clear cut case of comments being not acceptable.


    The actions taken were:


    1. I publicly identified the comments as non acceptable on the threads themselves, and invited people not to engage as I looked into the situation. I did this as webmaster when possible (on threads I hadn't previously joined), or with the already-assigned identity.
    2. I confirmed that the comments where coming from the same person by looking at the author in the backend. This was done without de-anoning the user.
    3. Upon confirming that the comments were from the same user, I de-anoned the user in question, blocking their account from logging into our authentication system (which had no visible impact on BobaBoard itself outside from their credential being restricted). Identity was only confirmed for the reported threads, and no other content created by the user has been de-anonimized.
    4. I confirmed action had been taken under the individual comments.
    5. I reached out to the user to inform them of the decision, and to invite them to discuss the situation if necessary.
    »these are posts i really wish i could edit if necessary
    »feel free to point out what i might have missed on the comments
    »if you feel it's relevant
    »Also this is out here for transparency and review, so feel free to ask questions
  2. OP

    What Went Right

    • The community correctly identified the comments as non acceptable, and brought them up for review.
    • Our Code of Conduct, while rudimentary, covered the situation and gave us something to point at for enforcement.
    • Suspending the account was a quick process.
    • The situation was addressed without escalation on the threads themselves.

    What Went Wrong

    • Our reporting system doesn't distinguish between "this is worrying, but take a look at your own leisure" and "this is an active incident, please take a look as soon as possible".
    • People had to ask where to report an incident, which might point to the need for more visible tools.
    • Webmaster had to de-anon on threads she had already joined in order to comment.
    • Since this was our first incident, queries had to be written to cross-reference the author of the comments (and then de-anon).
    • While the de-anoning process had been outlined before, it isn't currently written down in an easily-accessible and visible place for people to review.
    • More generally, the creation of moderation tools was delayed because by Webmaster's Terrible, Horrible, No Good, Very Bad End/Beginning of Year.
    • Figuring out the chain of recommendations/events that lead to this user's invite is not an easy process (and is currently ongoing).

    Where we were Lucky

    • If webmaster had not been available, there would have been a significant delay before the situation could be addressed.
    • The comments were made in a short amount of time, which made it easy to decide to cross-reference and check that they came from the same person.
    • The situation was extremely clear cut.
    »feel free to add more points if you think of them!
    »I might do the "next steps" tomorrow cause I'm super tired
  3. OP

    Action Items

    1. Further formalizing our CoC was proposed in !volunteers. I don't think that's strictly necessary as a consequence of this specific accident (see the discussion), but it's work that should be done regardless.
    2. Delineate the deanoning process somewhere in the welcome guide for full transparency. We might want to develop better guidelines as to when this is warranted, but for this stage "at discretion of the webmaster" is still a valid choice.
      1. Make sure to delineate explicitly what we deanon (posts flagged for abuse, if confirmed to be coming from the same person) and what we don't deanon (personal posts that have not been reported).
    3. Create an integrated reporting system, with tiers of urgency and a field for a description of the accident. Urgent messages can ping the Webmaster (and other mods in the future) on Discord for a quick look.
    4. Create a system to flag posts for abuse and automatically surface "repeat offenders" when they reach a certain threshold. Deanoning shouldn't be automatic, but should allow for human review of the flagged posts.
      1. Open Question: what levels of transparency should we have for flagging actions? Should flagging guidelines be public?
    5. Add the ability to have per-post identity in an already-joined thread for specific roles. This can, for example, be implemented as a special permission assigned to a role.
    6. Add the ability to lock threads or comment chains to give the Webmaster more time to act. Tie this ability to a permission, so different tiers of mods might be able to take this action without having full access to all moderation options.
    7. Create a document matching user accounts with their onboarding history and (potentially) Discord id, to be used in case of urgent actions needed or to further investigate this type of incident.
    »as usual, open for comments