I am new to the moderation forum but am interested in the dream.
After reading through the forum and browsing on etherum.world, I wanted to understand the practical aspects moderation.
The first place I went to was the Transparency Log but i found it to be not helpful at all. The number of user reports associated with a post is helpful, but an explanation without context is not helpful.
Although I can see the content of posts that are kept I cannot see the content of delisted post. I understanding that showing delisted post has a host of complications in itself, but as a new moderator and in the general interest of transparency and accountability shouldn’t it be possible to see past decisions, if i really wanted to, without directly contacting the moderator?
Currently, Transparency Log only conveys that the platform is actively moderating, but does not seem to do anything beyond that.
How can we improve the Transparency Log?
María here, from AKASHA foundation. I’m a Ux Researcher & designer and I’ve been helping in the Moderating Open Design challenge.
It’s great you also noticed this issue. In your view, it will be important that as a new moderator you are able to review past delisted posts to understand past decisions without the need to contact another moderator.
The transparency log could allow to “watch again” delisted content to fully understand the context again.
I believe that this mechanism will have some technical implications, like having the list of content reported and delisted available at all times.
Great observations! Thank you for pointing this out! As a moderator you have access to all content that has been moderated, you can review the content, the number and reasons for the reporting as well as the reason when the post has been kept or delisted. While a “kept” post remains visible the delisted ones can’t for obvious reasons be shown. We realize this is a problem for transparency/ due diligence, especially because there’s no way to even follow up with a delisted post. Currently there’s no case number of such, so even if you wanted to, and assuming a larger frequency of such events in the future, how could you even point to “that” delisted post since you don’t have any hint on the content. The first simple fix here is to include a “case number”, we could use the hash under which the post is indexed for example. While we still would not display the content one could point to it and say: "I want to know more about the process of why post “case number xyz” has been delisted. " As a next step we could introduce a form of transparency report that is issued and reveals more details of the content delisted without doing this in a way that is harmful or reveals too much possibly disturbing details. I see that a well working moderation team that is operating their “moderation business” in a proper way would do such a thing. In fact this is kind of happening in this forum. You can review the entries we made for moderated cases. All those were discussed in this forum .
Hey Maria & Martin,
I hope you guys don’t mind me sharing some of my feelings haha, but I guess I started this thread because I felt a certain way about not being able to see the delisted content. On reflection I think it arose due to a few reasons.
As someone new to the community, I was curious what disturbing or scanty content was being delisted but I also genuinely wanted to learn what the moderation norms currently are and decide if I agree with them. There is also this feeling of a weird logical entitlement (probably misguided), but the best way i can describe what i mean is that I know that I could have seen the content on my feed but now I can’t see it, and it was largely due to arbitrary timing.
Another observation on reflection, is that the explanation and context here wasn’t the most important thing to me. I kinda trust Martin, so I knew the decision was probably very reasonable and ultimately I was okay with not knowing the context.
My assumption is that maintaining a public or limited list of delisted content, aside from the technical implications, seems counterintuitive to the initial goals of content moderation. Although I think this is very much an assumption and can be challenged.
If moderators can see all available information (content, process, decision maker, etc) and general users can only see limited information (decision maker, reason), is there a view that the transparency log simply functions as a filter of information?