A review of the Meta-Proposal for BlueSky

I was delighted to find that BlueSky, an initiative to “transition the social web from platforms to protocols” launched by Jack Dorsey, put a specific emphasis on content curation and moderation in their design layout. Here I want to highlight several points of an article written in response to the BlueSky initiative by the Berkman Klein Center and compare them to our own approach and ideas. The Meta-Prososal for BlueSky had a specific focus on alternative content curation, moderation, and business models for decentralized social media. You can find the original document here: BlueSky Meta-Proposal

On Content Curation

The BlueSky Meta-Proposal states, “Infrastructure can become violent when we assume that content is neutral.”

This statement aligns with the AKASHA Foundations’ notion that there is a need for content curation. “Censorship resistance” mustn’t be misunderstood: the reality is that there’s harmful content and it can spread rapidly through the technologies we create, and we need to find systems to mitigate this. This became clear in our AKASHA conversation on Monster Island with Aaron Rabinowitz, an experiment where an online forum did in fact did try to abstain from any interference with people’s comments on their forum that ultimately led to radicalization and real-world consequences.

Here’s some of the learnings we took from Aaron’s presentation:

Irrelevant and harmful content will overwhelm relevant and productive content. Moderation is essential to achieve a functional level of discourse.

Moderation is an evolving process and needs to continually adapt to new concerns, social technology, and legal requirements.

It is impossible to please every member of a group and create rules that will cover every circumstance. People will test the boundaries of rules and search for edge cases. There will always be individuals who are unsatisfied with moderation decisions.

Moderation approaches and guidelines should be developed by the community and support the values of the community.

The critical challenge of decentralized social media is how to allow for curation in the absence of a centralized operator. The proposal to create a decentralized marketplace for curation and moderation made by the authors of BlueSky’s Meta proposal therefore resonates with the AKASHA Foundation’s vision. In fact in our vision of the AKASHA Framework we want to give individuals and organizations that deploy decentralized communities the tools to empower them to define and update their moderation and curation systems.

Challenges emerge when communities scale and diverse values are held within the same network. Will these challenges lead to forks within communities, or will individuals prefer to define their own content filters as suggested by MIT’s Gobo project?

How would one implement such community-driven curation efforts? We believe our proposal for governing decentralized social communities through individual legal entities such as a DAA (read up on our proposal for that here) is the solution.

On Moderation

The BlueSky proposal suggests that “moderation should center around restricting harassment & harm.” The AKASHA Foundation is in wholehearted agreement; we define moderation below:

Moderating is essential for social networking. Moderating facilitates cooperation and prevents abuse. Social networking can only descend into antisocial networking without moderating. All social groupings you have enjoyed in your life have various moderating mechanisms. Moderating processes in decentralized social networking need requisite variety. Moderation needs to be customizable and adaptable as communities change.

The Berkman Klein Center authors make an interesting proposal centered around designing for friction. Currently, social networks rely on virality. In contrast in their proposal they argue for a “cost” associated with creating new profiles.

We agree, that indeed future decentralized communities should find new ways to enable discovery and add the necessary friction to inhibit bad actors from spreading their malice.

How could such “cost” look like? Simply monetary cost may be oversimplifying it. It may be achieved instead in an indirect manner by requiring profiles to have specific traits in order to “level up”. For instance on our experimental deployed community Ethereum World requireing to addENS names (that involves costs) could be a solution. Or by using validated profiles through a on-chain identity provider. Such profiles already come with a “cost”, sometimes even just a social one to taint them with bad behavior. Overall it is clear that the cost for bad actors to create new profiles en masse should be inhibitive.

However, I disagree with another proposal made that bad behavior would lead to immediate visible “tagging”. I believe leveling up should involve a flag brought to the attention of a neutral moderator or group thereof acting on clearly defined rules. Instead of “negative tagging”, positive tagging may highlight mutual interests, and sharing of batched POAPS from past events may be used to discover participants with common interests. Some “leveling up” may be achieved through receiving POAPS for “good actions”, helping to increase reach. Potentially also demoting bad actors by revoking previously earned badges may be possible instead of “feathering” a participant for bad behavior.

The proposed concept of modular moderation filters is encouraging; it is great to find this concept has made its way into the architecture overview of the BlueSky ADX experiment as well. In our initial experiments with the AKASHA Framework, we imagine moderation filters could exist as independent integrations.

One challenge that we need to work out is how varying flagging criteria among communities or even apps will be assessed. If values vary, flags may simply be misleading from one community to another making a homogeneous user experience and neutral filtering criteria challenging.

On Business Models

Web2 social media companies have business models where the user is not the customer but the product. The attention economy is where user data is extracted for sale to the highest bidder. We agree on the necessity of a new paradigm with non-profit legal entities to implement decentralized social tools and platforms. This is precisely why the AKASHA Foundation is focussing on the creation of the AKASHA framework t to build decentralized social platforms that go together with the creation of legal entities for governance. These could be for instance implemented through a Decentralized Autonomous (Swiss) Association, a DAA.

We agree with avoiding monetization through tokenization or direct rent seeking by creating nonprofit entities (AKASHA or BlueSky). Independent developers building on the AKASHA Framework may seek their own methods to monetize the applications they are constructing.

The AKASHA Framework encourages native integration with EVM compatible smart contracts and blockchains. This enables different styles of monetization for creators and their services. Tipping, product/service purchases, subscription models, and membership dues will all be possible with AKASHA Framework in alignment with the Berkman Klein proposal.

Open Challenges we see at AKASHA

When looking at the AKASHA’s framework, interoperability and scalability with other decentralized social applications and platforms are of the utmost importance. We are open to BlueSky or any other initiative for exploring bridges and open standards. Standardization of interfaces will be crucial for facilitating interoperability to create a usable counter-proposal to today’s prevailing centralized social networks.

Questions that are guiding us are:

  • How may we handle moderation on inputs from other communities on different platforms (Web2 and Web3) that may be integrated into the user’s personal feed?
  • How may we communicate and signal across different communities and platforms when different annotations and flagging rules may be used?
  • Who will handle the increasing number of requests for moderation?
  • Who may curate flagged posts?
  • How can the architecture for moderating and curating be scalable - across different networks and thousands of requests per second?
2 Likes