This month’s conversation was with Aaron Rabinowitz!
Aaron is a secular moral philosophy educator with 10+ years of experience helping students develop their capacities for flourishing and value centered community organizing.
He’s currently working on a PhD in education at Rutgers University, with a focus on developing a new pedagogy of luck to help generate greater compassion, humility, and personal fulfillment. Aaron hosts Embrace the Void and co-hosts Philosophers in Space.
This was a great conversation with Aaron and Philip. I would love to connect and talk more on this becasue I have a pretty unique vision with the same end goal as Aaron on community building and moderating it, and would love to debate further. I personally agree with him as an individual regarding the far right propaganda causing a lot of issues that leads to violence, but I would argue that this is a societal issue that stems from the the entire socioeconomic system that at least in America is designed for and not just stemming from the far right. We have a very polarizing system where you are forced to decide whether you are on the right or the left, and you are alienated from which ever side you choose againts. It’s my belief that this stems from corporate lobbying on both sides of the political spectrum. Political debates on TV lead to polarizing the people against each other, and that can eventually lead to violence. We see the “other” as the problem rather than see the “other” as one of “us”. I feel this is an intentional outcome for those who have the power to spend their money on politicians in order to cause such societal issues in order to keep their power and strengthen it. Thoughts?
Thank you for the conversation and for having me there (and in this space). While I was fascinated by the Monster Island write up, I remain a little skeptical of the lessons learned. It’s easy to see the right as being worse offenders because of dog whistle memes or outright hateful messages, but it’s also easy to create a homogenous space by overpolicing rhetoric from the left. I’ve been in online spaces that actively and aggressively moderated words like cancer or crazy. Having shibboleths like that can very quickly annoy all but the most ardent lefties out of a space entirely. I’ve been a moderator for an online community for about 9 years and it’s very easy to see that some behaviors and speech needs to be nuked from orbit. It is harder to see the slow strangling and winnowing of communities that occurs from over-moderation and purity testing.
I agree that we have witnessed growing polarization of politics, most clearly in the US, and where the US leads the UK follows. Yet as for all complex systems, it is tricky to distinguish cause and effect. The advent of social media can be seen as a cause, but equally one can attribute other causes the communication around which is naturally channeled through available communication channels.
While we all have a huge ambition right here, it is I think bitesize compared with attempting a fix of the entire sociopolical-economic landscape!! We can strive to make social networking more respectful of community norms. We can rethink the digitalization of human identity. We can design digital systems to encourage nonviolent / compassionate communication. It won’t be easy, but it’s a massive transformation we can definitely make happen.
I that you’ve brought up the use of “over” in the context of moderating @daine. And of course, for everyone who thinks a community is being over-moderated I’m guessing we might conclude there’s someone who considers it under-moderated! It’s the subjectivity that matters, rightfully.
I keep returning to offline community for some guidance and design inspiration. As moderating — facilitating cooperation and attenuating abuse — is a feature of all human community, I find mysef asking: What offline mechanisms have we not yet digitalized? And what are the unprecedented qualities of online community that could be causing or amplifying the pain we’re witnessing?
By way of contributing one observation to the questions I’ve posed there, I would say — look, here we are on Discourse. Note how Jeff Atwood (Discourse founder) has refused to introduce threading. The conversation right here right now is, well, dare I say it, conversational. It is not about upvoting or downvoting. It is not about trying to be clever or attention grabbing. It’s conversation playing out just as conversation always has, each person taking turns. No contribution is limited by character count, but the conversationalists know implicitly that a looooooong response just won’t be read by anyone.
[Checks and realises that this is my third response in a row and decides to stop right now.]
It’s possible that for every person who thinks that there is over-moderation that there is someone who thinks it is under-moderated, but that does not mean that the answer is necessarily in the middle. If we want truly diverse and functional discourse spaces, we need to be willing to understand and engage with some difficult trade offs. Most of us have seen the horror show of chaotic environments with zero effective moderation. Whether by trolls or by extremists these places are made into Monster Island knockoffs. I’m merely pointing out that a mirror image of that also exists. If we allow moderation to preemptively designate any opposing view as hate speech, or every person with a heretical opinion as a Nazi, we quickly create a space devoid of anything but the leftist party line. And given that those insular communities are no longer in dialogue with the wider world, they lose their ability to influence the massive majority of people in the middle. We end up with a navel-gazing media that is shocked to find that they’ve lost the 2016 election, for example.
I agree that it is tricky to distinguish cause and effect especially in a world where we need to see or provide empirical evidence before we believe in cause and effect. I have the theory that social media is not directly responsible for the polarization of politics or ideologies, but has only connected the various bubbles that we live in as a human society and effectively burst such bubbles where we already had polarizing opinions and now we are just exposed to opinions that in our own bubbles we may not have known about. We all seemingly fell into our own echo chamber and the advent of social media showed us that we haven’t necessarily progressed as a society universally as far as some of us thought. I would love to see or do more research on this and connect on it to see how this could pertain to the development of Akasha.
The Akasha.org website states the definition for the word akasha as a network connecting humanity with itself and infinite knowledge. If this is what the Akasha foundation is to become then moderation can be very tricky. I believe the exposure of our prejudices and the rehabilitation of them are necessary for the long term growth of such an endeavor. My views on this are less about what it could mean for me in my life and more about what the Akasha foundation could do for humanity generations into the future. I would love to discuss further on this because I do have a background that gives me many perspectives and understanding on most sides of this debate.
Hi folks! Thanks for the follow up comments. Just to clarify my position, I don’t emphasize the problem on the right because I think there’s no problem on the left or because I think politics is always as simple as left right. I emphasize it because of the specific contingent reality that Americans are living through right now, with a right wing that is radicalized enough to storm the capital building and elected officials who will encourage them to do so. My goal is not to other people on the right, they absolutely deserve compassion. My goal is to help people understand that things on the American right are significantly worse than you might assume even in a post-Trump world. Happy to hear more thoughts on that.
Let me ask whether you agree with this hypothesis:
The state of the current “right” in the US might be a result of the dystopian way online social networks have been designed in the first place.
The “right” is more scrupulous in using the psychological tools/ social engineering machinery to seed division that is beneficial for its stakeholders. This means basically buying into the social engineering services via advertisements - the business model of current social media.
Consequence and necessary precondition for the success of this “investment” into social engineering is that the protagonists, i.e. the US society does not realize this.
So before jumping into further discussion on the state of the left and right in the US it might be important to understand whether my hypothesis are or we can agree on them.
I write this as I see your comment was also from “within” this perspective: “The state of the right and left in the US.”