Teaching Mastodon: Ethics, Moderation, and the Work of Participation
NB: This post is we hope the first of several which discusses the use of alternative social media in the classroom. In this post, ASM Research Network member JJ Sylvia IV discusses teaching Mastodon in a social media ethics course at Fitchburg State University.
In teaching Mastodon as part of a social media ethics class, I’ve learned that its ethical promises of greater user agency, reduced data exploitation and decentralized governance are inseparable from user labor in practice. This labor includes learning federation norms, building meaningful networks without algorithmic assistance, and participating in governance and moderation activities. Students coming from mainstream social networking sites often notice and sometimes resist or resent the additional labor Mastodon requires.
Over the last few years, I’ve been experimenting with how to include more alternative social media theory and practice into my teaching. My most ambitious attempt at this occurred in a graduate course in Social Media Ethics that I teach for M.S. in Applied Communication with an emphasis in Social Media. Beginning in the summer of 2023, I launched a Mastodon instance specifically created for the class, with lessons and readings surrounding the ethical differences between mainstream social media platforms and Mastodon, including their moderation policies. I’ll share my approach, along with some lessons learned and suggestions for others who may want to incorporate a similar assignment.
Why a Class Instance?
The first three weeks of my 7-week class focused on social media ethics related to platform moderation. I set up a new Mastodon instance specifically for my class. I decided to create a unique instance both because I wanted students to feel ownership of the platform as they wrote their own moderation policy for it (and moderation is always applied locally) and to avoid some of the overhead that would have been required in having students select which instance to use when they signed up. This instance was federated broadly. Because students were required to participate publicly, they had the option to use a pseudonym.
Onboarding
In the first week, students watched a video I recorded showing a step-by-step process of how to sign up for and use our Mastodon instance. I also linked to guides for finding accounts to follow, and a list that included those writing about communication and media studies related topics. Students were required to follow accounts and/or hashtags of their choosing, as well as try out various features like commenting and boosting other posts.
The labor of network building became highly visible in this onboarding step. The lack of an algorithm to help new users connect with people they already followed elsewhere or with like-minded communities has consistently been the most pressing issue raised by students. Despite the resources I shared for building a network, students reported being left with a negative first impression of the platform primarily because they don’t immediately see engaging content. Mastodon shifts this early-stage work from platform to user/community. These critiques made clear that students are accustomed to algorithmic onboarding that reduces the need for intentional community building. In future iterations of this lesson, I plan to explain this difference to students in the first week, more clearly articulating how they might see the slowness of building a community through their own agency as part of the ethics lesson related to using Mastodon. Both the effort of building a community and the lack of an endless scroll can be more clearly framed as potentially beneficial when we think differently about social media and how and why we use it.
Looking back, if I were to run this assignment again, I would include a small “starter pack” for students, with a set list of accounts and hashtags for them to follow that would immediately populate their timeline. I believe this would serve as a compromise that takes some of the stress out of building community, especially in a short 7-week course, while still leaving a significant amount agency for network building to students.
Comparing Moderation Policies
In the second week of the course, students were asked to compare the moderation policies of one mainstream social media site and a Mastodon instance of their choosing, other than our class instance. Most chose mastodon.social because of its size and prominence. They also played a content moderation simulation game which was designed to demonstrate the stress those working on content moderation regularly face.
In comparing platforms, students noted the privacy and moderation features of Mastodon were excellent, and that they were not bombarded with advertisements, which made for less competition for their attention.
Another student compared Mastodon to (then) Twitter, noting that although Twitter’s centralized structure provides consistency in moderation and content curation, they preferred Mastodon’s decentralized nature, which promotes user privacy, data ownership, and unique communities. In short, students seemed to prefer Mastodon’s moderation policies to those of more mainstream sites, including Instagram and Twitter. Centralized platforms outsource governance to the company while Mastodon externalizes more governance to communities/instances, which can feel empowering but also burdensome.
Creating a Moderation Policy
The lesson on Mastodon culminated with having students collaboratively draft a moderation policy for our instance. Students pitched ideas on an interactive site (Kialo) that allowed commenting and voting. We adopted the highest rated policies. These included:
- Spam and Misinformation: Resist posting false content intended to be misleading or harmful.
- Respectful Behavior: Treat each member with respect. Do not engage in personal attacks, dogpiling, threats, or bullying.
- Prohibited Content: Hate speech, violence, discrimination, or hostility based on attributes of race, ethnicity, gender, or sexual orientation are forbidden. Sexually explicit content, content that promotes the sexual exploitation of minors or violates child protection laws, is prohibited.
- Server Rules: The following content is not allowed on this server: 1. posts that contain racism, sexism, homophobia, transphobia, or xenophobia, 2. posts that contain harassment, dogpiling, or doxxing of other users, 3. posts that incite violence or promote violent ideologies, 4. posts that intentionally share false or misleading information.
There is some overlap in these policies, and another round of fine-tuning the top policies would have been helpful.
Among these top-rated policies, there was some discussion that there were no enforcement mechanisms included. We realized we had norms but not a process that included who receives reports, what actions exist, what timelines apply, and how appeals work. One other policy that didn’t receive full support from the class did have a suggestion for enforcement, which was:
“Reporting and Moderation: We strive to respect the right to free speech but recognize that there are instances that go beyond that. If you would like to report a user who may have violated our server rules, please send an email to [redacted]. Please include the post in question and the reason for the report. Reports are usually processed within 24 hours. For first-time violations, we may decide to delete the offending content. Repeat violators may be suspended.”
If I teach this again, I would add something new. Because we had our own instance, we never had any violations of the governance rules we adopted as we continued to use the platform throughout the rest of the class. In the future, I would consider not only requiring an enforcement section, but also having students analyze pre-created posts that had been “reported” as possible violations using the policies they adopted. Allowing a mechanism to better stress test the governance rules they created would make this assignment even stronger. This would also help them better appreciate the labor that goes into creating and enforcing such policies.
Conclusions
While students reported seeing the theoretical benefits of an alternative platform like Mastodon, they ultimately seemed put off by how much labor it took to participate. In the years since I first ran this assignment, I have begun to see more clearly some parallels in these challenges and those associated with the levels of civic participation needed when living in a democracy. Therefore, I’ve begun to see a larger ethical framing emerge around the importance of participation in the systems we use during the course of our daily lives. In the future, I plan to emphasize this larger ethical point about the agency and labor needed to support those systems we often take for granted, and the ethical challenges that arise when we ignore them.
For Further Reading
Below is the full schedule for the first three weeks of my course, including readings and assignments related to Mastodon. Sharing here for instructors who want to adapt the module.
Week 1
Reading: Ess, “Digital Media Ethics: Overview, Frameworks, Resources” in Digital Media Ethics. Note: All readings are done with collaborative annotations through Perusall.
Lecture Video: I demonstrate how to start using Mastodon.
Links: How do I discover accounts to follow on Mastodon and the Fediverse? and The Commodon List, a list of accounts that post about communication and media studies.
Assignment: For this assignment, you’re going to learn about the social networking experience known as Mastodon. You will first sign up for an account on our class instance of Mastodon (redacted link) and create a profile and introductory post. Your task is then to find and follow accounts on a different server, engaging in conversations. You can do this by commenting on posts, initiating discussions, or posting original content related to the community’s focus. You can also Boost posts (this is similar to retweeting on Twitter). Remember to maintain ethical and respectful behavior during all interactions.
Document your experiences and reflect on them. How does this community engage? How does the federated structure of the Fediverse impact these differences? Examine the ethical implications of federated networks. Discuss how the decentralized nature of the Fediverse might influence privacy, censorship, content moderation, and user behavior. Write a comprehensive series of posts (a thread) on your primary Mastodon account summarizing your experiences and insights using Mastodon. Share the link to that thread here.
Week 2
Readings:
- Schultze and Mason, “Studying Cyborgs: Re-Examining Internet Studies as Human Subjects Research,” in Journal of Information Technology
- Sylvia, “The Ethical Implications of A/B and Multivariate E-Commerce Optimization Testing,” in Ethical Issues in E-Business: Models and Frameworks
- Bay, “Social Media Ethics: A Rawlsian Approach to Hypertargeting and Psychometrics in Political and Commercial Campaigns,” in ACM Transactions on Social Computing
- Video: “Fixing Social Media with Ethan Zuckerman” by Manning College of Information and Computer Science
- Short case studies:
- Boka, “Facebook’s Research Ethics Board Needs to Stay Far Away from Facebook,” in Wired
- Zimmer, “OkCupid Study Reveals the Perils of Big-data Science,” in Wired
- Ellenberg, “What’s Even Creepier than Target Guessing That You’re Pregnant?” In Slate
- Play https://moderatormayhem.engine.is/
Assignment: In this assignment, you will use critical thinking to perform an analysis of various social media platforms’ and Mastodon instances’ policies. First, you’ll need to select 1) a mainstream centralized social media platform (e.g., Facebook, Twitter, Instagram, TikTok) and 2) a Mastodon instance (e.g., mastodon.social, scholar.social). You will then investigate and analyze the community guidelines, content policies, and moderation techniques of each.
Prepare a written report that compares and contrasts the two policies, focusing on the following:
- Goals and objectives of the policies
- User and content restrictions
- Reporting procedures
- Policy enforcement techniques
- Transparency and appeals processes
Consider the effectiveness of these policies and their potential ethical impacts on user behavior, user experience, content diversity, freedom of speech, and online safety. The final document should be 750-1,250 words, follow APA formatting and contain links to both policies.
Week 3
Readings:
- Pierce, David. “So Where Are We All Supposed to Go Now?” The Verge, July 3, 2023.
- Doctorow, Cory. “The ‘Enshittification’ of TikTok.” Wired. January 23, 2023.
- Liebeskind , Sam, “The Summer of Decentralized Social (a Research Deep Dive).” (This was originally a Google Doc, but is now posted on New Republic. 2023
- Doctorow, Cory, “Pluralistic: What the Fediverse (Does/n’t) Solve” December 23, 2022.
- Liu, Miao, Robert W. Gehl, and Diana Zulli. “Rethinking the ‘Social’ in ‘Social Media’: Insights into Topology, Abstraction, and Scale on the Mastodon Social Network.” New Media and Society, 2020.
- Recommended: Rajendra-Nicolucci, Edited Chand, and Ethan Zuckerman. “An Illustrated Field Guide to Social Media,” 2021.
Assignment: You’ll access Kialo below, registering a name when you first access it. Based on your previous analyses of moderation policies and our readings, you’ll use Kialo to write suggestions for moderation polices. You’ll then read what your classmates have suggested and give pros and cons for including those policies. We’ll actually implement the ones that are most popular. You’ll need to check this assignment more than once over the course of the week, to make sure you see everything being posted by classmates.
Comments
For each of these posts, we will also post to Mastodon. If you have a fediverse account and reply to the ASM Network Mastodon post, that shows up as a comment on this blog unless you change your privacy settings to followers-only or DM. Content warnings will work. You can delete your comment by deleting it through Mastodon.
Reply through Fediverse