To cope, some companies have tried to replace human moderators with algorithms. The results have been mixed at best. Some of the most high-profile failures were at Facebook, where algorithms censored archaeological images showing a 30,000 year-old nude figurine, while allowing live video of suicides to circulate widely. Facebook promised last year to hire thousands of human moderators — and, in some cases, to provide them with trauma therapy.
Those are good first steps for disaster-response moderation, but we also need to revive what Ms. West called the tummler part of the job. It’s a tough gig, but it can be done. Especially if companies admit that there is no one-size-fits-all solution for moderation.
This is why human moderators are so valuable: they can understand what’s important to the community they’re moderating. On the Reddit forum r/science, for example, moderators will delete posts that aren’t based on peer-reviewed scientific research. And on the fan-fiction forum An Archive of Our Own, where many people prefer to post stories under pseudonyms, members can be banned for revealing the legal names of another member.
A well-trained moderator enforces these rules not just to delete abuse, but to build up a unique community. At AO3, for example, there is a class of moderator called a “tag wrangler,” whose job is to make sure stories are labeled properly for users who don’t want “Iron Man” fic mixed in with “Iron Giant” fic. Or “Iron Chef”! The forum is also recruiting bilingual moderators who can answer questions and post items of interest for its growing community on Weibo, China’s most popular microblogging site.
Monique Judge, an editor at the black news site The Root, told me that she and her colleagues are inundated with racist comments. But instead of banning the commenters, or deleting their words, The Root lets them stand. “We let those stay so that people can see how ignorant they are,” she said. “I feel like those comments are just our reality as black journalists. No matter what we talk about, people will say, ‘Don’t discuss this because you’re black’.”
Ms. Judge’s point is that context matters. Racist comments mean one thing in The Root’s community, where black perspectives are centered, and quite another on Twitter, where they are not.
Moderators aren’t the only ones responsible, though. They are effective only if they have the support of their employers. Anil Dash, a social critic and podcaster who runs the app development community Glitch, once argued, in an essay that has become a classic among moderators, that if a website’s comment section is full of jerks, “It’s your fault.”
===========
Website source
Related posts:
- The Year the Internet Thought I Was MacKenzie Bezos – WIRED
- Easy ways to get the fastest internet connection possible in your home – Komando
- Elon Musk says Starlink internet private beta to begin in roughly three months, public beta in six – TechCrunch
- Verizon is canceling home internet installations during the pandemic – The Verge
- Ethiopia’s internet shutdowns are disrupting millions of lives – Quartz Africa
- How to check if your service provider is throttling your internet – CNET
- 8 charts on internet use around the world as countries grapple with COVID-19 – Pew Research Center
- How to boost your home internet speeds while you’re stuck at home: Tech Support – Yahoo Money
- Welcome (Back) to the Appointment Internet – New York Magazine