Home TechnologyTech News Nextdoor moderators scramble to address QAnon after Capitol attack

Nextdoor moderators scramble to address QAnon after Capitol attack

Nextdoor moderators scramble to address QAnon after Capitol attack

For months, Nextdoor moderators have struggled with the challenge of addressing QAnon content on its neighborhood sites — but after last week’s deadly attack on the Capitol, the pressure between moderators and the company’s policy team may have reached a breaking point.

Moderators have been asking Nextdoor to impose a ban against QAnon content since at least October, according to forum screenshots obtained by The Verge. Last week, Nextdoor moderators began pressuring the company directly in the National Leads Forum, a private forum for moderators on the site. In screenshots of forum posts, obtained by The Verge, moderators expressed concern that Nextdoor’s misinformation policies did not fully bar discussions of conspiracy theories like QAnon.

Following last week’s pro-Trump riot at the Capitol, one user returned to an early QAnon thread, writing, “I am bumping this up. It’s January 8th. Any policies yet? After the past week, we need some. I also wrote an email to Next Door Leadership about this three months ago and got no response.”

It wasn’t until five days after the riot that Nextdoor finally responded to the request, referring moderators back to the company’s policy on violent content. In this post, Caty K., Nextdoor head of community, wrote, “I want to reiterate that the broader Nextdoor team is committed to the safety of all members and communities on the platform.” She continued, “The violent events that took place at the US Capitol last week are no exception.”

But some Nextdoor moderators say that the company’s misinformation policies don’t meaningfully address QAnon, and haven’t been communicated well enough to help communities deal with the conspiracy. The company’s misinformation policy asks moderators to report individuals who distribute “misinformation related to the Election and COVID-19,” but does not directly address conspiracy theories like QAnon. After the attack on the Capitol, many QAnon theories carry an implicit risk of inciting violence, but moderators find it hard to justify their removal as straightforwardly violent content. At the same time, current Nextdoor moderation policies do not include a ban on discussions of conspiracy theories.

“The problem is this policy is written so specific to election and Covid-19 information and does not mention any violation that can be used for things like misinformation around politics and inciting fear in the community,” one moderator wrote in the thread.

“Facebook has announced that it will be automatically removing content with the phrase ‘Stop The Steal’ and #StopTheSteal,” Steve C., a California lead responded. “Does Nextdoor plan to do the same?”

On Monday, Caty wrote that “Nextdoor views QAnon as a hate group,” as a response to a thread titled “FB has banned all QAnon Content – what is ND policy?” Caty continued, “If you see content or members advocating these ideologies, please report them to our team and we will handle. I recognize we do not have a list of groups available for you all to reference, and I will work on that to make things clearer, but for now this comment serves the purpose of confirming that QAnon content should be removed.”

On Wednesday, Nextdoor confirmed to The Verge that it classifies QAnon as a hate group. Still, there’s been no effort to communicate the QAnon policy to everyday users, and as of publication, Nextdoor has not updated its misinformation policies on its website to reflect its classification of QAnon as a hate group. “Right now we don’t have plans to email it out [to moderators,]” Caty said in response to a post asking if the decision would be communicated beyond the forum.

Nextdoor also referred The Verge to its misinformation and violent content policies. “Any post or content on Nextdoor that organizes or calls for violence will be immediately taken down,” a Nextdoor spokesperson told The Verge. “Nextdoor’s Neighborhood Operations Team also uses a combination of technology and member reports to proactively identify and remove content.

Nextdoor has struggled to establish clear moderation policies in the past. Nextdoor neighborhoods are primarily self-governed, and unpaid “community leads” are in charge of reporting and removing content in their communities. This has led to content being wrongfully being removed or allowed to stay up. Last June, The Verge reported that posts supporting the Black Lives Matter movement were being wrongly taken down by Nextdoor moderators.

In October, Recode reported that QAnon-related content flourished on the platform in the last few weeks in the lead-up to the 2020 US presidential election. In one instance, Recode said that a user bombarded Nextdoor for weeks on Twitter before the platform removed a post “containing QAnon talking points.”

According to Nextdoor’s rules, discussions of national politics are banned on the main community feed. As a result, public and private groups have grown to house these discussions. In forum posts obtained by The Verge, community moderators expressed worry over private groups that could be housing violent or extremist posts.

“How can we ensure locked Groups are not participating in harmful discussions?” Jennifer V, an Arizona moderator, wrote in a forum post Tuesday. “We have a LOT of pro-Trump/Patriot Groups that I worry about. I also worry about other Leads or Community Reviewers seeing me report the Groups and the backlash.”

“My concern is QAnon content, as well as other content with conspiracy theories, promotions of violence, etc., that is in *private* groups that won’t get reported because the members of the group WANT that content,” Carol C., a Colorado moderator wrote in the forums last week. “I saw some of this type of content in the public political groups that have since gone private.”

This article is auto-generated by Algorithm Source: www.theverge.com

Related Posts

0

Ad Blocker Detected!

Refresh