© 2024 Public Radio Tulsa
800 South Tucker Drive
Tulsa, OK 74104
(918) 631-2577

A listener-supported service of The University of Tulsa
classical 88.7 | public radio 89.5
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

It's 'Our Fault': Nextdoor CEO Takes Blame For Deleting Of Black Lives Matter Posts

Nextdoor CEO Sarah Friar, here in July 2019, tells NPR the popular neighborhood app is taking steps to address reports of racial profiling and censorship on the platform.
Patrick T. Fallon
/
Bloomberg via Getty Images
Nextdoor CEO Sarah Friar, here in July 2019, tells NPR the popular neighborhood app is taking steps to address reports of racial profiling and censorship on the platform.

As protests swept the nation following the police killing of George Floyd, there was a surge of reports that Nextdoor, the hyperlocal social media app, was censoring posts about Black Lives Matter and racial injustice.

In an interview with NPR, Nextdoor CEO Sarah Friar said the company should have moved more quickly to protect posts related to Black Lives Matter by providing clearer guidance.

It "was really our fault" that moderators on forums across the country were deleting those posts, she said.

People of color have long accused Nextdoor, which serves as a community bulletin board in more than 265,000 neighborhoods across the U.S., of doing nothing about users' racist comments and complaints. But Nextdoor came under especially heavy criticism in May after the company voiced public support for the Black Lives Matter movement.

Unpaid volunteers, known as leads, moderate posts on Nextdoor. Friar said they were deleting posts about Black Lives Matter because they were following outdated rules stating that national conversations have no place in neighborhood forums. Those guidelines have now been revised to state that conversations about racial inequality and Black Lives Matter are allowed on Nextdoor.

"We did not move quickly enough to tell our leads that topics like Black Lives Matter were local in terms of their relevance," Friar said. "A lot of our leads viewed Black Lives Matter as a national issue that was happening. And so, they removed that content, thinking it was consistent with our guidelines."

She added that the new rules make one thing clear: "Black Lives Matter is a local topic."

Friar said that Nextdoor is taking several more steps to improve the moderation of comments. It will soon offer unconscious bias training to all moderators. It will also launch a campaign to enlist more Black moderators. And it is ramping up efforts to detect and remove instances of racial profiling.

Apologizing, then asking for help from Black users

Neighbors take to Nextdoor to search for a local plumber, find a babysitter or sell a piece of furniture. But the app also has gained notoriety for spreading panicked messages that carry racist overtones.

In recent weeks, as the national conversation has centered on racial injustice, Black users have shared their stories of abandoning Nextdoor. One person wrote on Twitter that they stopped using it after reading repeated complaints about "large groups of black teens walking in their neighborhood." Another tweeted that their neighbors would write messages such as "Saw a black youth hanging out next door. Calling the cops."

Mayisha Fruge, 42, a black mother of two in San Diego, Calif., who is active on Nextdoor, said those kinds of post sound familiar.

About 90% of her neighbors come across as good, decent people on the app, she said.

"That other 10 percent? They must be hiding behind the computer. I never would have thought that my neighborhood had those types of people, racist people in it," she told NPR.

In one post, a neighbor was suspicious about a black person who was simply taking a stroll. Another asked: do the Black Lives Matter protesters have jobs?

"I said, what does this have to do with equality and justice?" Fruge said.

Friar has apologized to Black users who have said they do not feel welcomed or respected on the app, vowing that racism has no place on Nextdoor.

She also announced that Nextdoor was cutting off a tie to law enforcement by ending a "forward to police" feature that allowed users to report observed activity to authorities.

But Friar told NPR that Nextdoor's efforts to combat racism on the app will go even further.

Nextdoor has enlisted Stanford University psychology professor Jennifer Eberhardt to help slow down the speed of comments to tamp down on racial profiling, and it's working with her to make unconscious bias training available to hundreds of thousands of moderators.

It is a change that some Nextdoor users have demanded. In an online petition, they criticized the app's "murky" guidelines for content moderation, which users said led to abuse and the silencing of Black voices.

In response to Nextdoor's commitments, the Atlanta-based group Neighbors for More Neighbors, which helped organize the petition, applauded the news but remained cautious.

"This is a positive step towards creating a true community forum where all people in our neighborhoods feel safe to participate," said activist Andrea Cervone with the group. "We will be keeping an eye on the company to make sure they continue forward and fulfill these public commitments."

In Northwest Indiana, Jennifer Jackson-Outlaw had a lukewarm reception to the company's announcements. Jackson, a black woman who became fed up with Nextdoor and deleted the app, said Nextdoor's mostly white executive suite needs a shakeup in order to effect real cultural change at the company.

"It's important to not only have representation as far as those who are the moderator, but also those who are in the leadership of the company who may be more be well-versed on some of the issues," she said.

At Nextdoor, Friar has kicked off an effort to recruit more Black leads. This includes inviting especially active Black users to become moderators and starting outreach campaigns to encourage Black users to join the app.

"We recognize that is an underrepresented group on Nextdoor," Friar said of Black users. "There are others of course, but we want to start there because we really feel that the Black Lives Matter movement is so critical and important right now just to the health of our country."

Friar described Nextdoor's content moderation as "a layered cake," saying it involves local moderators, artificial intelligence tools and the company's human reviewers.

She said that the app's AI programs are being fine-tuned to better detect both explicit racism and posts that engage in racial profiling, or what she called "coded racist content." Nextdoor is now dedicating more staff to focus on attempting to ferret out racist content on the app.

"We're really working hard to make sure racist statements don't end up in the main news feed, making sure that users that don't act out the guidelines aren't on the platform anymore," Friar said. "It is our No. 1 priority at the company to make sure Nextdoor is not a platform where racism survives."

Confronting the 'Karen problem'

Though anecdotal evidence suggests Nextdoor's user base is largely white, Friar said the company has no internal metrics about the race of its users.

The app does not ask about race when users sign up, a decision that Friar said may soon change as the company examines how best to hold itself accountable in its push to diversify the platform.

"We are debating that," she said. "Because if we want to measure our success of being a diverse platform, perhaps that's something we do need to ask."

Critics of Nextdoor, including U.S. Rep. Alexandria Ocasio-Cortez, D-N.Y., have drawn attention to the app's so-called Karen problem. It's a term that has come to describe a middle-aged, privileged white woman with racist habits, whether overt or subtle.

When asked if Nextdoor has a Karen problem, Friar deflected by saying any intolerance or racism on the app is a snapshot of issues plaguing the entire country, not problems confined to the neighborhood platform.

"Does the U.S. have that problem? Yes, it's out there," Friar said. "But I think we're working as hard as we can to make sure neighbors are doing right by each other, that they're being civil, being respectful and that they're not falling back to calling each other names but rather trying to deeply understand."

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Bobby Allyn
Bobby Allyn is a business reporter at NPR based in San Francisco. He covers technology and how Silicon Valley's largest companies are transforming how we live and reshaping society.