Facebook Will Hire 3,000 Staffers to Review Violent Content – Mark Zuckerberg

CEO Mark Zuckerberg says the company needs to do better after a string of episodes in which users broadcast killings, suicides, and other violent acts.

Facebook Will Hire 3,000 Staffers to Review Violent Content - Mark Zuckerberg

Zuckerberg said, recent videos including the horrific video of a man killing himself and his 11-month-old daughter in Thailand was “heartbreaking,” Just last week, we got a report that someone on Facebook Live was considering suicide, “We immediately reached out to law enforcement, and they were able to prevent him from hurting himself.”

In a Facebook post-Wednesday, said the company will add 3,000 people to its community operations team around the world in addition to the 4,500 it employs today to review “the millions of reports we get every week and improve the process for doing it quickly,”

“If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”

In his Facebook post announcing the move, Zuckerberg acknowledged the recent string of violent videos that have raised questions about the company’s moderation practices:

Over the last few weeks, we’ve seen people hurting themselves and others on Facebook—either live or in a video posted later. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community.

Two videos of a Thai man killing his 11-month-old daughter in April were available for 24 hours before being removed and were viewed over 370,000 times. 

Facebook’s chief operating officer Sheryl Sandberg said: “Keeping people safe is our top priority. We won’t stop until we get it right.”

Zuckerberg said Facebook workers review “millions of reports” every week. In addition to removing videos of crime or getting help for people who might hurt themselves, he said the reviewers will “help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation.”

There have been many horrifying incidents in recent months that were posted on Facebook. An Alabama man committed suicide on Facebook live last month, there were three Facebook Live-related suicides, including a 14-year-old girl in Miami who took her own life. On Easter Sunday, Cleveland man uploaded a clip showing himself shooting and killing a random 74-year-old man on the street.

Wednesday’s announcement is a clear sign that Facebook continues to need human reviewers to monitor content, even as it tries to outsource some of the work to software due in part to its sheer size and the volume of stuff people post.

Leave your vote

0 points
Upvote Downvote

Total votes: 0

Upvotes: 0

Upvotes percentage: 0.000000%

Downvotes: 0

Downvotes percentage: 0.000000%

Leave a Reply

Your email address will not be published. Required fields are marked *