Facebook to hire 3,000 more workers to remove violent videos
Facebook CEO Mark Zuckerberg wrote on his official Facebook handle that he will be hiring 3,000 new employees to monitor content and remove the violent videos from the website.
It is evident from the post that Mark is deeply affected by people posting violent murder and suicide videos on Facebook over the last few weeks. Here’s what he wrote.
"Over the last few weeks, we've seen people hurting themselves and others on Facebook -- either live or in video posted later. It's heartbreaking, and I've been reflecting on how we can do better for our community," Zuckerberg said in a Facebook post .
"If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down,"
The new hires will join Facebook's community operations team. They will review the "millions of reports" Facebook receives each week regarding posts that may violate its terms of service, in addition to the 4,500 employees who already review posts. The move will hopefully "improve the process for doing it quickly," Zuckerberg said.
"These reviewers will also help us get better at removing things we don't allow on Facebook like hate speech and child exploitation. And we'll keep working with local community groups and law enforcement who are in the best position to help someone if they need it -- either because they're about to harm themselves, or because they're in danger from someone else," he said in the Facebook post.
In addition to hiring more reviewers, Zuckerberg said Facebook will also be "building better tools to keep our community safe."
"We're going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help," he said.
"No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need."
The company does not allow videos and posts that glorify violence, but this content is often only reviewed and possibly removed if users report it.