Facebook hiring 3,000 to combat inappropriate streaming
Social network Facebook announced on Wednesday that it would hire 3,000 employees to monitor content on its site after recent incidents of violence being live streamed by users.
META PLATFORMS
$554.08
13:10 15/11/24
Nasdaq 100
20,394.13
12:15 15/11/24
The tech giant’s CEO Mark Zuckerberg revealed the staff additions to the already 4,500-strong team of post reviewers in response to criticism over recorded murders and suicides appearing on the site.
US police led a nationwide hunt for a man who streamed his murder of an older man in Ohio last month, leading to Facebook vowing to improve its monitoring service.
More recently, a Thai man broadcast the killing of his own daughter via the social network, with the video remaining online until more than a day had passed.
Zuckerberg said the increased staff numbers would help to detect and remove inappropriate material quicker
In a post on his own Facebook page, founder Zuckerberg said the increased staff numbers would help to detect and remove inappropriate material quicker.
"Over the next year, we'll be adding 3,000 people to our community operations team around the world -- on top of the 4,500 we have today -- to review the millions of reports we get every week, and improve the process for doing it quickly," Zuckerberg wrote.
"If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down."
Facebook's difficulties come at a similarly troubling time for other social media and internet companies about how they monitor and filter content on their sites. Google-owned YouTube has seen a raft of advertisers pulling their investments after some ads appeared alongside extremist content earlier this year.