Thursday May 04, 2017

Facebook Will Make More People Watch Violent Videos so You Don't Have To

Facebook currently has a moderation team of 4,500 people reviewing videos and other flagged reports, but Zuckerberg is adding another 3,000 to curb the steadily growing prevalence of disturbing content. He also hints at new tools that will make such material easier to report, though there is little elaboration on how those will compare to the current system. Do you have the stomach for this kind of job?

News Image

It is unclear whether or not these new team members will be officially hired as staff or just contractors, as the company's current moderators are. According to Zuckerberg, this bolstered community operations team will be able to more effectively root out offensive content on Facebook, before too many people see it. Since launching Facebook Live last spring, the company has been mired in controversy over the feature. Everything from sexual assault to suicide has appeared on the platform, and many critics say the company rushed the product to market without adequately equipping itself for these challenges.