Covid-19 has led Facebook to limit its use of human moderators, with the company directing them to focus on initial reviews of the most severe content violations reported by its users. As a result, Facebook has relied less on human moderators to look at appeals involving other types of content.
Facebook CEO Mark Zuckerberg said on Tuesday that the Covid-19 pandemic has reduced the company's ability to moderate content on its social networks due to its limited use of human moderators, according to CNBC.
Since lockdown measures started in mid-March, Facebook and its partners sent content moderators home to keep them safe, said Zuckerberg.
Content moderators are responsible for removing pornography, terrorism, hate speech and other unwanted content from across the site. Facebook employs 15,000 of these contractors at 20 sites globally, as cited in the Washington Post.
“Our effectiveness has certainly been impacted by having less human review during Covid-19, and we do unfortunately expect to make more mistakes until we're able to ramp everything back up,” said Zuckerberg.
As a result of this limitation, Facebook made the decision to prioritize the use of human moderators to do initial reviews of the most severe content violations reported by its users.
Zuckerberg said he expects the amount of appealed content to be much lower in the company's August report.
Already, that drop on content appeal reviews can be seen in the Tuesday…