Facebook is in a race to become a safer social platform and improve its ability to remove objectionable content. On Wednesday, the site announced it was recruiting about 3,000 additional moderators. The new facebook recruits will be an addition to the 4,500 workers employed by the site to monitor reports arriving on the site from users warning about content that breaches the terms and conditions set by Facebook.
The decision to hire more moderators was informed by severe criticism directed at Zuckerberg after the recent cases of murder and suicide broadcast live on the site.
If there is one problem that Facebook has struggled to deal with, it is objectionable pictures and videos. Every day, users on the site report content that violates Facebook’s terms of service. However, it is always up to the site’s moderation team to review and act accordingly. And of late they have been struggling to do exactly that.
Zuckerberg says that they are working on the ability to report such clips to enable them to do the right thing quickly and easily. Quick action includes removing a post or helping a person in need.
Hiring 3,000 Facebook recruits on top of the existing workforce might be what the site needs to police its platform effectively. If the move works, Facebook will be hailed as the pioneer of how big internet companies can deal with content problems on their platforms.
Zuckerberg’s move is laudable but he is not resting easy yet. According to him they still have a lot of work to do in order to prevent objectionable content finding a home on their site.
In the past, Facebook’s inability to moderate the content on the site has been an open secret. In 2016, BBC raised the alarm on how Facebook groups were being used by sexual predators to sell pictures of exploited minors.
Though the head of public policy at Facebook promised to remove the content, a follow-up investigation discovered that the site had failed to get rid of most of the content even after BBC highlighted the offending images using Facebook’s systems to report them.
Recently, Facebook came under fire when users filmed shocking events including murder and rape and proceeded to upload them on the site. Critics argued that the company was not doing enough to tackle the problem of abusive behavior on the social site.
In response to the problem, Damian Collins, chairman of United Kingdom House of Commons’ committee on media said that he doubted the effectiveness of Facebook’s moderation team. According to Collins, the inability of Facebook to get rid of offending content from the site even after several complaints being raised against it undermines the confidence of users to report it.
What does the recent move mean for Facebook and its users? For Facebook, having the additional manpower to police its platform translates to a faster review of content to determine which posts violate their standards. In addition, extra hands and eyes will make it easier to contact law enforcement agencies in case of emergencies. On the other hand, the 3,000 Facebook recruits will make it easier for users to report offending content and have such problems addressed effectively.
I have three takes from this:
- There are 3000 people out there who will say thank god spammers exists cause now they have a job thanks to them
- I don't think 3000 people are enough considering how huge the Fb platform is and how much people can post and share
- Those that spam hardcore should probably take it easier, as Fb bases most of its bans on reports by others, now they will go through more reports with the added man-power so more accounts are in danger.