Quantcast
Channel: Broadcast Journalism – Press Gazette
Viewing all articles
Browse latest Browse all 1474

Facebook 'grateful' to Channel 4 Dispatches team for undercover reporting of content moderation practices days after 'fake news' row

$
0
0

Facebook has said it is “grateful” to Channel 4 Dispatches for going undercover and exposing, for the first time, some of the ways moderators are told to permit far-right pages or hate speech on the platform.

Inside Facebook: Secrets of the Social Network will will show undercover footage taken inside Facebook’s largest UK content moderation centre in Dublin where the work is outsourced to a company called Cpl Resources.

The programme broadcasts tonight at 9pm on Channel 4.

It follows after a week of negative headlines For Facebook as the social media giant said banning pages that share fake news was not “the right way to go” in the ongoing fight against disinformation.

Dispatches sent an undercover reporter to work as a content moderator at the centre in Ireland, where he was trained how to decide whether content reported by Facebook users should be allowed to remain on the site.

The reporter discovered that pages belonging to far-right groups with large numbers of followers – including the page of former English Defence League leader Tommy Robinson – were allowed to stay up despite violating Facebook’s rules on multiple occasions.

Facebook’s most popular pages cannot be deleted by ordinary content moderators and are instead referred to a special queue to be assessed by Facebook staff.

One moderator told the undercover reporter: “If you start censoring too much then people lose interest in the platform… It’s all about making money at the end of the day.”

However Richard Allan, Facebook’s vice president of public policy, told Dispatches: “If the content is indeed violating it will go… I want to be clear this is not a discussion about money, this is a discussion about political speech.

“People are debating very sensitive issues on Facebook, including issues like immigration.  And that political debate can be entirely legitimate.

“I do think having extra reviewers on that when the debate is taking place absolutely makes sense and I think people would expect us to be careful and cautious before we take down their political speech.”

The undercover investigation also found that moderators were told not to delete violent content involving children as long as it is not posted with a celebratory caption.

They were also told that racially abusive content against ethnic and religious immigrants was permitted under Facebook’s policies, and that trainers had instructed moderators to ignore racist content.

In a statement published today, Monika Bickert, vice president of global policy management at Facebook, said the company had already begun taking action in response to the issues raised by Channel 4 during the course of their investigation.

Bickert said: “People all around the world use Facebook to connect with friends and family and openly discuss different ideas, but they will only share when they are safe.

“That’s why we have clear rules about what’s acceptable on Facebook and established processes for applying them. We are working hard on both, but we don’t always get it right.

“This week a TV report on Channel 4 in the UK has raised important questions about those policies and processes, including guidance given during training sessions in Dublin.

“It’s clear that some of what is in the program does not reflect Facebook’s policies or values and falls short of the high standards we expect.

“We take these mistakes incredibly seriously and are grateful to the journalists who brought them to our attention. We have been investigating exactly what happened so we can prevent these issues from happening again.”

Facebook has also faced backlash this week after its official Twitter account told US journalists it did not ban pages for sharing fake news – despite currently running a global advertising campaign saying “fake news is not our friend”.

The adverts add that Facebook is “committed to reduce its spread” and is therefore working with more fact-checkers globally, improving its technology, and giving users background information on articles in their news feed.

At an event designed to promote Facebook’s commitment to tackle fake news last week, CNN media reporter Oliver Darcy asked the company why far-right site Infowars is still allowed on its platform.

After Darcy tweeted that he “didn’t get a good answer”, Facebook responded defending its decision on the grounds of free speech.

It said: “We see pages on both the left and the right pumping out what they consider opinion or analysis – but others call fake news. We believe banning these pages would be contrary to the basic principles of free speech.

“Instead, we demote individual posts etc. that are reported by Facebook users and rated as false by fact checkers. This means they lose around 80 per cent of any future views. We also demote pages and domains that repeatedly share false news.”

In response to subsequent criticism from New York Times technology columnist Kevin Roose, Facebook added: “We just don’t think banning pages for sharing conspiracy theories or false news is the right way to go. They seem to have Youtube and Twitter accounts too – we imagine for the same reason.”


Viewing all articles
Browse latest Browse all 1474

Trending Articles