Fadzai Madzingira, Facebook’s Associate Manager  in charge of Content Policy
Fadzai Madzingira, Facebook’s Associate Manager in charge of Content Policy

How Facebook is preventing fake news and content that violates standards from spreading

Every day, people go on Facebook to share their stories, see the world through the eyes of others and connect with friends. The conversations that happen on Facebook reflect the diversity of a community of many people communicating across countries and posting everything from text to photos and videos.

Currently in Ghana, misinformation and disinformation have become an issue as a result of the social media technology and increased internet penetration. Social media has become a major medium for spreading false information.

Advertisement

For that reason, Facebook, a major social media platform, has initiated moves towards preventing fake or false news from going viral on its platform and has recently expanded its third-party fact-checking programme to Ghana.

This is aimed at helping to assess the accuracy and quality of news people find on Facebook, while reducing the spread of misinformation.
Using feedback from Facebook users, the plan is to raise potentially false stories to fact-checkers for review and removal.

When third-party fact-checkers fact-check a news story, Facebook will show these in Related Articles immediately below the story in News Feed. Page Admins and people on Facebook will also receive notifications, if they try to share a story or have shared one in the past that's been determined to be false, empowering people to decide for themselves what to read, trust and share.

Fact-checkers

Facebook in 2018 launched the third-party fact-checking programmes across five countries in Africa, including Kenya, Nigeria, South Africa, Cameroun and Senegal.

In October, this year, it extended it to 10 other African countries including Ghana, Ethiopia, Zambia, Somalia, Burkina Faso, Uganda, Tanzania, Democratic Republic of Congo, Cote d’Ivoire and Guinea Conakry.

For instance, in partnership with Agence France-Presse (AFP), the France 24 Observers, Pesa Check and Dubawa, Facebook will now be able to assess the accuracy and quality of news people find on its site, while reducing the spread of misinformation.

If one of the fact-checking partners identifies a story as false, Facebook will display it lower in its news feed items displayed and significantly reduce its distribution.

In Ghana, the independent fact checking will now be available through Dubawa, whose Programme Officer, Caroline Anipah, in a statement expressed excitement that it was going to help curtail misinformation and disinformation and raise the quality of information available to the public.

In Ethiopia, Zambia, Somalia and Burkina Faso, the fact checking will be done through AFP, Uganda and Tanzania, through both Pesa Check and AFP, Democratic Republic of Congo and Cote d’Ivoire through the France 24 Observers and AFP, and in Guinea Conakry through the France 24 Observers.

Community standards

In relation to how Facebook is enforcing community standards by removing content that is not allowed as people continue to use the platform to feel empowered to communicate, Facebook has initiated moves to create more awareness of the community standards.

At a recent workshop in Nairobi, Kenya, Facebook explained to participants how it developed its Community Standards that outlined what was and was not allowed on the platform and how it applied around the world to all types of content, including, for example, content that might not be considered hate speech, but may still be removed for breaching the Bullying policies.

Explaining, Fadzai Madzingira, Facebook’s Associate Manager in charge of Content Policy, said there was a content policy team that received reports; but unfortunately, many people, “were not aware that we have rules of what is not allowed on the platform, but they are also not aware of how to flag content to us.”

The workshop for the journalists in Nairobi was, therefore, to raise awareness of the Community Standards and how to report violations.

Reporting violations

Ms Madzingira told the Daily Graphic on the sidelines of the workshop that unfortunately, only few people in Africa were reporting violations. “Some content is incredibly black and white and very easy to take a decision to remove or not, while others are difficult,” she said.

She said content on adult nudity and pornography that clearly fell into the categorisation of violation was easy to remove, but there were others that were on the border lines and required content reviewers to resort to other tools and support to arrive at a decision.

She insisted that anyone could report content, since reporting was absolutely confidential, adding that “your uncle will not know that you reported his content if it violates, since reporting is confidential.”

The move is an indication of Facebook’s commitment and dedication to help tackle false news and content that violate standards on the platform. Misinformation is a problem, and these are important steps in continuing to address this issue.

Third-party fact-checking alone may not be the solution, but it is one of many initiatives and programmes to help improve the quality of information people see on Facebook. The safer the platform is, the more openly people will be able to express themselves.

Writer's email: [email protected] 

Connect With Us : 0242202447 | 0551484843 | 0266361755 | 059 199 7513 |

Like what you see?

Hit the buttons below to follow us, you won't regret it...

0
Shares