The upcoming elections are already dominating discussions on social media. There are fears, however, that disinformation could have a negative impact on the integrity of the election process. Pic frayintermedia
With South Africa’s elections approaching, issues surrounding voting processes and campaigns are already dominating Facebook conversations across the country. According to Jocelyne Muhutu-Remy, who manages Facebook's strategic media partnerships in Africa, topics like education, security, the economy, labour issues and corruption are at the heart of many online discussions as South Africans prepare to head to the polls on May 8.
Facebook users aged between 18 and 34 years are involved in 60% of the election interactions seen on the platform. According to a platform analysis of online discussions between January and March 2019, education is the hottest topic – with up to 1.6 million conversations about the issue on some of the days monitored. While women had more to say, making up 52% of Facebook users talking about the election online, men were slightly more engaged with election content, making up 57% of all online interactions.
The youth is driving the elections conversation on Facebook. Source: Facebook Journalism Project
According to Muhutu-Remy, this means women and the younger generations are not as disengaged and disenfranchised as many would like to believe - but whether their online engagement will have real world impact remains to be seen. According to the IEC there has been a surge in youth registration, but millions of young people who are eligible to vote have still not registered.
Women make 55% percent of registered voters in the South African voters roll. Pic: www.iec.org (Screenshot)
Online interactions, however, often do have a real world impact. For this reason, Facebook wants to form stronger partnerships with South African news organisations to ensure that the platform is not used to compromise the integrity of upcoming elections. A Facebook Journalism Project training workshop was held on April 3, 2019 at the Hyatt Regency Hotel in Rosebank, Johannesburg.
The workshop focused on tools and best practices for journalists preparing to cover South Africa's upcoming general elections. Facebook's head of news partnerships Nick Wrenn said Africa is important for the company, and they plan to spend more time engaging with journalists and news organisations in the region.
“Election integrity is important globally. We need to make sure that we are working with as many different industry stakeholders, especially in countries where there are important elections. This is to make sure that our platform isn't abused or misused by people who want to disturb a democratic election process.”
Facebook has been focusing on removing content that poses harm or jeopardises safety, both online and in the real world, as well as reducing misinformation and educating the platform users about what information they see on their news feed.
The moderation process entails taking down posts that contain hate speech, especially when the content directly attacks or incites harm to others on the basis of race, serious disability or disease, ethnicity, religion, national origin, sexual orientation or gender.
Facebook's algorithms push content from friends, family and to the top of the news feed. News only makes 4% of what appears in user's timelines. Source: Facebook Journalism Project
Wrenn admits that despite expanding their moderation team to 30 000 members and trying to include members from each country and region they’re represented in, challenges remain. When platform content is flagged, it gets escalated to a team member who then handles the complaint. Many users and advocacy groups, however, say this is not enough, and that content posted in non-European languages is often not properly assessed.
“I think it's fair to say there have been times when we have been chasing events and reacting to them rather than being proactive,” Wrenn acknowledged. “We are doing a lot more work now to try to get ahead of the risk before anything happens, but we are not perfect and we are not going to catch everything.”
Facebook uses artificial intelligence technology to detect fake accounts and uncover coordinated behaviour that is abusive, disrupting bad actors by taking down fake accounts and removing posts that violate community standards. Wrenn says this means that often content is removed before it is even reported by other users.
“Through a combination of machine learning and improved human moderation, including having people speaking languages and knowing local cultural nuances, we are able to get ahead of the problems. But clearly we still have problems that need to be addressed.”
While posts with misinformation are not removed, they do not appear as high in user news feeds. The post relevance is lowered once it has been fact-checked and flagged as false news. A false rating reduces the impact of false news by about 80%. On the other hand, posts from fact-checkers debunking a false post are given more relevance on news feeds.
By prioritising fact-checking and lowering the relevance of falsehoods online, Facebook hopes it can help lessen the impact of disinformation going forward.