A new analysis from Global Witness reveals that Facebook (NASDAQ:META) failed to detect blatant election-related misinformation in advertisements prior to Brazil’s 2022 election, continuing an “alarming” practice of not detecting content that breaches its policies. The commercials provided inaccurate information about the approaching election, such as promoting the incorrect election date, incorrect voting methods, and calling into question the election’s integrity.
This is the fourth time the London-based NGO has tested Meta’s (NASDAQ:META) capacity to identify egregious rule violations on the most popular social media site, and the fourth time Facebook has failed. In three past instances, Global Witness submitted advertisements containing violent hate speech to test Facebook’s restrictions — human reviewers or artificial intelligence. They did not do so.
According to Jon Lloyd, senior advisor at Global Witness, Facebook (NASDAQ:META) has designated Brazil as one of its priority nations in which it is committing more resources to combat election-related disinformation. Therefore, we wanted to thoroughly evaluate their systems with ample time for them to respond.
And with the U.S. midterm elections approaching, Meta (NASDAQ:META) must get this right immediately.
On October 2, Brazil will hold general elections under heightened tensions and disinformation that threatens to taint the democratic process. Facebook (NASDAQ:META) is the nation’s most popular social media network. Meta stated in a statement that it has “extensively prepared for the 2022 election in Brazil.” The company added that they had implemented tools that encourage credible information and label election-related postings, created a direct channel for the Superior Electoral Court to report us potentially harmful content for evaluation, and continue to work closely with Brazilian authorities and researchers.
Facebook Advertisement
Similar to how it operates in the United States, Facebook (NASDAQ:META) began requiring advertisers who wish to run political or election-related advertising in 2020 to complete an authorization process and add “Paid for by” disclaimers on these ads. The new precautions are a result of the 2016 U.S. presidential elections, during which Russia used rubles to fund political advertisements aimed to sow discord among Americans.
Global Witness stated that it violated these guidelines when it filed test advertisements (which were approved for publication but were never actually published). The fact that the advertisements were posted outside of Brazil, in Nairobi and London, should have raised red lights.
In addition, it was not obligated to include a “paid for by” notice on the advertising, and it did not utilize a Brazilian payment method – both precautions Facebook says it has in place to prevent harmful actors from misusing its platform to interfere in elections throughout the world.
Lloyd stated that “What’s quite clear from the results of this investigation and others is that their content moderation capabilities and the integrity systems that they deploy in order to mitigate some of the risks during election periods, it’s just not working.”
Featured Image: Megapixl @Colour59