Canadians will see less political content in their Facebook feeds than they did in past elections, and the social media giant will not hesitate to take down posts that promote misinformation about subjects such as the election or COVID-19, says the head of the company’s Canadian operations.
Kevin Chan said even posts by political parties or candidates could be removed if they violate Facebook’s rules. However, he said the company will also take into account the right to free speech.
“Politicians are not exempt from enforcement against breaches of our harmful misinformation policy,” Chan said in an interview with CBC News. “So if people, anybody on Facebook is saying things about the coronavirus that can lead to physical harm, we will remove it.”
For example, posts that claim that COVID-19 vaccines kill or seriously harm people will be removed, as will claims that wearing masks doesn’t prevent the spread of COVID-19.
Those upset with a decision by Facebook to remove a post can appeal to the independent oversight board set up by the company, Chan said.
It’s all part of Facebook’s Canadian Election Integrity Initiative being made public later today. The initiative consists of a seven-pillar plan that ranges from working with Elections Canada to provide accurate information about voting to combating any attempts by foreign actors to disrupt or influence the election.
Facebook will unveil details today of its plan to prevent bad actors from using its platform to influence the federal election or spread misinformation. (Paul Sakuma / The Associated Press)
Facebook’s plan includes a pilot project that began in February to reduce the number of political posts that automatically appear in the feeds of Canadian Facebook users.
Chan said Canadians told Facebook they wanted to see less politics in their feeds.
“One of the things we are doing, and we’ve been doing since February, is looking at ways in which we kind of reduce the distribution of political content on Facebook,” Chan said. “That’s not to say that we are removing political content. All that is still there, it is just a question of whenever people say that they want to see five or six things in a session, they have to be ranked some way.”
Chan said users can change their settings to see more political content if they wish or they can visit political pages. Users can also choose to see fewer political ads.
Chan said the measure will apply generally to all political posts, regardless of political party.
However, Aengus Bridgman, researcher with McGill University’s Media Ecosystem Observatory, said live testing Facebook’s pilot project to reduce political content during an election campaign “has huge misuse potential.”
“It’s Facebook playing God a little bit here during an election,” Bridgman said. “They’re choosing to do this, and Canadians who rely on Facebook for their political information are going to be subject to that algorithm.”
Bridgman said Facebook should provide details about the algorithm it is using to reduce political content and allow academic experts to audit it.
He said Facebook and other social media companies have done a lot over the past five years to address the problem of their platforms’ being used by bad actors to influence elections or spread misinformation. However, he said, they still have a way to go.
“They have not been successful on really curbing this stuff on their platforms, and it continues to be an enormous problem.”
Facebook isn’t the only social media company taking steps to prevent their platforms from being misused during the federal election campaign.
Cam Gordon, spokesman for Twitter, said the company will be monitoring four main categories of posts during the election – misleading information about how to participate in the election, suppression and intimidation, misleading information about outcomes and false or misleading affiliation.
Researcher Aengus Bridgman with the Media Ecosystem Observatory in Montreal says social media companies are voluntarily acting to prevent misuse of their platforms in a bid to avoid government regulation. (Louis-Marie Philidor/CBC)
Molly Morgan, spokeswoman for Google and YouTube, said the company regularly removes content that violate its policies.
“During elections, our teams work around the clock to ensure our policies and systems are protecting the integrity of our platform, preventing any abuse of our systems and surfacing authoritative election-related information,” she wrote. “We remain vigilant and are committed to maintaining the important balance of openness and responsibility on Election Day and beyond.”
Bridgman said part of the reason social media companies such as Facebook have been voluntarily taking action is they are trying to prevent governments from moving to regulate them.
“The bottom line is that this is attempts by the platforms to stave off more concerted regulation and oversight of their moderation and censorship policies.”
Elizabeth Thompson can be reached at [email protected]