We’ve made significant investments to help reduce the spread of misinformation and connect people to reliable information across our platforms.
We remove content that violates our Community Standards, which helps to protect people’s safety and security on our platforms. This includes removing harmful misinformation that could lead to imminent violence or physical harm, such as misinformation about COVID-19 and vaccines. We also remove misinformation that could prevent people from voting, like misrepresenting the dates, location, times and methods for voting.
For false claims that don’t violate our Community Standards, we rely on our global network of more than 80 independent fact-checking partners in over 60 languages to identify, review and rate viral misinformation across our platforms. When a fact-checker rates a piece of content as false, we significantly reduce its distribution so that fewer people see it. We notify people who try to share the content - or previously shared it - that the information is false, and we apply a warning label that links to the fact-checkers article disproving the claim.
Our efforts to fight misinformation and connect Canadians with credible information include:
- Expanding Fact-Checking Partnerships in Canada: Our global fact-checking program is a key piece of our strategy to reduce the spread of misinformation on Facebook. In the lead up to the last Canadian election, Agence France Presse signed on as our fact-checking partner in Canada.
- [NEW] Since the last federal election, we’ve added a new Canadian partner, Radio-Canada - Les Décrypteurs, in February 2020. We also added two new ratings in August 2020 to provide our fact-checking partners with more latitude to better reflect their research.
- Digital Literacy: In the lead up to the 2019 federal election, we partnered with MediaSmarts to disseminate a bilingual advertising campaign to help people make more informed decisions about what to read, trust, and share. We have continued our longstanding partnership with MediaSmarts and will relaunch a similar digital literacy campaign during the next federal election
- Content Transparency: In September 2018, we launched a Context Button to provide people with more background information about the publishers and articles they see in News Feed so they can decide what to read, trust and share. And in June 2020, we started to roll out a notification screen that lets people know when news articles they are about to share are older than 90 days.
- Protecting News-related Page Categories: Designation as a News Page can imply an inherent level of credibility to people, and we want to ensure that only Pages who primarily publish news are accurately identifiable as such on Facebook. Starting in Canada, we limited the ability for any Page to self-select a category related to news. Now, in order to have a news-related designation, Pages must be registered in the news Page index.
- Reducing Political Content in News Feed: One common piece of feedback we hear from Canadians is that they don’t want political content to take over their News Feed. Based on this feedback, in February 2021, we shared that we were starting to test in Canada a variety of ways to rank political content in people’s feeds using different signals, and then decide on the approaches we’ll use going forward.
- Detection and Removal of Fake Accounts with Artificial Intelligence: We block millions of fake accounts every day so they can’t spread misinformation. As of Q1 2021, we have taken action against 1.3 billion fake accounts. We also reported in our Q1 2021 Community Standards Report that our proactive detection rate was 99.8%, meaning we found and flagged 99.8% of violating accounts before they were reported to us.