Tackling Coordinated Inauthentic Behaviour

Keeping our platforms safe is an ongoing task. We’re constantly working to find and stop coordinated campaigns that seek to manipulate public debate across our apps. We’re making significant progress and are committed to staying ahead of this.

We view coordinated inauthentic behaviour (CIB) as coordinated efforts to manipulate public debate for a strategic goal where fake accounts are central to the operation. There are two tiers of these activities that we work to stop: 1) coordinated inauthentic behaviour in the context of domestic, non-government campaigns and 2) coordinated inauthentic behaviour on behalf of a foreign or government actor.

  • Coordinated Inauthentic Behaviour (CIB): When we find domestic, nongovernment campaigns that include groups of accounts and Pages seeking to mislead people about who they are and what they are doing while relying on fake accounts, we remove both inauthentic and authentic accounts, Pages and Groups directly involved in this activity.

  • Foreign or Government Interference (FGI): If we find any instances of CIB conducted on behalf of a government entity or by a foreign actor, we apply the broadest enforcement measures including the removal of every on-platform property connected to the operation itself and the people and organizations behind it.

  • Continuous Enforcement: We monitor for efforts to re-establish a presence on Facebook by networks we previously removed. Using both automated and manual detection, we continuously remove accounts and Pages connected to networks we took down in the past.

Since 2017, our security teams at Facebook have identified and removed over 150 covert influence operations for violating our policy against CIB.

We’re also investing heavily in security so we can find and address these kinds of threats.

  • We’ve improved our AI so that we can more effectively detect and block fake accounts, which are the source of a lot of the inauthentic activity.

  • We’ve more than tripled the number of people who work on security and safety issues overall to more than 35,000, including security experts, AI and machine learning engineers, and content reviewers.

These teams are constantly working to identify threats — identifying patterns, researching threat actors, designing new detection methods, and looking for any small mistakes from those who want to inflict harm.

In countering covert influence operations, we are focused on behavior, not the content because that is the best way to stop the abuse, hence why our investigative work and enforcement are location- and content-agnostic. They actively look for the elements common to every information operation:

  • Coordination among accounts, among Pages, or among offline groups;

  • Manipulation or deception; and

  • A strategic goal to influence public discourse. 35,000 We’ve more than tripled the number of people who work on security and safety issues overall to more than 35,000, including security experts, AI and machine learning engineers, and content reviewers.

Over the past four years, we’ve shared our findings about CIB we detect and remove from our platforms. As part of our regular CIB reports, we share info about the networks we take down over the course of a month to make it easier for people to see the progress we’re making in one place.