The tallies were based on the number of "coordinated inauthentic behaviour" networks removed by Facebook, a term it uses for a type of influence operations that relies on fake accounts to mislead users and manipulate the public debate for strategic ends.
Facebook began cracking down on these influence operations after 2016, when US intelligence concluded that Russia used the platform as part of a cyber-influence campaign that aimed to help former President Donald Trump win the White House, a claim Moscow has denied.
The company said Russia, followed by Iran, topped the list for sources of coordinated inauthentic behavior and that this was mostly rooted in foreign interference. Top targets of foreign operations included Ukraine, the United Kingdom, Libya and Sudan.
But the company also said that about half of the influence operations it has removed since 2017 around the world were conducted by domestic, not foreign, networks.
"IO [influence operations] really started out as an elite sport. We had a small group of nation states in particular that were using these techniques. But more and more we're seeing more people getting into the game," Nathaniel Gleicher, Facebook's head of security policy, told reporters on a conference call.
Facebook said the domestic influence operations that targeted the United States were operated by conspiratorial or fringe political actors, PR or consulting firms and media websites.
Myanmar was the country targeted by the most domestic inauthentic networks, according to Facebook's count, though these networks were relatively small in size.
Gleicher said threat actors had pivoted from large, high-volume campaigns to smaller and more targeted ones, and that the platform was also seeing a rise in commercial influence operations.
"I actually think the majority of what we're seeing here, these aren't actors that are motivated by politics. In terms of volume, a lot of this is actors that are motivated by money," he said. "They're scammers, they're fraudsters, they're PR or marketing firms that are looking to make a business around deception."
Facebook investigators also said they expected it would get harder to discern what was part of a deceptive influence campaign as threat actors increasingly use "witting and unwitting people to blur the lines between authentic domestic discourse and manipulation."
The report included more than 150 coordinated inauthentic networks identified and removed by Facebook since 2017.