Musk Has Reduced Twitter’s Ability to Spot Foreign Disinformation, a Former Data Scientist Says

It’s not clear how Twitter can filter out foreign disinformation now that CEO Elon Musk has gutted the teams meant to prevent a repeat of Russia’s effort to sway the 2016 presidential election, says one former senior Twitter senior data scientist.

In the wake of that election, the company took various measures to ward off threats to itself and its users. Perhaps most importantly, executives established a Trust and Safety Council to advise them on issues like harassment, stalking, doxxing, child abuse, self-harm, and foreign election interference.

Melissa Ingle—who was until November a senior data scientist at Twitter—worked closely with the council as she wrote and monitored machine-learning algorithms to detect political misinformation on Twitter.

“We tended to pay closer attention to a country when the election was approaching. For example, elections in Brazil, the US midterms, etc.,” Ingle said in an interview.

By 2022, she said, foreign-political-influence detection at Twitter was “a three-legged stool,” each part reinforcing the entire structure. The first was the Trust and Safety Council and the researchers and culture and language experts who helped them understand the nuances of particular countries’ influence operations.

“I’m not an expert on politics in ANY country, and these people would help us understand what to look for,” she wrote.

Ingles and other data scientists comprised the second leg. She described her work as writing artificial-intelligence algorithms using natural language processing to flag possible influence efforts.

“The algorithms are necessary because of the volume of tweets – there were approx 30M tweets per hour. No human workforce could cover this,” she wrote.

The final leg was human moderators, who applied their judgment to internal and external reports of disinformation.

“You need all three groups,” she said, and by the 2022 US midterms, they were all there.

Musk has since attacked each of those legs.

She says that Elon Musk laid off all three data scientist working on algorithms related to disinformation, about 60% of the 550 employees working on trust and safety issues are gone as well. Musk also disbanded the volunteer trust and safety council (about 100 members), and 80% of human reviewers. I’m not clear on how he intends to combat misinfo with the current staff.”

Ingle said that even though the models she created at Twitter still exist they need constant updating to remain accurate and relevant.

“As the nature of political discourse changes, a data scientist needs to be there to update the model. A machine-learning algorithm would not know to look for a term that is new, and old terms fall out of use. That’s why it’s important to have people to write and update these things,” she said.

But that’s not all that Musk is doing. He’s also taken a swipe at Twitter’s work with the US government to track foreign influence operations. Since 2017, the FBI’s Foreign Influence Task Force, or FITF, has worked with Twitter and other social media companies to “identify and counteract malign foreign influence operations targeting the United States.”I. Musk has leaked internal Twitter communications in a bid to to paint information sharing between the FBI and Twitter as nefarious (without actually exposing anything illegal or even, arguably, surprising).

So Twitter’s ability to spot or respond to foreign disinformation is now severely compromised. But China, Russia, Iran and other state actors have not abandoned their efforts to sway US public opinion on social media. On Monday, cybersecurity company Mandiant released a new report showing that actors connected to the governments of Russia, China, and Iran, continued in their efforts to try to influence the US population. But they’ve adapted their tactics, with a new focus on undermining trust in elections generally.

Twitter did not respond to a request for comment.

John Hultquist, head of intelligence analysis at Mandiant, told defense One, “Operations during the midterms have never been as strong as those seen during presidential elections, but we did see efforts by all the usual suspects this year. It’s interesting that some of these actors were more interested in suggesting they’d had effects than actually trying to affect election outcomes.”

Said Hultquist of foreign disinformation efforts generally: “We expect more will emerge.”