In the first quarter of 2025, Meta identified and dismantled three covert influence operations originating from Iran, China, and Romania. These campaigns aimed to manipulate public opinion in Romania, Azerbaijan, and Taiwan by disseminating misleading content through fake personas across multiple social media platforms.
Romanian Influence Operation
A network comprising 658 Facebook accounts, 14 Pages, and two Instagram accounts targeted Romanian audiences. The operators used fictitious profiles to manage Pages, share content, and comment on posts by politicians and news outlets. These fake accounts posed as local residents, posting about sports, travel, and local news to appear authentic. Despite their efforts, the majority of these comments failed to engage genuine users. The campaign also extended its presence to platforms like TikTok, X (formerly Twitter), and YouTube to enhance credibility. Meta noted that the actors employed sophisticated operational security measures, such as using proxy IP addresses, to conceal their identities. The content primarily focused on Romanian news and current events, including election-related topics.
Azerbaijan and Turkey Influence Operation
Another operation originating from Iran targeted Azeri-speaking audiences in Azerbaijan and Turkey. This network included 17 Facebook accounts, 22 Pages, and 21 Instagram accounts. The operators created fake profiles, often posing as female journalists and pro-Palestine activists, to post content, manage Pages, and comment within Groups, artificially boosting their popularity. They utilized popular hashtags like #palestine, #gaza, #starbucks, and #instagram to insert themselves into existing public discussions. The content, posted in Azeri, covered topics such as the Paris Olympics, Israel’s 2024 attacks, boycotts of American brands, and criticisms of the U.S., President Biden, and Israel’s actions in Gaza. Meta attributed this activity to a known threat actor group, Storm-2035, previously identified by Microsoft in August 2024 for targeting U.S. voter groups with polarizing messages on presidential candidates, LGBTQ rights, and the Israel-Hamas conflict. OpenAI also reported that Storm-2035 attempted to misuse ChatGPT to generate content for social media dissemination.
Myanmar, Taiwan, and Japan Influence Operation
The third operation, originating from China, targeted audiences in Myanmar, Taiwan, and Japan. Meta removed 157 Facebook accounts, 19 Pages, one Group, and 17 Instagram accounts associated with this campaign. The operators employed artificial intelligence to create profile photos and managed an account farm to generate new fake accounts. The campaign consisted of three clusters, each reposting content in English, Burmese, Mandarin, and Japanese about news and current events in the targeted countries. In Myanmar, the content advocated for ending the ongoing conflict, criticized civil resistance movements, and supported the military junta. In Japan, the campaign criticized Japanese policies and actions. In Taiwan, the content focused on undermining the government and promoting pro-China narratives.
Meta’s Response and Broader Implications
Meta’s proactive detection and removal of these influence operations underscore the ongoing challenges posed by state-sponsored disinformation campaigns. By leveraging fake personas and AI-generated content, these operations aim to manipulate public opinion and disrupt democratic processes. Meta’s actions highlight the importance of vigilance and collaboration among tech companies, governments, and civil society to combat the spread of misinformation and protect the integrity of information ecosystems.