Deepfakes, fake local news sites and AI-generated scandals are no longer fringe threats for Paris, but strategic worries.
As France heads toward key municipal elections in 2026, the government has quietly rolled out a new long-term doctrine: treating foreign disinformation as a genuine security threat, on a par with more traditional forms of interference.
France on alert as foreign disinformation targets 2026 local elections
French authorities say the country is facing a sustained campaign of information manipulation, driven mainly by pro-Russian networks and amplified by cheap, accessible artificial intelligence tools.
A sprawling influence operation dubbed Storm‑1516 has already set up around 150 fake French-language “local news” sites, mimicking regional outlets such as “Flash Bourgogne-Franche-Comté” or “Île-de-France Actu”. Since early 2025, these lookalike sites have pumped out nearly 14,000 biased articles.
The content plays on highly sensitive themes: rising prices, local unemployment, anger over the war in Ukraine and climate frustration. The aim is not just to mislead, but to polarise voters and erode trust in institutions ahead of the March 2026 municipal vote.
Watchdogs like Afnic, which manages France’s .fr domain, and Reporters Without Borders have flagged a wave of deepfakes and AI “hyper-edits”. These include fake speeches, doctored photos and lurid headlines designed to go viral on social media and messaging apps.
Foreign campaigns are trying to convince French voters that nothing and no one can be trusted – not the media, not the courts, not the ballot box.
According to French officials, a Kremlin-linked influence effort known as “Matrioshka” now pushes roughly 15 recurring anti-French narratives every month, targeting everything from defence policy to sanctions on Russia.
Faced with this landscape, the government has released a national strategy against foreign information manipulation. It is built around four pillars: citizen resilience, regulation of platforms and AI, detection and response to foreign interference, and an international front with allies.
First pillar: turning citizens into tougher targets
The first part of the strategy focuses less on technology and more on people. Paris wants to make every citizen harder to fool.
➡️ Hygiene after 65 : not once daily, not once weekly, here’s the shower frequency that keeps you healthy
➡️ Day turns to night as the longest total solar eclipse of the century sweeps across multiple regions
➡️ 54 years ago he was an extra in an Oscar-nominated film, now he’s the world’s best actor
➡️ Father splits assets in his will equally among his two daughters and son, wife says it’s not fair because of wealth inequality
➡️ A new kitchen device is set to replace the microwave for good
➡️ Princeton’s breakthrough qubit could finally make quantum computing practical
➡️ The hard-earned know‑how of this 305‑year‑old “old hand” is key to France’s biggest military exercise since the Cold War: ORION 26
➡️ Not the US or China: almost nobody guesses the world’s second country by tourist spending
A new “Academy for the fight against information manipulation” will be set up under the office of the Secretary-General for Defence and National Security (SGDSN), the body coordinating French security policy.
The Academy’s mission is not to tell people what to think, but to sharpen their critical skills so they can spot manipulation when they see it.
- Special training resources for elected officials and local representatives
- Modules integrated into school curricula through the Education Ministry
- Dedicated training for teachers and school staff
- Support for university research on disinformation and influence operations
- Simple, public-facing tools to help citizens check what they read and share
Officials often describe this as a kind of “cognitive vaccination”. Once you understand how viral falsehoods work, you are less likely to share them blindly.
Most disinformation campaigns follow a similar pattern: grab attention with anger, fear or outrage, then steer people toward a specific political or social reaction.
A persuasive fake story hooks your emotions first – the facts, or lack of them, only come later.
The strategy also backs the creation of full academic tracks dedicated to information manipulation, mixing political science, psychology, data science and media studies. The aim is to build a new generation of specialists who can track and explain influence operations over the long term.
Second pillar: reining in platforms and generative AI
The second pillar targets the digital infrastructure through which disinformation spreads: social networks, search engines and AI providers.
French officials argue that algorithms act as “megaphones”: they amplify content that drives reactions, clicks and comments. Since manipulative content is designed to spark emotional responses, it naturally rides these systems straight to the top of people’s feeds.
On top of that, generative AI can now create convincing text, audio and video in minutes. A fake press conference, a bogus leak or a fabricated sex scandal can be produced almost instantly and pushed across multiple platforms.
Within the European Union, France plans to use the Digital Services Act as a base and push for tighter obligations on large platforms, especially during elections and major crises. Tougher penalties for non-compliance are on the table.
Paris is also reinforcing an interdisciplinary operational team that watches how AI and platform dynamics affect public debate. This group blends technical experts, behavioural scientists, diplomats and legal specialists.
- Monitoring viral falsehoods and deepfakes in real time
- Assessing the potential impact on voter behaviour and public order
- Flagging coordinated inauthentic activity (fake accounts, botnets, content farms)
- Working with regulators to demand swift action from platforms
Another sensitive topic is money. Disinformation at scale costs something: content production, website hosting, advertising campaigns, even paid influencers. The strategy explicitly calls for tracing and, where possible, disrupting the financial pipelines behind foreign manipulation campaigns.
Third pillar: hunting foreign interference in the open
The third pillar is about detection and deterrence. Under the SGDSN and the Foreign Ministry, France’s Operational Committee for the fight against information manipulation (COLMI) is being upgraded into a more powerful coordination hub.
This revamped platform will pull together several levers:
- Technical tools for tracking online operations
- Diplomatic measures such as formal protests or coordinated statements
- Judicial responses including prosecutions and takedowns
- Economic sanctions or restrictions targeting individuals or entities
- Public communication campaigns to expose manipulation attempts
The idea is straightforward: once a foreign interference campaign is identified, the response will not stop at a press release. It may include court cases, asset freezes or diplomatic retaliation.
Paris wants hostile actors to understand that meddling in the French information space carries a clear and measurable cost.
This approach is often described as “information deterrence”: protect public trust, impose consequences on aggressors and reduce the odds they try again.
The plan also calls for training regional courts and prosecutors so they can handle cases involving digital manipulation and cross-border evidence. Local officials are often the first targets, especially during municipal elections, so justice at the territorial level needs to catch up quickly.
France also wants a strong domestic capability in OSINT – open-source intelligence. OSINT analysts study publicly available material: social media posts, satellite imagery, domain records, forums, news archives and more.
In practice, an OSINT team might compare domain registration data, track content timing across platforms, match imagery to real locations and link accounts across networks to reveal a coordinated operation.
Key elements of France’s OSINT approach
| Focus area | Goal |
|---|---|
| Social media analysis | Detect bot activity, fake personas and coordinated posting patterns |
| Website forensics | Identify networks of fake news sites and their technical backers |
| Geolocation & imagery checks | Verify photos and videos used to stir outrage or fear |
| Financial traces | Follow the money behind advertising and influence contracts |
Fourth pillar: building an international shield
Foreign manipulation campaigns rarely stop at one border. Content seeded in one language is quickly translated, repackaged and reused elsewhere. France’s fourth pillar is about working with partners at every level.
Within the EU, Paris is pushing for a community of practice on disinformation. This would involve regular sharing of techniques, joint monitoring during elections and coordinated pressure on platforms.
At G7 level, France supports closer alignment of national measures, building on the “Code of Practice on Disinformation” and newer initiatives on information integrity. The goal is to avoid gaps that foreign actors can exploit by shifting operations from one jurisdiction to another.
Inside NATO, France argues that information manipulation should be treated explicitly as part of the Alliance’s security doctrine. Hostile influence operations can soften up public opinion long before any conventional crisis, making a collective response harder.
At the United Nations, Paris takes part in talks around a “Global Digital Compact”, which aims to set broad principles for a safer, more trustworthy digital environment worldwide.
French officials see the information space as a new shared domain, like air or sea, where rules and cooperation still lag far behind capabilities.
What this means for ordinary users
While the strategy is written in technocratic language, it lands very close to home for anyone scrolling on their phone.
French authorities warn that spectacular, highly emotional stories around local candidates are a key vulnerability, especially those spread via anonymous channels. Some recent examples include fake allegations about candidates secretly living abroad or fabricated pornographic profiles using their names and photos.
Officials advise a simple reflex: when a piece of content pushes you to share instantly, ask who benefits if you believe it.
- Check the source: is the site or account known and traceable?
- Look for other coverage: do established outlets report the same claim?
- Beware of urgency: “share fast before it is deleted” is a common hook.
- Watch for emotional overload: extreme anger or disgust is often a warning sign.
From the government’s perspective, every citizen who pauses to verify before sharing is part of the national defence effort. Trust in democratic processes, they argue, is built gesture by gesture.
Why “information manipulation” is treated like a weapon
In security circles, the phrase “manipulation of information of foreign origin” covers a broad spectrum of activities. It is not just fake news or rumours. It also includes selective leaks, out-of-context footage, impersonation of local media and abuse of recommendation algorithms.
The point is not to criminalise legitimate criticism or satire. What worries officials is coordinated behaviour by foreign actors whose goal is to weaken another state’s ability to make decisions, run elections or maintain social cohesion.
Imagine a hypothetical scenario in a small French town ahead of the 2026 vote. A fake local news site publishes a story claiming the mayoral candidate plans to close a major factory. A deepfake video appears on social media “confirming” the story. Anonymous accounts buy targeted ads so that every worker in that factory receives the clip on their phone. Local anger rises, threats follow, the candidate suspends public meetings. Even if the story is debunked days later, the damage to trust is done.
This is the kind of chain reaction French planners want to anticipate and blunt. The four-pillar strategy is designed to catch such operations early, respond visibly and make them costlier for those who launch them.
For readers outside France, the shift is also a sign of where many democracies are heading: towards treating disinformation less as an online nuisance and more as a long-term strategic challenge, where education, technology, law and diplomacy need to move in step.
