How allies of AI are ramping up their political donations for the midterms

Tech leaders, investors and AI evangelists are pouring tens of millions into political battles, trying to shape who writes the rules for a technology they say could redefine the economy – and, critics warn, democracy itself.

Ads attacking tech, funded by tech

The split-screen nature of this fight is on full display in New York City. A recent TV spot targets congressional candidate Alex Bores, a Democrat and former Palantir engineer, tying him to the company’s controversial contracts with US Immigration and Customs Enforcement (ICE).

Over shots of his LinkedIn profile, the narrator accuses him of helping power ICE operations that have enraged immigrant-rights advocates. “ICE is powered by Bores’ tech. Manhattan is smarter than that,” the ad declares.

What viewers are not told: the attack is bankrolled by a super PAC linked to Leading the Future (LTF), a new industry-aligned political committee created to protect artificial intelligence from what its founders see as a dangerous regulatory backlash.

Leading the Future was set up by major tech figures, including Palantir co-founder Joe Lonsdale, with more than $100 million already committed to promoting AI-friendly candidates.

Bores, now seen as one of the more aggressive AI regulators at state level, calls it rank hypocrisy. He says he left Palantir over its ICE work, sacrificing a lucrative career, while the same ecosystem that profited from those contracts is now funding attack ads against him.

Why AI giants are rushing into politics

The timing is not accidental. Congress and state legislatures are finally moving from talk to action on AI regulation, and public patience appears to be thinning.

Polling by Gallup shows eight in ten Americans want government to insist on AI safety and data rules, even if that slows development. Concerns range from mass layoffs to rising electricity bills as energy-hungry data centres proliferate.

Hamid Ekbia, who leads the Academic Alliance for AI Policy at Syracuse University, says firms like Palantir and OpenAI see the elections as a risk management exercise.

➡️ Stop naming girls Top baby girl name trends for 2026, the most stylish girl name trends are bold, beautiful and full of meaning

➡️ World’s largest oil field found in France, upending energy forecasts and boosting the nation’s global clout

➡️ Sleeping In Total Darkness: A Simple Habit That Strengthens Your Brain And Protects Your Mental Health

➡️ 103,000 extra cancers: should CT scans be limited now?

➡️ In Limbo F/A-XX Naval Fighter Gets ‘Full Funding’ Nod From Congress, But There’s A Catch (Updated)

➡️ Scientists confirm an unexpected discovery that challenges what we thought we knew

See also  They Created Liquid Gears That Transmit Motion Without Touching or Using Teeth

➡️ A genetic discovery links early‑onset diabetes to brain disorders

➡️ Heavy snow expected tonight as authorities urge drivers to stay home

AI companies view heavy political spending as a pre-emptive hedge against hostile laws, especially given mounting anger over surveillance and immigration enforcement contracts.

Few companies illustrate the stakes as clearly as Palantir. The defence and data analytics firm has ridden the AI boom to a market valuation more than ten times higher than in 2022, pitching its software as a critical edge for governments and corporations.

Chief executive Alex Karp has framed AI as a geopolitical race. In his telling, either the US leads, or China does – and civil liberties hang in the balance. That binary framing is now bleeding into campaign finance.

The money map: who is giving, and to whom

Federal Election Commission filings show a dense web of donations from AI-aligned elites and their allies:

  • Leading the Future raised over $50 million in just the second half of 2025, including $25 million from OpenAI president Greg Brockman and his wife, and $25 million from venture capital powerhouse Andreessen Horowitz (a16z).
  • Palantir figures Peter Thiel and Alex Karp each wrote six‑figure cheques to committees aligned with Republican congressional leaders last year.
  • Karp gave $360,000 to Joe Biden and Kamala Harris’ joint fundraising operation in 2023, then $1 million to MAGA Inc. a year later, reflecting a pragmatic, cross‑party strategy to stay close to whoever holds power.

The goal for much of this money is not just to pick winners, but to shape the regulatory architecture itself. LTF and other industry voices are pushing hard for a single federal AI framework that would override tougher state laws.

Donald Trump has already signed an executive order seeking to block new state-level AI rules, and Texas Senator Ted Cruz has proposed legislation to reinforce that moratorium. LTF strategist Jesse Hunt argues that a patchwork of state regimes would choke innovation and hand an advantage to foreign competitors.

The counterweight: ‘Public First’ pushes back

The tech industry’s effort is not going unanswered. A new group called Public First, led by former Democratic congressman Brad Carson and former Republican congressman Chris Stewart, is trying to rally funding behind candidates who favour tighter guardrails.

Public First is seeking $50 million to back politicians who want “responsible tech policies” that reduce harm and guard against AI’s worst risks.

Carson says their mission is less anti-AI than pro-legitimacy: without credible safeguards, he predicts a public revolt that could slam the brakes on the entire sector. In his words, “people are already sharpening their pitchforks.”

See also  Wie Sie mit einfachen Yoga-Übungen zu Hause Ihre Beweglichkeit im Winter 2025 steigern und Stress abbauen

Public First has highlighted candidates like Bores, who authored New York’s Responsible AI Safety and Education (RAISE) Act. Due to take effect in March, the law forces AI firms to adopt safety protocols designed to prevent “critical harm”, such as assisting in chemical or nuclear attacks or enabling serious crimes, and to disclose serious incidents like breaches or dangerous malfunctions.

Midterm battlegrounds: from Albany to Austin

The clash between pro-growth AI donors and regulation-focused groups is set to play out across several key states this cycle: New York, California, Texas, Illinois and Ohio are all on the target list for LTF and its opponents.

In New York, Governor Kathy Hochul has positioned herself as both tech‑friendly and firm on safeguards, warning in her State of the State address that the state will not let emerging technology “undermine our democracy.”

In Florida, Republican Governor Ron DeSantis has proposed a “Citizen Bill of Rights for AI”, aimed at bolstering privacy, limiting certain AI uses on national security grounds, and placing rules on data centre construction. He has mocked Silicon Valley utopianism, warning that deepfake music and videos will not magically bring prosperity.

At the same time, a loose coalition ranging from DeSantis and former Trump strategist Steve Bannon to independent Senator Bernie Sanders is resisting attempts to strip states of their power to write their own AI rules.

Palantir money becomes a campaign issue

Palantir’s status as a lightning rod is giving opponents a tangible target. In Illinois, Senate hopeful Raja Krishnamoorthi faced criticism on stage for accepting more than $29,000 in donations over the years from Palantir chief technology officer Shyam Sankar.

Krishnamoorthi says that once the connection was highlighted, he redirected the money to migrant-rights groups. Other Democrats have followed a similar path.

Politician State Action on Palantir-linked donations
Raja Krishnamoorthi Illinois Donated contributions from Palantir CTO to migrant-rights groups
John Hickenlooper Colorado Redirecting tens of thousands from Palantir employees to immigrant-rights organisations
Jason Crow Colorado Also giving away donations following media scrutiny

Campaigns are being pressed by activists and online trackers that list top recipients of Palantir-linked money. That public pressure is turning relatively small cheques into potential liabilities on the trail.

Voters’ unease: jobs, bills and broken promises

For many candidates, AI is no longer a futuristic talking point but a doorstep issue. Reed Showalter, a Democratic antitrust lawyer running in Illinois’ 7th District, says the promised medical miracles have yet to arrive.

See also  10 Yoga at Home Essentials to Build a Consistent and Easy Routine

He argues residents are instead seeing higher bills for electricity and water as large data centres pull on local grids, alongside fears of wages declining as automation spreads.

Chris Stewart, the former Republican congressman now with Public First, notes that in more than a decade in Congress, he rarely heard constituents bring up energy prices. Now, as AI data infrastructure expands, he expects candidates will be quizzed on it directly and will need clear policy answers ready.

Key terms voters will keep hearing

The AI money fight is littered with jargon that can easily obscure what’s actually at stake. A few phrases matter more than others:

  • Federal pre-emption: Tech firms want one national AI law that cancels out stricter state-level rules. Supporters say this simplifies compliance; opponents say it strips communities of control.
  • Safety protocols: These are the internal rules AI firms must follow to stop their systems being used for serious harm, such as helping design biological weapons or plan cyberattacks.
  • Critical infrastructure risk: As AI data centres suck up power and water, regulators worry about local grids, drought-prone areas and resilience during heatwaves.

Possible scenarios for the next few years

Several paths are now in play. If AI-aligned donors succeed in locking in a light-touch federal regime that pre‑empts state action, innovation may accelerate, but the political backlash Carson warns about could grow as people link job losses and higher bills to a technology they never voted for.

If, instead, groups like Public First help elect lawmakers who insist on strong safety and transparency rules, companies may face slower rollouts and higher costs. Yet that could buy public trust and make catastrophic misuse less likely, especially around security-sensitive applications.

A third, more chaotic outcome sits in between: a messy patchwork of state laws, lawsuits and executive orders where companies shop for the friendliest jurisdictions, while regulators struggle to keep pace. In that landscape, political money becomes an even sharper tool for shaping who writes, and rewrites, the rules.

For voters trying to make sense of the midterms, one practical step is to follow not just candidates’ rhetoric on AI, but their donor lists, bill sponsorships and positions on issues like data‑centre development or worker retraining. Those details reveal whether they side more with AI’s allies, its sceptics, or the shrinking number trying to sit somewhere in the middle.

Originally posted 2026-03-05 15:50:02.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top