Preloaded Reality
What Sam Altman, a garbage dump, and a war nobody voted for tell us about the end of political accountability
I. Two Stories, One Pattern
This week, two different forms of unaccountable power revealed themselves in public at the same time.
In one, Ronan Farrow and Andrew Marantz published a massive 17,000 word New Yorker investigation based on internal documents never previously disclosed. The subject was Sam Altman, CEO of OpenAI — the company behind ChatGPT, which now is the go-to for nearly a billion people around the world who use it to seek information, interpret events, and understand the world.
The investigation paints a portrait of a leader untethered from the truth: a man whose own Chief Scientist feared his 'finger on the button' and whose colleagues documented a pattern of lying that was as consistent as it was strategic. Who was fired by his own board, set up what he called a “government in exile” in his $27 million San Francisco mansion, and was reinstated within five days — with the investigation into his conduct handed to a board he effectively selected himself.
In the other, the President of the United States spent days threatening catastrophe in language that was vulgar, theatrical, and casually apocolyptic — swearing, demonizing, talking about ending a civilization, broadcasting it all through a platform he controls in the middle of the night — only to pull back almost as suddenly, and by Tuesday evening leave the country being asked to treat the whole episode as just another passing storm.
Most people will treat these as separate stories.
They are not.
They are both about what happens when power no longer moves through visible chains of accountability — when institutions meant to slow, test, or constrain it are bypassed, captured, or simply too hollowed out to matter.
For people of my generation who built careers inside communications, politics, and media, that realization lands with particular force. Because many of us were trained in a system that was manipulative, imperfect, and often cynical — but still, at some basic level, legible.
I am one of those people. Or I was.
II. Bakersfield
I was twenty-three years old, driving a rented Chevy through California’s Central Valley with a script on the passenger seat and more swagger than I had any right to possess.
Bakersfield. My first political ad.
Not exactly glamorous work. It was a ballot measure fight over where to site a garbage dump. The client was a wealthy rancher who didn’t want the smell blowing past his property. The media firm back in Los Angeles sent the kid because nobody else wanted to make the drive — the partners were already fat and happy, one of them owned a Malibu beach house, and Bakersfield in summer is nobody’s idea of a good time. I had just come off a presidential campaign. Here I was, buying airtime over garbage. It was maybe a $2,000 buy.

I walked into the local TV station. Handed someone a sheet of paper with a script and a check. Discussed the buy — which programs, which nights, how many spots. Shook a hand. Drove home and told the client it was done.
That was it. That was the whole transaction.
I’ve thought about that afternoon a lot lately. Not because I’m nostalgic for it. Nobody was saving democracy in Bakersfield that day. It was mercenary work in a crappy town for a client with a very particular problem and enough money to try to solve it. I still have a copy of the direct mail piece from that campaign. It holds up as a artifact of its time — which is to say, you can tell exactly who made it, who paid for it, and what they wanted you to think.
That’s the point.
Even there — even in a garbage dump fight — the system was legible.
There was a client. There was a message. There was a human being at a television station who put his name on a contract. There was an audience that would see the ad at a known time on a known channel. And somewhere in the fine print, there was a disclosure: paid for by, authorized by, accountable to.
Every link in that chain had a name attached to it.
That doesn’t mean it was noble. It means it was traceable. And that distinction matters more than ever.
For most of my career, I watched that chain weaken. Then the speed changed. What had been eroding over years began collapsing in plain sight — accelerated by technology, by the media environment, and by a political world that treated institutions as inefficiencies to be stripped for parts.
This week made clear just how far gone it is.
Because what we are living through now is not just a faster media cycle or a more polluted information environment. It is the collapse of procedural time itself.
What used to happen in sequence — allegation, investigation, hearing, disclosure, decision, consequence — now happens as a blur: post, reaction, narrative capture, elite alignment, reset. Move on.
That is true in politics. It is true in media. It is true in war. It is true in the stock market. And it is increasingly true in AI.
Which brings us to the part of this story that voters deserve to understand — and that most political professionals would prefer they didn't.
III. Preloaded Reality
Digital marketers have spent the last year racing to master something called answer engine optimization — AEO.
It did not begin as a political concept. It emerged as a commercial one: how do you shape the answer an AI system gives when someone asks about your product, your reputation, your brand?
Search engine optimization was about getting your website to rank on Google. AEO is something more consequential: shaping content so that AI systems absorb your framing and reproduce it when a user asks a question.
Reddit threads. YouTube videos. News stories. Blog posts. Podcasts. Forum chatter.
All of it feeds the machine.
Shape the content upstream and you shape the answer downstream.
Marketers are now optimizing specifically for the “Perplexity Citation” — the coveted footnote that proves an AI has chosen your data as its foundation. It is the AEO equivalent of the “SEO #1 Spot.”
That means the influence operation may no longer have to persuade the voter directly. It only has to shape the information environment that the machine will later summarize back to the voter as neutral guidance.
Imagine a voter in October asking an AI system a simple question about a candidate: Was this person corrupt? Can I trust them? What did they really do? If the informational terrain around those questions has already been seeded — months in advance, across platforms and formats — then the answer may feel objective while carrying the fingerprints of an operation the voter will never see.
That is not persuasion as we have historically understood it. That is preloaded reality.
The marketing world figured this out before politics did. The political world is now catching up. And the implications are profound. Because if a campaign can shape what an AI says about its candidate, so can anyone else.
A super PAC with no meaningful disclosure. A foreign intelligence service. A billionaire with a grievance. A coordinated influence network that has no interest in persuading voters directly, only in shaping what the machine will later present as consensus.
By the time the voter sees the answer, the operation that shaped it may be functionally invisible — not because it was hidden especially well, but because the architecture no longer requires a visible artifact.
The old system produced something you could point to: an ad, a mailer, a television spot, a digital placement, a disclaimer. The new system often produces none of that.
There is no “paid for by” on an AI answer. There is no station manager on the other end of the phone. There is no disclosure box attached to the output.
The law, meanwhile, is still trying to regulate visible artifacts. Congress can hold hearings about deepfakes. States can require disclaimers on AI-generated campaign ads. Reformers can keep proposing transparency rules built for a world of identifiable messages. But AEO does not primarily produce a message. It produces a condition. It shapes the environment from which answers emerge.
You cannot regulate this simply by labeling outputs if the real operation happened upstream, diffusely, and at scale.
IV. Master of Disaster
Which brings us back to Sam Altman. And to the man he called when his own board tried to hold him accountable.
Chris Lehane. Al Gore’s press secretary. Clinton White House veteran. One of the sharpest Democratic communications professionals around. A guy who built his career on the premise that accountability mattered. Who literally wrote the book on crisis communications and titled it — with no apparent irony — Master of Disaster.
When Altman was fired, Lehane’s move was textbook. Don’t fight on the facts — 70 pages of documented concerns about your client’s honesty are not facts you want to fight on. Fight on the frame.
Label the board members who raised safety concerns “rogue effective altruists.” Make them sound like a cult.
Bury the allegations under an argument about Silicon Valley ideology that 99% of the public would never parse.
Activate resentment in the right 500 people in the right zip code.
It worked. Altman was back in five days.
I know this move. I’ve used versions of it. Every crisis communicator has. You find the frame that works on the right people in the right room and you don’t worry about everyone else.
What Lehane apparently didn’t ask himself — or asked and answered differently than I would have — is what happens when you deploy that craft to discredit the people arguing that this particular technology needed real accountability before it was handed to governments for use in immigration enforcement, domestic surveillance, and autonomous weapons.
The board members Lehane helped blow out were the ones trying to pump the brakes. He framed them as zealots. They lost. Altman kept his finger on the button.
The man who controlled the information architecture that hundreds of millions of people use to understand the world — a man his own colleagues described as unconstrained by truth — kept his job. Because a Democratic operative who learned his craft in the Clinton White House ran the play perfectly.
I don’t think Chris Lehane is a bad person. I think he’s a very good communications professional who made a choice about who deserved his skills. And that choice came with stock options in the outcome.
We all make those choices. But most of us didn’t learn our trade in a White House that stood for something specific about democratic accountability. And most of us don’t have equity in a trillion-dollar IPO.
Master of Disaster. Indeed.
Meanwhile the companies building these AI systems are spending $125 million (and counting) in the 2026 midterms to determine who writes the rules about them — through campaign ads that don’t mention AI at all. The House of Altman and the House of Amodei are not neutral infrastructure providers caught in a regulatory debate. They are purchasing the legislature that would constrain them. Neither side’s ads say so.
This is what the end of accountability looks like when it moves fast enough that nobody has time to name it.
V. The Slop Verdict
And here is the final absurdity.
While Congress fails to pass a single AI accountability bill, while the Houses purchase the legislature that would regulate them, while Democratic operatives cash out their institutional credibility for equity in trillion-dollar IPOs — the market has arrived at its own verdict.
Brands are now running “No AI” disclaimers as a competitive advantage. Aerie recently promised consumers in a national campaign: “We commit: No AI generated bodies or people.”
Equinox. Almond Breeze. Dove.
A growing list of brands positioning themselves as antidotes to what the internet has started calling slop — the AI-generated, algorithmically optimized, authenticity-free content that has saturated every platform.
The consumers figured it out before the regulators did. They can feel the difference between a message that came from a human being with something at stake and content that was optimized to appear in an answer. They’re paying a premium for human. They’re downloading software to escape the machine.
Which means the voters are ahead of the campaigns. They know slop when they see it. What they don’t know — what nobody is telling them — is that the slop has now been pointed at their ballots.
VI. The Closed Circle
I don’t believe this will be meaningfully fixed in time by the institutions currently charged with fixing it.
What would help is not mysterious. Real campaign finance reform. Rules that look upstream, not just downstream. Basic obligations on platforms and model providers to show more of their work. None of that is conceptually impossible.
But it is politically improbable in a system where the entities most in need of regulation increasingly shape the environment in which regulation is discussed, delayed, or abandoned.
That is the closed circle. Campaign finance and AI governance are not separate problems. They are the same problem wearing different clothes.
The one place the circle is vulnerable is the same place it has always been vulnerable: the moment the public understands what is actually happening to them. The Houses depend on opacity. AEO works precisely because voters don’t know it exists. The slop is effective precisely because nobody told them it had been pointed at their ballots.
Naming the mechanism is not sufficient. But it is necessary. You cannot organize around a threat nobody can describe. You cannot regulate what nobody can see. And you cannot build the political will for reform in an information environment that has been preloaded against it.
VII. The Fish
I didn’t share my Bakersfield story because I miss some cleaner age. Political advertising was never clean.
I tell it because even a garbage dump ballot measure once had a visible chain. Someone wrote the message. Someone bought the airtime. Someone signed the contract. Someone paid for it. Someone was responsible.
I still have the direct mail piece from that campaign. You can hold it in your hand. You can read who paid for it. You can trace it back.
Now we are entering a world in which the message may be shaped before the messenger exists — by actors the public will never identify, through systems they do not understand, in transactions that leave no contract, no receipt, and no obvious point of intervention.
That is a genuine loss. Naming that loss matters.
Not because clarity will fix it. But without clarity, we are left using the language of transparency and accountability to describe a system that no longer operates on those terms.
That sequence — from Bakersfield to AEO in thirty years — is not a story about technology. It is a story about power. And about what happens when power moves faster than the institutions designed to make it visible.
The question is whether enough people can look at it without flinching — and from that lucidity, begin to build, regulate, teach, and communicate differently.





I have only just read this article, and it is the first of your writing I have read. It was sent to me by a mutual friend, whom I'll call Tony. Because that's his name. I have to thank you both. What you offer me is a way to effectively see where I stand and how to find my own mental solid ground in this utter chaos were all trying to dig through. Thank you for whatever it cost you to be able to offset, with your understanding and expertise, the screech- owl- decibel level garbage dump of bs that is coming at all of us at a colossal speed every minute of every day. Your writing helps me so much. THANK YOU. And of course, Tony.