Back in my day, high schools offered driver’s ed as a class.
At my school, we had this hi-tech (for the 70s) training regime where everybody sat in a car simulator and reacted to a movie, which was a driver’s view perspective of a trip through a city, merging onto a highway, etc.
We were supposed to “drive” our simulators in response to what we saw on the screen. Nothing we did changed anything on the screen, of course. It was a movie.
The instructor had a way to monitor what we were doing, e.g., whether we were turning when we should, using our turn signals, etc.
There was one scenario where our movie self was getting way too close to the car in front of us. I braked, but it didn’t do any good. The image on the screen kept getting close to the car in front, and then that car stopped suddenly and “we all” (that is, the image on the screen) crashed into it.
The teacher said, “You’re all dead.”
I shouted out, “Not me. I braked two blocks back.”
I didn’t think it was fair to be judged for being in a situation I wouldn’t have gotten myself into in the first place.
That’s exactly how I feel about Google and AI.
From the beginning of the Internet era, I recommended restricting content (or at least most content) to paying customers. The idea of the “free web” supported by advertising was clearly a scam that was going to destroy the publisher’s relationship with their customers.
Then came Google. We were supposed to allow them to crawl and index our content with no legal agreement about what they could do with it. We were supposed to trust them because, you know, they had that sophomoric slogan: “Don’t be evil.”
That was another obvious scam, which set a bad precedent: any content out there on the web is open season for bots and crawlers. That’s not legally true, but it became the default assumption.
Then it got worse. Not only did we have to allow Google (and 10,000 other bots) to crawl our content, index it in their Willy Wonka factory and do what they pleased with it, but now we had to jump through their hoops for the privilege. We had to follow their guidelines on tagging, schemas, indexes, and whatever else they came up with. As if we were working for them.
And like the abuse victims we were, we went along with it.
All this set the precedent for AI, which brazely steals our content to create rival products.
With this history of bad decisions behind us, people now ask me, “what should we do?”
I feel like I’m back in driver’s ed class. I pushed on the brakes two blocks back and should never have been forced into this no-win situation.
But here we are. What do we do now?
Lawsuits, of course. Copyright still exists, no matter what the tech titans think. But the court is a slow play, and it won’t do much.
Some people are fighting back by dissing AI. They magnify the faults and errors in AI every chance they get. The goal is to destroy its reputation. Ridicule it. Make people feel embarrassed when they admit that they use it for content creation.
Mark Twain is quoted as saying “never pick a fight with people who buy ink by the barrel.” So, these folk might think, “don’t mess with us. We own publishing.”
The trouble is that AI has a lot more ink than we do.
I fear the horse is out of the barn, and closing the doors now doesn’t help much. But then I realize that’s not the right analogy.
We’re the ones who are creating the new content. All AI can do it mimic, or follow our instructions. We’re the spring. All the fresh water flows downhill from us.
Where does that leave us?
The angel on my right shoulder says, first, no matter what else is going on, focus on the basics. Make sure our legit customers get the content they need to improve their lives.
The devil on my left shoulder says, second, feed the thieving bastards a bunch of skibidi to poison their systems with BS. Create a website where only humans can get to the real content, and the bots are crawling fake news.
It sounds like fun, but it would never work, for many reasons.
So here we are. We’ve willingly participated in our own destruction. We sheepishly accepted a world that expects us to put content out on the internet “for free,” which has contributed to our own demise. And now we’re wondering what or who can save us.
The first step is to admit that the “free internet” is the problem, not the solution. We need a new model for a post-free Internet.
Here’s a stab at it.
Manifesto for a Post-Free Internet Publishing Model
The Free Internet Was a Mistake — The “free content, ad-supported” model trained readers to expect something for nothing, hollowed out publishing, and handed power over to intermediaries. Google, social platforms, and now AI have built fortunes by intercepting audiences and monetizing them. That must end.
Here are some principles for a new publishing ecosystem.
1. Content Has Value
Professional reporting, analysis, and storytelling are skilled human labors. They are not free raw material for bots and aggregators.
2. The Subscriber Relationship Belongs to the Publisher
Platforms have lived by the motto, “your audience becomes our audience.” That must end. Publishers must own the subscription relationship. Third parties may distribute — but they cannot own the customer data.
3. The Customer Relationship Is Sacred
Publishers must design for readers, not algorithms. Metrics of success have to shift from pageviews (and other nonsense metrics) to subscriber satisfaction, trust, and retention.
4. Intermediaries Are Rent-Seekers
Search engines, platforms, and AI are middlemen. They profit by diverting attention. Publishers must reclaim control of distribution and reject dependency on gatekeepers.
5. Scarcity Must Be Enforced Technologically
Open access without boundaries enables theft. Content must be protected through subscriptions, memberships, smart paywalls, and maybe even cryptographic proof of origin.
6. Publishers Must Collaborate
No single publisher can do this. Publishers should collectively invest in technologies (platforms for ebooks, podcasts, videos, etc.) that embrace the principles in this manifesto.
7. Technology Should Enhance, Not Cannibalize
AI should serve customers — with summaries, personalization, and research aids — not undermine creators. Publishers must build tools that add value for paying readers, not for freeloaders.
8. Focus on Expertise
In an age of synthetic content, human authority is the differentiator. Authorship, expertise, and transparency are the new trust signals.
Join the Resistance!
The free internet model is collapsing under its own contradictions. Publishers must reject it and build a sustainable digital economy where content is valuable, customers connect with publishers, and technology serves creation — not extraction.