Scott Brinker has an article about the second order effects of generative AI. He says the first-order effect is that the quantity of content in the world will grow exponentially.
He’s probably right, but I think my brain was put in backwards because when someone makes a claim or a prediction like that, my immediate thought is to wonder if it might be the opposite.
Is it possible that AI will lead to less content generation?
The model up to this point has been to create lots of SEO-friendly content to get on the first page of Google – so people will find you, view your ads, and maybe sign up for your email newsletter. Lots and lots of people are competing for that spot.
But the search engine model is about to die. Very soon, rather than entering a search and having a choice between 10 billion supposedly relevant pages, we won’t search at all. We’ll just ask the question and get our answer.
That will change the calculus of content creation.
For example, my wife was recently wondering if there’s a way to revivify hardened instant coffee. She did a search and went to some website that was clearly written by some primitive AI, or possibly by a human who didn’t have the foggiest idea what coffee was. The content was … stupid.
Somebody made that website because coffee is a popular subject, and there are lots of ads in that space, so if you can easily create a website that’s effective search engine bait, you can make some money.
That’s an example of what Scott Brinker is talking about. Using AI to create content to feed the internet – as it’s currently configured.
But isn’t it possible – maybe even likely – that the primary use of AI is going to turn that whole scheme on its head? AI won’t be generating content for searches. It will be generating content for searchers. It will eliminate the middleman of the search engine, the weird website, and the list of 10 billion search results.
My wife will just go to her AI of choice and say, “how do you revivify hardened instant coffee?”
FYI, I asked ChatGPT and it gave me an interesting answer.
Now I suppose you could say that Scott Brinker’s essential point is still true. Generative AI will result in the creation of a lot more content. But the direction and context of that content will be very different. It won’t be creating posts on websites, it will be answering individual questions. Or, to put it another way, it won’t be public content, it will be personal content.
This goes along with a conversation Brian Morrissey had on his “People vs. Algorithms” podcast about the development of AI agents. Right now, if you ask ChatGPT a question, it relies on older data. It doesn’t go out and search the web for you. It has some collection of information from the relatively recent past.
Soon there will be AI agents that can take a user’s request, like “find me the best French restaurant within 20 miles of my home that has a good selection of Belgian beers,” and it will do that for you.
This isn’t far away, and it will fundamentally change the way content is created. Content will be created directly for the user. It won’t be created for keyword density or SEO or any of that stuff, because that won’t be relevant anymore.