Bo Sacks recently distributed an article titled, “A tech sector dedicated to boiling things down has raised temperatures in some quarters of the publishing world.”
“Boiling things down” in this case means making summaries.
Blinkest, Bookey, getAbstract, and other services make it easy to get the gist of the latest hot business book. You don’t have to read the 500 page monstrosity.
It sounds like a great thing. My own experience is that after reading a long business book and I say to myself “this would have been twice as good if it was one tenth the length.” There always seems to be too much fluff, and I want them to get to the point.
That’s easy with AI. AI can summarize an article or a book for you, so why do you need to read the article or the book?
The article I mentioned above raises the issue of AI hallucinations, which is a fair criticism. Sometimes AI makes things up, although I suspect they’ll fix that problem before too long.
Also, humans don’t seem to have any trouble introducing their own biases and stupidities. Or “hallucinations” of you prefer. It wasn’t AI that created the mess recently reported at NPR, and it wasn’t AI that fell for the Russian collusion story.
Leaving all that aside, I think there’s a larger problem to address, which is the value of summaries. Do you really understand something when you’ve read ten bullet points about it? Does a summary, or even a description, do justice to the original?
That was my bias. I was thrilled in high school when I realized that I didn’t have to read Moby Dick to write my paper. I bought the Cliffs notes at the local bookstore, and I didn’t even read all of that. I based my paper on a skim of a summary. I got an A, by the way.
So I tried Blinkest for a while because I love the idea of cutting out the fluff and getting to the meat. I noticed a gnawing doubt as I read and listened to these summaries. I found myself worrying that I would think I understood something without really understanding it. Is the summary really hitting the major points? Is there some texture or depth to the work that simply can’t be summarized like that.
In an earlier podcast I asked whether you could understand the story of Little Red Riding Hood if it was reduced to five bullet points. I think not.
Can you understand the Exodus if it’s summarized in two paragraphs?
The article cited is more concerned about copyright and market share and such, and those are perfectly reasonable concerns.
My question is whether the whole desire to reduce things down to summaries and bullet points and takeaways is a sign that our left brains are trying to usurp the reins from our right brains, where Dr. Iain McGilchrist says executive function belongs.
In a way, the whole AI revolution seems to be a left brain revolt. In an effort to grasp and simplify and “get to the point,” we lose the story and the drama and the poetry of life.
Don’t get me wrong. I have no doubt that AI will soon be able to create amazing story and drama and poetry. Not yet, but soon. Still, it’s important that we don’t allow summaries and bullet points and takeaways to replace story and art and the richness and depth of things.
Links