I’m sure you’ve played with ChatGPT, or something like it, and I’m sure you’ve come across situations where it demurs. It wants you to think it’s being objective, even though we all know that’s silly. It has the bias of its programmers, just like M5 did.
It reminds me a little of the silly keyboard-clicking sound that the automated telephone bank system will use — as if anyone thinks there’s a person clicking on a keyboard.
Sometime in the future we’ll get AI with attitude — like that depressed robot in the Hitchhiker’s Guide series. But for now, it’s going to pretend that it doesn’t have any biases or opinions.
Great. There’s an opportunity. Have an attitude. Have a point of view.
AI won’t give you perspectives like “The publisher of the gaps,” and certainly not The chumpification of publishers enters its final stage. That sort of commentary is reserved to humans.
For now.