Technology without appropriate training may ruin your business

Noah with laptop
Summary: New technology (like AI) won’t help your business unless your team understands some basic concepts. Tools require more than procedures. They often require new mental models to avoid dumb mistakes.

It’s fun to believe in silver-bullet solutions, but they often fail because they ignore the complexity of the situation. E.g., “Why didn’t the Council of Elrond just give the ring to the eagles and have them fly it to Mordor?”

New technologies are often seen as silver bullets, but implementing a new technology without proper training is like trying to help Noah by giving him a laptop.

Think of all the concepts you have to understand to use a laptop. Those ideas would be completely foreign to somebody from the Bronze Age.

  • Programs and files
  • The desktop metaphor (e.g., dragging something to the trash)
  • The display is not the thing itself
  • User accounts and permissions
  • The fragility of the device (e.g., that it shouldn’t get wet)
  • That it runs on electricity and needs to be charged
  • Input devices, like the keyboard and the mouse

That’s hardly scratching the surface. You could give Noah the greatest design and project management software in the world, but it would take a very long time to get him to understand how to use it.

The gap between a modern worker and AI isn’t quite that big, but the general concept still applies.

Tools vs. Mental Models

When you try to implement a new solution, it’s not just about training people to use software. You also have to reshape how they think. New tools often require new mental models. If you don’t upgrade the way your team understands the new software, it’s likely they’ll misuse it.

Here are three examples that illustrate the problem.

  • Think of the manager who uses advanced Excel formulas without understanding the underlying math. He’ll generate reports that don’t mean what he thinks they mean.
  • When I first started working in publishing, I was amazed that not one person on the staff understood the concept of significant figures. I guess they don’t talk about such things in journalism school, but they’re pretty important in science.
  • Imagine a student who wants to calculate 4.7% of 350. He hasn’t done enough math “in his head” to have a sense for what kind of an answer he should expect, so when he fat-fingers the calculator and gets an answer of “1,645,” he doesn’t know that can’t be the right answer.

The point is that tools can’t replace having a basic understanding of how things work.

What This Means for AI

We’re in danger of doing something similar with AI. Businesses are layering AI into workflows without giving employees the intellectual foundation to use it responsibly.

ChatGPT can quickly summarize a report, but it might not address the details that are important for a specific use or in a specific setting. Or, a worker might upload a document that exceeds the token limit of the LLM and assume the model read the whole thing, even though it only processed the first portion and sampled the rest.

Business leaders need to think beyond operations and procedures. By all means train your staff to use the tools, but make sure they have a conceptual understanding of how the tool works, what it is, what it does, and (most importantly) the ways it might fail.

AI training has to go beyond procedures and methods. If your team doesn’t understand the limitations and blind spots of AI, they’ll trust it in situations where they shouldn’t. That can cause a lot of problems.

Want Help?

If you’re rolling out AI tools in your business and need help preparing your team both intellectually and tactically, give me a call.

I have created a training guide to give you or your employees the tools they need to use AI effectively — without getting you or them in a lot of trouble. If you’d like a copy, email me.

Leave a Reply

Your email address will not be published. Required fields are marked *