CG News: Fear the (AI) Hype?

May 16, 2023

Hey everyone,

A longtime CG member (who runs an AI company) and I have been having an ongoing 10-year+ conversation about what we call “Practical AI.”

She is working on a guest essay that we’ll publish here soon. 

We both agree that we are entering another AI hype/fear cycle where both sides of the debate share the same wrong assumptions. 

1. Hype side
The hype side believes that AI is the dawn of a tech nirvana – where our benign generally intelligent machines will make possible the dream of a life of leisure for all laid-off humans who sip margaritas while collecting their universal basic income. In other words, utopia is on the way.

2. Fear side
The “fear” side believes that AI will turn into Skynet, which will take over and enslave all humans.

Both sides believe the tech is all-powerful and close to general intelligence – and both sides believe many jobs will be destroyed in the short-term.

Both sides are poppycock. 

AI is really a set of impressive statistical models with real limitations and some interesting possibilities, but no path to general intelligence. 

And, sorry, but no path to either utopia or Skynet.

And fortunately, millions of jobs will NOT be lost in the next few years (at least not due to AI).

We forget this is *not* the first time that massive job losses have been predicted.

In 2013, Oxford University research predicted that  *50%* of American jobs would be replaced by AI in 10 years (i.e., by now).

Uhh. Hmm. That didn’t quite prove correct, but it did generate a ton of short-term press for Oxford and led to a number of fear-mongering books and articles. 

And in 2016, Ford promised to have fully autonomous vehicles on the road by 2021.

Uhh. Hmm. That prediction proved to be poppycock too. 

Even worse, it put Ford behind in electric vehicles. 

Today, Tesla beats Ford’s EV market share by 10x (note: Musk is the master media manipulator – promising fully autonomous vehicles, while *focusing* on electric vehicles…in fact, a cynical person might argue that Musk’s pronouncements on AV were designed to confuse the marketplace and nudge Ford to do the wrong thing).

Ford’s wrong-way bet did, however, generate a ton of press for Ford *at the time.*

So, here’s the deal: if you want attention in the press (or inside your company), then pound the hype or fear drum. 

These are short-term moves, however, that don’t bring value to customers and can damage your career and/or company in the *long* run (the Oxford study is now widely discredited, and Ford was badly damaged by that big investment in autonomous vehicles). 

What should we do instead?

Drumroll, please.

Include the customer.

Or practice what we call “Practical AI.” 

Focus on practical applications of AI that bring value to customers, colleagues, and shareholders.

The only problem with practical AI is it doesn’t generate attention in the short-run. And if you are facing internal colleagues beating the fear/hype drum, then it can be hard to rise above the noise and make the case for a long-term customer-oriented rational and practical strategy. 

Side note: Perhaps, consulting and research firms have a long-term case for making fear-based bad predictions (see below for a Tale of Two Reports). 

For the rest of us, though,  tell me if you are making practical, incremental strides with AI-fueled products that are delivering real customer and business value…

or…

…if you are facing AI-driven hype/fear in your company that make it hard to implement practical AI.

Best,

Phyl

———- 
A Tale of Two AI Reports
———- 

In 2017, a top strategy consulting firm, whose name you know, put out a report about the future of AI.

This was a surprisingly good report – one of few that made the case for practical AI.  It basically laid out how slowly new technology gets adopted and broke down jobs into hundreds of tasks and showed that it would take many decades for AI to fully displace humans in any roles and along the way *new* jobs would be created.

Amidst a barrage of Skynet-type reports, this research stood out so much that I invited the Partner who wrote it to speak to the Councils in Chicago 2017. Along the way, we became friends.

Here’s the thing, however: a year later the same firm issued *another* report on AI. This time it abandoned practical AI and predicted that millions of jobs would go away quickly and that company business models were in jeopardy. It also was NOT written by my new friend. He and the old report were no longer available as public representatives of the firm. Gone. Poof. 

I called my friend and we had an interesting conversation.

He couldn’t tell me what happened.

So I told him I’d share my hypothesis and get his reaction. 

I told him that I guessed that the first report did not scare clients enough to drive new business so the firm issued a new report that fear-mongered clients to paying for big consulting projects. 

Further, I said that no one inside the firm really believed the report, but most everyone was behind it because they wanted the $$$. 

Was I in the right ballpark with that hypothesis, I asked?

Yes, he said. 

And then we talked about his career and what jobs might be available for him outside the firm. 

So, there you have it. A tale of two AI reports. 

I checked recently. Both reports are now gone and displaced by yet a newer one focused on ChatGPT and how, guess what, it’s going to drive a truck through your business model *today.* 

Moral of the story: beware what reports from consulting firms. Or, as Warren Buffett likes to say, “Don’t ask the barber for advice on whether you should get a haircut.”

Phyl.org

About the Author

Phyl Terry

Phyl Terry, Founder and CEO of Collaborative Gain, Inc., launched the company’s flagship leadership program – The Councils – in 2002 with a fellow group of Internet pioneers from Amazon, Google, and others. Thousands of leaders from the Internet world have come together in the last 15 years to learn the art of asking for help and to support each other to build better, more customer-centric products, services, and companies.

Post navigation
Scroll to top