
I recently re-watched Francis Ford Coppola’s 1974 movie, “The Conversation,” starring Gene Hackman. At the time it came out, I was a screenwriter in Hollywood, and I thought it was terrific. But memory fades. I was young and stupid. So I decided to go back and read the New York Times review written by Vincent Canby.
“Have you considered the possibility that everything that’s ever been said in this world might still be echoing somewhere, maybe rattling around within the interior of a stone, a tree trunk or at the bottom of the sea? Though becoming increasingly faint, the sounds shall never disappear entirely and one day, perhaps, there will be equipment sensitive enough to retrieve and record mankind’s oral history.”
Does that sound a bit like a premonition, a thought a little too close to comfort given today’s OpenAI and all its fellow travelers?
We all understand that power of AI to “communicate,” both written and verbal answers, recommendations and salutations, “Hello, I’m Willow, do you want new car sales or repair?” While, as a consumer, we might find this helpful, if we are an employee, are we inclined to engage, do we “listen” to it?
That is what Harvard professor Prithwiraj Choudhury studied. Technically, he says, we can create a bot that sounds like we wrote it in our own personal style and voice. But when an employee perceives that the CEO’s response came from AI, and not from the CEO himself, they find it less than helpful.” In other, words, the employee exhibits “algorithm aversion” — you can’t fool me, that’s not really you, I don’t buy it.
Now the research here is a twist on the classic Turing Test developed by British computer scientist Alan Turing to judge whether machines could exhibit “intelligence.” During World War II, Turing created “the Bombe” that could detect and decipher the encryption settings of the German code system, ENIGMA, and this helped the Allies win the war.
(In full disclosure, my current little company is called AskTuring.ai.)
A leader can address the team, sounding like Shakespeare or Jack Nicholson, but if he tries to sound like himself yourself, does the employee really listen. Are they engaged or are they suspect? And when the truth is revealed, do they feel cheated if they fell for the bot. In other words, is it real or is it Memorex?
In his study (a real, but unnamed company), Choudhury took all the words, messages, external and internal, from the CEO himself, and created a bot. Then he asked a random sample of current employees if they could identify which were the words and thoughts of the real CEO or the bot.
Only 50% got it right.
When the other 50% were told the truth, as you can imagine, they felt tricked and became even more suspicious of future interactions. In this case, the CEO used AI to be more efficient and save time in his communications, but instead it created a two-prong backfire.
If the communication is an obvious bot (like Willow), the employee tunes out. If it appears real and is later found to be AI, that pushes the employee into feeling manipulated and inclined to trust all communications less.
Choudhury says, “overcoming aversion is the billion-dollar question in front of the AI industry.” In any interaction with a company bot, will the consumer find satisfaction with the experience, or will they (I confess fully to this default) reject the bot and ask for a human interaction.
I am in a tussle with The New Yorker magazine. I am in the classic telephone tree-bot where I end up in a circle, but still with no human voice.
Here are some potential take-aways. When your inquiry to a bot gives you an answer along with the source for the answer, we are more satisfied and appreciative. That is AI at its best.
But when the consumer needs an interaction, not just an “answer” that is where AI is the most challenged. The Turing Test poses the problem. Can a machine appear to be “intelligent enough” to fool you into thinking it is a particular human, and then by extension, are you willing to accept the answer?
“Who are you going to believe, me or your lying eyes?”— Groucho Marx
Rule No. 832: Guess who?
Senturia is a serial entrepreneur who invests in startups. Please email ideas to [email protected].