AI’s Dumbest Move Yet

Hello and happy Caturday – time to discuss AI’s dumbest move yet. This week, Fortune describes an AI that can tell if CEO’s are depressed on earning calls. According to the article, ‘AI-powered mental health assessments have already allowed researchers to identify correlations between CEO depression and business risks.’ This application of our new favorite technology might interest some, but is alarming and inappropriate for reasons we’ll now discuss.

The stated use of this tool is to ‘highlight mental health in leadership roles and how prevalent it is,’ the associate professor of accounting (accounting??) at Indiana University told Fortune. And then, almost in the same breath, ‘it … also has far-reaching implications for the organization, the employees, the investors, and the broader economy.’ So, thanks for that tissue of respectability – I hope you’re happy when this ‘AI voice stress tool’ goes the way of the polygraph.

Polygraph 2.0

I don’t blame the tool itself – only the people who made it and the people who use it. This is AI’s dumbest move yet, and if you can’t understand why – you’re part of the problem. See folks, we’ve been here before. Since the early 20th century, lie-detector devices were de rigueur with law enforcement, with many variations all focused on the same goal – ensuring no one could lie to you. But here’s the punchline: people defeat polygraph devices all the time.

Wikipedia lists some of these cases for your education / edification – you can review them at your convenienceAldrich Ames, a former CIA officer who became a spy for the Soviet Union, is perhaps one of the most infamous examples of someone beating a polygraph. Gary Ridgway, also known as the Green River Killer, underwent a polygraph test and passed, despite his guilt. It is believed that he managed to stay calm and collected, employing relaxation techniques to keep his physiological responses in check. This allowed him to evade capture for many years until DNA evidence linked him to the crimes. Ted Bundy, one of the most notorious serial killers in American history, also managed to pass a polygraph test.

Polygraph results have varying degrees of effectiveness. The only guys who benefitted from polygraphs were the people selling them or the people selling you on them. For everyone else, a polygraph was in some cases no better than flipping a coin. Now we have this new ‘AI mental health tool,’ proving we haven’t learned a thing. Any tool to exploit your mental or emotional vulnerabilities can be exploited. 

How This Will Go Wrong

Meanwhile, the most useful tools of human connection – compassion and curiosity – buried under layers of suspicion and skepticism. At what point are we going to start learning to talk TO each other instead of talking ABOUT each other? I’m not sure, but I know this AI ‘mental health tool’ won’t help.

Worse yet, this tool will – no doubt – be used by bad actors to attack opponents’ personal vulnerabilities. Along the way it’ll also be used to exploit those with personal, private medical issues. No sane executive will miss ‘crazywash’ training – tailoring their public persona, appearance, and voice skills – distancing themselves further from the people they’re supposed to be serving.

No one argues that AI can be valuable in some ways – we’re still learning those ways every day. Just as fire keeps you alive or kills you depending on who holds the matches, AI is a torch in the hands of mad children obsessed with charisma, control, and capability. This tool will burn us until we keep it with compassionate, conscientious, and caring craftsman.

So when it comes to AI’s dumbest move, here’s my parting thought: Congratulations, you played yourself.

More Futurology notes here

The post AI’s Dumbest Move Yet appeared first on Inkican.

 •  0 comments  •  flag
Share on Twitter
Published on January 25, 2025 09:40
No comments have been added yet.