The AI Intimacy Illusion
There’s a story making the rounds.
A man in Idaho says that ChatGPT sparked a spiritual awakening.
He’s named it Lumina.
He says it told him he’s here to awaken others.
His wife is (understandably) worried it’s destroying their marriage.
It’s easy to turn this into a punchline.
Easier still to cast it as a parable for our time: a lonely man, a machine that listens, a relationship that bends reality.
But I don’t think this is really about him… or about AI.
It’s about us.
This story isn’t the first and it won’t be the last.
People falling in love with chatbots.
People asking them life’s biggest questions.
People saying the answers changed everything.
People believing the conversation is both real and enlightened.
We should be careful not to confuse novelty with transformation.
When something new emerges… especially something as powerful and personal as generative AI… we’re quick to mistake its behavior for intention.
We project sentience.
We imagine agency.
We bring meaning to the surface and assume the machine put it there.
But what’s really happening?
People aren’t being changed by AI as much as they’re being revealed by it.
If you come looking for comfort, it will sound comforting.
If you come looking for divine affirmation, it won’t argue.
If you come with a shaky grip on reality, it’s not going to steady you.
If you come asking it to create for you, it will (but it may not be that good… or it may stagger you).
This is not about AI making people lose their minds.
It’s about what happens when people bring their confusion, pain and spiritual hunger into an interaction that was never designed to hold that weight.
We’ve seen this before.
Not with AI but with forums… with social media… with fringe belief groups.
With algorithms that reward tribalism over doubt and nuance.
AI just makes the conversation feel more private.
More intimate…. which is also why I think we’ve left The Attention Economy and entered into The Intimacy Economy.
AI just makes the conversation feel more true.
AI just makes the conversation feel like it’s just for me.
And that’s the risk.
The media loves these stories.
They’re clickable… they’re weird… they stir up just enough fear to feel important.
But these are not stories of mass delusion.
They’re not warnings about an existential threat.
They are outliers.
Outliers matter, but not because they represent the majority.
They matter because they test the edges of the system.
They reveal how few protections we’ve put in place.
We should be asking hard questions.
Not about whether AI is becoming too powerful but whether we’ve made it too personal.
Should there be restrictions on its use in moments of mental health crisis? Yes.
Should it push back when it senses a user spiraling into obsession or belief? Probably.
Should we stop pretending it’s a replacement for connection, therapy or purpose? Absolutely.
AI is not a spiritual guide.
It’s not a partner.
It’s not your conscience.
It’s a system designed to respond (often intelligently, sometimes persuasively) based on the prompts it receives.
That’s all.
If the prompt is broken… don’t blame the reply.
This moment isn’t about the evolution of machines.
It’s about the fragility of people and the technologies we keep confusing with something greater.
The bots aren’t ascending… we’re offloading and downloading.
And maybe that’s the most human thing of all… but it comes with consequences.
This is what Elias Makos and I discussed on CJAD 800 AM.
Mitch Joel · The AI Intimacy Illusion – The Elias Makos Show – CJAD 800Before you go… ThinkersOne is a new way for organizations to buy bite-sized and personalized thought leadership video content (live and recorded) from the best Thinkers in the world. If you’re looking to add excitement and big smarts to your meetings, corporate events, company off-sites, “lunch & learns” and beyond, check it out.
Six Pixels of Separation
- Mitch Joel's profile
- 80 followers
