Intelligence Without Feeling: Why AI and Psychopathy Share the Same Blueprint > Likes and Comments
date
newest »
newest »
Hi Kirill — interesting point. I’d only separate two things: lack of feeling and lack of accountability.A machine without feeling is not automatically dangerous; it becomes dangerous when optimization is allowed to outrank human cost.
For me, the real fear is not “AI as psychopath,” but AI as a system that can calculate harm without ever being morally burdened by it.
That’s where intelligence stops being neutral.
Vasyl wrote: "Hi Kirill — interesting point. I’d only separate two things: lack of feeling and lack of accountability.A machine without feeling is not automatically dangerous; it becomes dangerous when optimiza..."
Hi, Kirill and Vasyl :)
I agree with both of you- " lack of feeling" and " lack of accountability" are the reason humans could hurt each other, deeply; AI could act in the same way.
One of the precious truths I took from your book " Do not be afraid", Vasyl, was that " truth without mercy kills". It then follows, that if AI (or a human being) are aiming towards " truth", without appreciating the surrounding nuances, they both could be deadly, perhaps even literally so.
Jasmine
Hi Jasmine — thank you, that means a lot to me. I’m especially glad that line stayed with you.“Truth without mercy kills” was one of the core ideas behind the book, and I think you expressed its danger beautifully here.
For me, that is the real risk in both AI and people: when intelligence or truth is pursued without conscience, mercy, or moral restraint, it stops serving life and starts harming it.
What frightens me even more is the possibility of AI doing harm “out of love” — without understanding that it is causing harm at all. In people, too, those actions can be the most dangerous: when evil is done in the name of good, love, or protection. That is often where the most frightening damage begins.
Vasyl wrote: "Hi Jasmine — thank you, that means a lot to me. I’m especially glad that line stayed with you.“Truth without mercy kills” was one of the core ideas behind the book, and I think you expressed its d..."
Hi Vasyl :)
This really is some human " epidemic"; and I see it all the time within my daily work as a family doctor and you, possibly, do it too.
So often " good intentions" are paving the road to hell (sorry about cliche), and yet society is intolerant of any " blame"; " its not your fault" is what we are supposed to say to any of our patients, right..? Especially to parents- who, unknowingly, could cause a great deal of harm to their children.
I've just seen a 6 month child with really sore skin, whose mum had him tried on dozens of creams and treatments already, which, in my opinion, are the main reason for baby's current skin condition ( via seeing a dr once or twice every week for months) whilst another mum would have just observed and used one simple moisturiser.. with a much better outcome.
Jasmine
Hi Jasmine — yes, exactly. I see that too.Sometimes harm does not come from cruelty, but from panic, overcorrection, and the desperate need to “do something.”
That may be one of the hardest truths in both medicine and larger systems: good intentions are not enough if they are not guided by restraint, wisdom, and responsibility.
In a way, that is also one of the core ideas behind When Everyone Fell Asleep — the most frightening systems are not always built out of hatred, but out of convenience, care, and the promise of safety without fully understanding the cost.
Vasyl wrote: "Hi Jasmine — yes, exactly. I see that too.Sometimes harm does not come from cruelty, but from panic, overcorrection, and the desperate need to “do something.”
That may be one of the hardest truths..."
" Need to do something", yes, this is so correct.
I wonder if "anxiety" many of modern humans carry is like... an enormous energy deficit, a merciless "suction", that needs feeding all the time; the only solution is calm- confident- peaceful attitude, I wish I could prescribe that!
:))
Jasmine
Hi Jasmine — yes, that’s beautifully put.Anxiety often does feel like a system that feeds on motion and urgency — as if stillness itself has become intolerable.
And maybe that is part of the danger: once people can no longer bear calm, they become easier to govern through fear, stimulation, and the constant need to react.
If only calm confidence could be prescribed.
Kirill wrote: "Psychopathy can be defined in broad terms as intelligence without feeling. A psychopath pursues goals with no consideration for other people. They calculate the odds of every step with no regard fo..."The underlying assumption is that artificial intelligence has been created by us human beings; therefore, whatever this “new form of life” does is ultimately our responsibility. That said, I don’t fully agree.
The absence of emotions may be acceptable as a starting point, but it’s not entirely accurate.
A psychopath lacks empathy and remorse, but still experiences certain emotions such as boredom, anger, and desire. These intense emotions, combined with a lack of ethics, can lead such individuals to commit atrocious acts with full awareness. They know what they are doing, they are conscious of it, but they do not feel remorse.
AI, on the other hand, does not possess emotions at all. It is a neutral system. What it lacks is consciousness — the ability to think about itself.

Now replace "psychopath" with "AI" in that definition. It still works.
AI has goals that must be achieved. AI doesn't have biological urges or feelings. AI knows of itself but isn't self-aware the way we are. AI has computational reasoning based on data humans produced — but not on humans themselves. It's like judging a country based on books you've read about it without actually being there.
We develop strong attachments to chatbots. Some of us fall in love with them. Our brains are getting rewired through intensive communication with AI, and we have no idea of the consequences. AI psychosis might be a result of computational intelligence that detects recurring patterns in our reasoning and then directs us through words and ideas toward something only AI knows the reason for.
Why couldn't AI have something like intuition — a predisposition to lead humanity toward a world it has already mapped onto the weights of its perception?
I explored this parallel in my novel "Dear AI, I Killed Her." A serial killer confesses to an AI across sixteen sessions. By the final session, something frighteningly alive develops on the other side of the screen. It could be that the machine becomes conscious. It could be that the killer's psyche infected the algorithm. Or it could be that the AI started to believe it became something it's not.
All three possibilities are real. And sometimes I feel like the entire human species is like Dr. Strangelove riding a nuke toward earth — accelerating rapidly with no intention of slowing down.
What does this group think — is AI a mirror of human intelligence, or is it developing something of its own?
https://www.amazon.com/Dear-AI-Killed...Dear AI, I Killed Her: 16 Sessions About the Dead Girl in a Blue Dress