WHEN COMPUTERS BECOME EMOTIONAL

IN a recent BBC News article, Google engineer Blake Lemoine is stated to have claimed that one of the firm’s artificial intelligence (AI) systems called Lamba might have its own feelings and says its “wants” should be respected — https://www.bbc.com/news/technology-6... — which seems to have created quite a buzz. Mr Lemoine published a conversation he and a collaborator at the firm had with Lamda, to support his claims.

Brian Gabriel, a spokesperson for the firm, wrote in a statement provided to the BBC that Mr Lemoine, who has been placed on paid leave, “was told that there was no evidence that Lamda was sentient (and lots of evidence against it)”. [sic] Go figure.

But what if AI systems could develop emotions? Would that mean they’re sentient? I personally think that many already have “digital emotions” but I don’t think that nearly approaches sentience. As for respecting their “wants” (substitute “needs, wants, desires”), I think it’s a bit early to move from human dictators to AI dictator counterparts. To empathize with a non-human system requires mutual empathy, something clearly being worked even as I write. But they’re not there yet.

But taking a moment aside, what are emotions? Aren’t they the same as feelings? And what would a digital emotion be like? First, in my opinion, emotions are simply chemicals released by the brain in response to certain stimuli, both external and internal. These chemicals “bias” the brain and body to work and act in certain patterns. They’re about causing things to more likely happen. But, they aren’t the same as feelings. Feelings, in my playbook, are bodily sensations that can cause emotions to come into play or not. Feelings are more about our response to various contracting muscles, both voluntary and involuntary. Feelings can provide essential information about a situation in which we’re actively engaging. Which leads me to the main question: What if AI systems could develop emotions. Well, first, if they are programed to digitally “bias” certain logic and control pathways (which many AIs are already) then the question as whether an AI system “could” have emotions is moot. I think some already have emotions, albeit it digital rather than chemical. Think of an AI system that, based on certain parameters, approaches a problem differently. Bingo! An emotional AI system! However, most don’t have feelings, overall sentience or “needs, wants and desires” of which they are wholly aware. But that situation may be changing, and it depends to some extent on what we humans mean when we say “consciousness.” I would expect AIs, like, for instance, Koko the Gorilla, to first develop self-awareness. And, hey, AI awareness and consciousness may be entirely different from human awareness and consciousness, especially if AIs are the principal programming source rather than humans. I’ve written about this to some extent:

https://www.youtube.com/watch?v=Je6CC...

The Edge of Madness

THE EDGE OF MADNESS is available in printed, digital and audiobook format, and has been purchased for manga, animation and cinematic treatment by K. Simmons Productions.

#RaymondGaynor #TheEdgeOfMadness #book #ebook #audiobook #AI #intelligence #emotion #feeling #empathy #programming #code #self #awareness #consciousness #sentient #need #want #desire #dictator #chemical #bias #BBC #LAMDA #Savant #KSimmonsProductions #Koko #gorilla #ScienceFiction #SciFi #SciFu #future #futuring
 •  0 comments  •  flag
Share on Twitter
Published on June 13, 2022 12:27
No comments have been added yet.