More on this book
Kindle Notes & Highlights
Read between
October 11 - October 15, 2021
Once I started working at Synaptics, I began studying biology and neuroscience on my own, subjects I had not studied at the university level. All of the neuroscience books I read described brain operation by reducing it to pure electrochemical activity with the hidden assumption, never explicitly stated, that this activity was identical to sentient perception. It seemed to me that there had to be both unconscious and conscious recognitions, the latter occurring through the feelings and sensations that are quite different from pure electrochemical activity. So, I asked Gary Lynch to explain how
...more
Feelings, sensations, and sentiments are not symbols like electrical or chemical signals. They represent instead the meaning of symbols in the “space” of our consciousness.
I kept thinking how can a physical, inert structure like a computer that possesses only outer symbolic aspects give rise to inner semantic ones? The concept of complexity seems to have nothing to do with the sensations and feelings existing in our inner world. Computers, with
all of their complexity, don’t have a shred of consciousness. So, the inner world of meaning must likely be a property of a richer world in which matter represents only the symbolic aspect of reality. Meaning and symbols could then be two irreducible faces of the same coin.
But the paramecium is a single cell!” I exclaimed. “It has no nervous system!”, “How can just a bag of chemicals process information in such an exquisite manner? How can it reproduce by assembling a copy of itself within itself?” This is not a program that can copy itself within the computer memory. This is akin to a computer assembling another computer like itself within itself—hardware and software included—and then dividing into two complete computers! These are feats no engineer could match today. I concluded that there must be something fundamental going on that we do not yet understand.

