How do we hear?
We hear through our ears, right? It's not quite that simple. The brain has to be involved somehow. The flappy piece of skin on the outside (some people can wiggle them) is called the pinna or auricle. Many animals can use their pinnae to localize sound by moving them. Ours are somewhat useful in helping us to localize sound as well but maybe not as good as a dog.
As sound travels through the ear canal, it hits the eardrum which is a membrane and attached to the other side of that membrane are the tiniest bones in the body called the ossicles. You may have heard of the hammer, stirrup and anvil or more properly the malleus, incus and stapes. Together, these bones transmit sound to the oval window which is in contact with the fluid filling the cochlea or our primary hearing organ.
From there it gets weird because hearing is a temporal phenomenon but we need a way to get all the frequencies of sound up to the brain simultaneously so the cochlea is a spatial organ designed to break down sounds into its individual components. Each inner hair is best at picking up a certain frequency. The basilar membrane lining the cochlea transmits the highest frequency sounds closest to the oval window and low frequency sounds travel the highest. That's why we lose our ability to hear high frequency sounds soonest because the base of the cochlea takes the most pounding.
Anyway, the cochlea and the inner hair cells take the sounds we hear and segregate them into best frequencies. It's almost like we have a bunch of tiny microphones, each specializing in a certain frequency and each fiber of the eighth cranial nerve carries those impulses to the cochlear nucleus. In essence, we convert analog sounds into a whole bunch of digital signals. This is why it will be so difficult to build a true prosthetic cochlear implant. Most designs have anywhere from 4 to 40 electrodes designed to stimulate the 8th nerve. To truly emulate what the cochlea does, you need more like 20,000. That'll never happen.
The cochlear nucleus shapes the sounds and aggregates them and then passes those sounds along to the inferior colliculus. This is where sound location begins to form. From the inferior colliculus, the signals go on to the medial geniculate body and from there to the higher portions of the brain called the auditory cortex.
What am I getting at here? We don't just hear. Hearing is the end result of an incredibly complex sequence of structures and processing and allows our brains to take a temporal and spectrally diverse set of sounds and render them intelligible. I only bring this up because computers don't hear like we do. Siri doesn't answer questions using pinna, ossicles, cochlea and brain matter. Instead, she creates outlines of sounds and uses brute force to match those outlines to pre-recorded templates converting sounds directly into words.
Tomorrow, you will see that the visual system is actually a lot simpler, even though as Americans, it is our primary input sense.
As sound travels through the ear canal, it hits the eardrum which is a membrane and attached to the other side of that membrane are the tiniest bones in the body called the ossicles. You may have heard of the hammer, stirrup and anvil or more properly the malleus, incus and stapes. Together, these bones transmit sound to the oval window which is in contact with the fluid filling the cochlea or our primary hearing organ.
From there it gets weird because hearing is a temporal phenomenon but we need a way to get all the frequencies of sound up to the brain simultaneously so the cochlea is a spatial organ designed to break down sounds into its individual components. Each inner hair is best at picking up a certain frequency. The basilar membrane lining the cochlea transmits the highest frequency sounds closest to the oval window and low frequency sounds travel the highest. That's why we lose our ability to hear high frequency sounds soonest because the base of the cochlea takes the most pounding.
Anyway, the cochlea and the inner hair cells take the sounds we hear and segregate them into best frequencies. It's almost like we have a bunch of tiny microphones, each specializing in a certain frequency and each fiber of the eighth cranial nerve carries those impulses to the cochlear nucleus. In essence, we convert analog sounds into a whole bunch of digital signals. This is why it will be so difficult to build a true prosthetic cochlear implant. Most designs have anywhere from 4 to 40 electrodes designed to stimulate the 8th nerve. To truly emulate what the cochlea does, you need more like 20,000. That'll never happen.
The cochlear nucleus shapes the sounds and aggregates them and then passes those sounds along to the inferior colliculus. This is where sound location begins to form. From the inferior colliculus, the signals go on to the medial geniculate body and from there to the higher portions of the brain called the auditory cortex.
What am I getting at here? We don't just hear. Hearing is the end result of an incredibly complex sequence of structures and processing and allows our brains to take a temporal and spectrally diverse set of sounds and render them intelligible. I only bring this up because computers don't hear like we do. Siri doesn't answer questions using pinna, ossicles, cochlea and brain matter. Instead, she creates outlines of sounds and uses brute force to match those outlines to pre-recorded templates converting sounds directly into words.
Tomorrow, you will see that the visual system is actually a lot simpler, even though as Americans, it is our primary input sense.

Published on August 10, 2016 05:52
•
Tags:
action, adventure, ftl, science-fiction, space-travel, vuduri
No comments have been added yet.
Tales of the Vuduri
Tidbits and insights into the 35th century world of the Vuduri.
- Michael Brachman's profile
- 21 followers
