HISTORY130 - Augmenting our Senses - Past and Future
I have previously written about
human senses, delving into the science of how they work, and comparing them to
animal senses. See https://bobringreflections.blogspot.com/2024/02/science18-human-and-animal-senses.html.
Sometimes our five basic senses:
sight, hearing, smell, taste, and touch - don’t work as well as they should,
e.g., poor eyesight or hearing. After a
short introduction, I will explore the history of corrective measures to bring
our senses up to snuff, and what we can expect in the future.
As usual, I will list my
principal sources at the end.
Introduction
The five basic human senses are our
primary ways to perceive the world, using specialized organs (eyes, ears, nose,
tongue, skin) to collect environmental information, send signals to the brain,
and build our understanding and interaction with our surroundings, crucial for
survival, learning, and daily function:
·
Sight: Detected by the eyes,
this sense allows us to see light, shapes, colors, and motion, helping us
navigate and identify objects.
·
Hearing: Our ears detect
sound waves, enabling us to hear music, voices, and warnings, crucial for
communication and awareness.
·
Smell: The nose identifies
airborne chemicals, letting us smell scents from a distance, which also
influences our sense of taste.
·
Taste: The tongue detects
chemicals in food, differentiating between sweet, sour, salty, bitter, and
umami (pleasant, rich, savory taste found in foods like aged cheese, mushrooms,
and soy sauce). Taste is heavily linked
with smell.
· Touch: The skin perceives
pressure, texture, temperature, and pain, providing vital information about our
body's boundaries and surroundings.
Each sense organ contains specialized
cells that convert physical stimuli (like light or sound) into electrical
signals. These signals travel through
nerves to the brain, which processes and interprets them, creating our unified
perception of the world.
While these five sense are
fundamental, humans have other senses, including balance, body position, pain,
and others.
These five basic senses are essential
"gatekeepers" that gather information, allowing us (historically) to
find food, avoid danger, learn, reproduce, and effectively interact with our
environment.
In the past, when one or more of our
senses was deficient in some respect, our ancestors applied, at first their
ingenuity, and later their science, to augment the deficient sense to improve
performance.
Losing one sense can cause the brain
to rewire itself, reallocating resources to enhance remaining senses,
particularly if the loss occurs early in life or if the individual trains to
use other senses more intensely, e.g., blind individuals often having a keener
sense of hearing or smell.
Sight
Over 8 million Americans experience some form of vision impairment or
loss, with around 1 million being completely blind. Vision loss can result from common
age-related conditions like cataracts, clouding of the eye's lens that causes
blurry or foggy vision - the leading cause of blindness worldwide; macular degeneration that destroys sharp, central
vision by damaging the center of the retina; glaucoma, a group of diseases
that damage the optic nerve, often due to high pressure inside the eye; and diabetic
retinopathy, complication of diabetes where high blood sugar damages blood
vessels in the retina. Vision
impairment can also be caused by refractive errors, Including nearsightedness,
farsightedness, astigmatism, and presbyopia (decline in focus ability); plus,
sudden medical emergencies.
The history of vision
improvement spans from ancient magnifying aids, to eyeglasses, to contact
lenses, to surgical options.
In the 11th-13th centuries in Europe,
monks used "reading stones,” segments of glass or quartz spheres placed
directly on top of text to magnify the letters.
The first eyeglasses appeared in Italy
around 1285, simple convex lenses (for farsightedness) set into frames of
bone, wood, or leather that clamped onto the nose, helping medieval monks and scholars
read. Concave lenses were developed in
the early 1400s to correct nearsightedness, making objects appear smaller but
clearer. Frames evolved from simple
nose-pinchers to incorporating arms (temples) that rested over the ears,
resembling modern designs by the 16th century. In 1784, Benjamin Franklin invented bifocals
by combining distance and reading lenses in one frame. Cylindrical lenses
for astigmatism were developed in the 1820s.
Polarized sunglasses were introduced in 1936 to reduce glare. Lightweight and durable plastic lenses
emerged in the 1980s, replacing heavier glass.
Eyeglasses transformed from mere medical devices into fashion
accessories, with modern advancements including progressive lenses, coatings,
and advanced materials.

The earliest pictorial evidence for the use of eyeglasses is Tommaso da Modena‘s 1352 portrait of Cardinal Hugh de Provence reading in a scriptorium.
Diagnosis of vision problems
progressed also. The ophthalmoscope
invented in 1851 allowed doctors to see inside the eye to identify vision
issues. Snellen charts (see figure
below), invented in 1862, standardized visual acuity measurement. The
establishment of the National Eye Institute (1968) spurred research, leading to
breakthroughs and expanding the scope of optometric practice.

The Snellen chart standardized visual acuity measurement.
The first large/stiff contact lenses
were fitted in 1888, but it wasn’t until much later that plastic contact lenses
(1938) and silicone hydrogel lenses (2002) made contacts more comfortable.
In the late 20th century, vision correction
surgeries like LASIK allowed reshaping of the cornea to fix nearsightedness,
farsightedness, and astigmatism, while Refractive Lens Exchange surgery allowed
replacement of the eye's natural lens for significant vision
changes. Cataract surgery replaced
cloudy eye lenses and often resulted in sight improvement. These revolutionary procedures offered
a surgical alternative to glasses and contacts, allowing for permanent vision
correction.
The future of vision correction involves personalized
laser surgeries enhanced by artificial intelligence for precision, smart
contact lenses monitoring health (glucose, pressure), novel implantable lenses,
and gene therapies to correct genetic mutations causing sight issues, alongside
emerging technology like robotic surgery.
Also under development is a retinal prosthesis, a
"bionic eye" that restores some vision to people with severe
blindness by bypassing damaged photoreceptors using a camera, processor, and an
implanted electrode array that stimulates the retina with electrical pulses,
allowing users to perceive light and shapes.
The goal is safer, faster, more customized, and
even permanent vision solutions.
Hearing
Over 50 million Americans experience some
degree of hearing loss, affecting about 1 in 7 people, with rates increasing
significantly with age. Hearing loss is
primarily caused by aging, exposure to loud noises, infections,
earwax blockage, injury, or genetic factors.
The history of hearing
improvement began with ear trumpets, then evolved to bone conduction
devices, bulky electronic hearing aids, miniaturization and digital processing,
Bluetooth connectivity, and medical breakthroughs like cochlear implants for
severe hearing loss.
Starting in the 17th
century, conical, passive devices (ear trumpets) made from animal horns or
metal were the primary hearing aids, collecting and funneling sound waves to
the eardrum. Collapsible trumpets (late
1600s) and headband-attached versions (1800s) made them more portable and
discreet.

The first hearing aids were ear trumpets, funneling sound waves to the eardrum.
By the late 1800s, The first commercial bone
conduction devices emerged, using teeth or the mastoid bone to transmit sound,
helping the deaf hear music and speech.
In the late 19th century,
the invention of the telephone and carbon transmitters paved the way for the
first electric (though bulky) hearing aids, which could transmit and amplify
sound as an electrical signal. By the
1920s, vacuum tubes were used to provide significant sound amplification. In the 1940s-1950s, transistors replaced
vacuum tubes, making hearing aids smaller and lighter - pocket-sized. In
the mid 20thcentury, in-the-ear and behind-the-ear hearing aids were
developed.

The world’s first commercial vacuum-tube hearing aid with a single earphone on a headband - large, fragile, and not easily transported.
The beginning of the digital age
(1980s-1990s), with microprocessors, enabled digital hearing aids, allowing
better sound quality, noise reduction, and programmability tailored to a user's
specific hearing loss.
In the 2000s, miniaturization led to
receiver-in-ear canal hearing aids, Bluetooth connectivity for streaming, and
artificial-intelligence (AI)-powered features to analyze environments in
real-time, suppress background noise, focus on speech, and even offer language
translation.
In the late 20th century,
surgically implanted electronic devices (Cochlear implants) emerged to provide
a sense of sound to people with severe to profound hearing loss by bypassing
damaged parts of the ear and directly stimulating the auditory nerve, allowing the
brain to interpret these electrical signals as sound, with benefits including
improved speech understanding and music appreciation.
The future of hearing aids includes smarter;
smaller devices using AI for personalized sound; seamless Bluetooth
connectivity for direct streaming from TVs, smartphones, and smart home systems;
and incorporation of biometric sensors to track heart rate, physical activity,
sleep, and stress - becoming comprehensive wellness trackers.
In essence, hearing aids are evolving
into powerful personal devices that improve hearing, monitor health, and
connect us to the digital world, moving beyond the traditional role as simple
hearing amplifiers.
Finally, research is advancing toward cell
therapy treatments that could potentially reverse deafness, not just manage
hearing loss.
Smell and Taste
Approximately 20% of
American adults (about 1 in 5) are affected by some form
of smell or taste disorder. Loss of taste and smell is commonly
caused by viral infections (like colds, flu, COVID-19),
nasal/sinus issues (polyps, sinusitis, allergies), head injury, aging, certain
medications, smoking, and dry mouth, but can also signal neurological
conditions, tumors, or chemical exposure, often due to inflammation blocking
scent signals or damaging nerve pathways.
Smell and taste are deeply linked,
with smell providing most of what we perceive as "flavor" (the
complex experience of food) while taste detects basic sensations (sweet, sour,
salty, bitter, umami) on the tongue, but odor molecules traveling from the
mouth to the nose during chewing create the rich, nuanced experience of food,
which is why food seems bland with a stuffy nose.
Our tongue's taste buds pick up basic
tastes from chemicals dissolved in saliva. As we chew, volatile aroma compounds
from food travel up the back of our throat to receptors in our nasal
cavity. Our brain merges these basic
tastes with the complex smells from our nose, creating the perception of
flavor.
What we call taste is mostly flavor, a
combination of smell, taste, texture, and temperature. When our nose is blocked, we lose most of
this olfactory input, and food becomes bland, highlighting that smell contributes
roughly 80% to flavor perception. Smell also
strongly connects to memory and emotion, influencing food enjoyment and
decisions.
Smell. Ancient
civilizations viewed smell as a gateway to physical and spiritual health, and
employed intensive therapies to clear nasal "blockages." In c. 3000 BC, in ancient India, medical oils
were administered through the nasal passage to treat loss of sensation.
Other therapies included herbal smoking and forced vomiting to purge
excess phlegm believed to hinder smell.
In c. 200 BC, traditional Chinese Medicine (TCM) used acupuncture and
heat therapy at specific points on the face for nasal/sinus relief and opening the
face/mind. TCM practitioners also used nasal irrigation
and "stuffing" herbs into the nostrils to clear obstructions.
In the 2nd century BC, Greek
physician Claudius Galen linked the five senses to the brain and treatments to
improve smell focused on balancing basic body substances (humors) through diet
and purging to ensure the "vital spirit" could reach the brain.
As modern anatomy emerged, in the
1890s-1990s, the focus on smell moved toward understanding and eventually
replicating the olfactory system. In 1954, microelectrodes were measure
aromas, followed in 1982 by the first "intelligent" artificial nose
using a sensor array to identify up to 20 distinct odorants. In 1991 American neuroscientists Linda Buck
and Richard Axel identified the olfactory receptor gene family, which provided
the roadmap for modern smell restorative research.

American neuroscientists Linda Buck and Richard Axel received the Noble prize for their work on smell research.
The 21st century introduced
methods to physically retrain or regrow the olfactory system. Retraining involves sniffing at least four
distinct scents - typically rose, lemon, clove, and eucalyptus - twice daily
for 15-30 seconds each. Studies show
this can significantly improve the sense of smell over 3-14 months. In 2012, scientists restored the sense
of smell in mice using gene therapy to regrow smell-detecting nasal
hairs. In 2019, researchers successfully
used intranasal stem cell droplets to replace damaged neurons in mice,
restoring their ability to detect unpleasant odors.
Today, improving a problematic sense
of smell involves a combination of established therapeutic practices, emerging
medical treatments, and innovative "smell-aid" technologies. For smell loss caused by inflammation or
nasal polyps, oral or topical steroids remain a primary treatment. Active vitamin D delivered via nasal spray
may effectively treat inflammation-related smell loss where oral supplements
failed. Omega-3 supplements can be
effective in supporting olfactory recovery.
Prototype devices now exist that use an "electronic nose" (e-nose) to capture odors and translate them into electrical pulses. These pulses are delivered to the wearer’s nose, allowing the brain to "feel" and eventually identify different scents through touch-like sensations. Recent 2025 studies have successfully used non-invasive radiofrequency stimulation to target olfactory nerves in the brain, improving scent detection for over a week after a single treatment.
Research into olfactory implants and gene therapy is underway to help those with smell loss, with potential for future bionic noses.
Taste. The history of trying to restore/improve the human sense of taste parallels that of smell through the 1900s - because of the close link between smell and taste.
From the
1900s-1990s, researchers focused on identifying the biological limits of taste
and the genetic differences between individuals. In 1908, Japanese chemist Kikunae Ikeda
identified glutamate as the source of the fifth taste,
"umami," and later developed Monosodium Glutamate as the first mass-produced
flavor enhancer. In 1991, American psychologist and pioneering taste and smell
researcher, Linda Bartoshuk, identified "supertasters,"
individuals with a higher density of taste buds, sparking research into
how to modulate flavor for those with lower sensitivities.

American psychologist and pioneering taste and smell researcher, Linda Bartoshuk.
Since 2000, taste improvement technology
has focused on "digital seasoning" and biological regeneration of
taste receptors.
In 2014, Maine researcher, Nimesha
Ranasinghe, developed an interactive, electronically enhanced set of eating utensils
using light and electrical pulses to digitally simulate sour, salty, and bitter
sensations. In 2024-2025, there was commercial
release of devices that use weak electrical currents to enhance the perceived
saltiness of food, allowing people on low-sodium diets to "taste"
salt without consuming it. In 2025, researchers at Ohio State University
developed a wearable oral device that uses gel-filled chambers and tiny pumps
to deliver flavor profiles directly to the tongue, intended for use in Virtual
Reality or for patients with taste disorders.
Active clinical trials in late
2025 explored using human taste bud-derived stem cells to regrow damaged taste receptors
in patients who lost their sense of taste due to chemotherapy or neurological
conditions. Today, AI-Powered
specialized devices mimic human neural pathways to help patients with
neurological damage "feel" tastes again by converting chemical data
into electrical signals that the brain can process.
Practical taste improvement steps we
can take today include mindful eating (chewing slowly, mentally connect aromas
to tastes, serve hot and cold foods together or try different textures and
colors, and keeping a taste journal to identify and recall specific flavors and
smells), boosting flavors with herbs/spices, improving oral
hygiene, reducing processed foods, quitting smoking, and
exploring new foods, all while paying attention to smell, as it heavily
influences flavor. If problems persist,
consult a doctor to rule out underlying issues like infections.
The future of taste improvement
centers on enhancing flavor perception through neuro-technological devices,
AI-driven nutritional design, and pharmacological interventions, aiming to
heighten sensory experiences and aid medical diagnostics. Key developments include electric taste
stimulation, personalized nutrition, and enhanced culinary experiences.
Touch
While specific, comprehensive data on
"touch disorders" is limited, surveys indicate 38% of older adults have reported a fair sense
of touch, and 32% report a poor sense of touch. Loss
of the sense of touch is primarily caused by peripheral nerve damage,
spinal cord issues, or brain injuries that disrupt neural pathways. Common causes include diabetes-related
neuropathy, physical trauma (crushing, cutting, or burning nerves), infections,
stroke, or compression, such as carpal tunnel syndrome.
Frostbite, chemical burns, or even migraines with aura can
cause temporary or permanent loss of touch.
Historically, efforts to restore or increase the human sense of touch have progressed from philosophical theories and social practices to advanced neurological and technological interventions.
In ancient Greece, in the 4th century BC, Aristotle identified touch as one of the five primary senses, arguing it was the fundamental sense distinguishing animals from plants, and humans from other animals. Shaking hands began in the 5th century BC to signal peace and trust, using touch to confirm the absence of weapons.
In the 19th century, anatomists used microscopes to identify touch neurons (sense receptors located throughout the body), and early sensory history noted people's desire to touch artifacts for learning.
In the early 20th century, scientists began recording electrical impulses from single nerve fibers, cataloging responses to different stimuli (pressure, temperature) and revealing diverse touch receptors.
American psychologist Harry Harlow
conducted a series of landmark experiments at the University of Wisconsin-Madison from
the 1950s through the 1970s, using rhesus monkeys to challenge prevailing
behaviorist theories which claimed infants only bonded with mothers because
they provided food. Harlow's monkey experiments that showed
touch (comfort) was crucial for development, contrasting with earlier
behaviorist ideas discouraging physical affection.
In the 1970s, researchers
like American neuroscientist Paul Bach-y-Rita developed tactile substitution
systems to help the blind "see" through tactile feedback,
stimulating the skin (often on the back or tongue) to convey visual information
to the brain. His core philosophy was that "we see with the brain, not
the eyes," suggesting that the brain can reorganize itself to process
information from an alternative sensory source if the primary one is damaged.

American neuroscientist Paul Bach-y-Rita developed tactile substitution systems to help the blind "see" through tactile feedback.
Since the 1990s. there has been an explosion
in touch research, focusing on multisensory integration, haptics (simulating
touch through vibrations, motions, or forces), and the brain's representation of
touch.
Molecular genetics enabled precise
mapping of touch neurons. It was
discovered that brain stimulation electrodes can evoke touch sensations
(tingling) in brain areas, even after loss.
Researchers have developed ways to create artificial touch, giving
prosthetic users sensations of pressure and texture through nerve
connections. Wearable devices now
simulate the "feel" of virtual objects, allowing users to experience
resistance, temperature, and texture. Current
research focuses on engineering tactile sensors for prostheses and surgical
robots to give them human-level competence in physical exploration.
The future of restoring and increasing the sense of touch is rapidly advancing through Brain-Computer Interfaces (BCIs), nerve sensors implanted in the body, and artificial skin that can incorporate sensors - enabling neurological impaired and paralyzed individuals and amputees to feel sensations like pressure, texture, and temperature. BCIs are systems that establish a direct communication pathway between brain electrical activity and external devices, allowing users to control computers or robotic limbs with their thoughts. The process is to stimulate the brain's sensory cortex, and using self-powered sensors to convert touch into electrical signals, with goals to create intuitive, natural, and permanent sensory feedback.
Future systems may use AI to compute the best stimulation patterns, adapting to the user's brain activity for more natural, nuanced, and realistic sensations.
Conclusions
Based on recent research and
technological advancements, efforts to restore or improve the five senses
(vision, hearing, smell, taste, touch) are transitioning from simple corrective
aids to sophisticated, high-tech solutions aimed at regenerating sensory function.
Vision advancements include retinal implants and AI-powered glasses that
can read text or identify faces for the blind. Hearing devices are becoming
"intelligent" systems, with cochlear implants allowing the deaf to
hear by bypassing damaged ears. Efforts
to restore or improve the senses of smell and taste have reached a
significant inflection point, transitioning from basic rehabilitation to
high-tech regenerative and electronic interventions. Emerging neuro-prosthetics (like
"electronic skin") and wearable sensors are successfully simulating
the sense of touch by translating pressure into nerve signals.
While millions still suffer from
sensory impairment, advancements in gene therapy and regenerative medicine are
beginning to offer hope for repairing sensory cell loss, rather than just
masking the symptoms.
When all is said and done, we exist
only in relation to the world, and our senses evolved as scouts who bridge that
divide and provide volumes of information, warnings, and rewards. - Diane Ackerman
Sources
My principal
sources include: ”From Reading Stones to Smart Lenses: The Journey of Vision
Correction,” uoosd.com; “A brief history of hearing aids,” fallsofsound.com;
“VSP Vision Explores the Future of Senses in New Report Spotlighting the
Innovations Redefining How We See, Hear, Smell Taste, and Touch,”
vspvision.com; “The Future of Making Sense of the World,” medicalfuturist.com; plus,
numerous other online sources, including answers to many queries using
Google in AI-Mode.


Comments
Post a Comment