The Sensation Gap
What will it be like to perceive the world through artificial means?
The Sensation Gap | What will it be like to perceive the world through artificial means?
Emerging from the sliding doors, the world explodes into technicolor sound. It’s almost too much.
Minute movements of wind vibrating the leaves.
A harmony of robotic birds each occupying a different frequency.
Every crisp consonant uttered in passing conversations. Almost too crisp, like they’ve passed through an amplifier.
So many sounds that had been absent when I entered the hospital.
So many sounds that it brought me to tears.
Pulling myself together and noticing the tinny sound of traffic, I hopped on my bike, put my route into my phone and marvelled as the little beans behind my ears told me exactly where to go.
Hearing aids have come a long way in a very short space of time. The ear trumpets of years gone by have been replaced with tiny pieces of complex kit that integrate with your phone giving you a fully customised piece of augmented reality. Just a few short years ago, wearers would grumble at the inability of aids to pick out speech from a sea of city noise. In its ability to filter unwanted sensation and focus exclusively on the salient parts, the brain is unparalleled. But AI integration and deep learning models are nibbling away at the gap between “real” and “artificial” sensation. Instead of different programmes, clunkily tailored to one of 5 situations, AI hearing aids work by learning about the environments that wearers inhabit and predicting the optimum settings, given what you usually do and the speech patterns of those you normally spend time with.
But as one gap closes, another opens: abilities that go beyond natural human capacities are starting to be built into sensory aids. Think babel fish translation hearing aids, artificial eyes that see in the infrared range, fake skin that can withstand far more extreme wear and tear. The future is integrated, and those with a sense loss are the stomping ground for trialling sensory enhancement. The first legally recognised cyborg is a man with a rare form of colour blindness, called achromatopsia. Essentially, he only sees in greyscale. But with the help of a surgically implanted antenna which detects different wavelengths of light and relays the information through bone conduction vibrations, he is able to hear colour.
As fantastical as this sounds, the theory isn’t actually new. So I want you to try something - place your thumb at the corner of your mouth, your middle three fingers across the jawline and your pinky finger on your throat. Now read the rest of this paragraph out loud. What do you feel? Concentrate on every detail of what your fingers are able to pick up. Small movements of the mouth, subtle vibrations in the skin. This is the Tadoma technique, one of the earliest examples of sensory substitution. It is a method used to communicate and teach language to people who are deaf and blind. By recognising the patterns of mouth movement and vibration associated with different words and sentences, practitioners begin to understand the haptic information as language.
This technique was developed in the 1930s and shows how astonishingly well our brains can circumvent the loss of one or more senses; with a little help. Now this help has expanded, offering sensory substitutions and enhancements for those with sensory disabilities that were once only found in the minds of fiction writers.
One of the frontrunners in this race to expand the sensory minds of those with a diminished sense is writer and neuroscientist David Eagleman. According to Eagleman, the trick with sensory replacement is to capitalise on one of the remaining senses (it seems not to matter which, although touch is often the de facto being the cheapest and easiest to stimulate). Taking taste as an example; with the use of a special device worn in the mouth, it is possible to stimulate taste buds electrically with very high precision. If the wearer is visually impaired, scientists pair known items with a particular pattern of taste bud stimulation. By the end of 3 days training, our wearer can link particular tastes to particular items. But by the end of 6 months, something incredible starts to happen. The biohacker starts to perceive the taste patterns as visual information. As unlikely as this sounds, it stems from the brain’s ability to find computational power wherever it isn’t being used. And in this case it’ll turn to the relatively dormant visual areas of the brain, recruiting them into an adapted ability to sense spatial objects through taste.
This doesn’t just work for information to which most humans have ready access. It can be used to open up new sensory horizons; hearing infrared to detect temperature of objects at a distance, smelling emotions for those with autism, feeling magnetic fields through the skin.
Other scientists take a different approach, making aids that allow sensory disabled people to enhance what they already use. There’s a robotic hand in the making that translates oral speech into sign language for deaf people. And a lot of dogs are going to be out of jobs when robotic guide dogs take on the role of assisting sight impaired people.
These types of technologies inevitably throw up a lot of questions about what it means to be human. Where does the body end and the device begin? What does bodily autonomy mean when a corporation owns a bit of you - a bit that you rely on the more you use it? These are questions fiction writers and filmmakers have been asking for a while. In some ways it is an age old question: what are humans without their tools? But there are certainly new dimensions to it. Privacy concerns are definitely warranted. Particularly when those tools have the potential to manipulate their users in new and unsettling ways. Indeed companies are clambering to invade consumers’ other senses, so immune are they to visual and auditory advertising these days. One particularly alarming article I found touted the huge marketing opportunities afforded by switching to a haptic advertising methodology.
And in the worst case scenario, with fully privatised healthcare for example, the fantastical uses to which these pieces of kit can be used could price out those for whom the kit could greatly enhance their ability to exist in an able-bodied world.
We started this article with what it’s currently like to get a sensory aid. But delving into the realms of speculation, let’s imagine what it could be like in the future.
The batwoman crouches low in the eaves of a ruined house. She is feeling for bats. She can't say yet whether it will be a successful survey. Just that a colleague flagged this place up as a possible location. It is dark and dank and she is naked but for a thin skin suit covered in electrodes.
Nothing yet.
She thinks of her training over the past 6 months as gradually she learnt the language of bats. The shape of different species on her skin. How young bats felt, how the colony’s movements tracked across her body. It was no longer just touch; she had somehow gained a sense, a perception of how a community worked. This vest far outshone the intrusive old surveying methods, which often disturbed the bats and gave shoddy estimates of the size and longevity of a population.
She is jolted back to the house as the tapping begins. Immersing herself in sensation, she can hear them, despite the fact that she hasn’t truly heard a thing since she was a child. She had refused a cochlear implant as an adult, not wanting to be alienated from her Deaf community. But when the vest came out, with all its promises of communing with animals, she wanted one.
The tapping intensifies, flitting around her torso as she pictures them swarming and hears wings beating frantically at the air. A good day in the office.