With University of Michigan research fellow Philip Vu looking on, Oldham uses his own nerves and muscles—plus signals from implanted electrodes—to move the thumb and fingers of a prosthetic hand. “I hope that somehow, because I’m allowed to participate in this, that might make it better for somebody in the future,” he says.
VIDEO BY LYNN JOHNSON
Anatomy taught us this, not culture; we humans are wrapped, as I once heard another scientist say, in “an incredibly complex sheet covered with sensors.” The skin, that is, our largest organ. Its layers contain hundreds of thousands of receptor cells, unevenly distributed around the body’s surface, specialized for various jobs. Some shoot the brain signals about temperature or the harmful disruption we perceive as pain. Some seem specialized to soothe; neuroscientist Francis McGlone is part of an international group of scientists studying receptors, densest in the hairy skin of the arms and back, that produce a pleasant feeling when the skin that contains them is brushed or stroked.
And some receptors send the brain the kind of informational detail that helps tell us, all day long, what we’re touching and doing and using. Mechanoreceptors, these are called; conveniently, evolutionarily, their density is especially high along the palmside skin of the fingertips and hand. They’re working for you—again, if you have the use of at least one hand—at this very moment. You’re reading this story on some sort of device, presumably: a computer or tablet or phone. Try closing your eyes and running a finger around the edges. Let your fingertip find metal, plastic, corners, ridges.
Done? OK. Just now, from your hand to your brain, so much was happening. The pressure against your fingertip pads, the distortion to your skin, the vibrations you didn’t notice as you slid your finger over surfaces—each of these tiny alterations to your own sensor-covered sheet was stimulating its mechanoreceptors. Four varieties of such touch receptors have been identified, each with a subspecialty of its own; your vibration-sensing mechanoreceptors, for example, were firing away as your fingertips moved across textures of machine and cloth. Nerves carry those signals from the skin up to the brain, which instantly sorts and labels: Smooth! Different kind of smooth! Denim! Corduroy!
Today’s most advanced experimental prosthetics are designed not only to move with precision but also to feel. This robotic arm, created by researchers at Sweden’s Chalmers University of Technology, is surgically attached to bone and interacts with nerves in the arm to relay touch to the brain.
THE BODY’S NATURAL SYSTEM
Muscles receive signals from the brain to interact with the physical world. In response, sensors in human skin called mechanoreceptors, which detect touch, pressure, vibration, and stretching of the skin, send information back to the brain via the main nerves of the arm.
A motorized prosthetic, controlled by machine learning, can also return tactile information from sensors on the hand to the brain.
The brain sends signals to the hand and arm.
Found mostly in the fingertips; sensitive to light touch, vibration, and texture
Sensory signals are collected by the processing unit and sent to electrodes connected to the median nerve, which then relays the sense of touch to the brain.
Senses pressure, shape, edges, and rough textures
Senses pressure, vibration, and skin stretching
Found lower in the skin; sensitive to vibration and deep pressure
Signals are received by electrodes implanted along upper-arm muscles
median nerve links to the thumb, forefinger, and middle finger.
radial nerve stimulates the triceps, back of the hand and thumb, and back of the middle and ring fingers.
ulnar nerve connects to the ring and little fingers, part of the palm, and the biceps.
The processing unit translates signals from the brain and responses from the upper arm into action.
Sensors record tactile information and relay it back to the processing unit.
Traditional prostheses fit over damaged limbs. This removable robotic arm is connected to the humerus bone by a titanium anchor, which leads to better control and nerve interaction—making the prosthesis feel more like part of the body. Natural mechanoreceptors in the bone provide additional sensations.
A computerized motor uses machine learning to analyze patterns of electrical activity related to an intended motion in a patient’s nerves. Some patients practice using the arm through virtual reality before being fitted with it, which trains both the person and the unit.
Fingertips are covered with electrical sensors that mimic the natural mechanoreceptors in human skin. They can record information on factors such as pressure, texture, and vibration.
Jason Treat, NGM Staff. Illustration: Sinelab.
None of this takes place in isolation, of course. Context—smells, sounds, memory, situational input—affects everything.
I know that’s corduroy because I learned long ago what corduroy feels like. It’s why the touch of another’s hand can please in one context and repel in another. “The entirety of our perception is built against the lifetime of our experience,” said Case Western Reserve University biomedical engineer Dustin Tyler. “The system we’re working with”—the interplay of receptors, nerves, and brain, he means—“is always taking information in, filing it, associating it, connecting it, and creating our us. There is no beginning and end to it. We’re trying to tap into that.”
Tyler leads the multispecialty team working with Brandon Prestwood and eight other patients, all of whom—because of amputation or in one case paralysis—have lost at least one limb’s natural capacity to feel touch. A conversation with Tyler can ricochet between metaphysics and plainspoken exuberance; I once asked him how he’d found his way from a college engineering major to sensory restoration experiments, and his thoughtful reply included “Holy crap!” and “Awesome.” Electrical engineering, awesome. Neural networks, same. Neural networks run on the body’s internal electricity, after all; it’s electrical pulses that carry signals up and down the nerves. “I was fascinated by the brain,” Tyler told me. “I’m still amazed every day at how the machine we ride around in works.”
This Microdocodon gracilis, a creature tinier than a mouse, died about 166 million years ago in what is now Inner Mongolia. The white outline around its skeleton is a halo of fossilized fur, which suggests that the animal might already have developed what University of Chicago evolutionary biologist Zhe-Xi Luo calls “the tactile sensation associated with the hairs of modern mammals.” Luo says the first appearance of fur coincided with the evolutionary enlargement of brains in early mammals, including the expanded cortical area where touch sensation is processed. Although there’s no proof, he says, he and colleagues hypothesize that the two phenomena are linked. “The mammalian tactile sense of perception is one of the fundamental drivers for these early mammals to have developed larger brains,” he says.
PHOTOGRAPH BY JUSTIN JIN
PALEONTOLOGICAL MUSEUM OF LIAONING
Each center is experimenting with its own combination of implants and prostheses; the graphic created with the guidance of engineer Max Ortiz Catalán at Sweden’s Chalmers University of Technology, shows the arrangement Chalmers scientists have developed. Here’s the central idea: A post-amputation patient—a man like Prestwood, say, who’s lost his whole forearm—has truncated nerves within the part that remains. Those nerves are still able to send signals the brain perceives as coming from the missing limb; this can be one of the causes of phantom limb sensation.
So the trick is restoring the signaling. The sensors being built into these experimental prostheses can convert contact with a surface—a prosthetic finger touching a tabletop, for example—into electric signals. This sends data to a computer, which determines the nerves that will have to be stimulated to make the brain perceive the touch in the appropriate place. (Index finger? Thumb? Second knuckle on ring finger?) The computer sends pulses down the patient’s implanted wiring to an electrode, which stimulates the indicated nerve, sending biological electric pulses up the nerves.
Voilà: sensory information, ideally the right information, en route to the brain.
What many of the study volunteers most wanted to feel with their prostheses—what they longed for—was the touch of human skin.
When it’s working correctly, all this should happen nearly instantaneously, from the brain’s perspective, like the neural signaling we’re born with. But no two bodies are exactly alike, and for the participants who volunteer, so far about two dozen in U.S. and European research hospitals, the process demands forbearance: serious surgery followed by many hours in research labs, answering questions while tethered to a computer. “Where do you seem to be feeling that?” “How about now?” Even so, Prestwood and other participants told me, they signed on mostly for the chance to help scientists learn how this will play out—whether injured veterans and other amputees may someday be able to wear a near-natural limb that actually feels that way.
Waving a broom to catch her daughter’s interest, Chloe Nunez joins two New York University researchers in encouraging 16-month-old Campbell to try walking down the slope before her. (At home, sweeping together is a favorite game.) Mother and child are volunteer participants at NYU’s Infant Action Lab, where a team led by psychologist Karen Adolph studies the way healthy infants suss out challenges such as unfamiliar terrain. “One way they figure it out is through touch,” Adolph says. Natural sensors in those small feet and toes send Campbell’s brain vital data about how—or whether—to proceed.
Nerve fibers called CT afferents, clustered in the arms and back, can make people feel pleasant—and warmly connected to others—when those areas are brushed or stroked. University of Virginia neuroscientists Meghan Puglia and Kevin Pelphrey, exploring possible links between unusual CT response and autism or other developmental differences, are recording the brain activity of typically developing babies such as nine-month-old Ian Boardman, here being brushed by Puglia.
“I just wanted to see if I could pay it forward,” said Keven Walgamott, a Utah real estate agent who lost parts of his right arm and foot two decades ago after a power line sparked while he was lifting a pump out of a well outside his home. Starting in 2016, Walgamott spent more than a year as a research volunteer at the University of Utah, where he was temporarily implanted with electrodes, including some developed by scientists there. Inside their lab, wired into a computer, Walgamott would put on one of the new sensorized prostheses—this one named the LUKE, for Life Under Kinetic Evolution but also for Luke Skywalker, the
Star Wars Jedi who loses his hand in a light-saber fight with Darth Vader. By the end of The Empire Strikes Back, Luke has a prosthetic that can apparently do everything, including feel. If you enter “Walgamott eggs” or “Walgamott grapes” into a search engine, you’ll see him in a Utah lab with the LUKE: Concentrating, his face sober, he’s performing the kind of simple tasks that are almost impossible for hands that can’t feel.
He lifts a raw egg in its shell, with just the right delicacy, and sets it gently into a bowl. He holds a grape cluster with his actual hand, closes a prosthetic thumb and finger around a single grape, and pulls it off without squashing it. Video clips from other research centers show similar small triumphs: at Case Western Reserve, a blindfolded patient using sensorized prosthetic fingers to pinch and pull off the stems of cherries; in Sweden, a Chalmers patient inside his own garage, using tools with both his natural hand and his prosthetic one.
But what many of the study volunteers most wanted to feel—what they longed for, they told Tyler and other scientists—was the touch of human skin. “I was amazed at how many of them just wanted to connect with somebody,” Tyler said. “It wasn’t functional. It was just: ‘I want to hold my wife’s hand.’ ”
Easing Elvy Kaik through her final weeks of life in April 2020, hospice nurse Janine Hurn does her best to offer the balm of touch from inside pandemic protective gear. Now retired, Hurn was nursing on Washington State’s Whidbey Island when COVID-19 hit. “I think we were just made to respond to human touch,” she says. “The gloved hand doesn’t feel the same on a person’s body. There were times when I’d take the gloves off at the end of the visit, knowing I had the hand sanitizer. We both needed it—to just have that warm human hand.”
Once I asked Prestwood, after apologizing for the boorish question, why it mattered so much to perceive Amy’s fingers around his missing left hand when his intact right hand had been there all along. He didn’t take offense. He said it was hard to put into words. Finally he said: It made him feel whole. “Because it’s something I lost,” he said. “For six years I had not held my wife’s hand with my left hand, and now I was. It’s the emotion that goes with any kind of touch. It is … it’s being complete.”
Tyler found this at once moving, profound, and provocative. What does it mean to feel the joy of a loved one’s touch when the sensation is like the tip of a sewing needle? And if the right circumstances can make a certain kind of zap to the cortex register as the squeeze of human fingers, what might that imply for individuals separated by distance? “Like, holy cow. What could we do? ” Tyler said. “This is way beyond prosthetics.”
Which brings us to Veronica Santos, and her Los Angeles lab full of robots. “Biomechatronics” essentially means what it sounds like, the blending of biological and mechanical science, and Santos specializes in developing sensors for robot hands. Much of her work is meant to make robots more useful in medical settings and in places that are dangerous for humans, like the depths of the sea. But three years ago she began collaborating with Tyler on a series of experiments in … well, the nomenclature is still unsettled. “Remote touch.” “Distributed touch.” Just picture this: One person in Los Angeles, one person in Cleveland. Across 2,000 miles—the distance from UCLA to Case Western Reserve—they’re trying to shake hands.
Margaret Malarney was a 14-year-old athlete and live wire before undergoing lymphoma treatment in 2020. She suffered internal bleeding that seemed at first to have devastated her brain. Her parents, John and Kate Malarney, braced for bereavement—until, as Kate lay holding her, Margaret spoke her own name. Now Margaret progresses in special classes, with a barrage of loving rehab that includes abundant touch. “It gave us an entry to her,” says Kate, joining hands with Margaret at home in Chagrin Falls, Ohio. The teen arches her spine while movement educator Polly Manke supports her shoulders. “It was the first way we were reaching her.”
A robot is involved, and I’m about to explain how; Santos and Tyler decided to hook me up for a go as the Cleveland end in one of their experiments. Scientists and science fiction writers have for many decades considered how this might play out, a person in one place making what feels like physical contact with a person or an object somewhere else. If you’ve ever felt a cell phone vibrate, you’re part of the endeavor: That’s a wireless signal, from another locale, firing a minuscule motor that fires mechanoreceptors in your skin.
The engineering term of art is “haptics,” from the Greek
haptikos: relating to the sense of touch. Any technology designed to set off touch sensations is haptic—those restaurant pagers that buzz in your hand when your order’s ready, for example. You can buy virtual reality gloves now, to be worn with virtual reality goggles and wired to make your actual fingers and palms feel something like contact as your virtual hands touch virtual things. (You see a wall in the virtual room your goggles are displaying; lifting your actual hand puts your virtual hand against the wall, and a force in the gloves pushes back to create the illusion that you can’t bust through. Or your virtual fingers touch a virtual tractor on virtual farmland, and your actual fingers feel the engine throb.) Gamers are currently the biggest consumer market for such gloves; they’re also being used to make VR training devices, such as flight simulators, feel more realistic.
Compared to the symphony that is natural human touch, though, the technology has a long way to go. That’s not my metaphor, the symphony; I heard it from three different scientists trying to help me appreciate the orchestral coordination behind sensations we take for granted. “I’m making do with these amazing engineered materials, and they’re still our kludgy way of trying to re-create what my little nephew was just born with, nine months ago,” Santos told me. “I’m still humbled by that.”
Susie Reinish, 79, relies on touch to communicate attention, protection, and love to her 56-year-old son, Jerry, a former private chef who suffered brain damage after two diabetic comas. He’s lost his ability to speak. She can’t tell how much he understands when she and her husband, Rob, talk to him. And he sometimes bites uncontrollably; the mitts on his hands protect them from his teeth. “He doesn’t bite his hands as much when he’s being touched,” says Susie, here with Jerry in the kitchen of their home in Las Vegas, where she and Rob look after him. “Touch is the only thing that calms him.”
Cassandra Amaya’s younger son, 13-year-old Jonathan, has autism and for many years could not bear the touch of others. Scientists are working to understand why people on the spectrum often have unusual reactions to touch. One hypothesis: possible differences in the nerve fibers and brain processing that for most neurotypical people make gentle touch produce feelings of comfort and social connection. Amaya, who looks after Jonathan at their home in the California city of Banning, has learned that he now loves the robust touch of the “tickle monster” game—which has become their happiest physical connection.
The day I set out to feel her fingers from eight states away, Santos was wearing a T-shirt, blue jeans, and a pandemic face mask. I caught a wobbly glimpse of her, livestreamed and in 3D, through the VR goggles two Case Western Reserve researchers had strapped onto my head. Then she tipped abruptly sideways, vanishing from view, and what was I seeing now? Floor tiles. A desk leg, two shod feet—Oh. Santos’s feet. I raised my goggled eyes. “Hi,” Santos said.
What she was greeting was a wheeled robot, which after bumbling around the UCLA lab furniture had finally stopped to point its video camera face at hers. To use the researchers’ parlance, I was “embodying” that robot, seeing through its eyes, hearing through its microphone, and lurching like a drunk because of the human incompetently navigating from Cleveland. Nothing so remarkable about that, in the era of drones; the novel part was my own right hand, which was—here’s that word again—embodying the metal and plastic hand of the rolling Los Angeles robot. Taped to my gloved palm and my index finger were two metal disks. Wires connected the disks to a lab computer, which was internet connected to the robot, which had tactile sensors on its own robot fingers. Whenever the robot touched a surface, the sensors shot pulses to its robot brain—its computer. Those pulses zipped across the country, down the lab wiring onto my hand disks, through my skin, and then up my nerves into my somatosensory cortex.
Buzzing, Prestwood had said, but fainter. A needle’s tip. Those were good words for this—plus a pressure against my fingers when I, meaning the robot, closed my hand around the plastic wineglass on a table beside Santos. The experiment was designed to suggest two separated people celebrating a business deal with a glasses-clinking toast and a handshake. I failed the toasting part; my robot self kept dropping the glass. But the researcher whose place I had temporarily taken, a Case Western Reserve graduate student named Luis Mesias, was much more adept by now with long-distance touch. He’d learned how to maneuver his gloved hand expertly enough to pick up the glass in Los Angeles by the stem and tap it against a second glass his counterpart had raised. Feeling the tap, in Cleveland.
Aimee’s Farm Animal Sanctuary, in Arizona, opened as a respite center for animals. Then families with children on the autism spectrum spread the word: Touching the gentle animals calmed their kids. Now, says Aimee Takaha, here with a Holstein named Sam, all kinds of clients reserve hour-long animalcuddling sessions.
Mesias, embodying the Santos lab robot, has long-distance peeled a banana. He has long-distance squeezed a toothpaste tube with the gentle precision of a person preparing to brush his teeth. Give the research enough time and you can conjure a future in which touch is transmitted, as vividly as sight and sound are now, into tele-everything: work, travel, shopping, family gatherings. Consolation. Sexual intimacy. Medical care, the kind that requires a practitioner’s touch. Maybe in the metaverse, that not-yet-realized virtual gathering place that has leaped from sci-fi imagining into corporate business models, something we put on our actual bodies—gloves, a suit, whatever—will convince our brains that we truly are feeling virtual people, virtual animals, virtual things.
Maybe. If I hadn’t been looking right at Santos’s face as she went in for the handshake … if I hadn’t once shaken her actual hand and walked beside her in Los Angeles learning the timbre of her voice … in another context, I mean, the sudden buzzing-and-needles jolt to my skin would have felt nothing like the clasp of human fingers. It made me suck in my breath, though. I
could see her face, as she laid her bare hand over the robot’s, and for a long time afterward I thought about Brandon and Amy Prestwood, and the solidity of my daughter’s embracing arms beneath that barrier of plastic, and how fully the mind can meld story and setting with the pulses running along human nerves.
Two years ago, in the early weeks of pandemic lockdown, a pastor told me about his first Sunday services on Zoom. What his congregants missed most acutely, he said, was the passing of the peace—the murmuring of “peace of Christ be with you” and the quick clasp of hands, there in the pews, one person to another. It didn’t occur to either of us to wonder just then about the biology of that touch, a two-second deformation of skin cells making humans feel connected to each other and to their God. The neural diagrams now taped to my office walls include many explanatory labels, Receptor Sites and Impulse Conduction and so forth, and when I asked Tyler just how much of this might eventually be replicated by bioengineering—how much of the symphony, via body electrodes and computers?—he corrected me before my question was finished. “ ‘Replicate’ is dangerous,” he said. “We struggle a lot with this. We don’t have the fidelity to replicate the natural scheme. The general term we use is ‘restore.’ ”
From my Merriam-Webster, the red clothbound edition my grandmother gave me a long time ago:
Restore: to renew, to rebuild, to give back.Newer dictionaries inhabit my cell phone, but I keep that volume in reach because my palm on its worn cover sends my brain a story it understands. I’ve watched Brandon Prestwood speak before audiences of scientists; it still makes him nervous, he told me, but he’s learned simply to tell them what happened to him and watch them sit up straighter when he reaches the part about feeling Amy’s hand.
“In one of the speeches I gave, I talked about the military guy that’s been stationed in Afghanistan or whatever for a year,” Prestwood told me one of the last times we spoke. This was a hypothetical military guy, you understand, and Prestwood was riffing, imagining where the experimentation might lead. “And before he left, his wife got pregnant, he’s never seen his daughter, but he’s able to, you know, reach out and touch her via this system somehow. Or the businessman who hasn’t been home in six months. The
National Geographic photographer headed off to the Ivory Coast.”
He meant Lynn Johnson, whose images accompany this article and who spent time with the Prestwoods at their home in Hickory, North Carolina. She had mentioned impending work in Africa, Prestwood said, and he was envisioning Johnson, her luggage of the future containing some over-the-counter version of nerve-stimulating electrodes and tactile sensors, with a matching setup at her widowed father’s home in Arizona. “Just to be able to give, and receive, the reassuring touch,” he said.
Cynthia Gorney is a contributing writer. Her essay on the year in picturesappeared in the January National Geographic. Lynn Johnson is a longtime contributor who photographed women around the world working to change their communities for the November 2019 issue of the magazine.
The National Geographic Society, committed to illuminating and protecting the wonder of our world, has funded Explorer Lynn Johnso n’s storytelling about the human condition since 2014. Learn more about the Society’s support of Explorers.
This story appears in the
June 2022 issue of National Geographic magazine.