Replicas.jpeg

Keanu Reeves moves his wife and daughter's minds to new bodies in Replicas. Is it actually possible?

Contributed by
Jan 8, 2019

In Replicas, out this Friday, Keanu Reeves plays a neurologist on the verge of transferring human consciousness to a computer when his wife and daughter die in a car accident. Distraught and desperate, he sets about cloning his beloved family's bodies and transferring digital copies of their conscious minds into them, determined to cheat death just as any self-respecting movie scientist has since the 1920s.

Mind transfer to new bodies (or machines) isn't a new idea in pop culture or literature. The first reference to it is from Frederik Pohl's 1955 book The Tunnel Under the World, and in just the last few years we've had Self/less, Criminal, Transcendence, and many more, not to mention the theme of the biggest film of all time, Avatar.

Given our current understanding of neurology and computing power undreamed of even a decade ago, could we ever do it for real?

Back in the '50s, when computers were capturing the public imagination, some neurologists noted that the on/off state for neurons in the brain resembled the 1/0 state of a binary computer bit. In both cases, huge numbers of simple parts combined to create emergent complexity.

Like artificial intelligence, moving your consciousness from a sick or dying body to a healthy young one seemed right around the corner ...

HOW IT MIGHT WORK

First, don't confuse the mind with the brain. The endgame is to move the ghost in the machine — the expression of the neural activity — to another medium, not move the organic sac of pulp that contains it.

In case you're wondering, by the way, a full head transplant has actually been done. A University of Cincinnati scientist successfully attached the head of a mouse onto a new body, and a Russian man has already put his name down to be the world's first human head transplant subject, even though the planned date has passed.

Transferring consciousness would avoid messy complications like the body's tendency to reject transplanted organs, and it also means that, as long as a mind could be expressed computationally, we could keep it on a hard drive indefinitely, downloading it to whatever new substrate the technology allows.

That might be a healthy new body. Or it could be a clone of an existing body (as in Replicas), maybe with the disease afflicting it cured and removed. Maybe copies can be sent to countless new bodies, just as you can send the same email attachment to 20 people.

Maybe it will live in the networks and subsystems of the internet, absorbing the combined knowledge of humanity and perfecting its evil laugh for the day it launches nukes against us all. Maybe Kathleen Turner's mind in the body of a gorilla, as imagined in The Man With Two Brains, isn't so far-fetched?

If some sort of deep-level scanner can take a capture of the state of every neuron and synapse in an instant and copy it into a computer or another brain, might the "I" you can feel living inside your body right now shake its new head, blink its new eyes, and say "I think, therefore I am" somewhere else?

BABY STEPS

If that's a bit too advanced for today's technology, maybe we can just send bits of consciousness instead of an entire conscious mind.

The term used for feelings, ideas, and items of knowledge in your head is "qualia," and it represents the "mind stuff" that gives rise to your conscious experience of it. Maybe we can just isolate and transport the emotional content of a dream, the feeling of falling in love with your spouse, or your fear of spiders and start there?

Not quite. The first problem is that "qualia" is a philosophical and not a neurological term. When you have a dream, scream in delighted terror on a roller coaster, or learn a new language, the experience is not contained in a neat little package in one spot that you can just scan and rewrite somewhere else.

The human brain is a distributed and parallel processing system. Everything you experience is processed across so many brain areas and with different levels of intensity over time that it'll be difficult to even identify an individual thought or emotion in such a synaptic firestorm, let alone extract it.

Brain

Credit: Public Domain

In fact, it's the only way a brain is even remotely like a computer, as those scientists in the '50s suggested. The icon on your desktop makes it look like a single entity, but the physical impulses that create it are spread all over the hard drive, shifting and rearranging themselves every time you access, move, or resave something.

But suppose we could fence off the neuronal impulses making up a single qualia. The next problem is that experience and memory in biological systems like ours is annoyingly movable and squishy.

Where computers retain perfect copies of information, memory evolved in humans to let us most effectively navigate our environment right now, not have a perfect recall about that holiday in Martha's Vineyard when we were 7.

Next time you remember the trip you might see the boat on the horizon as being blue (when it was red) or of your auntie being there (it was actually your grandmother).

The contents of your consciousness are nested in or mixed up with other stuff that's changing constantly along with your ongoing awareness of self. There simply might not be a definitive enough representation of "you" from one millisecond to the next we can take a copy of.

THE FUNCTION KEY

So it seems the easiest way to transfer the consciousness and end up with it not just intact but thinking and feeling as you do now is to take everything at once, all in one hit, and before it changes too much (which it does constantly).

If we're successful and the "I" wakes up in another body, a robot or a gorilla, we've entered the philosophical realm of functionalism.

Going right back to the time of Aristotle, functionalism says that everything we think, feel, remember, and know arises from the mechanical architecture of the brain – they're functions of the machinery.

Aristotle

Credit: Public Domain

According to the theory, if we had the technology to replace a neuron with a chip or vacuum tube that did the same thing, then gradually made that replacement for every other neuron until all the biological matter had been replaced with bits of plastic or silicon, the continuity of your conscious awareness would survive.

Most of us would think that was ridiculous – plenty of neurologists do – but you'd be surprised how much we tend to ascribe functionalism to the rest of the body. If you lost a leg in a motorcycle accident or a shark attack, you wouldn't feel any less "you" than you did before, right?

In the same way, functionalism says that if we take our snapshot of your mind-state and move it to a brain grown in a lab or a computer (or a gorilla), it will go on to feel, emote, and experience life the same way without interruption.

ONE PROBLEM AFTER ANOTHER

But there's still a lot more to it, and it can get pretty depressing. If functionalism is right, some philosophers (like legendary AI researcher Marvin Minsky) say it's proof there's no soul, that we're just automata responding to instinct and the environment, our consciousness simply tricking us into thinking we have free will.

And arguments against functionalism are no less gloomy. If we did slowly replace all the neurons with electronics, conscious awareness might slowly die — like HAL singing "Daisy" in an ever-slowing drawl — because that instance of consciousness is an expression of that particular biological brain as it tries to prompt that particular human to reproduce, survive, etc.

HAL

Credit: Metro-Goldwyn-Mayer

Also, not only are the technical hurdles are insurmountable today, but getting over them seems even more difficult the more we learn about the brain. Those early thinkers excited about how similar brains and computers were are actually dead wrong — they're nothing alike.

If neurons were as simple as bytes (on or off, one or zero), transferring consciousness might only be a problem of degree, but they actually have far more features than just firing or not firing. Dendrites – part of the neuronal structure – can effect changes in nearby brain cells without the neuron as a whole firing to connect across a synapse.

That means our hypothetical brain state capture is getting bigger and more complicated all the time. Instead of just scanning the 100 billion neurons to see if they're firing or silent, we'd have to account for every possible chemical reaction and behavior in the brain, no matter how small.

Like continuing discoveries about the atom throughout the 20th century, we might reveal ever-smaller and ever-simpler components for decades until we reach the bedrock of brain biology, all of which we have to include as is in our digital mind file.

einsteinsbrain.jpeg

And if we conquer the data processing and neuroscientific challenges above, there's still more.

No conscious mind exists in isolation – signals, commands, and information are streaming between your brain and the rest of your body in a flood of hormones and enzymes. If you're tired or horny at a given moment, that's who you are no different from your longtime fear of heights or views on gay marriage.

Capture only brain activity and put it in a computer and you'll only get half the story. Put it in another body that's completely different and it might collapse under the strain of trying to enmesh with a totally new environment.

And not just on a microcellular level, either – your conscious sense of self right now is linked to and informed by everything from your upbringing to the foibles of your immune system, all of it a roiling cauldron of instinct and intent particular to the way your brain and body have grown together. If you put the mind of a 60-year-old male Nobel laureate in the body of an 11-year-old girl, who knows what kind of psychosis might result?

Keanu Reeves can deal with a bus that will explode if it goes below 50 miles per hour, free mankind in the far future from the simulation that forms its prison, and take down a whole family of Russian mobsters with his signature Heckler & Koch P30L after one of them is dumb enough to kill his dog.

But transferring a mind to a robot, a new body or a gorilla? Don't expect it any time soon ...

Make Your Inbox Important

Like Comic-Con. Except every week in your inbox.

Sign-up breaker
Sign out: