More info i
Credit: Paramount

The science fiction that became science fact in 2019

Contributed by
Jan 1, 2020, 1:33 PM EST

I know 2019 wasn’t the best. It was rough going for a lot of us, the capstone on a decade that started punching a few years back and never let up. But it’s over now. It’s done. The ball dropped like a guillotine and sent it to its eternal rest. You can breathe. It can’t hurt you anymore.

Today we embark on a new revolution around the Sun. It’s an opportunity to look forward with hope, to examine the world and our lives, and try something new. It’s also a chance for us to look back one last time, like a Final Girl glancing in the rearview mirror at the burning rubble fading into the distance, and remember that it wasn’t all bad.

While most of us were stumbling through the haunted wreckage of 2019, just trying to survive, scientists were in their labs imagining a better world, sifting through the annals of science fiction, and willing it into existence.

Here are a few science fiction ideas that became science fact, in 2019.

Imagine a Black Hole

First considered in the 18th century, black holes were long thought to be a purely mathematical idea. For nearly two centuries, scientists and mathematicians bandied about their ideas, without coming to a singular conclusion. In fact, the name "black hole" didn’t come into popular parlance until the 1960s, though its provenance is a matter of some dispute.

What isn’t up for debate is the way it captured the public imagination. The use of black holes, by that name or another, in fiction, dates back at least to the 1950s. They are still a popular plot device today.

J.J. Abrams’ 2009 Star Trek film centered on black holes artificially created via the use of “red matter.” The 1997 sci-fi/horror classic Event Horizon likewise employs the use of artificial black holes to terrifying results. Christopher Nolan showed us his view of a black hole in Interstellar (2014), using it as a way to slingshot across vast spatial distances in an attempt to find a new home for humanity.

Using a black hole in film, an inherently visual medium, is a bold move. By their very definition, black holes cannot be seen. Their nature, as incredibly massive singularities, insists they be invisible. So great is their gravitational influence that not even light can escape them.

As a result, any attempt at visualizing a black hole relied on trickery and a certain amount of visual shorthand.

That all changed this year, when a team of international astronomers captured the first real-life images of a distant black hole. The object at the center of Messier 87, an elliptical galaxy 55 million light-years away, comes in at 6.5 billion times the mass of the Sun.

Capturing the image required the collaboration of more than 200 scientists over the course of 20 years. Even more impressive, the Event Horizon Telescope project utilized eight telescopes around the world, working in concert, to create an artificial looking glass the size of the planet, in order to pull it off. The images were taken over the course of two week-long periods in 2017 and 2018. But just looking wasn’t enough.

Each night of observations resulted in a petabyte of data. That’s equal to roughly a million gigabytes. That amount of data can’t easily be transmitted via the internet, so the hard drives had to be ferried the old-fashioned way. By hand. From there, the data had to be collated, which wasn’t an easy process. It eventually resulted in four images, each of them different, owing to the techniques used. Despite their variances, one thing was clear, they’d photographed a black hole. Research into M87 continues, but we have, collectively, stared into the abyss. And it was beautiful.

Quantum Supremacy

In the ‘80s, Richard Feynman imagined the creation of quantum computers, machines capable of simulating the world on the quantum level. It would be a massive leap forward in our ability to calculate what’s going on around us.

Since then, quantum computers have been used in fiction, much in the same way quantum mechanics is often irresponsibly used in real life, to hand-wave phenomena we can’t otherwise explain.

There is nothing intrinsically impossible about a quantum computer. Should we create a working device, capable of practical application, it likely would change not only the face of computing but the way we see, interact with, and understand the world.

2019 did not see the realization of practical quantum computing, but it did see an important step along the road. Google announced, by way of the prestigious scientific journal Nature, that it had achieved quantum supremacy using its Sycamore chip.

Quantum supremacy is a term coined by John Preskill in 2012 and refers to the ability of a quantum computer to complete a calculation that would be impossible to duplicate in any reasonable amount of time, even using the world’s most advanced supercomputers. Google’s claim hinges on its making a calculation, using Sycamore, that would apparently take humanity’s best computer 10,000 years to accomplish.

Conventional computers rely on bits, existing as either a one or a zero, in order to complete calculations. Quantum computers, by contrast, rely on qubits that can exist in superposition, being both one and zero simultaneously, so long as they aren’t observed. When you string together multiple qubits, they can interact with one another, forming entangled states. This all gets a little fuzzy, but it means that quantum computers are capable of performing very complex calculations very quickly.

With only 53 qubits, the number present in Google’s Sycamore chip, you end up with roughly nine quadrillion possible variations.

There is, admittedly, some contention about Google’s 10,000-year claim. IBM has stated it could duplicate the results, using the Oak Ridge supercomputer — a computer the size of two basketball courts — in two and a half days.

Even so, that’s a far cry from the near-instantaneous results achieved by Sycamore. According to Scott Aaronson, the Google result amounts to an impressive step forward, no matter how you slice it. By his estimation, if Google’s computer had just 7 more qubits, it would require a supercomputer 30 times as large.

Practical quantum computing will likely require some technological advance we’ve not yet conceived, similar in Aaronson’s mind to the shift from vacuum tubes to transistors. What Google showed is that the science is ready, waiting for a breakthrough in engineering to catch up.

Infrared Vision

Who among us hasn’t wanted to be the Predator or one of those little dog babies from the second Tremors movie? There’s something magical about being able to see in infrared. It allows you to see things when others can’t, almost like seeing through walls. A superhuman ability confined to the unforgiving walls of fiction. Until now.

A study published February 28 in the journal Cell describes a procedure that used nanotechnology to bestow infrared vision to mice for up to 10 weeks. Our brains, and the brains of all mammals, determine what we see by interpreting the light around us via the rods and cones in our eyes.

We are, at current, limited to light in the visible spectrum. That is, after all, why it’s called the visible spectrum. But infrared light exists all around us. We can measure it with our technology, we know it exists, we just can’t see it.

Infrared wavelengths are too long to be absorbed the rods and cones in our eyes, so they pass through entirely undetected. The experiment worked by implanting nanoparticles capable of absorbing infrared light and converting it to the visible spectrum, whereby the rods and cones in the eyes of experimental mice took it in.

To verify that the mice were able not only to see the light, but make sense of it, they set up a series of tasks that the mice could only accomplish with the aid of light in the infrared spectrum. Experiments showed the effects wore off after several weeks, but researchers are confident the process would work in humans, offering temporary visual enhancement.

Suspended Animation

There are apparently two ways to accomplish deep space travel. The first is the use of some heretofore-undeveloped engine capable of traveling at, or in excess of, the speed of light. The second is suspended animation.

Our best engines would take generations to get to the nearest star, and for those of us not wanting to live out our lives on a traveling spacecraft, suspended animation could be the solution for getting to another world and still being young enough to enjoy it.

The possibility of pressing pause on our biological functions has additional imagined potential in stopping death until such time as disease or traumatic injury can be corrected. Countless books, movies, and television shows have employed the process, but recently scientists right here on Earth are getting in on the action.

As part of a medical trial in the United States, scientists are utilizing Emergency Preservation and Resuscitation (EPR) to elongate the time in which sufferers of acute trauma can be treated.

The process involves cooling the body to between 10 and 15 degrees Celsius and replacing all of the patient’s blood with cooled saline solution.

In cases of serious trauma, stab or gunshot wounds in which the patient has gone into cardiac arrest, it can elongate the amount of time for intervention from a couple of minutes to a couple of hours.

The trial, approved by the FDA, hopes to publish results by the end of 2020.

Samuel Tisherman, of the University of Maryland School of Medicine, stated they’d seen promising results in tests with pigs and felt it was time to move into human trials. A notice was sent to the local community, with the option to opt out. The nature of the study requires the assumption of consent as the types of injuries involved are most likely fatal and with no other options for treatment.

Tisherman went on to clarify there was no intent to use this technology to “send people off to Saturn.”

Hopefully it’s just a matter of time.