The first photograph of a black hole represents a groundbreaking step forward for scientific achievement in our lifetimes. This one astounding image, so simple and yet utterly entrancing, could revolutionize our understanding of the universe and its endless mysteries. It took many scientists and technicians a long time to make this happen. One woman who represented those exceptional achievements was Dr. Katie Bouman. A Ph.D. student in computer science and artificial intelligence at MIT, Bouman helped to create an algorithm that would make that image possible in the first place. An image of her reacting to the photograph of the black hole, shock and pride etched onto her face, went viral. For many, she was representative of centuries of women in STEM, the bright young hope for the future, and a reminder of how often the crucial work of women is overlooked or flat-out erased from history.
Bouman was humbled by the response but remained quick to thank the rest of the team, saying, "No one algorithm or person made this image, it required the amazing talent of a team of scientists from around the globe and years of hard work to develop the instrument, data processing, imaging methods, and analysis techniques that were necessary to pull off this seemingly impossible feat." In a TED Talk she gave in 2016, Dr. Bouman said to the audience, "I’d like to encourage all of you to go out and help push the boundaries of science, even if it may at first seem as mysterious to you as a black hole."
She helped to achieve the impossible, but it says a lot about the world we live in that the misogynistic backlash against her surprised nobody. Bouman faced a level of online vitriol that felt all too familiar to most women who inhabit space on the internet. Fake social media accounts were created impersonating her. Her work was dissected and dismissed as hogging all the credit from her white male colleagues. Every basement dweller who took one computer class in high school insisted they had the evidence that Bouman did none of the work she was credited with and that the real genius behind the operations was her male colleague (he had to publicly call out this nonsense for the sexist bile that it was).
But nowhere was the anger more evident than on YouTube. The first search results for Katie Bouman were videos dedicated to "proving" she was a liar. The top search was for a video entitled "Woman Does 6% of the Work but Gets 100% of the Credit," pushing the debunked claim that Andrew Chael was being written out of the story in favor of an evil feminist conspiracy. This MRA-driven bile was given priority by YouTube’s algorithm over Bouman’s words and achievements. Even Flat Earther conspiracies about the black hole image got a better ranking by the algorithm than Bouman’s TED Talk. YouTube claimed this was a minor error and that the problem had been rectified, but plenty of people on social media said that searching Bouman’s name still led to the same results.
This isn’t unique to Bouman, of course. Before Captain Marvel’s release, anyone wanting to look up Brie Larson interviews faced the same problem, as the algorithm prioritized men screaming about her “reverse sexism” and political agenda over the actress herself. Kelly Marie Tran and Kathleen Kennedy faced the wrath of the misogynistic internet in the same manner, as did the women of the Ghostbusters reboot. To this day, searching for videos of Anita Sarkeesian will bring up a hell of a lot of sexist, lie-ridden, and outright hostile videos by men desperate to “refute” her feminist work.
YouTube's algorithm is one of the worst things about the internet. It is its own black hole, only without the wonder. Leave autoplay on for long enough and soon your relatively innocuous day of watching cute actor interviews or cat videos will descend into a hellish display of sexism, racism, homophobia, transphobia, and anti-Semitism. Many parents encountered this problem when they discovered that the algorithm created a disturbing pattern for children's programming wherein an innocuous Peppa Pig marathon could descend into videos of abuse and traumatic images. New York Magazine noted how the algorithm favors conspiracy videos and heavily leans right-wing. The Daily Beast said that YouTube had "built a radicalization machine for the far-right." A report from Data & Society found that YouTube provided a "breeding ground" for far-right radicalization. The site was also blamed for the rise in flat earth conspiracy theorists, as such videos appear with distressing frequency thanks to the algorithm. If you’ve ever looked up something vaguely feminist or female-centered on YouTube, the chances are it didn’t take very long for your recommendations tab to become chock-full of videos of men telling you that feminism is bad.
This may seem like a frivolous issue for some — just skip the video if you don’t want to watch it, right? — but YouTube’s power and influence over our society is immeasurable and deserves far greater scrutiny than what it receives. There are 1.5 billion YouTube users in the world, which, as The Guardian notes, is more than the number of households that own TVs. This one site has ingrained itself into our daily lives — as I write this, I have YouTube playing on my own TV — and shapes our means of information and discourse. When the site prioritizes mistruth over facts, bigotry over reason, hostility, and violence over everything else, then that cannot help but change the way we talk about such issues.
Women aren’t the only ones affected by this problem. Violent rhetoric festers throughout YouTube, and the algorithm plays into that across all intersections of gender, race, sexuality, and so on. But there’s something particularly insidious about the way YouTube’s basic design has made the site hostile to half the world’s population. The simplest things, from an all-female reboot of a movie to a woman’s scientific achievements, become the starting point for a descent into the rabbit hole of conspiratorial hatred. Given how young the site’s target demographics skew, and how many parents use the site as a distraction for kids in the same way TV used to be, ignoring this problem or letting it continue is simply not an option. YouTube knows this too, as they have pledged to fix the problem, but progress has been slow. Dr. Katie Bouman’s struggle was but one of countless versions of the same story, and if YouTube doesn’t make more forceful action to rectify this egregious issue, she won’t be the last to face this.
The views and opinions expressed in this article are the author's, and do not necessarily reflect those of SYFY WIRE, SYFY, or NBC Universal.