About a month ago, I posted a video on my channel titled, The Most Profound Moment in Gaming History. The moment in question was the final codec conversation from Metal Gear Solid 2. Naturally, because I facetiously declared something subjective to be the greatest, people correctly disagreed with me. Hundreds of commenters began to list other games that, in their opinion, topped Metal Gear Solid 2's final conversation. I figured I would run down the list of examples provided to me and do analysis videos on them to determine their merit.
Considering Metal Gear Solid 2's final conversation involved the ethics of AI controlling human behavior, I figured it would be appropriate to start with another game that has the same themes. Soma. Up until a week ago, I had never played Soma, but I was familiar with the name as well as the company that made it.
Encouraged by the game's numerous positive reviews, I gave it a try. As far as the story goes, I was giddy to discover a game that continued the discussion about artificial intelligence and how it will affect humanity in the coming century. What impressed me most, however, is that the game did not rely solely on the common science fiction narrative that placed AI and humans in violent conflict.
Instead of the terror of war, SOMA birthed a new form of existential terror by illustrating the most bleak, bloody, and brutal consequences that may arise from fusing humans with machines, where the utmost havoc is wrecked on body and mind. Worst of all, Soma presents a scenario where machines advance too fast to be regulated, and subject humans to their random whims. The result is arguably a fate worse than death. Again, I am making an objective claim about a subjective question, a fate worse than death. Yet, as I continue, I would be shocked to find somebody who argues that Soma's reality is not a fate worse than death.
Needless to say, there are spoilers ahead. Some of my fellow YouTubers have distilled Soma's themes into one central question. What does it mean to be human? While this is an accurate distillation, it's too broad a topic for a single video. I think a more succinct question for this video would be, what does it mean to have consciousness?
I think it's fair to say that consciousness, and by extension sentience, is what conveys value to the human being. It's the reason why human beings condemn murder of other members of our species, but almost never condemn killing non-sentient creatures lower on the food chain. It is what differentiates us from our fellow man, via our unique personalities. In some religious communities, to have consciousness means to be made in the image of God. Simply put, consciousness is what supposedly makes humans special.
But Soma seeks to challenge this notion. Before we entertain this and other philosophical questions, here is a brief synopsis of the game for those who have never played it. The story centers around a man named Simon Jarrett.
Simon had his brain scanned with an experimental machine and, effectively, had his consciousness copied. Upon his death in 2015, a copy of his consciousness wakes up in the 22nd century inside of a robotic avatar. The game would initially have you believe that the same Simon woke up 100 years later in a decrepit underwater research station known as Pathos 2. In reality, this was a form of manipulation on the part of the game's developers.
The original human Simon died in 2015. It is only revealed midway into the game that the character the player is controlling is merely a human mind merged with a machine. Specifically, a copy of Simon's consciousness was planted within. As for the rest of the game, you find another copied consciousness of a former Pathos II employee named Catherine Chun. She explains that the Earth's surface was obliterated by a meteor strike, leaving only the inhabitants of Pathos II alive.
Matters were complicated by the facility's non-sentient artificial intelligence, a HAL 9000-esque AI named the WOW. The WoW's prime directive was to preserve human life, but that goal was overzealously interpreted following the meteor strike. The result was human beings being kept alive artificially, whether it was their physical bodies after being fused to machinery, or their consciousness downloaded into robots. Around this time, Catherine explained that she began a project called The Ark, a Matrix-like world where one could upload a copy of their consciousness into a simulated paradise to live on. For Millenia, the end goal of the game is to reach the Ark and shoot it into space.
When Simon realizes he's not the original Simon, but a copy, the game forces the player to contemplate this question. Does a sentient human have the same, greater, or lesser value than a sentient machine? Throughout the game, this question arises whenever Simon encounters a robot programmed with the personalities of former Pathos 2 employees. Some might instinctively say that they are of equal value.
However, I'm not so certain that the same answer applies in regards to Soma. Sentience implies the ability to feel, the ability to evolve, the ability to die. Do the robots in Soma feel, evolve, and die exactly like humans?
Does consciousness evolve differently inside of a machine compared to a human body? Questions like these are left wonderfully ambiguous. But what is not so ambiguous is whether or not being conscious of a simulated existence is tolerable. to the copied consciousness?
The answer is a flat no. Throughout the game, we see various examples of these copies realizing their state of affairs. All react in different ways, but all ultimately result in stress or insanity. The only way to avoid this tragic end is to have one's consciousness altered to avoid self-reflection, like in the case of Carl or the worker drones at the bottom of the ocean. The prime example of this inevitable insanity is the case of Brandon Wan.
In order for the story to advance, this dead man's psyche must be revived multiple times via simulation to gain important information. The simulation repeats multiple times because the mere suggestion that he might not be inhabiting a human body is too much for his quote-unquote psyche to handle. Worst of all, the constant resetting of this simulation forces the player to wonder whether or not they are effectively murdering another sentient life form, and whether that murder is equivalent to a human murder.
What complicates this matter further is the concept of an afterlife. For human beings, an afterlife is an unprovable but convenient idea. Believing that one's consciousness carries on beyond the body's death gives meaning to life's tragedy. In regards to these robots, one can definitively prove that there is no afterlife.
Without a link to the natural world, there is no possibility that their life goes on. Their existence is permanently erased. like a junk mail in your digital inbox.
Is this cruel? Did that digital consciousness have the same value as human consciousness? It's a good question. But there is a more pressing question at hand. Soma implies that human consciousness can be replicated by computers.
All one needs to do is collect enough data, subject it to the correct mathematical formulas, and artificial intelligence is born. But if this is the case, how similar is that for our organic selves? If life can be replicated and simulated by computers as mere collections of digital data, are human beings just collections of organic data? Are we special creatures, capable of free will and consciousness after death?
Or are we deterministic creatures, whose sense of self vanishes in death? Soma exploits this uncertainty to the fullest possible extent, eliciting an existential horror greater than any jump scare ever could. One thing is for certain, however, even if an afterlife does not exist, something similar to it can be replicated in our reality. In the reality of Soma, we receive an almost pitch-perfect depiction of what hell might look like. Think about it.
What is the most common description of hell? Well, it's a place filled with physical and mental anguish that you can never leave, presided over by demons and monsters, and decorated with blood and destruction. Now what is the environment of Soma like?
Well, the environment is littered with dead bodies, blood, oozing structure gel, and above all, demonic creatures bent on causing you terror. The sentient robots have been lobotomized to never self-reflect and merely perform meaningless labor. Worst of all, the remaining humans crave for death, but cannot die, on account of the onboard AI keeping them alive.
Now let's entertain for a second that our sense of self vanishes in death. Between death and Soma's reality, which would you choose? With this, Soma presents a fate that is truly worse than death, for it brings our mythological conception of hell to life.
Thankfully, fictional stories like Soma provide an unintentional service. They allow bad ideas to die before they become manifest in our reality. Though AI will inevitably come to be, we can educate each other about the ethical use of AI so it can be used to our advancement. rather than our detriment.
We can foresee the potential path to hell that arises from our good intentions, so that a scenario, such as the one created by the WoW, never occurs. Unfortunately, various existential questions still linger. For the sake of time, I'll return to the one I just brought up. If digital personality is just a collection of data that can be erased, does that mean human personality is just a bunch of organic data that can be erased?
Is there nothing transcendent about us? At least in the reality of Soma, human consciousness is devalued on account of this replication. If there can be another version of us, how special are we really? There is danger regarding these types of thoughts. because they can breed all kinds of extreme behaviors.
For example, if there is nothing transcendent about human consciousness, then why put up with the tragedy of life? One can insinuate, based on supplementary material, that this is a question that many employees of Pathos 2 asked themselves. Out of fear of existential angst, many sought out a hopeful narrative, and an employee named Mark Serang proposed just that, the narrative of the coin flip.
Serang believed that once your brain is scanned, during a short moment, the original and the copy are identical, but will soon be two distinct entities due to diverging experiences. The proposed method to counter this is to end your previous original existence during or quickly after you are brain scanned. Supposedly, upon doing this, your original consciousness will become the copy. Then your copy could live for thousands of years in the virtual paradise known as the Ark.
Now, if human consciousness is not transcendent, and is erased upon death with no heaven or hell that awaits you, the mere chance that one could be saved from the misery of human existence must have seemed tempting to the Pathos 2 employees, hence why so many of them committed suicide upon hearing the theory of the coin flip. Yet the promise of a virtual paradise overwhelmed the rational capacities of not just the Pathos 2 employees, but of our main character Simon. It did not matter. that the scan did not transfer the core consciousness to a new body, but only copied it. The fiction was much preferable to the reality.
Even Catherine knew this. Though she told the truth to Simon, she didn't force the truth on him when he misunderstood the concept of the coin flip. Suddenly, the idea of human consciousness being special does not sound so illogical.
Suddenly, the concepts of free will and transcendence beyond death have utility, even if you ultimately disagree with both. In respect to the Pathos 2 employees, they needed some form, any form of hope, to sustain themselves. Otherwise, the alternative would be as pessimistic and horrifying as Soma's final scene.
Catherine? Please don't leave me alone. Catherine? Catherine?