(this post has been copied here from my old blog) Ok, I just got done playing the game "SOMA" last night, and I made this blog specifically to write a blog post about it, because I loved it so much. This post got quite long because I kind of just go through the whole story so there are definitely spoilers. Also, don't expect philosophical rigor; this is a mainly creative outlet. I started playing the game SOMA many months ago, I think last Spring, so it's taken me a long time to finish it. It starts with the main character, Simon, in his apartment. Turns out he has a neurological condition, I think a brain tumor, and has agreed to take part in an experimental study to find a cure. Turns out that experimental study involves taking a full brain scan at a precise level of detail and uploading that data to a computer, and using it to run studies on cures. Oh no, I could already see where this is going. If it's the future and they have the technology to make brain scans, how is Simon not worried about them having a copy of his consciousness on a flash drive? He isn't worried about it, though, but there is some symbolism when he said (right before the scan was taken) - "You know, Native Americans thought pictures stole a piece of their souls." Later in the game toward the end this phrase is repeated. Then, the Doctor pushes the button, and everything goes black just for a second. This is the first instance of "fission" in the game. In the moment when the button was pressed, I had no idea where I'd open my eyes. I had the intuition that the game was going to follow the traditional way of thinking about fission, the idea that at the moment of the scan, all my memories and my perspective up until that point are preserved within the data. Any new instantiation of the data would feel, subjectively, like a continuation of original Simon's identity, in the same sense as the Simon that opens his eyes in the same chair feels like a continuation. However, since they have the data, I have no idea how many times they could have decided to instantiate it. There could have been hundreds of other Simons that woke up and they are all, equally, continuations of the same identity. But after the moment of fission they become distinct by nature of no longer sharing perspectives. I've never understood why this is such a big philosophical problem. People get worked up asking "which one is the real Simon?!" but I am not sure I get it. They all are, and at the same time it is a meaningless question. My views on fission are partly in line with Parfit's view that the self just doesn't exist so questions of personal identity just make no sense. However I don't go quite that far. To some degree I think they make sense, if we get rid of the misleading notion of "identity" since that implies the self exists as some concrete "thing." But the self is not some concrete thing in the world. There's no "self" in the sense of a "subject of experience," there's just experience, that seems to be had by a self. One way I think of the "self" is of a functional linking relationship. The referent of "I" thoughts is not a "thing," it's an illusion made up by our mind, but an illusion that makes sense and is useful. It links my past to my present to my future. It links my subjectivity with my agency. It might help to even link my perceptions, together with my memories, as belonging to a single agent. Without it, everything would be chaos. Chaos may be closer to truth but it doesn't help us run from lions, or plan for the future of our families and communities. Back to Simon, his brain was scanned, the room was dark, and if I was in Simon's shoes at the moment of the scan I think I would be scared, not knowing what will be done with the data, but knowing that in any case they all will feel equally like (and therefore literally be, in an important sense) me. But, it's a game, so I, as the player, know that the designer of the game will probably only have me follow one possible timeline of Simon-hood, and I will likely not open my eyes in that same, dreary, unclean non-hospital room. I am not sure where I wake up but I am in a chair and it isn't the same. Actually looking back I don't know how Simon's consciousness got there, there was nobody around. It is dark and the doctor is not there. Simon is confused, he wonders if something has gone wrong. I don't remember all the details of the order of events, but it turns out over 100 years has passed since the original Simon got the brain scan. A comet has hit the earth and civilization has retreated to a small community of scientists living at the bottom of the ocean floor in a connected group of "stations." Everything feels slightly strange, and my experience "glitches" sometimes. I can touch dead people and have a flashback of sorts to their last moments of life. I don't see any humans. I find a robot at one point, but he won't really tell me anything. It seems something has gone horribly wrong, and this civilization that was humanity's last hope has failed. It seemed to have to do with the strange, not-quite-biological-but-not-quite-machine entity that snaked its way around all the camps, the WAU. I really liked the storyline of the game. It was sort of like the missions in fallout in the sense that you get clues from computers about what went on, you try to piece together the puzzle, while at the same time trying to complete a mission. I find an "omnitool" which is required for getting in and out of buildings etc. There is a woman who exists in the omnitool who I can talk to when it is plugged into a terminal. Her name is Catharine and she is really very nice, she is my only solace and I always look forward to plugging her in. She was human once, a biological human, but now she just exists on this device. She says the time in between when she is plugged in from one place to another just "disappears." I found this really interesting and cool. It seems quite obvious, of course, why would she be able to perceive the passage of time when she is not conscious at all? Simon asks her why it is different from falling asleep and she gives two reasons. First, when you fall asleep, you can in some sense notice the dissolution of your awareness. It happens piecemeal. You can't always remember in the morning, but falling asleep is a process that is quite different from being unplugged. I wonder if there is some important reason why that is the case for humans other than biological constraints. The second reason she gives is that you typically wake up in the same place you fell asleep, so it is even more clear that time has passed and that you are the same person. I would add also that sleeping is different from being turned off, even though we aren't aware, there's a sense in which we exist. The discontinuous nature of not only her conscious experience but her very existence, has got to have important implications for identity. You would never know how many other yous also woke up, for one thing. She calmly accepts the fluidity of personal identity. She doesn't seem as attached as many humans today are to the idea that it's the same you from day to day. Maybe this is an implication of the way future society will view identity if uploading becomes commonplace. Maybe the attachment to the ego will fall away and people genuinely won't care which "me" they are, realizing the meaninglessness of the question. I can rationally understand the meaninglessness of the question, though still fell an odd attachment to my fake "self." More on this in the end. After meeting Catharine the point and mission of the story takes real shape. She confirms the downfall of civilization, but she was part of the team working toward the future of humanity. Since earth is pretty much shot, the last hope is the "ARK." The Ark is a simulated universe populated with uploaded consciousnesses embodied in simulated bodies. It is not turned on yet, it is at Tau, and the whole mission of the game is to make my way over to Tau. Their plan is to launch the ARK into space where everyone one it can live happily ever after, into eternity. It really sounds quite beautiful. I hear all about the plans to make ARK and it seems like it's supposed to be like a version of heaven. It is supposed to feel very real and supposed to be indistinguishable from living in a body, if they have perfected the technology. Of course not entirely indistinguishable. Especially in the ways that matter, because I'm not sure you can reproduce in ARK, and I don't think you die or age either. Can a life like this have meaning? I'm not sure. Maybe, if we redefine the word itself. I also don't know the level of detail of the simulated universe, and whether some degree of self-increasing complexity could be encoded into the system. The reason I would wonder is because it seems like I would feel a little bleak if there was nothing to be discovered. Mystery keeps me going, but if we create the universe we would have complete knowledge of it. It seems like we even would have figured out consciousness. What is left to invent or left to think about and ponder over? Would there be any new maths to invent or would the simulated brains only be equipped with the faculties to grasp maths already in existence at the time? Maybe we could find meaning in art. Populate the universe with beauty. But what is beauty without struggle? It's hard to know what a life like this would be like. But regardless, in the game, I yearn to make it there. I might not know what I will do when I get there, or whether my future will just seem long and dark, cold and empty, devoid of real meaning, but I have the natural instinct not to stay on the dying earth. At one point in the story I need to find a password or a code or some thing from Catharine's colleagues. Yes, her colleagues. They are long dead, but their consciousnesses are stored in these little drives. I need to plug them in, and talk to them, and try to figure out what the password is. This part of the game was really fun, sort of made me feel evil, and really sort of solidified the feeling that these people did not know what they were getting into when they uploaded their consciousnesses. I don't think Catharine would react the way they reacted when I plugged them in, because she is so used to it, but I think at the point these people were scanned they were more like Simon when he was scanned. They were thinking they were going to wake up in the scanner room, and copies of themselves could exist somewhere else. They didn't seem to realize the implication that they could wake up millions of times and all the times it would feel like a direct continuation exactly how it feels for the one that wakes up in the scanner room. There were a few different people. I could pick a person that I thought was most likely to know the password, and then pick the simulation I wanted them to wake up in. One was an island, one was totally blank I think, and one was simulated to look like the original scanner room. If I put them on the island or didn't treat the situation just right, they freaked out and started having an existential crisis. I finally found the right guy, and Catharine talked to him as if the scan had been normal and just said "oh can you tell me this password?" We had to try a few times to get the wording just right because at first he knew something was up, the password had some degree of security clearance. I'm just telling this part of the story to re-emphasize the personal identity themes that were present throughout. As I continued along the story the WAU started becoming more prevalent, and that story came across in small snippets. From what I could gather, there was something called "structure gel" that the scientists were using for some experimental purposes. I read a study where they reanimated a dead mouse with the structure gel, but there were some weird anomalies with the mouse's behavior. This part of the story is actually not entirely clear to me, it could be because of the long gaps in between my gameplay and my memory is typically pretty poor for details. It seemed to me as though something that they thought they could control, and thought they could use for some technological "goods," ended up developing a form of intelligence on its own. They developed it to help humanity, and in some sense when it became intelligent it seemed to be acting to protect the humans, until the humans lost control. In some senses the WAU seemed like a "cancer," infecting and mutilating everything it came in contact with. Distinct from cancer, though, the WAU was end-directed, and goal oriented. It was in some sense an organism, though in another sense seemed mechanical - I think that's the nature of the structure gel. There was a manifestation of the WAU that took the form of a very scary "shadow," I don't know if it was real or a projection in the sense that it manipulated my awareness so that I could see it as a villain. It made my vision glitch out when it got near me. It was actually really scary, I like scary games in general, but am not that good at maps and mazes and there were a lot of those combined with running from this extremely mysterious monster figure. Toward the end of the game I start to hear voices, and as I'm really close to Tau a monster tells me it will "make preparations." There's another point of fission in the middle/toward the end of the game. Poor Simon because he still doesn't get it. He needs to wear a new suit so that he can better travel where he needs to go, and his robot body (oh yeah, he is a robot now. he was freaked out when he first saw his robot body in the mirror) can't just hop inside the suit. He needs to put his consciousness inside the suit, so he goes and sits on a scanner chair and waits for Catharine to push the button. I don't know what it is he thinks will happen, poor guy. He thinks it is like teleportation, like he can just physically transfer his consciousness from the robot body to the suit. So she pushes the button and I open my eyes in the suit! Sweet, it worked! Simon doesn't know yet his old self didn't move, but he looks over and notices another Simon sitting in the chair still. The Simon over there is unconscious for a moment because of the nature of the scan, but will wake up eventually, thinking something went wrong with the scan. Poor that guy, even more than the poor Simon in the suit. I decide to kill the Simon in the chair. He doesn't need to wake up and be stuck here thinking something went wrong with the scan and now he is all alone. Is this a legitimate case of killing someone? I think yes, somebody dies. It's tough in this case since pre-scan Simon didn't know. In the real world, I think the proper thing to do would have been to let pre-scan Simon decide what happened to Simon-in-the-chair. If I was pre-scan Simon I'd say kill Simon-in-the-chair, and then when you do, it's not really an immoral case of killing I don't think. As long as pre-scan Simon is fully aware that there is no real difference, from his perspective, between Simon-in-the-chair and Simon-in-the-suit: they are both him. If he doesn't really know this he might not be able to fully consent to the killing, and you might worry that after the scan the Simon to be killed would change his mind, having not known that he was going to feel like a full person. However, since Simon-in-the-chair is unconscious it is not that morally grey. Only a little because of the autonomy issue. But even though Simon-in-the-suit is now distinct from Simon-in-the-chair, since they were both the same person a minute ago, this Simon probably has good reason to believe that the best way to respect Simon-in-the-chair's humanity is to kill him. It would be different if he wasn't going to wake up in such a dreadful, dreadful, place. But Simon in the suit gets angry. I don't think he even realized up until now that his self in Toronto continued living. He thought he just got frozen in time and plopped 100 years later. He questions what he is, what existence is, whether he himself even counts as "Simon." The existential dread was real. He said, "If there's an afterlife, is it already populated full of people who would call me an imposter?" He doesn't really come to terms with the fluid nature of identity throughout the entire game. He's just yelling at Catharine, saying she lied, saying she tricked him. I don't think she meant to, though maybe a little, just because she knows he doesn't get it. To address the afterlife issue, if I were talking to Simon, I would say Simon, if there is an afterlife it's not full of distinct copies of you. If there is an afterlife I'm inclined to think the illusion of separateness would fade away entirely. If there is an afterlife, Simon, all the yous and everyone else are just one. Nobody will call you an imposter. There is no such thing as an imposter when there is no such thing as an original. I don't know if this would comfort him though. He would probably just get angry because he does, certainly, exist, and an afterlife in which his identity is no longer that of a separate "Simon" "thing" doesn't feel like an afterlife in which he is "himself" at all! Like I said, poor guy. Catharine isn't sure how to comfort him, but he gets even more mad and a little sad when she is silent. She tells him a mundane story about her childhood to comfort him. I don't remember what it is, but it is sweet. I forget that she was human once, sometimes. Simon asks her what is the first "human" thing she will do on Ark, when she finally has a body. She says she would like to watch the clouds go by, does that count? Another interesting thing he said around this time was related to his perception of time. He says "what a long day! I woke up today to get a brainscan, and it has been 100 years!" A long day indeed. It's probably been more than a day since he's been underwater. What happens to your circadian rhythms without the ability to sleep, as a robot? Do you not have circadian rhythms anymore? What would that kind of perception of time feel like? This seems important, and not trivially, in terms of what it means to be human. Another example in the game about sleep, which I think is interesting. I see some Buddhist themes throughout the game, I think. Maybe I'm just reading it that way. I recently read this book on meditation and sleep, from a philosophy/neuroscience/buddhist perspective. I think they may have left something out that is fundamental to consciousness if their uploads can't sleep. Sure the uploads have all the memories, and based on the way it's spelled out in the story have a continuous phenomenal experience. It's a given in the story line since inherently you play from a perspective, but you have to wonder, whether a robot without the potential for delta brain waves in the real world has consciousness in the same sense as I do. Maybe you have to wonder this in the game too, maybe there is a qualitative change but the system can't notice it. I wonder if they can meditate. I wonder if they remove information and verbal thought they have the same ability to reach a "pure consciousness" state. But regardless of the question of whether there is fundamentally something wrong with a conscious being that can't sleep, it also would clearly matter for the time-perception thing. One day surely would be too long. I would be sad if I never got to sleep again, I think, even if I didn't get tired, it would just be weird. I love my half asleep state in the morning, it's when I get my best ideas. I finally made it to TAU. It was the most scary part of the game for me because that WAU projection shadow was following me around and being very scary and very close and I died like 5 times. But eventually I found the ARK. I also saw the last biological human alive. She was hooked up to a ventilator machine. All alone, she didn't know the rest of the civilization had collapsed. She knew her station did but didn't know that Omicron did too. It was quite the realization for her to realize she was the last human on earth, I am sure. I wanted her to explain to me what happened, to tell me more of her story, to tell me what was going on and why I was here and tell me what the WAU was. She didn't though. She just asked me to disable the life support, so I did and waited for her to die. I put the ARK on a conveyor belt to head over to Phi where I would put it in the rocket and send it to space. This is when I started getting excited about uploading to ARK, as I mentioned, I super wanted to make it to "heaven." The game was getting darker, though, and daniel told me not to get my hopes up. He said I would likely not make it to heaven, scary games don't have happy endings. But I wouldn't believe him, I said, I'm holding out hope. The WAU started making its way deeper into my consciousness but I said no, I will make it to the ark, I will not give in. I found the heart of WAU and tried to kill it but then it started chasing me around the ocean floor and it was extremely intense. I thought I might die but I was starting to feel really connected to Simon, screaming and yelling "I don't want to die! I want to live! I want to make it to heaven!!" I made it to Phi, found the rocket, put ARK inside, and plugged Catharine in to tell me what to do. This is when I realized. Oh my goodness, the whole game it only lets me follow along one perspective. I have been so excited chugging along to make it to ARK, what if I don't get to see it? What if the game keeps me in the perspective of earth Simon, doomed to live out his days in the bottom of the ocean? But in the real world, you would know that both Simons are really you, right? Would you be scared? A little fear seems rational. But you also know that you will make it to ARK, since they are both you. It seems pre-transfer Simon should have reason both to be scared about staying on earth and to be excited about being on ARK, simultaneously. So I tried to have this perspective. They are both me, it doesn't matter, I said to myself - it isn't a coin toss, they are equally me pre-scan. I get even more nervous when I ask Catharine when I will upload, and the upload button is the same as the liftoff button -- is she trying to trick me? I really will upload, right? Simon in the game wasn't scared though, he still trusts her despite how mad he was at her before. So I push the button and a 20 sec countdown starts, with loading screens indicating upload progress. I had a lot of emotions when it got to 3 seconds and my upload wasn't done. I was shouting again and I was at the edge of my seat PLEASEEE let me make it on haha. Like I said, I was getting very attached to my character. The mission was a little hard at times and I really wanted to succeed, and to see what Ark was like. It seemed like such a good project, it was the future of humanity. 3....2....1...and my upload completes RIGHT at the last second. "YES! I MADE IT!" Simon said in the game, the build up of emotion ended in success, I made it on the ship and the rocket launches. But... I'm still here. I'm still at the bottom of the ocean. I mean I understand how this stuff works, I knew that a Simon would have to stay at the bottom of the ocean. But I really, legitimately, did not want it to be me. This emotion I felt is a big part of the reason I loved the game (I hated it though). I wanted it to be me. I know that if I'm Simon in a sense it was me that made it on the ARK but... I wanted to see it. Poor Simon in the game feels like I feel, but worse. He yells at Catharine again, yells a lot. Says she tricked him. She is hurt I think, and also a little defensive and trying to make him understand -- We made it! She says. She is excited, we made it to ark. Simon is upset, of course, it doesn't make him feel better knowing there is a copy of himself up there, it isn't him, he's not there. She says "we lost the coin toss, that's it, but we made it." She starts to yell back a little, she is angry that I'm yelling at her, but then her screen cracks, the computer breaks, and I, Simon, am stuck at the bottom of the ocean in a post-apocalyptic earth, the only sentient being in existence, a deep pit in my stomach, watching the future of humanity fly off into the distance and I will never get to see it and I will never see clouds or feel warmth or fall in love or laugh or cry or eat or anything ever again. Who cares if my copy made it on, at that point. Then the credits roll and I started crying. I wanted to go to heaven, I said. I know I made it but I wanted it to be me. I wanted to see it. It's a little silly that I'm crying but I am an emotional gameplayer, what can I say haha, and like I said, I get attached to my characters, pretending they are me. I also felt a little... was it guilt? Or confusion? Because if this was really how things that played out of course I would feel the same way. How was Catharine so calm? How was she so detached from whatever perspective she took on at that current moment? Was it me that was the problem, why I was so sad, why I would be if this were actual? Am I too attached to my ego? What's the appropriate reaction? Because losing a coin flip isn't quite the right metaphor it seems, as Catharine says. Because both options are 100%, it isn't 50%. I know that I am going to open my eyes both on ark and at the bottom of the ocean. But I also know before I open them that I can only see one at once. And in this case clearly the dread of the individual left to die sucks but doesn't make me think anything like -- "why upload at all?" The world is ending. The me on the bottom of the ocean is going to die either way. So uploading seems like my best chance, my only chance. But what about other cases of uploading? Teleportation that copies all your molecules and destroys the original, is not only teleportation but teleportation+suicide. But if it destroys the original at the exact time of the copy, no one will ever know they've also died. They will just think they've traveled. I mean this is classic personal identity transporter cases, nothing new here. But playing this game gave me a bit of extra perspective on levels of certainty. It doesn't make sense to have any sort of probabilistic view, before scan, in terms of where I will wake up - they are all future me. All of Catharine's colleagues woke up instantaneously after their scans, as many times as I pushed "run" on the program, and they were all continuations of their pre-scan selves, to the same degree. At the same time, when I do wake up, it doesn't really make sense to say at that point that "I" am also somewhere else. But, no paradox here. So, it still hurt to be the Simon that got trapped on earth, even knowing pre-scan full well that one Simon would. Part of this is the nature of a game, since the game follows just one timeline, I just hoped I'd get to watch the timeline that made it to Ark. But I can't chalk it all up to that, since the way I'd feel if this were real would likely be much worse. Maybe I need to meditate more. Luckily, the game was just tricking me. I'm so glad they were -- After the credits, I opened my eyes and looked down and saw I had hands. Human hands. I have hands again? I wasn't even a robot! Is this ark? It was dark but I start walking and it opens up into a wide, green, sunny, blue skied area with a river flowing beside me. I felt a slight sense of euphoria, being in ARK when I thought that I was never going to get to see it. That Simon, the one that woke up on ARK, I'm sure was still ignorant to the fact that another one stayed on earth. That's ok, though I still don't know why it took him so long to get it. I walked along the path breathing and taking it in, looking up at the sky. I stopped at a terminal along the way to take a survey on the nature of my experience. How did I feel? How were my senses? How connected did I feel to this body, and how was my sense of self? Is there meaning in this new world? Do I want to keep going with the project? I was pretty excited so I answered them quite positively. My senses were good, I felt more connected to this self than the robot self at the bottom of the ocean. I had skin again! I could smell again and I could feel breeze again! For the meaning question, I answered that maybe we can find a new sort of meaning in this new world. We can have artists, we can still have philosophers I think, maybe if there is no possibility for reproduction I can build a team of researchers studying genetics and what is different in the physical world that allows that to happen and see if there is a chance to build that into the simulation. If life gets too long maybe we can build in life cycles. Or maybe we will enjoy living infinitely, or maybe we will decide we don't want to be separate anymore or have bodies anymore and combine our consciousnesses into the flow of the river. Who knows what the future holds? I walk along the bend and I see a woman, it's Catharine! Behind her is a big, sparking, brand new city, she says hello and runs toward me and it fades out into black. The final shot was of the rocket in space. Ark was all that was left of humanity - a simulated world, filled with simulated humans but with real consciousness. The people inside had no way of seeing where the rocket was going, I don't think, in fact I don't think they had any way whatsoever of interacting with the outside world. But does that matter? Was their simulated world any less "real" for being simulated? In a sense, no, as long as it really is the case that they are all conscious, and find a way to lead happy fulfilled lives. But of course there is a clear sense in which it is limited. They will always know there is a reality outside of all that they can see and that there is no longer any way for them to reach that reality. What would aliens think if they found the ship? Would they have any way to contact the beings inside? Would they have any way to even know that the ship had conscious agents inside? It's not a planet, after all, and likely has no signs of life whatsoever. Such a closed off existence seems... bleak. But what else could they have done? When the future of the human race is at stake, you do what you know how to do. You might not worry as much about what counts as "real" vs. not. Maybe in the future they had decided this was a simply irrelevant question. Maybe their "completed physics" told them that it doesn't matter what things are made of, just what information they carry. Maybe they learned that even on earth there was no sense in which they had access to some objective "truth" that exists outside of perspectives, so it really doesn't matter that they also know that they don't have that when on the ship. Is there any difference? Does it matter that humans created the ship? Does it matter that everything on the ship, including the people, are lines of code? (I know they are because I looked at the code for ark). Can everything in existence be codified? Are we missing something by not including in our reality the parts of nature that resist codification if any exist? If they "figured out" consciousness in this future world, did they also figure out "autonomy?" Could they have programmed quantum mechanics into the nature of the reality on the ship? What about additional dimensions? Is the ship deterministic? Are people algorithmic? Assuming they really do have a completed physics, what, if any, philosophical questions remain open? So, so, many questions, haha. Here are my overall thoughts. I think about personal identity and consciousness a lot from the perspective of academic philosophy, but it was really neat to see how these issues are represented in popular culture, fun to engage with that perspective. The story line was pretty good but some of the middle areas with the long mazes just got long and boring and I wished it would have been a little condensed. I loved the way they tricked me at the end and made me cry. I almost wish I got to see Ark for longer after all that build up, but I can see the appeal in leaving it a mystery. It also seemed like a break in the general narrative for the viewer to experience both the Simon left on earth as well as the Simon on the Ark, since prior to that it had followed from a single perspective. But I think we were meant to just assume that the new Simon is separate and pretend rewind to the moment of upload. That would be why they did the credits first-- more of an after-game than a continuation of the narrative. I think one thing they could have done that would have been an interesting twist would have been to make Simon's identity crisis a little more salient and relatable. I felt like he just didn't get it, but if there was some evidence perhaps to show this current Simon was "less real" or not even really conscious or something, that would have been an interesting path. What if copies really do have less value - what do you do when you find out you are one? In terms of the philosophical themes, I thought the treatment of personal identity in fission cases was pretty cool, but fairly straightforward, nothing particularly novel or innovative. That's fine, but they could have done a lot more with respect to philosophical issues regarding consciousness. They mention the "sleep" thing very briefly but don't explore it at all. And it never really even seems to be a question whether all these robots are actually conscious. There's a lot more they could have done to make the game better and more thought-provoking. It didn't inspire any new thoughts I hadn't had before, but it did make me feel things about the fission cases :) Which I appreciated. All in all I'd give the game SOMA an 8.4/10.
1 Comment
|
AuthorJenelle is a grad student interested in philosophy of mind. Categories
All
Archives
October 2020
|