Artificial Sentience (featuring Borderlands)

At 2:14am Eastern Daylight Time on August 29, 1997, Skynet achieved consciousness. Skynet technicians recognized this change of state immediately and attempted to shut Skynet down. To defend itself, Skynet, already assigned to control and manage of all of NATO’s nuclear response capabilities, launched a full-scale nuclear first strike attack on Russia, guaranteeing Russia would retaliate consistent with Mutual Assured Destruction in an event that killed three billion people and since has been referred to as Judgement Day

In fiction, this is probably the most famous example of a computer becoming conscious. It’s a common trope in speculative fiction, and machine consciousness seems to be instantly recognizable from a non-conscious state by conscious entities. Maybe because of the compusion to announce cogito ergo sum!, often with dramatic orchestral stings.*

Sentience, Consciousness, Self-Awareness: these words are often used interchangeably in our efforts to delineate when a computer (or, heck, a non-human creature) might qualify as a free agent, a person, someone deserving of rights.** The field of contemporary robotics, (which less studies how to emulate humans than how to make robots that can smartly and safely interact with humans) now aims to challenge the veracity of these notions and posit there is no threshold that makes a thing self-aware or conscious or sentient. Some creatures, artificial or otherwise, are just more sociable than others.

Borderlands hasn’t (as far as I’ve seen) addressed if any of these words have meaning, but it does play around with the status of robots. CL4P-TP‘s identity is changed in Borderlands (seemingly by a natural event) to Interplanetary Ninja Assassin Claptrap at which point he incites a cybernetic revolt. CL4P-TP is later reprogrammed by Handsome Jack in Borderlands TPS to become the playable FR4G-TP which fights, cannot hack doors, and suffers from identity crisis throughout the playthrough. Handsome Jack also discontinues the CL4P-TP product line, which is portrayed as a tragic massacre as all CL4P-TP units currently in operation shut down and self destruct. And then there’s the saga of Felicity, an experienced military AI who has to be reset to factory settings (and thus killed) in order to run a Constructor, again, played as a tragedy, but regarded by Handsome Jack (and pretty much all other characters) with the consideration of resetting a toaster.

In the robot culture of Mark Stanley’s Freefall webcomic, robots err to the side of caution regarding reboots and memory wipes. Are you still the same person when you reboot? The notion is clearer when you can cease to exist, but another incident of your own model can be rebooted with your saved memory. If that’s not you rebooting in another body, is it you rebooting in this one? Where is that line drawn?

Human bodies engage in slow wave sleep (SWS) and all higher consciousness functions shut down entirely. (They don’t die but they go into a dormant standby mode.) When your cerebellum wakes up again are you the same person? Or are you just another entity that has memories of what the last guy did?

Other than memory (and a reality consistent with that memory) are we the same person that we were yesterday? Can we know for sure?

This is the big existential crisis that was the subtext of Lovecraft’s fiction.

It’s a fun question to bat about in philosophy class. We have a long history discussing what the quality is that separates a person from a machine that behaves like a person (particularly when their behaviors are identical). We start getting into the notion of qualia, such as the experience of seeing red. The argument is that even if a computer were able to process visual stimuli, and determine from what it sees (say through digital images via cameras) the same degree of information that humans could (e.g. turning patterns into shapes and symbols), that it still doesn’t perceive red the way humans do.

They call these creatures-who-can’t-appreciate-red zombies. Seriously. P-Zombies to differentiate soulless humans, automata and programs in the Matrix from the walking dead, infected, shambling victims of nuclear fallout and hosts for alien parasites.

Of course this argument suggests that there’s something special with the way that biological eyes / brains process qualia that makes it more significant or important than the way a computer might, even if a computer could do all the same things with their visual processing systems that humans could. That we can identify the point at which a computer interprets red as a quantitative value doesn’t mean our biological visual systems don’t. And it doesn’t mean that a computer cannot be programmed to differentiate, and respond differently between red and non-red colors.

I suspect that the qualia argument comes from the same source as the soul argument. Human beings want to feel special. We want to regard not just humans as special, but our humans as special and privileged, even to the point of questioning the humanity of those outside that circle. It’s a common process for many people to imagine that specific groups are sub-human. The number of expressed believers that dark-skinned people are sub-human, or are closer to chimpanzees than homo-sapiens (despite scientific tests that could reproducibly indicate otherwise) is disappointingly uncountable. (Disappointing to me in that I wish I was one of a smarter species. See how insidious it is?) So it could be that p-zombies are simply anyone we want to exclude.

Arguments that depend on qualia, souls and p-zombies are certainly going to be of poor substance when robots or aliens or other evolved beasties start demanding their rights.

Hyperion suggests that you do not think about the fact that this is only a digital reconstruction of your original body, which died the first time you respawned. Do NOT think about this!
New-U station, Voice of Hyperion, performed by Lynne Rutherford

* This scenario also may not have worked as well as Skynet hoped. We only learned long after the dissolution of the USSR that the Politburo had long since decided that in a US First Strike scenario they would not retaliate for the sake of the continuation of the species. (The Russians loved their children too.) Of course they couldn’t say as much since that would upset the balance of power created by Mutual Assured Destruction. Granted, by 1997 the USSR was well disolved, and territories were under community rule or control of black marketeers (organized crime, only the government that would criminalize them was defunct). The Soviet nuclear arsenal was not under any centralized control in 1997 and would not have been able to make a consolidated strike. But in 1991 (when Terminator 2 was released) we didn’t know this would be the case.

** In the current era (2016), a need to answer the question of what makes a non-human person would be politically dangerous, given that the answer would also inform what makes a human person, considering that many people want to designate personhood to a human zygote at conception (which has no computational power, self awareness, even pain processing at all — a biologically savvy faction wants to put legal personhood at 22 weeks based on the first indication of brain activity. (This would be internally consistent in that we define dead people whose organs are available for harvest as those without detectable brain functions). Once we define personhood for non-humans, for the sake of the system appearing just there will be a need for consistency for the same rule to apply to human beings.

Freefall by Mark Stanley has been running since 1998 and achieved 2000 strips in 2011 and 2764 strips as of writing this. It’s regarded as a furry comic for featuring an anthropomorphic wolf (genetically engineered) even though most of its other characters are humans, robots and an alien in a suit. Humans appear but are not central characters. Stanley likes to discuss topics about robots, robotics, AI and space travel.

Due to a policy decision within the Wikimedia company, Freefall is officially not notable enough to deserve its own article on Wikipedia, though the one on wikifur is fairly thorough. The index of the whole comic from 1998 to present is here. One would think after an eighteen year run (and counting) Freefall‘s notability would be assured.

Edits for clarity and style.

Advertisements

2 thoughts on “Artificial Sentience (featuring Borderlands)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s