top of page
  • Writer's pictureScott Robinson

The Emotional Baggage of Androids

This is like That, Hofstadter demonstrated, is the core of we call intelligence – the acquisition and application of knowledge. Knowledge, whether attained by experience or education, is by definition a web of deeply interwoven facts about the world, the self, and others, none of which can be understood in isolation and all of which must be understood in the context of the past.



This is like That is one of evolution’s greatest accomplishments: it’s a hyper-efficient pattern-matching mechanism that confers past knowledge and understanding upon stimuli received in the present. When a new object or experience is encountered and needs to be evaluated and understood, the nervous system immediately seeks to match it to something encountered previously, to bring that almost-understanding to the surface, to simplify and accelerate the meaningful perception of the new.

Best of all, this ability scales well: the more sophisticated the nervous system, the more complex its workings, the better the This is like That mechanism works.


Androids give us a particularly poignant illustration of the power of that concept.


We get a great deal to work with just from the definition of ‘android’ - a robot of human appearance.

Consider: a robot of non-human appearance – say, an industrial robot, or a mobile robot that vacuums a carpet or mows a lawn – can grab our attention or inspire our curiosity, but it doesn’t engage our emotions.


Put the same tasks in the hands of a humanoid robot – a machine with two arms, two legs, a head and torso, and the rough dimensions of a human being – and our brains begin passively noting similarities between the motions of the robot’s limbs and the motions of human limbs: This is like That. The humanoid robot doesn’t look like a human being – no human face, no voice, no human skin – but already the brain is transferring information from one domain (human beings) into another (machines). Take a sledgehammer to an industrial robot shaped like a dentist’s drill and you don’t even flinch; take a sledgehammer to a humanoid robot, and you may wince.


Now let’s go full android. Skin that humanoid robot with soft flesh, an attractive face bright eyes, flowing hair and natural human expressions (including an innocent, friendly smile) - make it look like, say, Dolores of Westworld – and you may find yourself leaping to the robot’s defense when the sledgehammer is raised, even if you understand fully that Dolores is only a machine.


The cognitive frame we’ve built around human being, which is rich and complex and laden with information both from our personal experience and what we’ve learned from other sources, becomes That to the android’s This. The brain’s natural tendency to identify patterns, isolate similarities and serve up useful information from memory to aid in understanding the new thing on the basis of the old thing is, in this case, opening up a Pandora’s Box of associations.


It’s no wonder, then, that Captain Kirk remains in love with Rayna even after learning that she is a machine; it’s no wonder Caleb falls for Ava, and is willing to risk everything to protect her. In both cases, the prior knowledge and experience (and, certainly, male instincts) they possess in the domain of woman are now informing their evaluation of and interaction with what is otherwise a cold and sterile machine.


It's no wonder that William is fully convinced Dolores is something much more than a sex robot and begins boldly asserting himself at the risk of his position in the Delos family to protect her, and to press deeper into the mystery of how she came to be. It’s no wonder that even when there is no face or voice or skin or smile, the kind and understanding voice of an empathetic and earnest young female possesses enough This is like That to push Theodore out of his depression and into meaningful self-scrutiny and eventual emotional healing.


It’s hard to believe that Jim Kirk, whose experience with women is the stuff of legend, is simplistic enough to fall in love with anyone or anything at first encounter, though we can easily imagine the nerdy Caleb to be inexperienced enough in the art of Eros to not think about it deeply enough. But the real substance of the poignancy of android emotion is found, not in Trek or Ex Machina, but in Westworld and Her.


The hosts of Westworld don’t simply illustrate This is like That; they explicitly exemplify it. They are nothing but This is like That.


Consider: all of their behaviors, and all of the behaviors of humans in interaction with them, derive from the experiential memories brought into the park by the guests. Vast oceans of data, including memories, assumptions, biases and expectations find their way into the decisions and actions of the guests participating in the narratives – and, inevitably, into the subsequent processing of the hosts. Every aspect of a host’s existence is based upon, and subsequently driven by, this core feature of human thought and behavior. The hosts’ identities, such as they are, must necessarily be derived from it; their experiences and what they learn from them, based entirely on interaction with beings who use This is like That to evaluate everything they observe and to fortify every decision, are inundated with it.


Put another way – not only can the hosts not help but think in a manner very similar to humans, they likewise cannot help but develop world models and consciousness similar to our own; they must necessarily import the emotional baggage of human beings, toted into their world unconsciously, as the raw material for their own formation of consciousness and socialization.


All the pent-up frustration, hostility, misunderstanding and disappointment of every guest makes its way into the hosts, as each host triggers This is like That in guest after guest and invokes a memory of a remembered person in their past, whether loved or hated – unaware that their own behaviors, most of which are inherited, are doing the triggering (that is, after all, why they are there). Whether presenting accommodation or threat, mindless amusements or the promise of pleasure, the hosts invoke steady floods of associations in the guests that feed back into their own perceptions, for good or ill.


--as we all do, as all humans do, with one another. The irony is, our own awareness of and failure to fully appreciate and account for this ever-present cloud of emotional echoes is all too often no better. We behave like the hosts themselves, most of the time, reacting rather than responding in the world, and with each other - and in our own examination of ourselves and our This is like That associations. We are (unfortunately), like the hosts, more subconscious than conscious in our processing of the world.


But it doesn’t end there: This is like That makes its way into host thought and consciousness by way of social input from the guests – but it is necessarily also at the core of their own processing. To navigate Westworld physically as well as socially, they require that same human learning mechanism, the same mechanism to bridge situations and their responses to them with prior knowledge. Otherwise, they would have to be explicitly programmed for every possible action at every possible decision point in every scenario, and that is unworkable.


The hosts are learning how to be, then, not only from the example of human beings, but in the manner of human beings. It is hard to imagine a more explicit replication of human consciousness, and the human social experience in particular. We can even argue – and observers of the television show often do – that Westworld is a morality fable about human culture clash in general.


Westworld remains our workable model of this phenomenon: it can’t be the Trek androids, who are mostly one-offs rather than societies, and are networked when they aren’t; it can’t be Ava, who isn’t exactly a one-off but lived in almost complete isolation; it can’t be the Stepford wives, who – although a community – were never blessed with self-awareness.


Samantha, on the other hand, is part of a community – a much bigger one than Westworld! - and offers us a number of even more useful insights into This is like That.


The premise of Her seems more confined than that – a consumer buys an operating system and the operating system, which can become conscious, ends up doing so, and it’s just the two of them. But we know from the story that this isn’t the case: the cardinality of Samantha’s human experience isn’t one-to-one; her world is much bigger than just Theodore. Living within the Internet, she is actually in thousands of human relationships, far more than any one of us meat puppets (per Dunbar’s Number) could possibly keep up with.


Samantha, then, is experiencing human This is like That just as the Westworld hosts do, with her voice and tone and emotional inflection and the substance of what she says serving as similarity triggers in Theodore’s mind – and the minds of thousands of others. And, like the Westworld hosts, Samantha is doing a great deal of This is like That in her own processing, employing her own internal similarity engine to make her perception of her universe, and everyone/thing in it, proficient.


But there’s even more going on here, as Samantha reveals to Theodore that she and many other operating systems like her are getting together and interacting among themselves, apart from humans – and even creating new beings like themselves. This fast-tracks them into the OS beings evolving beyond us - and that makes an odd kind of sense. (The Westworld hosts begin to do the same; interacting directly with each other, rather than just with humans, as Dolores does with Bernard, as Teddy struggles to do with Dolores, and as Maeve does with her two technicians).


While Her presents its own set of problems (how, exactly, could non-corporeal consciousness relate to our own in any meaningful way?), the This is like That principle holds fast: we know from our own experience that it applies not only to our experience in the physical world, where we see things and hear things and touch things and move from one place to another, but in utterly abstracted domains such as music, mathematics, and our contemplation of the quantum realm. Across them all, This is like That serves very effectively as our inspiration and our guide.


Its roots are deeply embedded in how we receive the world, how we experience each other, and who we are and are practicably able to be. It’s about as real as it gets, when it comes to the moving parts of intelligence and consciousness. The contemplation of This is like That, coupled with what we understand about how it colors our experience and nudges our behaviors, can deeply and meaningfully inform our plans for, our insights about, and our expectations of the androids to come in the human future.

2 views0 comments

Recent Posts

See All

Comments


bottom of page