A while ago I watched the movie Prometheus, completely unaware at the time that it was related in any way to the Alien movie franchise. I found one particular part of the movie fascinating, however: the character of David8, an android who is sent along with a team of specialists on a scientific expedition to find signs of life on another planet. He practically takes care of the ship and its crew, who are in stasis for most of the journey.
In Prometheus, we see a meticulous David go about his daily routine on the spaceship. He plays basketball while riding a bicycle, he learns an ancient language while eating his meals, and he watches Lawrence of Arabia and styles himself after Peter O’Toole in the movie. Innocent enough — but things get particularly dangerous and grim when (spoiler alert! …actually, this whole article has spoilers, so watch out!) David starts following a hidden agenda of his own, essentially using members of the crew as lab rats as he investigates the mutagen that becomes the aliens.
I’ve always been interested in AI, along with virtual reality. I think that’s why I’ve merged the two in my work-in-progress, Neon Vape, which features a character stuck in a virtual reality game with only an in-game AI for company. David’s character immediately fascinated me in Prometheus, but I still couldn’t quite put it all together — what he was doing and why. Even his connection with Weyland — the idea that he was simply following Weyland’s commands — seemed to me to leave a lot to be desired about his behaviour. It was in Alien: Covenant that I truly began to understand the extent of David’s agenda and his dangerous thought process.
I’m just spit-balling here, but I wanted to share some thoughts that I’ve been having about this character, which to me is one currently the most interesting character in the entire franchise. In particular, I want to focus on two main themes: the idea of creation and the ability to create, and the idea of sentience in androids.
In Alien: Covenant, after teaching Walter to play the flute, David guesses that Walter doesn’t have the ability to create — “even a simple tune” — to which Walter explains that David made people uncomfortable because it was too human and thought for itself, and that as a result all consequent models, itself included, were created “with fewer complications” (or, as David puts it, “more like machines”).
I found this particularly strange and interesting for a number of reasons. First, there’s the definition of “create” — what does it mean to create something? Evidently it can’t simply be creation out of thin air (and, if it isn’t, then it is arguably never truly creation at all), but if we are to take the definition as simply making something — not necessarily something out of thin air, but simply making or crafting something, like a new tune on the flute — then perhaps we can analyze his desire to create from this perspective. In other words, rather than a desire to be able to create, it is a desire for creativity.
How do people learn to be creative, usually? In school we are taught to memorize and duplicate first, before then achieving creativity. You learn a language by writing down the letters again and again, copying them, and then words, and then sentences. You learn to tell stories by learning what stories are, and reading stories, and then being able to make your own based on the knowledge you now have of what a story is and what it should look like. Creativity is, in many ways, the result of learning and understanding.
In Prometheus, David mirrors this in stages. First, he learns. He learns, for example, how to adopt Peter O’Toole’s mannerisms, style, and even speech patterns and accent, from his portrayal of E.T. Lawrence. He learns about the mutagen and how it works. He learns about the “engineers”. He learns, then reacts accordingly, his plan changing and adapting — his pseudo-creativity constantly evolving as he learns more and more.
But I think what mystifies me about the character of David is his ability to choose what he “likes.” How does David realize he likes O’Toole’s portrayal of E.T. Lawrence? How does he decide that, out of all the characters and people he could model himself after, that particular one is the one for him? That’s one of the questions I still haven’t come up with an answer for…
In any case, creation is a bit of a moot point when it comes to David. He wants to create, but at the end of the day, he cannot — he can only work with what is already there. His ‘creations’ — being the aliens that he breeds and experiments with — are simply derivatives. He still has not ‘created’ something, though he has altered what already existed.
It was also mentioned, at some point (and I think it was David that mentioned it, in fact), that to create, one has to sometimes destroy. I think this is an interesting point, insofar as creation and destruction have in many ways been linked throughout history, but I also think that in this situation it is an erroneous statement, which also makes me wonder about David’s logic. David is partaking in destruction by completely annihilating an entire planet’s population, but he’s really only clearing the canvas in order to make something else — his aliens. It’s like taking the Mona Lisa and painting over it in white in order to do your own masterpiece.
It’s not real creation, so it seems as though David himself has a very particular understanding of what it means to create, and a very limited one at that (dare I say, more limited than a human’s — which would make sense and bring to light other issues, if we extend that David is rebelling against the idea that humans are superior to him and the realization that while they may be “inferior” in certain ways, they are also “superior” — a paradox which quite possibly gets in the way of his reasoning).
Actually — taking a moment to think about it a bit more — it is possible that the issue with David is with his inability to make value-judgements in such cases. How can he say with any certainty that humans are inferior to him? How can he say that the aliens are superior to humans? It seems entirely irresponsible for an android as intelligent as David to have such simplistic balances in place, which also increased my interest in the character and the choices he takes during the movies.
I tend to wonder if David is aware of the paradoxical nature of his thinking (and, even if he isn’t, how in the world he takes decisions when his reasoning doesn’t seem to balance out), and how that affects our perception of his decision-making and problem-solving throughout the movies.
There is an entire focus in Alien: Covenant on David’s supposed emotional connection to Dr. Elizabeth Shaw, who saves him at the end of Prometheus but who is dead by the time we see David again in Alien: Covenant. This is revealed mainly through his conversations with Walter, along with other more subtle representations of emotions David claims to have.
For example, David claims that he “pitied [Weyland] at the end” of his life. After teaching Walter to play the flute, he seems happy with the android’s accomplishment, and compliments him. When Walter explains that models that succeeded David were made to have less human-like qualities, David guesses that he is unable to create, which he claims is “damned frustrating”.
While language is definitely a way in which us humans convey our emotions, thoughts, and opinions, it is also easily replicated. David is an intelligent android, and I have no doubt that due to his emotional recognition and his ability to recreate facial expressions and emotions, he can easily duplicate them as he pleases. And there is nothing stopping him from using emotional language, even if he himself can’t feel the emotion.
And that’s my next point: I don’t think David can feel any emotions. I think, rather, that David is simply reacting to the logic and reasoning in his ‘brain’. In other words, I think David has an understanding of what would be ‘damned frustrating’, and has studied the humans around him for a long enough time to comprehend and record when certain emotions are displayed, and in which situations. After all, David’s ‘life’ didn’t start on board the spaceship, and he spent many years on Earth interacting with Weyland and other humans. In that sense, he can recognize certain situations and the emotions they would normally illicit in humans, and duplicate that accordingly. I don’t think he actually has any genuine emotion at all.
Sentience, however, is another discussion, because once again we have to agree on a common concept definition to use as a base. If we take sentience to mean emotion, then I argue David is not sentient. Scholars have argued that sentience also has little to do with agency, because agency is something that is found across all animals — they all have the ability to do something (check out this article for more on this), but not all animals have been proven sentient. So, the fact that David is set free from his master upon his death and can operate on his own free will does not make him sentient.
Perhaps there are other parameters that can tell whether or not something is sentient, but there’s also one more thing: in order to be sentient, there seems to be an assumption that the organism whose sentience is being investigated is, in fact, alive. David is a machine, and therefore not alive. This, too, makes the discussion of sentience a bit difficult to gauge.
👾 👾 👾
That’s all I have for now, but this character is definitely an interesting one, and I’m fascinated in seeing how David’s story finishes up. The more time we spend with him, the more about him is revealed, and I’m interested in finding out what the scriptwriters/actors/directors have in store for his character.
Subscribe to the author’s newsletter and follow me on social media to get updated when new content comes out!