Elove: What Does Fiction Know?



Reading Philip K. Dick’s Do Androids Dream of Electric Sheep? we are as “bamboozled," to use one of his terms, by the representation of a dystopian world, as we are by the awe inspiring first images of its film adaptation, Blade Runner  (1982 version by Ridley Scott). What escapes us at first, starting with its title and its first few pages, is the framing of the novel in a psychological, cognitive and affective system. It is a question of desire in the title, paralleling the protagonist Deckard’s dream (desire or ambition) of acquiring a genuine sheep: if humans seek real sheep, do androids seek electrical ones? If indeed androids can dream, can they dream only of artificial, electric sheep? Moreover, we are immediately introduced to the mood organ, which enables its user to dial the “artificial brain stimulation” (7) in order to be in the mood she’d like to be in. For the moment, Iran, Deckard’s wife, doesn’t want to dial anything and wants “to sit there on the bed and stare at the floor” (7). Dick establishes thus a clear distinction between a simulated mindset and a genuine one without the machine’s stimulation. The simulation/genuineness pair is immediately rendered explicit in the next two paragraphs, the subject of which is totally different. Deckard goes up on the roof of his building to see his electric sheep, a “sophisticated piece of hardware that it was, chomped away in simulated contentment, bamboozling the other tenants of the building” (7-8). As for the neighbors, “To say, ‘Is your sheep genuine’ would be a worse breach of manners than to inquire whether a citizen’s teeth, hair, or internal organs would test out authentic” (8). There is, then, for the sheep simulated contentment versus genuineness, just as there is for Iran a simulated mood versus just staring at the floor, a sign of her genuine mood. The plot of the novel is about maintaining that distinction. It turns out that a recent generation of androids has developed characteristics too close to those of human beings and they are revolting against the oppression to which they are subjected. On the orders of the “authorities," Deckard’s job is to seek out and destroy them in order to efface the blurring of simulation and genuineness starting to appear.
“Is your sheep genuine?” If the question could be asked, it is because the bamboozling of the other is successful: as with the chased androids, you cannot, cognitively speaking, tell the difference between an electric simulation of a sheep and a genuine one. What you see is not necessarily what you get. Moreover, if the question could be asked, it is because, in view of the difficulty in distinguishing one from the other, that knowledge has some importance. For various reasons, you just do not relate to a simulation in the same way as you relate to the real item. Early on in the novel, it is the social pressure of owning an electric sheep versus a genuine one that bothers Deckard: “Owning and maintaining a fraud had a way of gradually demoralizing one. And yet from a social standpoint it had to be done, given the absence of the real article” (9). Electric sheep “break down and everyone in the building knows” (12).
Deckard’s ambition, his own “dream” as a matter of fact, is to be successful in eliminating a number of these androids in order to make enough money and trade in his electric sheep for a genuine one. As the idea of acquiring “the real article” grows on him, so does his understanding of the effect of his knowing the nature of the animal—real or ersatz—will have on his relation to the animal. Seeing what he thought at first to be a real owl at the Rosen organization—the manufacturer of the androids he is supposed to “retire"—he starts to understand what that knowledge brings to his own relation to the animal:

He thought, too, about his need for a real animal; within him an actual hatred once more manifested itself toward his electric sheep, which he had to tend, had to care about, as if it lived. The tyranny of an object, he thought. It doesn't know I exist. Like the androids, it had no ability to appreciate the existence of another. (42)

The consciousness that you are dealing with a genuine animal is reflected back unto you to affirm your own existence as genuine, as an entity with a consciousness as opposed to an android. Owning a genuine sheep confirms, in other words, your own genuineness as human. Or to put it in Villiers’ Edison’s terms, dealing with a genuine sheep confirms your own “intimate sense” of yourself and your “true reality” (14).
“Is your sheep genuine?” Turing, of course, would have put the problem completely aside: if you can’t tell if it is human or a machine that answers your question, it doesn’t really matter and we’ll just say that what the machine does is “something which ought to be described as thinking” (435). But can we say the same thing in this case? If we were to imagine a Turing Test that applies here, it would have to go something like this: if you cannot distinguish between two animals, without knowing which one is artificial, and both give you the impression that they know that you exist, we’ll just call it a day and say that the artificial one does “something which ought to be described” as acknowledging your existence. That genuineness Turing Test has, by the way, actually been imagined by Villiers, whose Edison promises to his young friend Lord Ewald an android to replace his fiancée:

--But…such a creature could never be anything but a doll, without feeling or intelligence! he cried, for lack of anything else to say.
--My Lord, said Edison solemnly, you may take this on my word of honor: you will have to be careful, when you compare the two and listen to them both, that it isn’t the living woman who seems to you the doll. (64)

As soon, however, as you know which is genuine and which is not, only the one you think is genuine will be able to perform that acknowledgement. It is not, then, the authenticity of the object at hand but your knowledge of that authenticity that guarantees the functioning of the system. That’s how Nathanael in “The Sand-Man” falls in love with Olimpia. Whether it is really an automaton or not does not matter. It is his presumed knowledge that she is a person that allows him to fall in love with her. That presumption of knowledge is determined by a predisposition or mindset; but, of course, as soon as he finds out that it is a machine, that love dissipates into thin air. The situation is quite different in Cassou-Noguès’ “Le Jeu.” The narrator there does not take the machine’s projections as genuine. It is a game, after all. But for the character he plays, Proust’s Marcel, Albertine’s genuineness as a real person is not in question. The narrator might as well not play if his Marcel did not believe, during the time he is playing, in that genuineness.

This page has paths:

This page has tags:

Contents of this tag:

This page references: