Empathy—Or Lack Thereof?

In his novel Do Androids Dream of Electric Sheep?, Philip K. Dick invites readers into a post-apocalyptic version of Earth where very few humans remain in favor of colonizing on Mars and other planets. On this desolate planet, bounty hunters like Rick Deckard make a living off catching escaped androids that made their way to the planet. To test whether or not these realistically built androids are not human, Rick administers a test called the “Voigt-Kampff” scale, which basically measures the person’s—or cyborg’s—reaction to provocative questions and statements, i.e. testing for the empathy that only a human is capable of possessing.

This scale is put to the test when Rick meets with members of the android-developing Rosen Association: Eldon Rosen and his niece, Rachael Rosen. The duo tries to prove to Rick that his scale is not accurate by having him use it against Rachael, whose reactions garner unusual results by the system, since “[t]he gauges, however, did not respond. Formally, a correct response. But simulated” (Dick 50). While Rachael’s results classify her as an android, the Rosens insist to Rick that she is a human whose unique circumstances growing up negatively altered her empathetic reactions. Rick, however, tries one last test, and the results give him his desired outcome: “He saw the two dial indicators gyrate frantically. But only after a pause. The reaction had come, but too late” (Dick 59). Rachael is confirmed an android and not, in fact, an empathy-lacking human.

This encounter poses the hypothetical question: Would it be morally wrong for someone—a human—from our society today to fail this specific empathy test? While in Dick’s world the Voigt-Kampff tests for androids, would failing this test in our android-free world make someone less than human?

Provocation

In chapter one I found a very interesting part of the reading to be when Rick began to talk to his neighbor regarding his electric sheep. At the end of the conversation to try and convince Barbour to allow him to buy his foal Rick showed him that his sheep was not real. Barbour assured him he would not tell anyone, which in turn caused “something of the despair that Iran had been talking about tapped him on the shoulder…” (Dick 13). Rick then became aggravated and tells Barbour if he could get enough andys (androids) in one month he could buy an animal. Barbour makes a joke about Rick buying a domestic animal instead or a cricket, which sends Rick into a silent rage. Rick, being aggravated at the statement, tells Barbour how his horse could die and  he could find her lying on her back like a cricket. This part of the reading really stood out to me because of all of the different emotions Rick went through. In the beginning of the chapter he and his wife Iran spoke of how they had to program their emotions of the day, kind of like a robot. Yet when Barbour told Rick he would not tell anyone of the sheep, Rick felt despair without having to program it into his system; he also seemed to have felt anger without having to program it into his system. My question is, so far in the reading what do you think Dick is trying to make us believe that Iran and Rick are? Are they human; are they androids; are they somehow a mixture of both?

Animals and Empathy

In the world that Dick created, there is a belief system known as “Mercerism.” Many people seem to own animals, and if they could not afford live animals, they own realistic mechanical replications of such. However, for Rick and surely many others in this universe, “owning and maintaining a fraud had a way of gradually demoralizing one” (Dick 9). Those who did not own an animal seemed to be looked down upon because “from a social standpoint it had to be done… He had therefore no choice except to continue” (Dick 9). Rick’s neighbor, to whom he revealed his mechanical sheep, explains that people “consider it immoral and anti-empathetic” to not care for an animal (Dick 13).

What does owning an animal and empathy have in common? Why would it be so looked down upon to not be taking care of an animal in this world?

Penfield, Good or Bad?

 

In the First chapter of Do Androids Dream of Electric Sheep by Philip K. Dick the reader is aware that in the future individuals are able to control their emotions using a device called the Penfield. Basically Penfield is a machine where the user “sets” their emotion to a specific degree. For example; if the user wants to feel joyful they would set the nob to the mode that will allow them to feel that emotion. “My schedule for today lists a six – hour self-accusatory depression,’ Iran said…..It defeated the whole purpose of the mood organ.” (Dick 5) In this chapter Iran has set her mood to be in depression her reasoning being “I realized how unhealthy it was, sensing the absence of life, not just building but everywhere………they call it ‘absence of appropriate affect.” (Dick 5) To her this machine does not seem normal, everyone should have a natural reaction to every situation, not a planned one.

If the device were to exist in reality in my opinion the Penfield program be used for evil instead of good. Instead of helping people with severe depression, there could be a possibility that individuals would lose their sense of empathy, and this would lead to bad decision making. If there was a chance that the device would be used for bad, should the Penfield device be strictly reserved for individuals with severe mental illnesses, and no one else?  Or should the mood organ be available for everyone just at a higher price, and if so does this then dehumanizes everyone?