Humanity & Self Awareness

In chapter 11,  it is revealed that Resch is an android as well as Garland himself, but Resch’s case is different. “We all came here on the same ship from Mars. Not Resch: he stayed behind another week, receiving the synthetic memory system,” (Dick 121). It is revealed to us  by Garland that Resch is unaware that he is in fact an android due to this synthetic memory system that makes him believe this. The big question is what Resch will do when he finds out, since he is most likely not going to believe this. When Resch returns, Garland aims his laser beam at him and Resch kills him, proclaiming that it is his job to predict what androids can do. When questioned about their conversation, Rick doesn’t tell Resch that Garland told him that Resch was also an android, only that Garland admitted to being one and that there were many others. With this knowledge, Resch and Rick come up with a to get out of the building. In shock that he hadn’t suspected Garland to be an one, Resch questions himself a little, wanting to take the test to see if his memory has somehow been tampered with, though Rick subtly insists that he will not like his results. Resch chooses not to believe him, only insisting that he has a squirrel, an animal he cares for and loves, like any other human would.

Why do you think Rick chose not to tell Resch? Does Resch owning an animal really make him human in the ways of Mercerism or is it just the false memories effects on him?

Retire All Androids?

In chapter 12 of Phillip K. Dick’s novel Do Androids Dream Of Electric Sheep, Rick struggles with his personal view of androids. After retiring Luba Luft, Rick is suddenly upset over her death and debates if he should continue being a bounty hunter. When Phil reminds Rick that bounty hunters are necessary, Rick questions the need for killing androids. He argues that Luba was not a threat to humanity, that “she was a wonderful singer” and “the planet could have used her” (Dick 136). He wonders to himself ,”how can talent like that be a liability to our society” (Dick 137).

The question I have about this section is that is it necessary to retire all androids? Are they all a threat to society or can some of them live peacefully on Earth as long as they do not harm humans?

Machines vs. Religion?

Isidore is on his way back to the false animal vet to deliver a mechanical cat. On the way, he thinks about the way Mercerism helps spread empathy throughout the world; He believes that Buster Friendly could be jealous of that (Dick 75). Only Buster had a problem with Mercerism,everyone else was fine with it, even the U.N. secretary has said on numerous occasions, “Mankind needs more empathy” (Dick 75). He arrived at the vet and handed the cat over to his boss Mr. Sloat. It appears from Isidore and Sloat’s conversation that Buster Friendly and Mercerism are fighting for control over their “psychic souls” (Dick 76). They discussed that both sides are immortal, and that Buster must be some kind of superior life form.

From what I can tell Mercerism is a type of religion, which is based around empathy. If Buster Friendly is always talking badly about it, could that mean the TV show is hosted by Andriods since they are also immortal?

Is Emotion a Luxury?

In chapter 1 of Philip K Dick’s Do Androids Dream of Electric Sheep, he describes Iran and Rick’s use of  the Penfield mood organ. Upon waking up, Rick notices that Iran still is not up or awake. He claims to her “You set your Penfield to weak…I’ll reset it and you’ll be awake” (Dick 3). Her response is unpleasant in nature as she doesn’t want to wake up. This is because she has her dial schedule set to have a “six hour self-accusatory depression” (Dick 4). Iran has this scheduled because she was feeling to good. Her mood was too good and didn’t feel healthy to feel all these good emotions so she put it on her schedule to feel despair twice a month (Dick 5). Iran also talks of how grateful she was that they could afford the mood organ, even for a short second.

I find it peculiar that Iran would want to feel these emotions when she has the ability to feel joy all the time. She mentioned that how she feels even with the mood organ “used to be considered a sign of mental illnesses” and it doesn’t seem normal (Dick 5). Why would Iran choose to set her mood organ to feel an emotion mirroring that of a mental illness? Also, is the Penfield mood organ something that everyone owns or is it a luxury that only a few people own and use daily?

The Right To Choose

A deeply questionable moment occurs in the first chapter of Do Androids Dream of Electric Sheep? by Phillip K. Dick.

The protagonist, Rick Deckard, is having an argument with his wife, who doesn’t want to use the Penfield machine, which controls empathy/human emotions. He has also learned that twice a month she dials her machine to a Despair setting. He sees this as dangerous, telling her that “Despair like that… is self-perpetuating” (Dick, 5). She, on the other hand doesn’t feel right about not despairing when despair might ordinarily be called for, like when she hears all of the empty apartments in their building. The argument ends with Deckard declaring that “I’ll dial for both of us” and sets her setting to “pleased acknowledgement of husband’s superior wisdom in all matters” (Dick, 7). He then proceeds to go about his day, more or less as normal.

What I would like to field to the group is this: Is what has just transpired, a husband essentially choosing the way his wife is going to feel, morally dubious? Does the fact that his wife conceded the argument make this act acceptable, or is this a case of emotional manipulation, or possibly even emotional abuse?

 

Empathy—Or Lack Thereof?

In his novel Do Androids Dream of Electric Sheep?, Philip K. Dick invites readers into a post-apocalyptic version of Earth where very few humans remain in favor of colonizing on Mars and other planets. On this desolate planet, bounty hunters like Rick Deckard make a living off catching escaped androids that made their way to the planet. To test whether or not these realistically built androids are not human, Rick administers a test called the “Voigt-Kampff” scale, which basically measures the person’s—or cyborg’s—reaction to provocative questions and statements, i.e. testing for the empathy that only a human is capable of possessing.

This scale is put to the test when Rick meets with members of the android-developing Rosen Association: Eldon Rosen and his niece, Rachael Rosen. The duo tries to prove to Rick that his scale is not accurate by having him use it against Rachael, whose reactions garner unusual results by the system, since “[t]he gauges, however, did not respond. Formally, a correct response. But simulated” (Dick 50). While Rachael’s results classify her as an android, the Rosens insist to Rick that she is a human whose unique circumstances growing up negatively altered her empathetic reactions. Rick, however, tries one last test, and the results give him his desired outcome: “He saw the two dial indicators gyrate frantically. But only after a pause. The reaction had come, but too late” (Dick 59). Rachael is confirmed an android and not, in fact, an empathy-lacking human.

This encounter poses the hypothetical question: Would it be morally wrong for someone—a human—from our society today to fail this specific empathy test? While in Dick’s world the Voigt-Kampff tests for androids, would failing this test in our android-free world make someone less than human?

Animals and Empathy

In the world that Dick created, there is a belief system known as “Mercerism.” Many people seem to own animals, and if they could not afford live animals, they own realistic mechanical replications of such. However, for Rick and surely many others in this universe, “owning and maintaining a fraud had a way of gradually demoralizing one” (Dick 9). Those who did not own an animal seemed to be looked down upon because “from a social standpoint it had to be done… He had therefore no choice except to continue” (Dick 9). Rick’s neighbor, to whom he revealed his mechanical sheep, explains that people “consider it immoral and anti-empathetic” to not care for an animal (Dick 13).

What does owning an animal and empathy have in common? Why would it be so looked down upon to not be taking care of an animal in this world?

Penfield, Good or Bad?

 

In the First chapter of Do Androids Dream of Electric Sheep by Philip K. Dick the reader is aware that in the future individuals are able to control their emotions using a device called the Penfield. Basically Penfield is a machine where the user “sets” their emotion to a specific degree. For example; if the user wants to feel joyful they would set the nob to the mode that will allow them to feel that emotion. “My schedule for today lists a six – hour self-accusatory depression,’ Iran said…..It defeated the whole purpose of the mood organ.” (Dick 5) In this chapter Iran has set her mood to be in depression her reasoning being “I realized how unhealthy it was, sensing the absence of life, not just building but everywhere………they call it ‘absence of appropriate affect.” (Dick 5) To her this machine does not seem normal, everyone should have a natural reaction to every situation, not a planned one.

If the device were to exist in reality in my opinion the Penfield program be used for evil instead of good. Instead of helping people with severe depression, there could be a possibility that individuals would lose their sense of empathy, and this would lead to bad decision making. If there was a chance that the device would be used for bad, should the Penfield device be strictly reserved for individuals with severe mental illnesses, and no one else?  Or should the mood organ be available for everyone just at a higher price, and if so does this then dehumanizes everyone?

Provocation Assignment

Throughout the semester you will notice “provocation” assignments built into our syllabus. They are often broken into groups and correspond to longer works of fiction. Provocations are meant to provide context and support for your student-led discussions in class. In order to complete these assignments you must:

  • Read the assigned text very closely and annotate it thoroughly.
  • Choose one section of the text you found most interesting/problematic/controversial/stimulating and summarize it in 5-7 sentences.
  • You should use at least one direct quote in this summary with an MLA citation.
  • Construct a complex question for your classmates to answer about that section of the text that will spark a lively debate.

On the blog, you will post your provocations BEFORE class time as indicated on the syllabus. ONLY post when your group is listed on the syllabus. Use the category and tag provided by your professor for each post.

These posts will be graded on a 5 point scale:

  • 5points= An engaging, thought-provoking post that shows attention to detail and comprehension of the text. Grammar and mechanics must be practically perfect (edit carefully!). Direct evidence from the text with a citation must be included.
  • 4points = An accurate summary and well-composed question that may contain a small, but not catastrophic, misreading or errors in grammar and mechanics.
  • 3points= A sloppy post that shows little effort and does not include the elements listed above.
  • 2points= A post that is a day late, or difficult to read, or phrased in a way that students would be unable to respond.
  • 1point= A post that is a week late, contains numerous errors, and does not contribute to the conversation.
  • 0points= The post does not exist.

Please create these summaries and questions yourself: DO NOT STEAL OTHER PEOPLES WORK. If I find you have plagiarized these posts you will be reported. If you are struggling please come see me or email me with questions.