In his novel Do Androids Dream of Electric Sheep?, Philip K. Dick invites readers into a post-apocalyptic version of Earth where very few humans remain in favor of colonizing on Mars and other planets. On this desolate planet, bounty hunters like Rick Deckard make a living off catching escaped androids that made their way to the planet. To test whether or not these realistically built androids are not human, Rick administers a test called the “Voigt-Kampff” scale, which basically measures the person’s—or cyborg’s—reaction to provocative questions and statements, i.e. testing for the empathy that only a human is capable of possessing.
This scale is put to the test when Rick meets with members of the android-developing Rosen Association: Eldon Rosen and his niece, Rachael Rosen. The duo tries to prove to Rick that his scale is not accurate by having him use it against Rachael, whose reactions garner unusual results by the system, since “[t]he gauges, however, did not respond. Formally, a correct response. But simulated” (Dick 50). While Rachael’s results classify her as an android, the Rosens insist to Rick that she is a human whose unique circumstances growing up negatively altered her empathetic reactions. Rick, however, tries one last test, and the results give him his desired outcome: “He saw the two dial indicators gyrate frantically. But only after a pause. The reaction had come, but too late” (Dick 59). Rachael is confirmed an android and not, in fact, an empathy-lacking human.
This encounter poses the hypothetical question: Would it be morally wrong for someone—a human—from our society today to fail this specific empathy test? While in Dick’s world the Voigt-Kampff tests for androids, would failing this test in our android-free world make someone less than human?