The future is sci-fi: On Philip K Dick's Do Androids Dream of Electric Sheep, AI and artificial empathy
As we embark on a new decade, how do visions of the 2020s — imagined in books like Do Androids Dream of Electric Sheep?, films like Soylent Green, or even manga like Ghost in the Shell —match up against our reality? In this series, we look at seven pop culture artefacts from the past that foretold the future, providing a prophetic glimpse of the decade we’re now entering.
Just as the dreams of yesterday's science fiction writers became our nightmares of today, it is possible the dreams of artificial intelligence today could become our nightmares of tomorrow. If AI begins to look, think, feel or exhibit consciousness like us, they may eventually come to replace us. This fear that we may lose control over our creations is what Isaac Asimov called the “Frankenstein Complex." Fiction is littered with such instances. In R.U.R., Karel Čapek not only introduces us to the word robot, but posits a rebellion of enslaved androids and the massacre of their human masters. Westworld imagines a similar fate. In The Terminator, an android assassin is sent back in time from 2029 to kill a woman whose unborn son is the key to humanity's future salvation.
If androids become virtually indistinguishable from their organic counterparts, how do we tell them apart? We're way past CAPTCHA here, and Ex Machina exposed the limits of the Turing test as you can't determine if an android can truly think independently based on a narrow conversation alone. In Do Androids Dream of Electric, Philip K Dick proposed an alternate criterion: empathy.
Set in a near-abandoned San Francisco after World War Terminus (WWT), the novel tells the story of a bounty hunter named Rick Deckard who tracks down runaway androids, who have escaped from the off-world colonies where they served as slaves. These andys, as they are called, not only resemble humans but their intellectual capacity has evolved to an extent where intelligence tests have become insufficient and futile. So, they have been replaced by the Voight-Kampff test, which sniffs out androids with a series of questions intended to provoke empathetic responses. These responses are measured via changes in heart rate, breathing and pupil dilation.
Dick believes empathy to be the defining human trait, and what separates us from the androids and possibly every other living being. But he also seems to suggest our lack of empathy is an equally defining trait. In his novel, the humans left behind on Earth must adhere to a hierarchical social order, determined by an IQ test. The high-IQ "regulars" are allowed to procreate and migrate to off-world colonies; whereas the low-IQ "specials" (or "chickenheads"), who were affected by the radioactive fallout following WWT, don't possess these rights. Dick then sets up a “special” named John Isidore as Deckard's foil. Despite being treated as subhuman, Isidore displays more empathy than the “regulars” by aiding the “andys” in their escape. In the process, he proves himself to be more human than Deckard, who effectively seems like an unempathetic android programmed to kill. So, is it empathy or a lack of it that allows bounty hunters like Deckard to commit genocidal violence against androids demanding the right to self-determination? Is it empathy or a lack of it that allows the dehumanisation of the marginalised like Isidore, that allows the large-scale deportation of immigrant families and children, when countries refuse to host migrants fleeing wars they orchestrated? Is it empathy or a lack of it at the root of all the crime, warfare and oppression in the world?
Till we can empathise with our fellow humans regardless of their skin colour, race, religion or even opinions, we can't equate empathy with humanity, nor use it to separate artificial intelligence from us. Furthermore, even if artificial intelligence has become an evolving reality, artificial empathy has not. Because empathy cannot be written with programming languages and information coding. At least not yet. To evoke empathy or any emotion in AI, it must first be able to identify it and decode what it means. "We don't understand all that much about emotions to begin with, and we're very far from having computers that really understand that. I think we're even farther away from achieving artificial empathy," said Bill Mark, whose AI project at SRI International evolved into Siri. "Some people cry when they're happy, a lot of people smile when they're frustrated. So, very simplistic approaches, like thinking that if somebody is smiling they're happy, are not going to work." Mark's team has developed software that has solved the identification issue, if not empathy because emotions are interpreted and felt differently by different people. Even if AI learns these unique emotional responses through sensory inputs gained from human interaction, it can only mimic them and not register its own original response.
But in Dick's vision of the future, human beings themselves are not above mimicry and artifice. They use a device called "Penfield Mood Organ" to program various emotional states or feel emotions of any kind. In a world where empathy has become key to survival, humans resort to a fitting techno-theological solution in a religion called Mercerism, which uses "empathy boxes" to spiritually link a prophet named Wilbur Mercer with his followers. Humanity is united in this shared experience, which makes them feel like they're close to one another and helps avoid feelings of loneliness and alienation. Ironically, in the desire to be more empathetic and thus human, they are instead becoming more machine-like.
Do Androids Dream of Electric Sheep, which was written in 1968, was initially set in 1992 and changed to 2021 to not only keep it closer to Blade Runner’s setting but also because we were way past the period in which it was set. Of course, sentient andys, mood organs and the world imagined by Dick will surely not arrive by 2021. Currently, the most advanced humanoid robots can do backflips and parkour, and go on awkward dates with Jimmy Fallon, but we're still far away from creating one in our true image. But if AI is expected to perform half of all productive functions in the workplace by 2025, we must consider establishing ethical rights for robots, as CNBC suggests. It's not just an exercise in empathy, but a deterrent against a potential AI uprising in future. There have been enough dystopias of our own making — and we don't want to add to that.
Find latest and upcoming tech gadgets online on Tech2 Gadgets. Get technology news, gadgets reviews & ratings. Popular gadgets including laptop, tablet and mobile specifications, features, prices, comparison.
Updated Date: Jan 06, 2020 16:29:56 IST