The future according to Yuval Noah Harari: The historian on the 21st century's biggest challenges, and how to face them

Imagine that you're the driver of an out-of-control trolley car. The only mechanism that's still working, is a lever that helps switch from one track onto another. Ahead on the tracks, you see five people tied down. You're hurtling right towards them, it's inevitable that you'll run them over — except, there's another track that bifurcates to the left. Then you notice that on this track too, one person has been tied down. What should you do? Continue and cause the deaths of five people, or switch tracks and kill only one?

"The Trolley Problem" has long been used as a thought experiment in Ethics/Moral Philosophy.

On Sunday, 16 December, noted Israeli historian Yuval Noah Harari used it to explain the problems that lie ahead for makers of self-driving cars.

Harari — the author of Sapiens, Homo Deux, and 21 Lessons for the 21st Century — was in Mumbai to deliver the Penguin India Annual Lecture 2018. His talk touched upon 'Challenges for the 21st Century', and if, by session's end, many people in the audience were still considering having children, it would have been a surprise.

Not that Harari's talk was filled with doomsday prophesies. Nor did he paint a picture of the future so dystopian that it would rival the portrait of humanity presented by that other pessimistic philosopher Thomas Hobbes (who described mankind in its natural state as "solitary, poor, nasty, brutish, and short").

No.

But what he did say was sufficiently alarming.

For instance: "Human beings are the gods on earth, but we're very irresponsible gods... Throughout history, human beings have been manipulating the world around them long before they've understood it."

If that wasn't enough to send you into a funk about the wretchedness of the species we belong to, there were the challenges Harari dwelt on: Nuclear war, climate change, and technology. He stressed that none of these could be solved without global cooperation.

Technology, especially, was going to change the world even more than it already had, Harari warned, and we needed to brace for it.

"The last time we had a big technological revolution — the Industrial Revolution — it resulted in the creation of the working class," he said. But the next technological revolution would create a "non-working class".

"Artificial Intelligence (AI) and robotics will lead to the disappearance of many jobs by 2050, but the bigger issue will be retraining people for new types of jobs. You'd need to learn a lot of new skills and in some cases, change your personality," Harari said, adding, "And this will not be a long-term solution. Because one of the most important things to remember about the automation revolution is, it will not (unfold) as one watershed happening. We will have a bigger revolution by 2025, then by 2035, then 2045. So to stay relevant, people will have to reinvent themselves every few years."

In a scenario like this, governments would need to step in and help people through these difficult transition phases. But even this, Harari said, would not be enough to deal with the psychological effects of the need for constant reinvention. "Will you have the mental resilience? Reinventing yourself at several phases through your life would be simply too difficult for most people. And as life expectancy increases, the age of retirement will be pushed back. People will have to fight against uselessness," Harari stressed.

The biggest struggle for people in the 21st century, he concluded, would be against irrelevance.

Yuval Noah Harari. Image via Facebook/@Prof.Yuval.Noah.Harari

Yuval Noah Harari. Image via Facebook/@Prof.Yuval.Noah.Harari

The question that naturally arose, was what the education system might do to prepare citizens for the world they would inhabit, or "what should we teach children in school now that will be useful for them over the next 50 years" as Harari framed it. "No one knows what the job market will be like in 30 years, and that's unprecedented in history," he pointed out. "Teachers and children won't know what is needed for anyone to be a fruitful member of society." The answer then, was to focus on building emotional intelligence and mental resilience.

This technology-driven future would mean changes outside of our work lives too. Religion, for instance, would have to adapt; some would survive, others would go extinct. "Religion has been around for a very long time; it will merely change its face. Religions keep adapting all the time, and then they say 'We didn't change anything!' They say they're retaining the original purity but it's a way of repackaging change," Harari said.

The nature of governments too would change. Harari posited the rise of digital dictatorships, by saying: "New technologies might tempt governments across the world to monitor and control people all the time. New totalitarian regimes will have technologies that weren't available to a Hitler or Mao — they'll take have the ability to 'hack' human beings".

'Hacking' a human being required a good understanding of biology (especially brain science), a lot of data (especially personal data), and a lot of computing power. "These things weren't around in the times of Hitler or Stalin, so even if the Gestapo or KGB followed you around 24/7, they wouldn't be able to tell what you were thinking or feeling or predict what you would do," Harari said, adding that these resources were exactly what new leaders had/would have access to.

As for the leaders themselves, Harari said citizens needed to ask politicians running for office how they intended to deal with the spectre of nuclear war, climate change and technological disruptions. "The problem (now) is no one can come up with a meaningful vision for the future. Populist regimes with their fantasies can sustain themselves for a few years. But unless we can find some meaningful vision for the future, we're headed for disaster. If a politician has nothing meaningful to say about the future, don't vote for that politician," he asserted.

Those espousing isolationist politics too had no place in Harari's conception of the future.

"Leaders have become far more reckless in their behaviour and far more isolationist," he said. "This is exceedingly irresponsible because in the 21st century, the idea of an independent nation is a fantasy."

"Whenever a leader says 'My country first' or 'Only my country is important' we need to ask them, how their country will by themselves deal with these three issues," Harari said. "Nationalism does have an important role in the 21st century, but to be a good nationalist — which means protecting your country's interests, not hating foreigners — you must also be a globalist. So if you take away from this talk a single lesson, it is that our global problems demand global solutions."

Lest he give the audience the impression that all was bleak, Harari clarified, "(What I've outlined in my talk) is neither inevitable nor a prophecy, these are possibilities. Technology is not deterministic. It can be used for a variety of purposes. Technology gives us the choice, but ultimately its use is in the hands of humans. We can use AI and biotechnology to create different types of society."

The need of the hour, was ethical training for software engineers. "They need it even more than lawyers and judges. They shape the world. For instance, more and more today, the question of discrimination turns out to be a question of design. We need to look at how software is designed. We need to ensure that algorithms do not discriminate against other people." Harari said.

It was at this point that Harari brought up his version of the Trolley Problem.

"I've focused on the dangers of these technologies, but they have huge advantages. The vast majority of traffic accidents for instance, are caused by human error. If we replace humans at the wheel with driverless cars, we could save a million people a year," he said.

Makers of driverless cars though, would have to think about ethics.

Harari asked the audience to imagine a driverless car was going down a road, with its owner asleep in the back seat, when two children suddenly shot out in front of it. Should the car continue straight ahead (and kill the children) or veer away from them, only to collide with an oncoming truck, that was sure to cause the death of the owner?

"Computers don't have a subconscious. They do whatever you programme them to do," he pointed out. So you could have the best philosophers in the world in a room and give them a year to decide [what the driverless car should do], and that's what the car would be programmed to do.

If you left the decision to the makers, Harari said, they'd come up with two models of cars: "Tesla Altruist and Tesla Egoist" and leave it up to the buyer to choose, because the “customer is always right”.

"These are questions that philosophers have been debating thousands and thousands of years, but they had very little practical import. These questions of ethics are now related to practical questions of engineering," Harari said. "We've always needed philosophers and spiritual guides, but we need them now more than ever before because we're more powerful now than ever before."

(The matter of Yuval Noah Harari's lecture has been condensed and rearranged in this piece.)

 

Firstpost is now on WhatsApp. For the latest analysis, commentary and news updates, sign up for our WhatsApp services. Just go to Firstpost.com/Whatsapp and hit the Subscribe button.

Updated Date: Dec 20, 2018 12:30:25 IST

Also See