Modern science didn’t appear until the 17th Century, what took so long?

The Scientific Revolution of the 17th century yielded the figure of the modern scientist, single-mindedly dedicated to collecting empirical evidence and testing hypotheses against it.

Nobel Prize-winning physicist Richard Feynman once recalled a friend, an artist, who would say that he could properly appreciate the beauty of a flower, while a scientist like Feynman always insisted on taking the flower apart and making it dull. Of course, Feynman disagreed. “I can imagine the cells inside, which also have a beauty,” Feynman wrote, calling his friend’s prejudice “nutty.” “There are all kinds of interesting questions that come from a knowledge of science, which only adds to the excitement and mystery and awe of a flower.”

I thought of Feynman’s good-natured defence while reading “The Knowledge Machine,” a provocative and fascinating book by philosopher Michael Strevens that mostly enthralled me, even as a couple of parts set my teeth on edge. But that’s just the nature of opinion and disputation, something that Strevens would surely understand, given his argument that opinion and disputation play an essential role in the scientific world.

While human civilization has existed for millenniums, modern science has only been around for a few hundred years. Image credit: Pixabay

While human civilization has existed for millenniums, modern science has only been around for a few hundred years. Image credit: Pixabay

While modern science is built on the primacy of empirical data — appealing to the objectivity of facts — actual progress requires determined partisans to move it along.

Science has produced some extraordinary elements of modern life that we take for granted: imaging devices that can peer inside the body without so much as a cut; planes that hurtle through the air at hundreds of miles an hour. But human civilization has existed for millenniums, and modern science — as distinct from ancient and medieval science, or so-called natural philosophy — has only been around for a few hundred years.

What took so long? “Why wasn’t it the ancient Babylonians putting zero-gravity observatories into orbit around the earth,” Strevens asks, “the ancient Greeks engineering flu vaccines and transplanting hearts?”

The Scientific Revolution of the 17th century yielded the figure of the modern scientist, single-mindedly dedicated to collecting empirical evidence and testing hypotheses against it. Strevens, who studied mathematics and computer science before turning to philosophy, says that transforming ordinary thinking humans into modern scientists entails “a morally and intellectually violent process.” So much scientific research takes place under conditions of “intellectual confinement” — painstaking, often tedious work that requires attention to minute details, accounting for fractions of an inch and slivers of a degree.

Strevens gives the example of a biologist couple who spent every summer since 1973 on the Galápagos, measuring finches; it took them four decades before they had enough data to conclude that they had observed a new species of finch.

This kind of obsessiveness has made modern science enormously productive, but Strevens says there is something fundamentally irrational and even “inhuman” about it. He points out that focusing so narrowly, for so long, on tedious work that may not come to anything is inherently unappealing for most people. Rich and learned cultures across the world pursued all kinds of erudition and scholarly traditions but didn’t develop this “knowledge machine” until relatively recently, Strevens says, for precisely that reason.

The same goes for brilliant, intellectually curious individuals like Aristotle, who generated his own theory about physics but never proposed anything like the scientific method.

According to “The Knowledge Machine,” it took a cataclysm to disrupt the long-standing way of looking at the world in terms of an integrated whole. The Thirty Years’ War in Europe — which started over religion and ended, after killing millions, with a system of nation-states — made compartmentalization look good. Religious identity would be private; political identity would be public. Not that this partition was complete in the 17th century, but Strevens says it opened up the previously unfathomable possibility of sequestering science.

The timing also happened to coincide with the life of Isaac Newton, who became known for his groundbreaking work in mathematics and physics. Even though Newton was an ardent alchemist with a side interest in biblical prophecy, he supported his scientific findings with empirical inquiry; he was, Strevens argues, “a natural intellectual compartmentalizer” who arrived at a fortuitous time.

So modern science began, accruing its enormous power through what Strevens calls “the iron rule of explanation,” requiring scientists to settle arguments by empirical testing, imposing on them a common language “regardless of their intellectual predilections, cultural biases or narrow ambitions.” Individual scientists can believe whatever they want to believe, and their individual modes of reasoning can be creative and even wild, but in order to communicate with one another, in scientific journals, they have to abide by this rule. The motto of England’s Royal Society, founded in 1660, is “Nullius in verba”: “Take nobody’s word for it.”

Strevens’ book contains a number of surprises, including an elegant section on quantum mechanics that coolly demonstrates why it’s such an effective theory, deployed in computer chips and medical imaging, even if physicists who have made ample use of it (like Feynman) have said that nobody, themselves included, truly understands it.

Strevens also has some pretty uncharitable things to say about the majority of working scientists, painting them as mostly uncreative drones purged of all nonscientific curiosity by a “program of moralizing and miseducation.” The great scientists were exceptions because they escaped the “deadening effects” of this inculcation; the rest are just “the standard product of this system”: “an empiricist all the way down.”

He may well be right, but from a book about the history of science, I wanted more proof. Then again, “The Knowledge Machine” is ultimately a work of philosophy and should be considered an ambitious thought experiment. Strevens builds on the work of philosophers like Karl Popper and Thomas Kuhn to come up with his own original hypothesis about the advent of modern science and its formidable consequences. The machine in Strevens’ title has scientists pursuing their work relentlessly while also abiding by certain rules of the game, allowing even the most vehement partisans to talk with one another.

And Strevens doesn’t even leave it at that. Climate change, pandemics — he comes up to the present day, ending on a grim but resolute note, hopeful that scientists will adapt and find a better way to communicate with a suspicious public. “We’ve pampered and praised the knowledge machine, given it the autonomy it has needed to grow,” he writes. “Now we desperately need its advice.”

Jennifer Szalai c.2020 The New York Times Company

Find latest and upcoming tech gadgets online on Tech2 Gadgets. Get technology news, gadgets reviews & ratings. Popular gadgets including laptop, tablet and mobile specifications, features, prices, comparison.