darwin Archives - The Freethinker https://freethinker.co.uk/tag/darwin/ The magazine of freethought, open enquiry and irreverence Fri, 26 Jul 2024 14:04:38 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.2 https://freethinker.co.uk/wp-content/uploads/2022/03/cropped-The_Freethinker_head-512x512-1-32x32.png darwin Archives - The Freethinker https://freethinker.co.uk/tag/darwin/ 32 32 1515109 Linnaeus, Buffon, and the battle for biology https://freethinker.co.uk/2024/06/carl-linnaeus-the-comte-de-buffon-and-the-battle-for-biology/?utm_source=rss&utm_medium=rss&utm_campaign=carl-linnaeus-the-comte-de-buffon-and-the-battle-for-biology https://freethinker.co.uk/2024/06/carl-linnaeus-the-comte-de-buffon-and-the-battle-for-biology/#respond Fri, 28 Jun 2024 04:44:00 +0000 https://freethinker.co.uk/?p=13887 Review of Every Living Thing: The Great and Deadly Race to Know All Life* by Jason Roberts, Riverrun,…

The post Linnaeus, Buffon, and the battle for biology appeared first on The Freethinker.

]]>
Review of Every Living Thing: The Great and Deadly Race to Know All Life* by Jason Roberts, Riverrun, 2024.

‘God Himself guided him’, it was said of the famous Swedish taxonomist, Carl Linnaeus (1707-1778). ‘God has given him the greatest insight into natural history, greater than anyone else has enjoyed. God has been with him, wherever he has gone…and made him a great name, as great as those of the greatest men on earth.’ ‘Nobody has been a greater biologist or zoologist’, gushed a contemporary admirer. And when Linnaeus wrote a medical treatise, a reviewer observed that ‘We may justifiably assert that no one who has studied medicine, pharmacy or surgery can do without it; indeed that it cannot be but of use and pleasure to the most learned medical men.’

Linnaeus himself was the anonymous author of these and many other plaudits. He was, simply, an appalling person. Confident that he would never be contradicted, he embellished his field notes, making his journeying sound far more epic than it was. Scrambling up the greasy pole of academic preferment, he lied about his academic collaborations and surrounded himself with sycophants. He lent apparent scientific credibility to racism, declaring that there were different races of Homo sapiens, with fixed attributes. Homo sapiens europaeus, he announced, was inherently ‘governed by laws’, unlike the African subspecies, Homo sapiens afer, which was ‘governed by whim’ and was ‘sly, slow [and] careless’. He was a chauvinist at home, believing his own daughters unworthy of education, and an unabashed nepotist who arranged for his 22-year-old son, who had no degree and no love of botany, to be appointed adjunct professor of botany (on his father’s death, he became a full professor).

He was also wrong: emphatically, repercussively, corrosively wrong about the natural world.

There were, he thought towards the end of his working life, 40,000 species: 20,000 vegetables, 3,000 worms, 12,000 insects, 200 amphibious animals, 2,600 fish, and 200 quadrupeds. It is now estimated that there might be up to a trillion species, and even lower estimates have the number in the millions or hundreds of millions.

For Linnaeus, species were fixed and had been since the time of the Biblical creation. It was sacrilegious to think otherwise. ‘We reckon the number of species as the number of different forms that were created in the beginning. . . . That new species can come to exist in vegetables is disproved by continued generations, propagation, daily observations, and the cotyledons. . .’

‘odious, dishonest, bigoted, and mistaken.’ portrait of carl linnaeus by alexander roslin, 1775.

Odious, dishonest, bigoted, and mistaken. Yet Linnaeus is the only taxonomist of whom most have heard. His method of denoting species, by reference first to the genus (such as Homo), is ubiquitous. And so is his assumption that nature can and should be corralled into synthetic conceptual structures.

He had an exact contemporary, an almost forgotten Frenchman, Georges-Louis Leclerc, Comte de Buffon (1707-1788), who shared Linnaeus’ project of producing a comprehensive account of life on earth but shared few of Linnaeus’ vanities, moral failings, or biological errors.

Buffon was contemptuous of self-seekers, calling contemporary glory a ‘vain and deceitful phantom’. He campaigned vigorously against dividing humans into races, let alone ascribing pejorative attributes to each race, and his closest friendship was with a woman he regarded as his muse and his intellectual superior. He was suspicious of systems of classification, acknowledging their usefulness, but realising that nature was in a state of constant change and that no artificial system could do justice to the dazzling complexity of the real wild world. He was robustly critical of Linnaeus. While all systematic approaches to nature were flawed, he wrote, ‘Linnaeus’ method is of all the least sensible and the most monstrous.’ ‘We think that we know more because we have increased the number of symbolic expressions and learned phrases. We pay hardly any attention to the fact that these skills are only the scaffolding of science, and not science itself.’

Since nature was mysterious, liquid, and vast (there were, he suspected, many more than 40,000 species) it could only properly be approached, thought Buffon, with humility and uncertainty. Human presumption must be shed at the laboratory door. He was a true Enlightenment sceptic; no questions were out of bounds. The Enlightenment did not stay that way for long, but while it did, it was glorious, and Buffon adorned it.

James Roberts’ scintillating account of these two lives is an overdue and important attempt to disinter Buffon from the obscurity in which he has long languished. He is not the first writer to try (the most systematic effort is Jacques Roger’s 1997 Buffon: A Life in Natural History), but he is far and away the most successful. Roberts is a sprightly storyteller who wears his considerable learning lightly. The result is a compellingly readable piece of intellectual history; a salutary account of enmeshed personality and ideas, and so of the way that science itself works.

Lionized Linnaeus is the archetype of many modern biologists. It started for him, as for them, with a childhood love of the natural world which soon curdled into ambition—an ambition not to understand, but to force the facts into a set of self-created and self-satisfied categories. It is a very modern story: rigour becomes fundamentalism; a search for the truth becomes a quest for new ways to affirm old orthodoxies; journeying becomes colonialism. ‘Objects are distinguished and known by classifying them methodically and giving them appropriate names’, wrote Linnaeus. ‘Therefore, classification and name-giving will be the foundation of our science.’ Self-reference and self-affirmation, in other words, are what science is all about. 

And the forgotten Buffon is the antitype of many modern biologists. His boyhood wonder never left him; never became sclerosed into a reverence for categories rather than real plants and animals. ‘The true and only science is the knowledge of facts’, he said. Theory, however elegant and revered, must always give way to reality.

Linnaeus was the fifth generation in a line of preachers and was expected to occupy the hereditary pulpit, but at the age of four, hearing his father declaim the incantatory Latin names of plants, he had an epiphany. It made him a botanical obsessive and set biology’s course for the next two hundred years.

For nine miserable years, he was marinated in Latin and Greek at school. He was known by teachers and pupils as the ‘Little Botanist’ (he was never more than five feet tall). He failed to make the grade for the Christian ministry. His teachers told his outraged father that Linnaeus should become a tailor or a shoemaker instead. It is perhaps unfortunate for science that he did not.

His father sought advice. What could be done with his hopeless son? A friend of the family suggested medicine and offered to coach Linnaeus for entry to medical school. It wasn’t as prestigious as preaching, but it was better than shoemaking. Linnaeus duly went (via a false start in Lund) to medical school in Uppsala.

There, in 1729, in a garden, there was a fateful meeting. Professor Olof Celsius, who was trying to write a book about the plants mentioned in the Bible, saw a tiny, ragged student drawing a flower badly. Celsius asked him what the flower was, and the student, who was Linnaeus, responded using a ludicrously difficult term forged by the French botanist Joseph Pitton de Tournefort. It showed that though Linnaeus could not draw, he had spent many hours learning Tournefort’s 698 categories. It was a formidable achievement. Celsius impulsively took Linnaeus under his wing, fostering Linnaeus’ passion for plants and recruiting him to work on the book about Biblical plants.

Celsius proved a loyal and powerful mentor. It was largely due to him that Linnaeus, in 1730, as a second-year student who had never taken a single class in botany, became de facto professor of botany. This astonishing appointment cemented Linnaeus’ confidence in his rapidly gestating system of plant classification, based on the characteristics of plant reproductive organs. It ignited Linnaeus’ belief in himself as a botanical messiah, and he began to see any challenge to him as blasphemous. When he was deposed from his post by a rival, Nils Rosen, and relegated to student status, the furious Linnaeus vowed to kill Rosen—and even carried a sword with him for the purpose.

Like his four autobiographies, much about Linnaeus was bogus or misconceived.

Dispossessed, Linnaeus went on a collecting expedition to Lapland, collecting not only plants but self-glorifying yarns. ‘A divine could never describe a place of future punishment more horrible than this country, nor could the Styx of the poets exceed it. I may therefore boast of having visited the Stygian territories’, he wrote. Like his four autobiographies, much about Linnaeus was bogus or misconceived. The famous conical Lapp hat in which he is so often pictured was one worn by women. And he was far from the careful scientist of his self-portrait. He was, observes Roberts, temperamentally unsuited for field research: his methodology ‘swung wildly between minutiae and the cursory’. But his energy, though erratic, was real. He collected manically and worked on a scheme for classifying every species. For Linnaeus, writes Roberts, ‘[t]he Maker had long since put away his tools and closed up His workshop’, and of course there had to have been room in Noah’s Ark for all the species. The only problem in identifying the small and limited number of species was their geographical dispersion. Linnaeus was confident that he was up to the job.

He returned to Uppsala, and though his status there was better than it had been, his account of the expedition and the outline of his system of classification failed to impress the scientific establishment.

He stalked peevishly out of Uppsala and became a travelling biological entertainer, dressed in his Laplander costume, beating a shaman’s drum, telling his tall tales of swashbuckling travel, showing his collections of insects, and holding forth on his fast-gestating system of classification. He was plausible, at least in Germany, where the Hamburgische Berichte trumpeted that ‘All that this skilful man thinks and writes is methodical. . . . His diligence, patience and industriousness are extraordinary’. Linnaeus agreed, for he had written the lines himself.

He was shown the Hydra of Hamburg, one of the world’s most valuable zoological curiosities—a seven-headed, sharp-clawed monster which, eighty-seven years before, had mysteriously appeared on a church altar. Its authenticity was unquestioned until Linnaeus, after inspecting it, started to laugh. ‘O Great God’, he said, ‘who never set more than one clear thought in a body which Thou has shaped.’ It was a conclusion, like all his conclusions, driven by theology, and, like many of his conclusions, wrong. What would he have made of the fact that the cognition of cephalopods is partly outsourced to their semi-autonomous tentacles, or to the notion that all organisms are complex ecosystems—humans, for instance, being vats of bacteria, fungi, and viruses, all of which contribute crucially to the entities we call ourselves?

His wanderings took him to the Netherlands, where he was examined in medicine and finally obtained a medical degree. In 1735 he published his Systema Naturae, which gave to the five-fold hierarchy of Kingdoms, Classes, Orders, Families, and Species the meanings used today. He hoped it would be tectonic. It was barely noticed.

Linnaeus returned to Stockholm. To earn a living, he stalked the coffee shops, looking for signs of syphilis and gonorrhoea in the customers, and offering to treat them. It made his fortune and established him in medicine, but he continued to work on his plants, duly became professor of botany, produced a second edition of the Systema and the Philosophia Botanica, a codification of his core tenets, and sent apostles across the world to continue, and hopefully to complete, his project of identifying all the species. The recognition he craved came in part with a Swedish knighthood in 1761. It was followed by a long and bitter decline. He suffered from an autoscopy characterised by visual hallucinations and the conviction that he shared his life with a second version of himself. He forgot his own name. By 1776 he was silent. He died in 1778.

It was an exciting time to be a thinker… The truth was such a majestic and elusive thing that the search had to engage every discipline, invent new disciplines, straddle and confound old categories, and mercilessly discard cherished but superannuated models.

Linnaeus’ great competitor, Buffon, had an undistinguished Burgundian childhood. Enriched by a legacy, he became a carouser and dueller at the University of Angers, finally fleeing the city after wounding an Englishman in a duel. This episode changed him. Reflecting in Dijon, it became clear to him that he did not want the life of the idle, comfortable estate manager the fates seemed to have in store for him. But how could he escape? Help was at hand in the form of the young Duke of Kingston and his travelling companion, Nathan Hickman, a precocious naturalist. Buffon travelled with them in France and Italy for a year and a half. He would never be the same again. He read a treatise on Newton’s calculus, became obsessed with the man, and realised that he, Buffon, had himself worked out one of Newton’s theorems. The discovery transformed him—making him reassess his own ability—and shaped the course of his life.

It was an exciting time to be a thinker. Spinoza, Newton, and Leibniz, who did not slave in tiny impermeable siloes like modern academics, saw the business of science as discovering the truth about the world. The truth was such a majestic and elusive thing that the search had to engage every discipline, invent new disciplines, straddle and confound old categories, and mercilessly discard cherished but superannuated models.

Buffon, infected with this excitement, began to distance himself from his companions and returned to his birthplace, the village of Montbard in Burgundy. There, in the Parc Buffon, he began his life’s work: to understand life. This involved—but unlike Linnaeus’ conception, did not completely consist in—the classification of biological life.

A calling so high, in a temperamental hedonist, demanded a strenuously structured and rather ascetic life. Buffon’s valet woke him at 5 a.m. and was instructed to get him up however reluctant he was, even if, as was once necessary, he had to be doused in ice-cold water. Inward order meant outward order, and so Buffon dressed formally each day: for his work, not his public. After a hairdresser had curled and powdered his hair, Buffon walked to the park and to one of two cells devoted to a type of biological monasticism—each containing only a writing table, a fireplace, and a portrait of his idol, Isaac Newton. He worked from no texts or notes, just his own memory and his immediate thoughts, and took regular walks in the park to clear his head. At nine there was breakfast—a roll and two glasses of wine—and then it was back to work until lunch at two, followed by a nap, a lone walk in the garden, and a return to the cell until exactly seven. He handed his day’s writing to a secretary, who made a fair copy, grafted it into whatever manuscript was on the go, and burned the original pages. Guests typically arrived at seven for wine and conversation, but there was no supper for Buffon, who was in bed promptly at nine.  He kept up this routine for half a century.

1753 portrait of the comte de Buffon by François-Hubert Drouais.

‘It is necessary to look at one’s subject for a long time’, he wrote. ‘Then little by little it just opens out and develops. . .’ And it did. In 1749 the first three volumes of Buffon’s Histoire Naturelle were published, containing a staggering 417,600 words and written in contemporary French with unusual simplicity and clarity. It was a runaway bestseller and sold out in six weeks. Buffon spent the rest of his life enlarging and refining it. At his death, there were thirty-five volumes—three introductory ones on general subjects, twelve on mammals, nine on birds, five on minerals, and six supplemental volumes on miscellaneous subjects. The book was no mere catalogue. It contained not only detailed anatomical descriptions but also accounts of ecological context and behaviour.

Any book sufficiently ambitious to be worth writing or reading will necessarily be a failure, and in many ways this was. Buffon had hoped to deal with ‘the whole extent of Nature, and the entire Kingdom of Creation’, but despite his gargantuan efforts and flagellant self-discipline he discovered, as do all mortals, that nature defeated him: the book did not deal properly with plants, amphibians, fish, molluscs, or insects.

Yet when he bowed out of life in 1788, his life seemed to many to have been a triumphant success. 20,000 mourners lined the Paris streets. The Gentleman’s Magazine in London described him as one of the ‘four bright lamps’ of France, alongside Montesquieu, Rousseau, and Voltaire.

Linnaeus, who had preceded him into the grave ten years earlier, had a quiet funeral. Most of the few who attended were university colleagues.

Linnaeus and Buffon had competed for decades. It looked as if Buffon had decisively won. But history is capricious. Within five years of his death, Buffon was reviled as a reactionary and an enemy of progress. A raucous, torch-bearing crowd tipped his corpse from the coffin and clamoured to install a plaster bust of Linnaeus in the royal garden Buffon had managed.

Linnaeus’ rigid categories are wholly antithetical both to Darwin’s notions of the fluidity of species and to ecological understandings of the nature of nature. Buffon had written that ‘it is possible to descend by almost imperceptible gradations from the most perfect of creatures to the most formless matter.’ It sounded presciently Darwinian. It was. When Darwin discovered Buffon, he wrote to Huxley: ‘I have read Buffon—whole pages are laughably like mine. It is surprising how candid it makes one to see one’s view in another man’s words. . .’ ‘To Linnaeus’ mind’, writes Roberts, ‘nature was a noun. . . . To Buffon, nature was a verb, a swirl of constant change.’ Buffon prefigured Darwin and understood the interconnectedness of things. Linnaeus would have denounced Darwin as a heretic and seen claims of ecological entanglement as an affront to the tidy architecture of the Creator.   

Yet Linnaeus is revered and Buffon forgotten. This is very strange. Why is it so?  

Roberts speculates intelligently and plausibly. As he says, the French Revolution is undoubtedly part of the story. Buffon, confident in Paris salons and the Versailles court, was never going to be a darling of the revolutionaries—though his politics were far more egalitarian than Linnaeus’ and his relative secularism should have been more palatable than Linnaeus’ religiosity. Linnaeus’ rigid scheme of classification played well in Great Britain, devoted to its class hierarchies. Imposing an artificial regime onto the world complemented and complimented colonial notions of conquest and control.

Roberts’ explanations, though elegant and ingenious, are insufficient. An anomaly so striking cries out for a more fundamental justification. This can be found, I suggest, in the work of Iain McGilchrist, who in his two gigantic books The Master and His Emissary: The Divided Brain and the Making of the Western World (2009) and The Matter with Things: Our Brains, our Delusions, and the Unmaking of the World (2021) views the history of the last few thousand years through the lens of the functional asymmetry of the cerebral hemispheres.

To survive and thrive (his thesis goes), we need two wholly different types of attention. One is a narrow, focused attention, contributed to humans by the left hemisphere. The other (which is supposed to be in overall charge) is a wider, more holistic type of attention, based in the right hemisphere. Paying attention properly to the world demands a dialogue between these hemispheres.

The left hemisphere is meant to be the executive, acting on the orders of the right. It is the primary locus of language (which is dangerous, because it can advocate its own views), and in right-handed humans governs the right hand, which seizes and manipulates.

The left hemisphere deals in polarities. It loves black-and-white judgments. It builds and curates pigeon-holes and gets petulant at any suggestion that there is anything inadequate about its filing system. It is highly conservative and hates change.

The right hemisphere knows that nothing can be described except in terms of the nexus of relationships in which it exists, that opposites are often complementary, and that meaning is generally to be read between the lines. It does not confuse the process of understanding with the process of assembling a complete set of data, and it sees that knowledge and wisdom are very different.  

McGilchrist suggests that much of our intellectual, social, and political malaise results from the arrogation by the left hemisphere of the captaincy of the right. The nerdish secretary makes declarations about the web and weave of the cosmos and drafts policy—yet it is dismally unqualified to do so.

This is a perfect explanation for the posthumous fates of Linnaeus and Buffon. Buffon’s work represents a respectful conversation between the hemispheres. He grabbed facts in those long days of intense left hemispherical focus, and the facts were duly passed to the right hemisphere which placed them into a holistic vision of the whole natural world—a world of relationality and flux.

Linnaeus seems never to have moved out of his left hemisphere. He was, and his successors are, pathologically attached to their categories. For him, to categorise was to understand. The names spawned in the left hemisphere were the truth.

The left hemisphere’s conservatism is shown by the desperate and doomed efforts to reconcile Linnean taxonomy with biological realities. Linnaeus’ five taxonomic categories were expanded to twenty-one, and the enlarged scheme is audibly creaking. Viruses, for instance, simply don’t fit. If a model needs to be revised so radically, isn’t it time to trash it and start from the beginning? Darwin showed that the notion of immutable species is nonsense, yet taxonomists still cling to it pathetically.

this 1942 book delineated the modern synthesis of evolution, often referred to as ‘neo-darwinism’.

There is another important twist in the hemispherical story within modern biology. You couldn’t make it up. Neo-Darwinism itself, plainly at odds with traditional taxonomy, has not dealt a death blow to taxonomy. Why? Surely because left hemispheres stick together in diabolical and incoherent solidarity against the right. Neo-Darwinism has become a new, non-negotiable category. A model that is all about fluidity has become itself a mandate for stasis. All biological observations (unless you’re in a taxonomy department) have to be squeezed into it, however uncomfortably. Neo-Darwinian orthodoxy has become as canonical as the canons of taxonomy. Biological science, far from being (as the Enlightenment anticipated it would be) a workshop in which paradigms are gleefully smashed, has become a temple in which paradigms are uncritically worshipped. 

There’s a battle on for biology, a battle raging in the laboratories and lecture rooms of the world: a battle that is really between the left and right hemispheres of the world. It’s a battle for reality against dogma; for freedom against colonialism; for the untameable, mysterious, tangled wild against human vanity and self-reverential theory. It is a battle exemplified well by the epic contest between Linnaeus and Buffon.


*Note that, when you use this link to purchase the book, we earn from qualifying purchases as an Amazon Associate.

Related reading

The Highbrow Caveman: Why ‘high’ culture is atavistic, by Charles Foster

‘An animal is a description of ancient worlds’: interview with Richard Dawkins, by Emma Park

Consciousness, free will and meaning in a Darwinian universe: interview with Daniel C. Dennett, by Daniel James Sharp

‘We are at a threshold right now’: Lawrence Krauss on science, atheism, religion, and the crisis of ‘wokeism’ in science, by Daniel James Sharp

The post Linnaeus, Buffon, and the battle for biology appeared first on The Freethinker.

]]>
https://freethinker.co.uk/2024/06/carl-linnaeus-the-comte-de-buffon-and-the-battle-for-biology/feed/ 0 13887
The need for a new Enlightenment https://freethinker.co.uk/2024/04/the-need-for-a-new-enlightenment/?utm_source=rss&utm_medium=rss&utm_campaign=the-need-for-a-new-enlightenment https://freethinker.co.uk/2024/04/the-need-for-a-new-enlightenment/#respond Fri, 26 Apr 2024 07:09:00 +0000 https://freethinker.co.uk/?p=13298 Christopher Hitchens on the need for a new Enlightenment.

The post The need for a new Enlightenment appeared first on The Freethinker.

]]>
Editorial introduction

Below is reproduced, with permission from the Estate of Christopher Hitchens (to whom I express my gratitude), the final chapter of Hitchens’s classic freethinking text god Is Not Great: How Religion Poisons Everything.*

Today, as much as when that book was published in 2007, there is a need for a new Enlightenment. Two of this chapter’s themes—the danger and instability of Iranian theocracy and the threat posed to free speech by Islamic fanatics—remain very obviously and very unfortunately relevant. But the real power of the below, I think, is to be found in these words: ‘[I]t is better and healthier for the mind to “choose” the path of skepticism and inquiry in any case, because only by continual exercise of these faculties can we hope to achieve anything.’ Yes, we remain stuck in prehistory, all right. But if anything can help us to transcend our primitivism, it is the work of Christopher Hitchens. And now from his company I shall delay you no longer.

~ Daniel James Sharp, Editor of the Freethinker


“The true value of a man is not determined by his possession, supposed or real, of Truth, but rather by his sincere exertion to get to the Truth. It is not possession of the Truth, but rather the pursuit of Truth by which he extends his powers and in which his ever-growing perfectibility is to be found. Possession makes one passive, indolent, and proud. If God were to hold all Truth concealed in his right hand, and in his left only the steady and diligent drive for Truth, albeit with the proviso that I would always and forever err in the process, and to offer me the choice, I would with all humility take the left hand.” – GOTTHOLD LESSING, ANTI-GOEZE (1778)

“The Messiah Is Not Coming—and He’s Not Even Going to Call!” – ISRAELI HIT TUNE IN 2001

The great Lessing put it very mildly in the course of his exchange of polemics with the fundamentalist preacher Goeze. And his becoming modesty made it seem as if he had, or could have, a choice in the matter. In point of fact, we do not have the option of “choosing” absolute truth, or faith. We only have the right to say, of those who do claim to know the truth of revelation, that they are deceiving themselves and attempting to deceive—or to intimidate—others. Of course, it is better and healthier for the mind to “choose” the path of skepticism and inquiry in any case, because only by continual exercise of these faculties can we hope to achieve anything. Whereas religions, wittily defined by Simon Blackburn in his study of Plato’s Republic, are merely “fossilized philosophies,” or philosophy with the questions left out. To “choose” dogma and faith over doubt and experiment is to throw out the ripening vintage and to reach greedily for the Kool-Aid.

Thomas Aquinas once wrote a document on the Trinity and, modestly regarding it as one of his more finely polished efforts, laid it on the altar at Notre Dame so that god himself could scrutinize the work and perhaps favor “the Angelic doctor” with an opinion. (Aquinas here committed the same mistake as those who made nuns in convents cover their baths with canvas during ablutions: it was felt that god’s gaze would be deflected from the undraped female forms by such a modest device, but forgotten that he could supposedly “see” anything, anywhere, at any time by virtue of his omniscience and omnipresence, and further forgotten that he could undoubtedly “see” through the walls and ceilings of the nunnery before being baffled by the canvas shield. One supposes that the nuns were actually being prevented from peering at their own bodies, or rather at one another’s.)

However that may be, Aquinas later found that god indeed had given his treatise a good review—he being the only author ever to have claimed this distinction—and was discovered by awed monks and novices to be blissfully levitating around the interior of the cathedral. Rest assured that we have eyewitnesses for this event.

On a certain day in the spring of 2006, President Ahmadinejad of Iran, accompanied by his cabinet, made a procession to the site of a well between the capital city of Tehran and the holy city of Qum. This is said to be the cistern where the Twelfth or “occulted” or “hidden” Imam took refuge in the year 873, at the age of five, never to be seen again until his long-awaited and beseeched reappearance will astonish and redeem the world. On arrival, Ahmadinejad took a scroll of paper and thrust it down the aperture, so as to update the occulted one on Iran’s progress in thermonuclear fission and the enrichment of uranium. One might have thought that the imam could keep abreast of these developments wherever he was, but it had in some way to be the well that acted as his dead-letter box. One might add that President Ahmadinejad had recently returned from the United Nations, where he had given a speech that was much covered on both radio and television as well as viewed by a large “live” audience. On his return to Iran, however, he told his supporters that he had been suffused with a clear green light—green being the preferred color of Islam—all throughout his remarks, and that the emanations of this divine light had kept everybody in the General Assembly quite silent and still. Private to him as this phenomenon was—it appears to have been felt by him alone—he took it as a further sign of the imminent return of the Twelfth Imam, not so say a further endorsement of his ambition to see the Islamic Republic of Iran, sunk as it was in beggary and repression and stagnation and corruption, as nonetheless a nuclear power. But like Aquinas, he did not trust the Twelfth or “hidden” Imam to be able to scan a document unless it was put, as it were, right in front of him.

Yet again it is demonstrated that monotheistic religion is a plagiarism of a plagiarism of a hearsay of a hearsay, of an illusion of an illusion, extending all the way back to a fabrication of a few nonevents.

Having often watched Shia ceremonies and processions, I was not surprised to learn that they are partly borrowed, in their form and liturgy, from Catholicism. Twelve imams, one of them now “in occultation” and awaiting reappearance or reawakening. A frenzied cult of martyrdom, especially over the agonizing death of Hussein, who was forsaken and betrayed on the arid and bitter plains of Karbala. Processions of flagellants and self-mortifiers, awash in grief and guilt at the way in which their sacrificed leader had been abandoned. The masochistic Shia holiday of Ashura bears the strongest resemblances to the sort of Semana Santa, or “Holy Week,” in which the cowls and crosses and hoods and torches are borne through the streets of Spain. Yet again it is demonstrated that monotheistic religion is a plagiarism of a plagiarism of a hearsay of a hearsay, of an illusion of an illusion, extending all the way back to a fabrication of a few nonevents.

Another way of putting this is to say that, as I write, a version of the Inquisition is about to lay hands on a nuclear weapon. Under the stultified rule of religion, the great and inventive and sophisticated civilization of Persia has been steadily losing its pulse. Its writers and artists and intellectuals are mainly in exile or stifled by censorship; its women are chattel and sexual prey; its young people are mostly half-educated and without employment. After a quarter century of theocracy, Iran still exports the very things it exported when the theocrats took over—pistachio nuts and rugs. Modernity and technology have passed it by, save for the one achievement of nuclearization.

This puts the confrontation between faith and civilization on a whole new footing. Until relatively recently, those who adopted the clerical path had to pay a heavy price for it. Their societies would decay, their economies would contract, their best minds would go to waste or take themselves elsewhere, and they would consistently be outdone by societies that had learned to tame and sequester the religious impulse. A country like Afghanistan would simply rot. Bad enough as this was, it became worse on September 11, 2001, when from Afghanistan the holy order was given to annex two famous achievements of modernism—the high-rise building and the jet aircraft—and use them for immolation and human sacrifice. The succeeding stage, very plainly announced in hysterical sermons, was to be the moment when apocalyptic nihilists coincided with Armageddon weaponry. Faith-based fanatics could not design anything as useful or beautiful as a skyscraper or a passenger aircraft. But, continuing their long history of plagiarism, they could borrow and steal these things and use them as a negation.

This book has been about the oldest argument in human history, but almost every week that I was engaged in writing it, I was forced to break off and take part in the argument as it was actually continuing. These arguments tended to take ugly forms: I was not so often leaving my desk to go and debate with some skillful old Jesuit at Georgetown, but rather hurrying out to show solidarity at the embassy of Denmark, a small democratic country in northern Europe whose other embassies were going up in smoke because of the appearance of a few caricatures in a newspaper in Copenhagen. This last confrontation was an especially depressing one. Islamic mobs were violating diplomatic immunity and issuing death threats against civilians, yet the response from His Holiness the Pope and the archbishop of Canterbury was to condemn—the cartoons! In my own profession, there was a rush to see who could capitulate the fastest, by reporting on the disputed images without actually showing them. And this at a time when the mass media has become almost exclusively picture-driven. Euphemistic noises were made about the need to show “respect,” but I know quite a number of the editors concerned and can say for a certainty that the chief motive for “restraint” was simple fear. In other words, a handful of religious bullies and bigmouths could, so to speak, outvote the tradition of free expression in its Western heartland. And in the year 2006, at that! To the ignoble motive of fear one must add the morally lazy practice of relativism: no group of nonreligious people threatening and practicing violence would have been granted such an easy victory, or had their excuses—not that they offered any of their own—made for them.

Then again, on another day, one might open the newspaper to read that the largest study of prayer ever undertaken had discovered yet again that there was no correlation of any kind between “intercessory” prayer and the recovery of patients. (Well, perhaps some correlation: patients who knew that prayers were being said for them had more post-operative complications than those who did not, though I would not argue that this proved anything.) Elsewhere, a group of dedicated and patient scientists had located, in a remote part of the Canadian Arctic, several skeletons of a large fish that, 375 million years ago, exhibited the precursor features of digits, proto-wrists, elbows, and shoulders. The Tiktaalik, named at the suggestion of the local Nunavut people, joins the Archaeopteryx, a transitional form between dinosaurs and birds, as one of the long-sought so-called missing links that are helping us to enlighten ourselves about our true nature. Meanwhile, the hoarse proponents of “intelligent design” would be laying siege to yet another school board, demanding that tripe be taught to children. In my mind, these contrasting events began to take on the characteristics of a race: a tiny step forward by scholarship and reason; a huge menacing lurch forward by the forces of barbarism—the people who know they are right and who wish to instate, as Robert Lowell once phrased it in another context, “a reign of piety and iron.”

Religion even boasts a special branch of itself, devoted to the study of the end. It calls itself “eschatology,” and broods incessantly on the passing away of all earthly things. This death cult refuses to abate, even though we have every reason to think that “earthly things” are all that we have, or are ever going to have. Yet in our hands and within our view is a whole universe of discovery and clarification, which is a pleasure to study in itself, gives the average person access to insights that not even Darwin or Einstein possessed, and offers the promise of near-miraculous advances in healing, in energy, and in peaceful exchange between different cultures. Yet millions of people in all societies still prefer the myths of the cave and the tribe and the blood sacrifice. The late Stephen Jay Gould generously wrote that science and religion belong to “non-overlapping magisteria.” They most certainly do not overlap, but this does not mean that they are not antagonistic.

Above all, we are in need of a renewed Enlightenment, which will base itself on the proposition that the proper study of mankind is man, and woman.

Religion has run out of justifications. Thanks to the telescope and the microscope, it no longer offers an explanation of anything important. Where once it used to be able, by its total command of a world-view, to prevent the emergence of rivals, it can now only impede and retard—or try to turn back—the measurable advances that we have made. Sometimes, true, it will artfully concede them. But this is to offer itself the choice between irrelevance and obstruction, impotence or outright reaction, and, given this choice, it is programmed to select the worse of the two. Meanwhile, confronted with undreamed-of vistas inside our own evolving cortex, in the farthest reaches of the known universe, and in the proteins and acids which constitute our nature, religion offers either annihilation in the name of god, or else the false promise that if we take a knife to our foreskins, or pray in the right direction, or ingest pieces of wafer, we shall be “saved.” It is as if someone, offered a delicious and fragrant out-of-season fruit, matured in a painstakingly and lovingly designed hothouse, should throw away the flesh and the pulp and gnaw moodily on the pit.

Above all, we are in need of a renewed Enlightenment, which will base itself on the proposition that the proper study of mankind is man, and woman. This Enlightenment will not need to depend, like its predecessors, on the heroic breakthroughs of a few gifted and exceptionally courageous people. It is within the compass of the average person. The study of literature and poetry, both for its own sake and for the eternal ethical questions with which it deals, can now easily depose the scrutiny of sacred texts that have been found to be corrupt and confected. The pursuit of unfettered scientific inquiry, and the availability of new findings to masses of people by easy electronic means, will revolutionize our concepts of research and development. Very importantly, the divorce between the sexual life and fear, and the sexual life and disease, and the sexual life and tyranny, can now at last be attempted, on the sole condition that we banish all religions from the discourse. And all this and more is, for the first time in our history, within the reach if not the grasp of everyone.

However, only the most naive utopian can believe that this new humane civilization will develop, like some dream of “progress,” in a straight line. We have first to transcend our prehistory, and escape the gnarled hands which reach out to drag us back to the catacombs and the reeking altars and the guilty pleasures of subjection and abjection. “Know yourself,” said the Greeks, gently suggesting the consolations of philosophy. To clear the mind for this project, it has become necessary to know the enemy, and to prepare to fight it.


*Note that, when you use this link to purchase the book, we earn from qualifying purchases as an Amazon Associate.


Further reading

Christopher Hitchens and the long afterlife of Thomas Paine, by Daniel James Sharp

Christopher Hitchens and the value of heterodoxy, by Matt Johnson

What has Christianity to do with Western values? by Nick Cohen

Against the ‘New Theism’, by Daniel James Sharp

Atheism, secularism, humanism, by Anthony Grayling

The ‘Women’s Revolution’: from two activists in Iran, by Rastine Mortad and Sadaf Sepiddasht

‘Words are the only victors’ – Salman Rushdie’s ‘Victory City’, reviewed, by Daniel James Sharp

The Satanic Verses; free speech in the Freethinker, by Emma Park

The Enlightenment and the making of modernity, by Piers Benn

Secularism and the struggle for free speech, by Stephen Evans

Do we need God to defend civilisation? by Adam Wakeling

The rhythm of Tom Paine’s bones, by Eoin Carter

Books From Bob’s Library #1: Introduction and Thomas Paine’s ‘The Age of Reason’, by Bob Forder

New Atheism, New Theism, and a defence of cultural Christianity, by Jack Stacey

‘An animal is a description of ancient worlds’: interview with Richard Dawkins, by Emma Park

‘We are at a threshold right now’: Lawrence Krauss on science, atheism, religion, and the crisis of ‘wokeism’ in science, interview by Daniel James Sharp

Consciousness, free will and meaning in a Darwinian universe: interview with Daniel C. Dennett, by Daniel James Sharp

‘Nature is super enough, thank you very much!’: interview with Frank Turner, by Daniel James Sharp

How three media revolutions transformed the history of atheism, by Nathan Alexander

Quebec’s French-style secularism: history and enduring value, by Mathew Giagnorio

How laïcité can save secularism, by Kunwar Khuldune Shahid

The case of Richard Dawkins: cultural affiliation with a religious community does not contradict atheism, by Kunwar Khuldune Shahid

Religion and the Arab-Israeli conflict, by Kunwar Khuldune Shahid

The need to rekindle irreverence for Islam in Muslim thought, by Kunwar Khuldune Shahid

Britain’s blasphemy heritage, by David Nash

Secularism is a feminist issue, by Megan Manson

The hijab is the wrong symbol to represent women, by Khadija Khan

The post The need for a new Enlightenment appeared first on The Freethinker.

]]>
https://freethinker.co.uk/2024/04/the-need-for-a-new-enlightenment/feed/ 0 13298
Consciousness, free will and meaning in a Darwinian universe: interview with Daniel C. Dennett https://freethinker.co.uk/2023/12/consciousness-free-will-and-meaning-in-a-darwinian-universe-interview-with-daniel-c-dennett/?utm_source=rss&utm_medium=rss&utm_campaign=consciousness-free-will-and-meaning-in-a-darwinian-universe-interview-with-daniel-c-dennett https://freethinker.co.uk/2023/12/consciousness-free-will-and-meaning-in-a-darwinian-universe-interview-with-daniel-c-dennett/#comments Mon, 18 Dec 2023 02:24:00 +0000 https://freethinker.co.uk/?p=11259 The American philosopher talks about life, consciousness and meaning in a godless, Darwinian universe.

The post Consciousness, free will and meaning in a Darwinian universe: interview with Daniel C. Dennett appeared first on The Freethinker.

]]>
Daniel Dennett in 2012. image credit: Dmitry rozhkov. image used under the Creative Commons Attribution-Share Alike 3.0 Unported license.

Introduction 

Daniel C. Dennett is Professor of Philosophy and Director of the Center for Cognitive Studies at Tufts University, Massachusetts. One of the world’s best-known philosophers, his work ranges from the nature of consciousness and free will to the evolutionary origins of religion. He is also known as one of the ‘Four Horsemen of New Atheism’, alongside Richard Dawkins, Sam Harris, and Christopher Hitchens.  

His many books include Consciousness Explained (1992), Darwin’s Dangerous Idea: Evolution and the Meanings of Life (1995), Freedom Evolves (2003), Breaking the Spell: Religion as a Natural Phenomenon (2006), Intuition Pumps and Other Tools for Thinking (2013), and From Bacteria to Bach and Back: The Evolution of Minds (2017).  

I recently spoke with Dennett over Zoom to discuss his life, work, and new memoir I’ve Been Thinking, published by Penguin: Allen Lane in October 2023. Below is an edited transcript of the interview along with some audio extracts from our conversation. Where some of the discussion becomes quite technical, links to explanatory resources have been included for reference. 

Interview 

Freethinker: Why did you decide to write a memoir? 

Daniel C. Dennett: In the book, I explain that I have quite a lot to say about how I think and why I think that it is a better way to think than traditional philosophical ways. I have also helped a lot of students along the way, and I have tried to help a larger audience. I have also managed to get the attention of a lot of wonderful thinkers who have helped me and I would like to share the wealth.  

As a philosopher who has made contributions to science, what do you think philosophy can offer science? Especially as there are some scientists who are dismissive of philosophy

I think some scientists are dismissive towards philosophy because they are scared of it. But a lot of really good scientists take philosophy seriously and they recognise that you cannot do philosophy-free science. The question is whether you examine your underlying assumptions. The good scientists typically do so and discover that these are not easy questions. The scientists who do not take philosophy seriously generally do pretty well, but they are missing a whole dimension of their life’s work if they do not realise the role that philosophy plays in filling out a larger picture of what reality is and what life is all about. 

In your memoir, you say that it is important to know the history of philosophy because it is the history of very—and still—tempting mistakes. Do you mean, in other words, that philosophy can help us to avoid falling into traps? 

Exactly. I love to point out philosophical mistakes made by those scientists who think philosophy is a throwaway. In the areas of science that I am interested in—the nature of consciousness, the nature of reality, the nature of explanation—they often fall into the old traps that philosophers have learned about by falling into those traps themselves. There is no learning without making mistakes, but then you have to learn from your mistakes. 

What do you think is the biggest and most influential philosophical mistake that has ever been made? 

I think I would give the prize to Descartes, and not so much for his [mind-body] dualism as for his rationalism, his idea that he could get his clear and distinct ideas so clear and distinct that it would be like arithmetic or geometry and that he could then do all of science just from first principles in his head and get it right.  

The amazing thing is that Descartes produced, in a prodigious effort, an astonishingly detailed philosophical system in his book Le Monde [first published in full in 1677]—and it is almost all wrong, as we know today! But, my golly, it was a brilliant rational extrapolation from his first principles. It is a mistake without which Newton is hard to imagine. Newton’s Principia (1687) was largely his attempt to undo Descartes’ mistakes. He jumped on Descartes and saw further. I think Descartes failed to appreciate how science is a group activity and how the responsibility for getting it right is distributed. 

In your memoir, you lay out your philosophical ideas quite concisely, and you compare them to Descartes’s system in their coherence—albeit believing that yours are right, unlike his! How would you describe the core of your view? 

As I said in my book Darwin’s Dangerous Idea, if I had to give a prize for the single best idea anybody ever had, I would give it to Darwin because evolution by natural selection ties everything together. It ties life and physics and cosmology; it ties time and causation and intentionality. All of these things get tied together when you understand how evolution works. And if you do not take evolution seriously and really get into the details, you end up with a factually impoverished perspective on consciousness, on the mind, on epistemology, on the nature of explanation, on physics. It is the great unifying idea. 

I was lucky to realise this when I was a graduate student and I have been turning that crank ever since with gratifying results. 

How does consciousness come about in a Darwinian universe? 

First of all, you have to recognize that consciousness is not a single pearl of wonderfulness. It is a huge amalgam of different talents and powers which are differently shared among life forms. Trees are responsive to many types of information. Are they conscious? It is difficult to tell. What about bacteria, frogs, flies, bees? But the idea that there is just one thing where the light is on or that consciousness sunders the universe into two categories—that is just wrong. And evolution shows why it is wrong.  

In the same way, there are lots of penumbral or edge cases of life. Motor proteins are not alive. Ribosomes are not alive. But life could not exist without them. Once you understand Darwinian gradualism and get away from Cartesian essentialism, then you can begin to see how the pieces fit together without absolutes. There is no absolute distinction between conscious things and non-conscious things, just as there is no absolute distinction between living things and non-living things. We have gradualism in both cases.  

We just have to realise that the Cartesian dream of ‘Euclidifying’, as I have put it, all of science—making it all deductive and rational with necessary and sufficient conditions and bright lines everywhere—does not work for anything else apart from geometry. 

Why are non-naturalistic accounts of consciousness—‘mysterian’ accounts as you call them—still so appealing? 

I have been acquainted with the field for over half a century, but I am still often astonished by the depth of the passion with which people resist a naturalistic view of consciousness. They think it is sort of a moral issue—gosh, if we are just very, very fancy machines made out of machines made out of machines, then life has no meaning! That is a very ill-composed argument, but it scares people. People do not even want you to look at the idea. These essentially dualistic ideas have a sort of religious aura to them—it is the idea of a soul. [See the Stanford Encyclopedia of Philosophy entry on consciousness for an overview of the debate over the centuries.] 

I love the headline of my interview with the late, great Italian philosopher of science and journalist Giulio Giorello: ‘Sì, abbiamo un’anima. Ma è fatta di tanti piccoli robot’ – ‘Yes, we have a soul, but it’s made of lots of tiny robots’ [this interview appeared in a 1997 edition of the Corriere della Sera]. And that’s it! If that makes you almost nauseated, then you have a mindset that resists sensible, scientific, naturalistic theories of consciousness.  

Do you think that the naturalistic view of consciousness propounded by you and others has ‘won’ the war of ideas? 

No, we have not won, but the tide is well turned, I think. But then we have these backlashes.

The one that is currently raging is over whether Giulio Tononi’s integrated information theory (IIT) of consciousness is pseudo-science [see the entry for IIT in the Internet Encylopedia of Philosophy for an overview]. I recently signed an open letter alongside a number of researchers, including a lot of the world’s very best on the neuroscience of consciousness, deploring the press’s treatment of IIT as a ‘leading’ theory of consciousness. We said IIT was pseudo-science. That caused a lot of dismay, but I was happy to sign the letter. The philosopher Felipe de Brigard, another signatory, has written a wonderful piece that explains the context of the whole debate. [See also the neuroscientist Anil Seth’s sympathetic view of IIT here.] 

One of the interesting things to me, though, is that some scientists resist IIT for what I think are the wrong reasons. They say that it leads to panpsychism [‘the view that mentality is fundamental and ubiquitous in the natural world’ – Stanford Encyclopedia of Philosophy.] because it says that even machines can be a little bit conscious. But I say that machines can be a little bit conscious! That is not panpsychism, it is just saying that consciousness is not that magical pearl. Bacteria are conscious. Stones are not conscious, not even a little bit, so panpsychism is false. It is not even false, it is an empty slogan. But the idea that a very simple reactive thing could have one of the key ingredients of consciousness is not false. It is true. 

It seems that antipathy towards naturalistic theories of consciousness is linked to antipathy towards Darwinism. What do you make of the spate of claims in recent years that Darwinism, or the modern evolutionary synthesis of which Darwinism is the core, is past its sell-by date? 

This is a pendulum swing which has had many, many iterations since Darwin. I think everybody in biology realises that natural selection is key. But many people would like to be revolutionaries. They do not want to just add to the establishment. They want to make some bold stroke that overturns something that has been accepted.  

I understand the desire to be the rebel, to be the pioneer who brings down the establishment. So, we have had wave after wave of people declaring one aspect of Darwinism or another to be overthrown, and, in fact, one aspect of Darwin after another has been replaced by better versions, but still with natural selection at their cores. Adaptationism still reigns.  

Even famous biologists like Stephen Jay Gould and Richard Lewontin mounted their own ill-considered attack on mainstream Darwinism and pleased many Darwin dreaders in doing so.  But that has all faded, and rightly so. More recently, we have had the rise of epigenetics, and the parts of epigenetics that make good sense and are well-attested have been readily adapted and accepted as extensions of familiar ideas in evolutionary theory. There is nothing revolutionary there.  

image: penguin/allen lane, 2023

The Darwinian skeleton is still there, unbroken. It just keeps getting new wrinkles added as they are discovered.  

The claims that the evolutionary establishment needs to be overthrown remind me of—in fact, they are quite closely related to—the enduring hatred of some people for Richard Dawkins’s 1976 book ‘The Selfish Gene’.

Yes, some people do. But I think that it is one of the best books I have ever read and that it holds up very well. The chapter on memes is one of the most hated parts of it, but the idea of memes is gathering adherents now even if a lot of people do not want to use the word ‘meme’. The idea of cultural evolution as consisting of the natural selection of cultural items that have their own evolutionary fitness, independent of the fitness of their vectors or users—that has finally got a really good foothold, I think. And it is growing. 

As one of the foremost champions of memetics as a field of study, you must be pleased that it is making a comeback, even if under a different name, given that earlier attempts to formalise it never really took off. 

Well, the cutting edge of science is jagged and full of controversy—and full of big egos. There is a lot of pre-emptive misrepresentation and caricature. It takes a while for things to calm down and for people to take a deep breath and let the fog of war dispel. And then they can see that the idea was pretty good, after all.  

You mentioned Stephen Jay Gould. In your memoir, Gould and several others get a ‘rogue’s gallery’ sort of chapter to themselves. How have the people you have disagreed with over the years influenced you? 

Well, notice that some of my rogues are also some of the people that I have learned the most from, because they have been wrong in provocative ways, and it has been my attempts to show what is wrong with their views that have been my springboard in many cases. Take the philosopher Jerry Fodor, for example. As I once said, if I can see farther than others, it is because I have been jumping on Jerry like he is a human trampoline!  

If Jerry had not made his mistakes as vividly as he did, I would not have learned as much. It is the same with John Searle. They both bit a lot of bullets. They are both wrong for very important reasons, but where would I be without them? I would have to invent them! But I do not need to worry about beating a dead horse or a straw man because they have boldly put forward their views with great vigour and, in some cases, even anger. I have tried to respond not with anger but with rebuttal and refutation, which is, in the end, more constructive. 

And what about some of the friends you mention in the book? People like the scientist Douglas Hofstadter and the neuropsychologist Nicholas Humphrey? 

People like Doug Hofstadter, Nick Humphrey, and Richard Dawkins—three of the smartest people alive! It has been my great privilege and honour to have had them as close friends and people that I can always count on to give me good, tough, serious reactions to whatever I do. I have learned a lot from all of them.  

Nick Humphrey, for example, came to work with me in the mid-1980s and we have been really close friends ever since. I could not count the hours that we have spent debating and discussing our differences. If you look at the history of his work, you will see that he has adjusted his view again and again to get closer to mine, and I have adjusted my view to get closer to him. I accepted a lot of his points. That is how progress happens.  

How do you differentiate between philosophy and science? In your afterword to the 1999 edition of Dawkins’s 1982 book ‘The Extended Phenotype’, for example, you say that that work is both scientific and philosophical. And in your own career, of course, you have mixed science and philosophy quite freely. 

I think the dividing line is administrative at best. Philosophers who do not know any science have both hands tied behind their backs. They are ill-equipped because there is just too much counter-intuitive knowledge that we have gathered in science. That is one of the big differences between philosophy and science. In science, a counter-intuitive result is a wonderful thing. It is a gem, a treasure. If you get a counter-intuitive result and it holds up, you have made a major discovery.  

In philosophy, if something is counter-intuitive, that counts against it, because too many philosophers think that what they are doing is exposing the counter-intuitivity of various views. They think that if something is counter-intuitive, it cannot be right. Well, hang on to your hats, because a lot of counter-intuitive things turn out to be true!   

What you can imagine depends on what you know. If you do not know the science (or what passes as the science of the day because some of that will turn out to be wrong) your philosophy will be impoverished. It is the interaction between the bold and the utterly conservative and established scientific claims that produces progress. That is where the action is. Intuition is not a good guide here. 

We all take for granted now that the earth goes around the sun. That was deeply counter-intuitive at one point. A geocentric universe and a flat world were intuitive once upon a time. 

Darwinism, the idea that such complexity as living, conscious organisms can arise from blind forces, is counter-intuitive, too.  

Yes. My favourite quote about Darwinism comes from one of his 19th-century critics who described it as a ‘strange inversion of reasoning’. Yes, it is a strange inversion of reasoning, but it is the best one ever. 

It strikes me that some of the essential differences between your view and the views of others hark back in some way to Plato and Aristotle—the focus on pure reason and the immaterial and the absolute versus the focus on an empirical examination of the material world. 

Yes, that is true. It is interesting that when I was an undergraduate, I paid much more attention to Plato than to Aristotle. Again, I think that was probably because I thought Plato was more interestingly wrong. It was easier to see what he was wrong about. Philosophers love to find flaws in other philosophers’ work! 

That brings to mind another aspect of your memoir and your way of thinking more generally. You think in very physical, practical terms—thinking tools, intuition pumps, and so on. And you have a long history of farming and sailing and fixing things. How important has this aspect been to your thinking over the years? 

It has been very important. Since I was a little boy, I have been a maker of things and a fixer of things. I have been a would-be inventor, a would-be designer or engineer. If I had not been raised in a family of humanists with a historian father and an English teacher mother, I would probably have become an engineer. And who knows? I might not have been a very good one. But I just love engineering. I always have. I love to make things and fix things and figure out how things work.  

I think that some of the deepest scientific advances of the last 150 years have come from engineers—computers, understanding electricity, and, for that matter, steam engines and printing presses. A lot of the ideas about degrees of freedom and control theory—this is all engineering. 

Since you mention degrees of freedom, whence free will? You are known as a compatibilist, so how do you understand free will in a naturalistic, Darwinian universe? 

I think there is a short answer, which is that the people who think free will cannot exist in a causally deterministic world are confusing causation and control. These are two different things. The past does not control you. It causes you, but it does not control you. There is no feedback between you and the past. If you fire a gun, once the bullet leaves the muzzle, it is no longer in your control. Once your parents have launched you, you are no longer in their control.  

Yes, many of your attitudes, habits, and dispositions are ones you owe to your upbringing and your genes but you are no longer under the control of them. You are a self-controller. There is all the difference in the world between a thing that is a self-controller and a thing that is not. A boulder rolling down a mountainside is caused deterministically to end up where it ends up, but it is not being controlled by anything, while a skier skiing down the slalom trail is also determined in where she ends up, but she is in control. That is a huge and obvious difference. 

What we want is to be self-controllers. That is what free will is: the autonomy of self-control. If you can be a competent self-controller, you have all the free will that is worth wanting, and that is perfectly compatible with determinism. The distinction between things that are in control and things that are out of control never mentions determinism. In fact, deterministic worlds make control easier. If you have to worry about unpredictable quantum interference with your path, you have a bigger control problem.  

I know that you have a long and ongoing dispute with, among others, the biologist and free will determinist Jerry Coyne on this. 

Yes. I have done my best and spent hours trying to show Jerry the light! 

Alongside Richard Dawkins, Christopher Hitchens, and Sam Harris, you were one of the ‘Four Horsemen of New Atheism’. In your memoir, you say that you were impelled to write your book on religion, ‘Breaking the Spell’, because you were worried about the influence of religious fundamentalism in America—and you say that your worries have been borne out today. In your view, then, we are seeing a resurgence of dangerous fundamentalism? 

Dennett with two of his fellow ‘horsemen’, Christopher Hitchens (left) and sam harris (centre), at the ciudad de las ideas conference, 2009. image credit: Werther mx. image used under the Creative Commons Attribution-Share Alike 3.0 Unported license.

We are, yes, and we are seeing it across the world and across religions. I think that we have to recognise that a major part of the cause of this is the anxiety, not to say the terror, of the believers who see their world evaporating in front of their eyes. I warned about that in Breaking the Spell, and I said, ‘Look. We have to be calm. We have to be patient. We have to recognise that people are faced with a terrifying prospect, of their religious traditions evaporating, being abandoned by their children, being swept aside.’ No wonder that many of them are anxious, even to the point of violence.

In Breaking the Spell, I designed a little thought experiment to help those of us who are freethinkers, who are atheists, appreciate what that is like. Imagine if aliens came to America. Not to conquer us—imagine they were nice. They were just learning about us, teaching us about their ways. And then we found that our children were flocking to them and were abandoning musical instruments and poetry and abandoning football and baseball and basketball because these aliens had other pastimes that were more appealing to them. I deliberately chose secular aspects of our country for this experiment. 

Imagine seeing all of these just evaporate. What?! No more football, no more baseball, no more country music, no more rock and roll?! Help, help! It is a terrifying prospect, a world without music—not if I can help it! 

If you can sympathise with this, if you can feel the gut-wrenching anxiety that that would cause in you, then recognise that that is the way many religious people feel, and for good reason. And so we should respect the sorrow and the anger, the sense of loss, that they are going through. It is hard to grow up and shed religion. It has been our nursemaid for millennia. But we can do it. We can grow up. 

Is there a need for another ‘New Atheist’ type of moment, then, given the resurgence of religious fundamentalism and violence in the world? 

I am not sure that we need it. I am not going to give the New Atheists credit for this—though we played our role—but recent work has shown that the number of those with no religion at all has increased massively worldwide. Let’s just calm down and take a deep breath. Comfort those who need comforting. Try to forestall the more violent and radical responses to this and just help ease the world into a more benign kind of religion.  

And religions are doing that, too. Many religions are recognising this comforting role and are downplaying dogma and creed and emphasising community and cooperation and brotherhood and sisterhood. Let’s encourage that. I sometimes find it amusing to tease Richard Dawkins and say to him, think about this evolutionarily: we do not so much want to extinguish religion as get it to evolve into something benign. And it can.  

We need the communities of care, the places where people can go and find love and feel welcome. Don’t count on the state to do that. And don’t count on any institution that is not in some ways like good old-fashioned religion for that, either. The hard thing to figure out is how we can have that form of religion without the deliberate irrationality of most religious doctrine. 

And that is a difference between you and Dawkins. In ‘Breaking the Spell’, you did not expend much energy on the arguments for and against the existence of a deity, whereas Dawkins in ‘The God Delusion’ (2006) was much more focused on that question. 

Yes, but Richard and his foundation also played a major role in creating The Clergy Project, which I helped to found and which is designed to provide counsel and comfort and community for closeted atheist clergy. There are now thousands of clergy in that organisation and Richard and his foundation played a big role in setting it up. Without them, it would not have happened. So, Richard understands what I am saying about the need to provide help and comfort and the role of religion in doing so. 

You mentioned music earlier, which you clearly love as you devoted a long chapter in your memoir to it. So, what for you is the meaning of life without God and without a Cartesian homunculus?

Well, life is flippin’ wonderful! Here we are talking to each other, you in England [Scotland, actually, but it didn’t seem the moment to quibble!] and me in the United States, and we are having a meaningful, constructive conversation about the deepest issues there are. And you are made of trillions—trillions!—of moving parts, and so am I, and we are getting to understand how those trillions of parts work. Poor Descartes could never have imagined a machine with a trillion moving parts. But we can, in some detail now, thanks to computers, thanks to microscopes, thanks to science, thanks to neuroscience and cognitive science and psychophysics and all the rest. We are understanding more and more every year about how all this wonderfulness works and about how it evolved and why it evolved. To me, that is awe-inspiring.  

My theory of meaning is a bubble-up theory, not a trickle-down theory. We start with a meaningless universe with just matter, or just physics, if you like. And with just physics and time and chance (in the form of pseudo-randomness, at least), we get evolution and we get life and this amazingly wonderful blossoming happens, and it does not need to have been bestowed from on high by an even more super-duper thing. It is the super-duper thing. Life: it’s wonderful. 

I completely agree. I have never understood the appeal of religion and mysticism and ‘spooky stuff’ when it comes to meaning and purpose and fulfillment, but there we are. In your memoir, you discuss the thinking tools you have picked up over the years. Which one would you most recommend? 

It might be Rapoport’s rules. The game theorist Anatole Rapoport formulated the rules for how you should conduct any debate. These are the rules to follow if you want constructive disagreement. Each of them is important. 

The first thing you should do is to try to state your opponent’s position so vividly and clearly and fairly that your opponent says they wish they had thought of putting it that way. Now, you may not be able to improve on your opponent, but you should strive for that. You should make it clear by showing, not saying, that you understand where your opponent is coming from.  

Second, mention anything that you have learned from your opponent—anything you have been convinced of, something you had underestimated in their case.

Third, mention anything that you and your opponent agree on that a lot of people do not. 

Only after you have done those three things should you say a word of criticism. If you follow these rules precisely, your opponent will know that you really understand him or her. You have shown that you are smart enough to have learned something from or agree about something with him or her.

What Rapoport’s rules do is counteract what might almost be called the philosopher’s blight: refutation by caricature. Reductio ad absurdum is one of our chief tools, but it encourages people to be unsympathetic nitpickers and to give arguably unfair readings of their opponents. That just starts pointless pissing contests. It should be avoided. 

I know the answer to this question, but have you ever been unfairly read? 

Oh yes! It is an occupational hazard. And the funny thing is that I have gone out of my way to prevent certain misunderstandings, but not far enough, it seems. I devoted a whole chapter of Consciousness Explained to discussing all the different real phenomena of consciousness. And then people say that I am saying that consciousness is not real! No, I say it is perfectly real. It just is not what you think it is. I get tired of saying it but a whole lot of otherwise very intelligent people continue to say, ‘Oh, no, no, no! He is saying that consciousness isn’t real!’  

Well, given what they mean by consciousness—something magical—that is true. I am saying that there is no ‘real magic’. It is all conjuring tricks. I am saying that magic that is real is not magic. Consciousness is real, it is just not magic. 

Do you have any future projects in the works? 

I do have some ideas. I have a lot of writing about free will that has accumulated over the last decade or so and I am thinking of putting that together all in one package. But whether I publish it as a book or just put it online with introductions and unify it, I am not yet sure. But putting it online as a usable anthology in the public domain is a project I would like to do.  

Further reading:

Darwinism, evolution, and memes

‘An animal is a description of ancient worlds’ – interview with Richard Dawkins, by Emma Park

Science, religion, and the ‘New Atheists’

Atheism, secularism, humanism, by A.C. Grayling

How three media revolutions transformed the history of atheism, by Nathan Alexander

Secular conservatives? If only…, by Jacques Berlinerblau

Can science threaten religious belief? by Stephen Law

Christopher Hitchens and the long afterlife of Thomas Paine, by Daniel James Sharp

Christopher Hitchens and the value of heterodoxy, by Matt Johnson

Meaning and morality without religion

What I believe – interview with Andrew Copson, by Emma Park

Morality without religion: the story of humanism, by Madeleine Goodall

‘The real beauty comes from contemplating the universe’ – interview on humanism with Sarah Bakewell

The post Consciousness, free will and meaning in a Darwinian universe: interview with Daniel C. Dennett appeared first on The Freethinker.

]]>
https://freethinker.co.uk/2023/12/consciousness-free-will-and-meaning-in-a-darwinian-universe-interview-with-daniel-c-dennett/feed/ 1 11259