Thursday, January 29, 2015

Islam and science have problems with their relationship

In August 2013, Richard Dawkins elicited one of his periodic bouts of controversy by tweeting that Trinity College, Cambridge had produced many more Nobel Laureates than the entire Muslim world.  While no one could deny that his tweet was objectively correct, any serious point he might have been making was drowned out by the condemnation of his quasi-racist language.  This is a shame because, unfortunately, science in much of the Muslim world really is in crisis.  

Nidhal Guessoum (who is associated with this website), a professor of Physics and Astronomy at the American University of Sharjah in the United Arab Emirates, likes to ask his students and colleagues about their scientific beliefs.  Despite living just down the road from the multinational entrepôt of Dubai, he has found that only about ten per cent of Muslims at the university accept that human beings evolved from other animals.  We should bear in mind that Guessoum is asking only undergraduates and members of the university faculty.  The population at large is likely to be even more dismissive of Darwin.  In comparison, Gallup polls of Americans have consistently found that half of those surveyed accept humans are descended from apes. We tend to think of the United States as a haven for creationists, but it has nothing on the UAE.  If the problem of science among Muslims were confined to rejecting Darwin, we would at least be confronting an opponent familiar from debates with Christian creationists.  But, as Professor Guessoum explains in Islam’s Quantum Question, science in the Islamic world has further problems.  He’s written the book to help explain Muslim attitudes towards science and discuss ways that the situation can be improved.  Unfortunately, besides creationism, there are several other serious threats to a harmonious relationship between Islam and modern science.  
The first threat is the claim that science is an imperialist cultural artefact with no objective claim to truth.  This leaves people in Islamic countries free to reject science as a colonial imposition.  The situation is made worse by left-wing professors in the West.  Muslim intellectuals in the United States and United Kingdom, who have drunk deep of the beguiling draught of postmodernism, have attempted to build an Islamic science better suited to their co-religionists.  Thinkers like Ziauddin Sardar and Seyyed Hossein Nasr are not household names, but their rejection of western science as incompatible with Islam has become conventional wisdom in many Muslim countries. Of course, their attempts to create an alternative natural philosophy have been an abject failure and they have been reduced to bickering among themselves.  In Islam’s Quantum Question, Guessoum is always impeccably polite, but it is clear he despairs of views like these.  His ridicule of Sardar, Nasr and their fellow travellers is all the more devastating for being so gently expressed.  Guessoum knows perfectly well that science is universal.  Its truths are the same everywhere. The idea of a specifically Islamic science makes as little sense as a Christian or atheist one.  This is why Abdus Salam, who won the Nobel Prize for Physics in 1979 for his work on the weak nuclear force, is one of Guessoum’s heroes.  Salam saw himself as an ambassador for science to the developing world.  It goes without saying that he found no conflict between his own Muslim devotion and his epochal work in nuclear physics.
A second threat is the low status of science in Muslim-majority countries.  For example, at the beginning of his book, Guessoum mentions two competitions for the pupils at his son’s school in the UAE.  One was for a science project that he was asked to judge.  It is fair to say he was less than impressed by the quality of many of the entries, but attendance was so sparse that there were few people to notice.  Three days later, the school held its annual Koranic memorisation competition.  Hundreds of parents, the local media and various guests of honour crammed into the school hall to witness prizes totalling $20,000 being handed out to the pupils.  With educational priorities like these, it is hardly surprising that the Muslim world has to import scientists and engineers from the West or send its own sons and daughters to be trained there.
The third threat is even more insidious.  The school of I’jaz teaches that the findings of modern science have been miraculously present in the Koran all along.  The verse “the Originator of the heavens and the earth! When he decrees a thing, he says only: ‘Be!’ And it is.” (Q:2:117) is taken as a reference to the Big Bang.  But proponents of I’jaz make even more surprising claims.  They can derive the speed of light, being 300,000,000 metres per second, from the verse “He directeth the ordinance from the heaven unto the earth; then it ascendeth unto Him in a Day, whereof the measure is a thousand years of that ye reckon.” (Q32:5).  Guessoum devotes one of the appendices of his book to refuting this “calculation” of the speed of light.  But there are many other examples taken extremely seriously and he is clearly angered that I’jaz is so influential.  Unfortunately, despite having much in common with the Bible Code craze of a few years back, I’jaz is fast becoming mainstream in Muslim countries.  Guessoum found that 80% of the Muslim faculty and students at his university in the UAE believed that the Koran contains explicit statements now known to be scientific facts.  
Guessoum himself suffers from none of these misapprehensions.  He is a believing Muslim but his scientific views are similar to those of most of his Western colleagues: he wholehearted accepts Darwin’s theory (rejecting intelligent design) and sees science as universal rather than local.  He rejects I’jaz but does see the merit, like many Christians, of the fine-tuning argument and theistic evolution.

Guessoum’s experience shows that reconciling Islam and Science is a problem that has already been solved.  Islam’s awesome thirteen hundred years of scholarship has already furnished the answers in this debate.  All that is needed is to retool the arguments developed centuries ago to make them fit for the modern era.  The original debate between Islamic and foreign sciences took place in the ninth to twelfth centuries when ancient Greek natural philosophy and mathematics were first translated into Arabic.  Admittedly, back then, in a high-scoring game, the mystics eventually prevailed with a late winner from Al Ghazzali (d. 1111).  Warning Muslims against the work of Euclid and Ptolemy, Al-Ghazzali said they are “the preliminary to the sciences of the ancients, which contain wrong and harmful creeds.”  He was probably talking specifically about astrology but as his influence has waxed, the sciences in the Islamic World have waned.  

That is not to say that the traditional picture of Al Ghazzali snuffing out Golden Age science is accurate.  The astronomical work of Nasir al-Tusi (d. 1274) and Idn al-Shatir (d. 1375) alone refutes that theory.  The mathematical models of both these scholars were used, unacknowledged, by Nicolas Copernicus (d. 1543) in his Revolutions of the Heavenly Spheres.  Nidhal Guessoum’s own favourite Islamic thinker is Idn Rushd, known as Averroes in the West.  He took on the challenge of Al-Ghazzali and has been a pariah among conservative Muslims ever since.  Only in the West is Averroes hailed as one of the most important thinkers in history.  

So obviously, science in the Islamic world did not break the mould in the way that it did in the West.  But, as George Saliba notes in his Islamic Science and the Making of the European Renaissance, the question of why modern science didn’t arise in the Muslim world is the wrong one to ask.  It didn’t arise in all sorts of advanced civilisations including China or India; ancient Greece and Rome; or Sassanid Persia and its great antagonist Byzantium.  Instead, we should be wondering why a recognisably modern science had arisen in the West by the end of the nineteenth century.  That this didn’t happen elsewhere isn’t because of the deficiencies of other societies.  It’s just that there was a unique conjunction of historical contingencies in one place and time.  Exactly what those contingencies were remains a matter of much debate.

What, then, is the solution to Islam’s quantum question?

There are some clues in Guessoum’s book.  One element is the need to ensure that any discussion of science is grounded in the Koran.  The esteem in which this book is held among Muslims is well known.  Since it is full of injunctions to observe and understand nature, there is strong support for science to be found within its pages.  It also supports a philosophy of the unity and predictability of nature which adheres well to the axioms of modern science.  

Obviously, Koranic literalism can be unhelpful.  Luckily, there is a history of interpretation that allows the Koran to be read in a figurative rather than literal way where necessary.  A passage that looks like a straightforward statement of fact is likely to also have a range of metaphorical and religious interpretations.  Guessoum warns of some pitfalls in this approach.  For instance, the Arabic word commonly translated as science today, ‘ilm, has the wider meaning of “knowledge” in classical Arabic.  Nonetheless, the essential lesson is that revering the Koran as the word of God does not also mean having to treat it as a scientific textbook.

To a great extent, the relationship between science and Christianity is of academic interest only.  Readers of this blog might find the subject fascinating but it only rarely impinges on public life.  When it does, the issue in question is almost always creationism which most scholars in the field regard as one of the subject’s least interesting manifestations.  The situation among Muslims is different.  For them, the question of how to reconcile science to Islam is of epochal importance.  The best-case scenario could well see them in a better place than the West – a science that recognises its ethical boundaries and rejects the naïve utilitarianism of so many western scientists.  But for the present, the story is much less encouraging.  Nidhal Guessoum is in no doubt that the relationship between science and Islam is highly problematic and that this is holding back the development of Muslim societies.  Sadly, there is little that western Christians can do about this.  

Given the importance of its subject-matter, it is unfortunate that Islam’s Quantum Question is such a poorly organised and written book.  Even the title is a misnomer – Guessoum tells us early on he’s got hardly anything to say about quantum mechanics.  The book was originally in French and, as far as I can tell, Professor Guessoum translated it into English himself.  The result is difficult to read and even harder to follow.  For most readers, the amount of new material is more than can be easily swallowed.  Muslim thinkers come thick and fast, sometimes referred to by their surnames and sometimes by their given names.  Keeping track of who is who and what they all think becomes a serious challenge.  It’s not even clear for whom the book is for.  There is lots of material which looks like it is aimed at an audience of western non-Muslims.  But Guessoum also spends a great deal of space elucidating the basic philosophy of science and presenting evidence that evolution is true.  

Deep within his book there is an essential text fighting to get out.  There is no doubting the significance or the urgency of the issues it raises.  Thus, despite its faults as a piece of writing, it is something that everyone interested in the interface between science and religion should read.

This article is a much expanded version of a review originally published in Science and Christian Belief 26(2) 2014.

Discuss this post at the Quodlibeta Forum

Friday, January 09, 2015

We know less about the ancient world than we think we do

On 15 June 763BC, a near total eclipse of the sun was visible over a swathe of the Near East.  As luck would have it, the event was noted in the official list of Assyrian high officials.  This record provides the earliest absolute and uncontroversial date in ancient history.  Using lists of kings and the chronicles of events, historians have counted the years back from this date to construct the chronology of ancient history.  
Radiocarbon analysis (which measures the decay of carbon 14, an unstable isotope) and the predicable styles of pottery found in digs both provide corroborating evidence.  Dating the layers of archaeological remains from the artefacts found within them is called stratigraphy and can yield quite precise results.  The vast amount of pot shards that has been unearthed allows archaeologists to use statistical methods to screen out random noise and anomalous samples that have found their way into the wrong strata.  Of course, pottery and radiocarbon methods need to be calibrated to produce absolute dates.  This has been done using samples of wood whose age can be determined by matching patterns of tree rings, a technique called dendrochronology.  We can count back sequences of tree rings from the present day, all the way to 2000BC.  By carbon dating the oldest samples of wood, we can tie the tree ring record to the results from carbon 14 decay.
By 1990, all these clues had yielded a multi-dimensional jigsaw which fitted together to almost everyone’s satisfaction.  There were a few heretics like Peter James, who suggested in his book Centuries of Darkness that the conventional chronology included two hundred additional years around 1000BC.  Thus remains that were conventionally dated to 1050BC actually occurred in 850BC.  Although James’s book is an excellent read, it fails to convince.  
Nonetheless, it has now turned out that the conventional chronology was not as secure as everybody else thought.  While James was convinced ancient history was two centuries too long, new evidence has begun to pile up in the opposite direction: it now looks like the conventional chronology is up to 150 years too short.  To put it another way, a cataclysm that everyone thought occurred in 1500BC actually happened before 1620BC.  The event in question was the massive eruption of the island of Thera in the Aegean Sea.  
Conventional chronology dated the end of Minoan age in Crete to 1450BC.  Archaeologists assumed that the Thera eruption (on the modern island of Santorini) and its resulting tsunami had destroyed the Minoan fleet leaving them vulnerable to raiders from the mainland.  Certainly, the havoc wrought by the volcano can clearly be seen across the Eastern Mediterranean.  When Thera exploded, it blasted 60 cubic kilometres of rock into the atmosphere which settled over Asia Minor.  The resulting layer of ash and pumice is used to date the sites where it is observed.  And the eruption had other effects.  Sulphur dioxide released by the volcano spread across the northern hemisphere and fell to earth as acid rain, or more significantly as acid snow.  At the poles, not all of that snow has yet melted and, from the 1990s, it provides a new strand of evidence to date the eruption.  
Ice cores, drilled from the icecap of central Greenland, record the depth of each annual snowfall.  The ice holds within it information on the constitution of the atmosphere going back tens of thousands of years.  Like tree rings, each layer can be counted so as to give an absolute rather than relative date.  Big volcanic eruptions show up as spikes in the sulphur-content of the annual fall of snow: Krakatau in 1886; Tombura in 1815; Vesuvius in AD79.  Despite the presence of literate civilisations in Egypt, the Levant and Babylon, no written record of the Thera eruption exists, but the ice cores should overcome that deficiency and provide an absolute date for the cataclysm.  
Actually, the fact that the Thera event went unrecorded is less surprising than it seems.  Mankind has been remarkably unobservant of enormous volcanic eruptions.  An event in 1257AD, less than 800 years ago, is indelibly imprinted into both the Greenland and Antarctic ice cores.  It was greater in size even than Tombora and thus the largest eruption in the last ten thousand years.  But remarkably, no one knows where it happened.  Only in 2012 has Mt Rinjani in Indonesia emerged as a likely candidate.  Another big eruption, as recent as 1809, remains unidentified.
By 2000, the Greenland ice cores had revealed that Thera could not have happened when everyone thought it had.  The most likely anomaly in the ice dated from 1640BC, but this turned out to be from a volcano in Alaska.  At the same time, carbon dating an olive tree buried in the Aegean eruption yielded a date of around 1620BC.  Sulphur traces in the ice have been found that correspond to this date, although they are not as strong as might be expected.  Now, the dendrochronologists have piled in.  The Thera eruption would have caused unusually cold weather which stunted plant growth across the globe.  Evidence from bristlecone pines in the western United States, oak trees in Ireland and Swedish pines all point to a cold snap in 1627BC.  This is consistent with what we’d expect from a big volcano blowing its top in the Mediterranean.  Evidence from the Antarctic ice cores should be in shortly, but for a northern hemisphere volcano, this is unlikely to be conclusive.
The lack of a definitive date for the Thera disaster is frustrating, but we can now be reasonably sure it occurred 120 years earlier than thought.  The implications of this for ancient history are immense.  The chronology of the New Kingdom of Egypt was thought to be rock solid.  Finding that they need to find room for a dozen more decades has been too disconcerting for Egyptologists to tackle so far.  There is a good chance that the extra years belong in a period after the well-documented New Kingdom called the Third Intermediate Period.
For historians of Babylonia, the crisis has been less existential.  Absolute dates for the second half of the second millennium are based on ancient observations of the planet Venus.  We know from modern calculations that a particular configuration of Venus recorded during the eighth year of the reign of a certain King Ammisaduqa must have occurred in 1702BC, 1646BC, 1582BC or 1550BC.  Other events in Babylonian history, such as the reign of King Hammurabi (famous for his law code) and the sack of Babylon by the Hittites are arranged around whichever absolute date is most convenient.  That some of these possible Venusian dates differ by 120 years, about the same length of time that the Thera eruption has been moved back, is highly suggestive to say the least.

So, where does all this leave biblical chronology?  That remains very unclear.  But the redating of Thera shows that we know a lot less about when things happened in the ancient world than we thought we did.
Discuss this post at the Quodlibeta Forum

Friday, January 02, 2015

The British Medical Association thinks it is bad for doctors to work weekends. Why do we still treat the medical profession as special?

In this age of public cynicism, few professions remain in high public esteem. No one ever liked journalists, politicians or estate agents. But in recent years, bankers and lawyers have become much less-trusted: for good reason you might say. The teaching unions continue with their long campaign to undermine the regard which much of the public still have for their members. But doctors have bucked this trend. Even recent scandals in the National Health Service (conveniently blamed on managers) haven’t really dented the way the public see physicians, or how physicians see themselves.

That isn’t too surprising. Doctors can do amazing things to heal us. They save lives habitually. In some circles, it is sacrilege even to criticise them. But, remarkably, doctors enjoyed a healthy professional reputation even back in the days when they couldn’t really help us at all. The miracle of modern medicine is much more recent than we realise. It was only from the mid-nineteenth century that doctors were more likely to cure than kill. Our expectation that we won’t die of infectious disease dates from after the Second World War.

It’s impossible to overstate just how useless pre-modern medicine was. If you fell ill, there was nothing, and I mean absolutely nothing, that a doctor could do to cure you. Granted, he had plenty of treatments and his learning was considerable. But bleeding, purgatives and the like would do you more harm than good. In essence, doctors were charging fat fees to hasten patients towards the grave.
Actually, I was slightly exaggerating when I said doctors could do nothing. There were some drugs available, like opium, to lessen pain. But you didn’t need a doctor to access these drugs and, although opium could reduce discomfort, you wouldn’t be cured. It was palliative only. Luckily for them, doctors did have another trick up their sleeves, although they did not know it. It’s called the placebo effect.

It’s well known that when you give a patient a sugar pill, something with no active ingredients, it can have marked beneficial effects. The mere fact that the patient thinks that they are being treated with an effective medicine makes them better able to heal themselves. And this effect is even more marked if the doctor himself thinks he is doing some good. That’s why new drugs are tested using the double-blind method. Patients are divided into two groups. One group is given the drug under test and the other is given a placebo. It’s called double-blind testing because the researchers giving the drug don’t know which is which any more than the subjects do. Only a second lot of researchers, who never actually come into contact with the patients, know who has received the real drug and who has received the fake.

So, a doctor in the eighteenth century, with his training and aura of competence, could help his patients cure themselves merely because all parties thought that he could. This might even offset the damage that the doctor was doing by administering dangerous drugs or ordering bleeding. Clearly, the doctors who could best help their patients were the ones who didn’t do anything besides having a reassuring bedside manner and giving out harmless placebos. That’s generally what village healers and cunning folk did. Their magical cures were less likely to hurt you than the treatments of professional doctors. Most effective of all was praying at a saint’s shrine. If you believed in it, prayer would do as much good as a visit to the doctor, and it was unlikely to do you any harm at all. Physicians made their living by cloaking themselves in learning, jargon and professional qualifications. But it was all an illusion. No matter how many long years they studied Galen and Avicenna, they couldn’t help their patients one jot.

Incidentally, that’s how homeopathy got going. It was founded by Samuel Hahnemann in 1796 while doctors were still more likely to be licensed killers than saviours. Now, I hope I won’t offend anyone when I say that homeopathic medicines do precisely nothing. They rely entirely on the power of suggestion – in other words the placebo effect. But when homeopathy was founded, doing nothing could be a huge improvement on conventional treatments. So, it appeared to work better. This meant that homeopaths gained a respected place in British medicine that they have never really relinquished. Homeopathy is still available on the National Health Service.

All this raises a slightly disconcerting question. If doctors could maintain a professional reputation back when they couldn’t help their patients, is some of the reverence in which we hold them today really just a function of good public relations? That’s not to say that today’s medical professionals don’t deserve a large measure of respect. But placing them on a pedestal doesn’t do us or them any good at all. So when the British Medical Association say that doctors are too important to work at weekends, we should treat the suggestion with the scorn it deserves.

Discuss this post at the Quodlibeta Forum

Thursday, December 11, 2014

The Perils of Earthquake Prediction and Political Hubris

Six seismologists were finally acquitted last month after they made rash comments before Aquila earthquake of 2009.  They were victims of unreasonable expectations and not scientific ignorance. 
At 3am on 6 April 2009, an earthquake measuring 6.2 on the Richter scale devastated the medieval city of L’Aquila in the Apennine Mountains of central Italy.  Over three hundred people died and the city’s cultural treasures were left in a parlous state.  But it is events that unfolded shortly before the quake that have continued to attract worldwide attention. 

Six days before the disaster, a government committee of six seismologists and a public official tried to dampen down fears that an earthquake was imminent.  In particular, the one member of the committee who was not a scientist, Bernardo De Bernardinis, stated that there was “no danger”.  In 2012, a local court convicted the committee members of involuntary manslaughter.  When they were first charged, numerous professors, decorated with the weightiest of credentials, wrote letters attacking the prosecutors.  Putting these men on trial was an affront to the dignity of science, they cried.  When the seven were found guilty, the cacophony of outrage doubled in volume.  The Aquila seven joined Galileo as paradigms of scientific martyrdom. 
The wheels of Italian justice turn extremely slowly and only now have appeals against the decision of the local court been handed down.  The six scientists have had their convictions quashed, but that of De Bernardinis, who said there was no danger, was upheld.  Further appeals are still possible.

So what was really going on?  The world’s media misreported the 2012 trial with an even greater level on ineptitude than usual.  No prosecutor had alleged that failing to predict the earthquake was a criminal offence.  This was because predicting an earthquake is impossible.  The record of failure is long and inglorious.  We’ve only recently found out why earthquakes happen at all.  Aristotle thought they were a result of vapours escaping from the soil.  In the eighteenth century, some theorists blamed lightning strikes.  The development of plate tectonics in the early-twentieth century means that we now understand what causes the ground to shake. But mainstream geologists long derided this theory and it only achieved widespread acceptance in the 1950s.
Prediction remains a pipedream.  Studies of animals have found that, while they can act strangely before a major quake, plenty of more innocent occasions set off the same behaviour.  A retired engineer claimed that an earthquake was looming at L’Aquila because he was picking up higher readings of a radioactive gas.  But again, this also happens when no earthquake is due.  Foreshocks, such as those felt at L’Aquila, occur before about half of large quakes.  In contrast, large quakes only follow foreshocks about one occasion in fifty.  Thus, major seismic events do give some warning signs.  It’s just that those warnings don’t usually presage a serious earthquake.  In the jargon, “false positives” are far more common than true predictors.  Just imagine if scientists demanded the evacuation of Los Angeles, promising the big one was around the corner, and then nothing happened.  That is the most likely outcome given the current state of knowledge.

So, the seven Italians were not convicted of failing to predict the disaster in L’Aquila.  Rather they were accused of going about their duties negligently.  And negligence that causes death is often characterised as manslaughter.  Given De Benardinis assured the public they were completely safe, the failure of his appeal seems fair.  But the seismologists were in the impossible position of not knowing what the risk was, just that it probably wasn’t very great.

One question the case raises is the extent to which scientists should be held accountable.  The implication of many of the L’Aquila seven’s defenders is that scientists should be given carte blanche to say what they like.  Anything else would obstruct free enquiry.  But that can’t be right.  A scientist who carried out their work without due care or made off-the-cuff pronouncements would surely be culpable.  Given that we cannot predict earthquakes, a confident statement that no earthquake was due would be as bad as saying that it was imminent.  As in many other fields, an honest mistake is a defence, but negligence is not.  Scientists enjoy the status of latter-day sages.  Many imagine that their methods provide the only road to truth, not only in physics and biology, but in the social sciences as well.  So perhaps the message from L’Aquila is that in making those claims, scientists unintentionally erect expectations that they cannot possibly meet.
Still, there is another way of looking at the case of L’Aquila.  The government set up a committee to advise on earthquakes and people of the city felt betrayed when it failed to protect them.  No matter that the state could no more control the ground than Canute could the tide.  Like so many westerners, the citizens of L’Aquila thought that their government was an indomitable Leviathan.  The media fuels this mood with its constant refrain that “something must be done” even when there is patently nothing that can be.  So grandiose has the rhetoric of the state become, that people imagine the reality should match the words.  Even in a country like Italy where the government is so self-evidently incompetent, it is still expected to be in control of events that are intrinsically beyond control.  Extending the scope of a government’s tasks to scientifically impossible tasks such as earthquake prediction is only a small step further from expecting it to achieve the economically impossible by “kick-starting the economy” or the mathematically illogical task of preserving generous entitlements without raising taxes or cutting other spending. 

We have come to expect too much from our politicians and our scientists.  What it needed is for both professions to become more humble. Otherwise, they can expect to be severely punished if they can’t live up to their rhetoric.
Discuss this post at the Quodlibeta Forum

Saturday, July 26, 2014

Text wanted

I'm looking for an introduction to philosophy text that meets the following criteria:

1. It's arranged topically, not historically.
2. It nevertheless deals with these topics by going over how various thinkers throughout history have addressed them -- although not exhaustively of course.
3. It's actually introductory, for people who haven't taken philosophy before.
4. It's inexpensive.

I had used Does the Center Hold? but it's chapter on metaphysics was exclusively on philosophy of mind (not really dealing with space and time, causality, etc.), and the chapter on philosophy of religion was just terrible. Anyone have any ideas?

Discuss this post at the Quodlibeta Forum

Sunday, June 22, 2014

The Trace of God by Joe Hinman

“Are mystical experiences real?” asks Joe Hinman in his new book, The Trace of God: A Rational Warrant for Belief. It should not be too much of a spoiler to reveal that he concludes that they are. Joe has distilled the research on mysticism since William James to determine the commonalities of these experiences and to ask how much they can tell us about aspects of reality not readily accessible to everyday experience.

There are plenty of problems with studying mystical or “peak” experiences. For a start, they are highly subjective and extremely difficult to describe. Francis Spufford gives it a go in his excellent book Unapologetic, but reading about someone else’s experience is always second best. I am a decidedly non-mystical person and so I find the subject alien, if quite fascinating. Joe introduces us to Ralph Hood Jr’s “M scale” which attempts to provide a measure of mystical experience so that they can be compared and validated. In fact, it turns out that there has been a great deal more work in this area than your might imagine. This has revealed uniformities that mean we can certainly group these experiences into a single category.

But are they real? The standard rationalist response is to dismiss mysticism was something that goes on inside our heads, often aided by illicit chemicals. It has no external cause and so it is of interest to neurologists and hippies only. It certainly can’t tell us anything about God. As Joe explains, the problem with this dismissal is that all our experiences are ultimately subjective. We never enjoy unmediated access to reality, but only the most radical solipsist would claim this means that reality doesn’t exist. We know when we are awake and quite often know when we are dreaming. Mysticism isn’t like that – it feels more real than everyday life.

Still, the strength of science, says the rationalist, is that it overcomes subjectivism by insisting on repeatability. Joe marshals Thomas Kuhn and other sociologists of science to argue that scientists are just as prone to herd mentality as the rest of us. I’m not sure this goes far enough to mean a mystical experience can claim parity of subjectivity with a laboratory experiment. But Joe doesn’t want to take things that far. He just argues that the mere fact that mystical life is subjective does not rule it out of court as a valid experience from which we can extrapolate knowledge. His basic argument is that we are justified in accepting religious truths on the ground of our own experiences (what is called the “religious a priori”). Thus, mysticism can provide us for a rational warrant for religious belief.

The bulk of The Trace of God is taken up by detailed rebuttals of sceptical arguments against mysticism: that it is just emotions and feelings, caused by drugs or brain chemistry. Joe blunts these arguments, but he would be the first to admit that he has not proved that mystic experiences are not purely internal. However, by showing that rationalists cannot invalidate mysticism, he leaves the road open to his own argument: these experiences are evidence of God in the way that a footprint is evidence of a wild animal. Mysticism provides a trace that gives us a rational warrant to postulate the existence of the being that gave rise to the spoor. We’re not dealing in proof here. In that respect, Joe’s project is similar to Alvin Plantinga’s work on warrant. But whereas Plantinga floats his justifications on rarefied philosophical air, Joe builds on the solid ground of widely experienced phenomena. No one, as far as I am aware, believes in God because of philosophy. Plenty of people base their religious faith on mystical experience.

This leaves us with two difficult questions: is belief in God warranted by someone else’s mystical experience? And where does religious doctrine fit into experiences that can be wildly inconsistent? Joe doesn’t really deal with the first of these. He concludes that the evidence of mysticism provides him with warrant for knowledge about God that he has anyway. It doesn’t seem to provide much evidence for the non-believer unless that non-believer is willing to invest in a religious interpretation of these experiences. As far as apologetics goes, this is a “come on in, the water’s lovely” argument. 

And what about doctrine? Joe, like me, is an orthodox Christian of liberal persuasion. One senses that, for him, the universal aspects of mysticism are an advantage not a problem. A Christianity that damned the rest of humanity (or worse a Christian sect that damned most Christians into the bargain) is not one that Joe or I would be comfortable with. If mystical experiences provide warrant for believing in God, they also provide evidence of God’s interest in all of humanity. Joe distinguishes between knowing God “face to face” and the knowledge of doctrine. He finds evidence for the distinction in the writings of Paul: the man who had the most famous mystical experience in history. 

Overall, as a first book that breaks new ground in the philosophy of religion, this book represents a considerable achievement. Joe’s publishers, Grand Viaduct, also deserve credit for helping him overcome the disadvantage of dyslexia to communicate his ideas in a format such that they might achieve the recognition they deserve.

Discuss this post at the Quodlibeta Forum

Wednesday, March 19, 2014


Ed Feser had an online debate with Keith Parsons. Considering that they had had some minor hostility issues with each other previously, it's remarkable how respectful and cordial they are, although it started with some less-than-polite forays. Here are the elements, and the comments are worth reading too:

Before the debate:
Keith Parsons: Can the arguments of the "new atheists" be made stronger?
Ed Feser: Four questions for Keith Parsons

The debate:
Parsons: Answering Prof. Feser
Feser: An exchange with Keith Parsons, part 1
Parsons: Reply to Prof. Feser's second question
Feser: An exchange with Keith Parsons, part 2
Parsons: Reply to Prof. Feser's third question
Feser: An exchange with Keith Parsons, part 3
Parsons: Reply to Prof. Feser's fourth question
Feser: An exchange with Keith Parsons, part 4

After the debate:
Feser: Can you explain something by appealing to "brute fact"?
Parsons: Response to Prof. Feser's response, part 1 (I'm not sure why this was just posted a couple days ago as it seems to form a part of the actual debate, but so be it)

Discuss this post at the Quodlibeta Forum

Wednesday, March 05, 2014

Arctic views

With Russia's sort-of invasion of Ukraine (some say it is definitely an invasion, some say it isn't), pundits on the political right are pointing out how, during the 2012 American presidential campaign, Mitt Romney was mocked by President Obama, John Kerry, and the media for suggesting that Russia was America's number one geopolitical foe. Some have looked further back to the 2008 campaign when Sarah Palin was not taken seriously when she suggested that Obama's stance towards Russia's invasion of Georgia would only encourage Putin to invade Ukraine.

This latter case is bringing up another issue involving Sarah Palin and Russia that I've never understood. During the 2008 campaign, an interviewer asked her for her thoughts on Russia, given its proximity to Alaska, the state Palin was the governor of at the time. She responded that Russia and Alaska are neighbors, and that in fact you can see Russian territory from Alaskan territory, specifically an island in the Bering Strait.

When I first heard this, I nodded my head. I thought it was common knowledge. There are two islands about two and a half miles apart in the Bering Strait: the Little Diomede Island is Alaskan and the Big Diomede island is Russian. The Alaskan island has a town facing the Russian island, and the Russian island has a military base on it. The international date line goes right between the two islands, and the space between them (actually the whole Bering Strait) was known as the Ice Curtain during the Cold War. Monty Python's Michael Palin began one of his travelogues on Little Diomede Island, and tried to finish it there as well, but couldn't quite make it. I remember in the 1980s the comic strip Bloom County had a sequence about how some ignorant hicks heard that the USSR had moved within two and a half miles of American territory and were panicking about it. Lynne Cox, an American swimmer, swam between the two islands to "ease international tensions." Etc. Again, I thought this had permeated American culture and that everyone knew it -- not necessarily the names of the islands (which I didn't know), but just that there was an American island and a Russian island a couple miles apart in the Bering Strait.

In fact, the Diomede Islands aren't the only place that Alaska and Russia are within sight of each other: "To the Russian mainland from St. Lawrence Island, a bleak ice-bound expanse the size of Long Island out in the middle of the Bering Sea, the distance is 37 miles. From high ground there or from the Air Force facility at Tin City atop Cape Prince of Wales, the westernmost edge of mainland North America, on a clear day you can see Siberia with the naked eye." St. Lawrence Island has two towns on it. And Tin City, as noted above,  is part of the North American continent, not an island. It is the mainland, and you can see Siberia from it: "The station chief at Tin City confirms that, for roughly half the year, you can see Siberian mountain ranges from the highest part of the facility."

Yet when Palin said you can see Russian territory from an Alaskan island, everyone went crazy about how stupid she was. On Saturday Night Live, Tina Fey, portraying Palin, said "I can see Russia from my house," which, incredibly, has entered the public consciousness as something Palin supposedly said. I guess if people were ignorant of the Diomede Islands -- and if they didn't realize that the proximity of mainland Russia to St. Lawrence Island and the westernmost part of the North American continent allowed an observer to see one from the other (which I was ignorant of and surprised by) -- I could understand them being skeptical of Palin's actual statement. But even if you think she's unintelligent and says foolish things in general, once you found out about these islands, why in the world wouldn't you respond by saying, "Oops, my bad, Palin was right." I mean, it's no big deal. You didn't know about a couple of islands in the Bering Sea. It's not a personality flaw. Yet Palin's statement is still held up as an example of stupidity on her part. I don't get it. Maybe it's because she supposedly used this to tout her international cred. But, again, she was asked about Russia's proximity to Alaska, and she merely confirmed it by accurately stating you could see Russian territory from Alaskan territory.

I'm not defending Palin's politics at all here. My confusion about this has nothing to do with her politics or her overall intelligence or how well-informed she is. I just don't understand why an innocuous and correct statement she made in response to an interviewer specifically asking her about this subject would cause so many people to have such a strong reaction that she must be wrong. If you think she's unintelligent, fine. If you think she's wrong about politics, great. This isn't politics, it's geography. What's the source of this reaction? It's this absurd polarization, this staking out of claims, this willful blindness that makes me avoid politics as much as possible.

Update: Here's some pictures that I obviously cannot vouch for. First, to show their proximity, are some pictures of the Diomede Islands:

Second, a picture of the Russian mainland from St. Lawrence Island, with the Alaskan town of Gambell in the foreground:

Third, a picture taken from Tin City (presumably at the part that's over 2000 feet above sea level), which, to reiterate, is the westernmost point of the North American continent. The Diomede Islands are about three-fourths up from the bottom of the picture -- Little Diomede is right in front of Big Diomede, so it's hard to distinguish them. On the upper right part of the picture is Cape Dezhnev, the easternmost point of the Asian continent, and on the upper left part of the picture is more Russia.

Another update: Here's the beginning of Michael Palin's travelogue Full Circle, which begins on Little Diomede Island with Big Diomede in the background:

Discuss this post at the Quodlibeta Forum

Wednesday, February 26, 2014

Some recent reads

Brainstorms: Philosophical Essays on Mind and Psychology by Daniel Dennett. I just added Dennett to my "Favorite Books" list on my profile page. Not because I agree with him, but because he has staked out the issues that I tend to move towards as well. The difference is in our assessment of the issues, but we agree on the meta issue of which issues should be addressed. I'm currently reading The Intentional Stance (I'm stuck in the far-too-long chapter "Beyond Belief" which defends his almost-but-not-quite eliminativism), and will move on to his new book afterwards, probably.

Physicalism, or Something Near Enough by Jaegwon Kim. You should read everything Kim writes. He is one of the most important voices in philosophy of mind. I'd like to make a website devoted to Kim's writings, analogous to the websites Andrew Bailey has made for Alvin Plantinga, Peter van Inwagen, and others. This book, which I disagree with for the most part, is his attempt to solve the mind-body problem and mental causation, which he also addressed in his excellent Mind in a Physical World. He seems to think he is successful -- and Kim is emphatically not one of those overconfident philosophers who solves deep problems with superficial analyses -- but he says he is still left with qualia. That's why it's near enough to physicalism: he's solved the most important and difficult part, and the remainder is a difficult but nowhere near as significant issue.

Reason, Metaphysics, and Mind edited by Kelly James Clark and Michael Rea. These are the proceedings from Alvin Plantinga's retirement conference. The essays aren't on Plantinga's philosophy, but rather (and I approve of this wholeheartedly) on issues that Plantinga wrote on extensively. As with any collection there are some good and some not as good, but it's definitely worth it.

The Concept of Canonical Intertextuality and the Book of Daniel by Jordan Scheetz. Jordan is one of my best friends, so I'm completely biased towards this book. We went to the same school, and we both took a class on Aramaic. This is relevant because his book is a focused commentary on Daniel, one of two Old Testament books (the other being Ezra) with a significant portion written in Aramaic rather than Hebrew. This book is a short commentary, but Jordan has mastered the languages so completely, it's incredible. Some general editor compiling a new series of Bible commentaries better contact him, because if he wrote a comprehensive commentary on Daniel, it would be amazing.

Simply Jesus by N.T. Wright. Very good. I need to start buying his Christian Origins and the Question of God series, seeing as how he's just published a fourth volume on Paul.

Jesus and the Logic of History by Paul Barnett. I've had this book for 15 years and never read it. I finally took it off the shelf a couple months ago. It's outstanding. One very interesting point he makes is how scholars start studying Jesus with the gospels, and never move on. He suggests that we start with the New Testament epistles because the information they contain about Jesus is a) tangential to the points they're making to their audience, and b) was already accepted by the original audience -- that is, the statements were meant to remind the readers about something, or put it in a specific context they may not have thought of before. Barnett suggests this makes these statements immune to many of the methodologies used to evacuate the gospels of historical validity.

The Martian by Andy Weir. My wife brought me back a pre-publication copy of this novel from a conference she attended in Chicago last month (it was officially published this month). It's about one of the first manned explorations of Mars, and one astronaut being accidentally abandoned there and struggling to survive, the hope of rescue, etc. I read a lot of science-fiction, so I'm referencing this book in lieu of a long list. My reasons for singling out this one are that a) it's a particular subject that I love: near-future exploration of the solar system; b) it threads the needle of being a very pleasant read while being nice hard science-fiction: the guy really knows the science and the technology (or at least is able to convince a layman like myself that he does). This is made all the more impressive by the fact that c) it's the author's first work. For my fellow science-fiction fans, I recommend it.

Discuss this post at the Quodlibeta Forum

Saturday, January 25, 2014

On the Appearance of Age; or Putting the "omph" in omphalos

I just finished reading Darwinia by Robert Charles Wilson, whose short story "Utriusque Cosmi" made me a lifelong fan no matter what else he writes. The premise of Darwinia is that in 1912 Europe essentially disappears and is replaced by an alternate Europe with roughly the same coastlines, rivers, mountains, etc., but no sign of human civilization, and with plants and animals from a very different evolutionary history than our own. Wilson uses this to ask questions about one of the primary arguments young-earth advocates use in order to avoid the scientific evidence that the earth and universe are billions of years old: the claim that God created things with a false appearance of age. Wilson's main character speculates about this issue:

Certainly Europe had been remade in 1912; just as certainly, these very trees had appeared there in a night, eight years younger than he found them now. But they did not seem new-made. They generated seed (spores, more precisely, or germinae in the new taxonomy), which implied heritage, history, descent, perhaps even evolution. Cut one of these trees across the bole and you would find annular growth rings numbering far more than eight. The annular rings might be large or small, depending on seasonal temperatures and sunlight ... depending on seasons that had happened before these plants appeared on Earth.

Similarly, young earth creationists claim that God created trees with annual rings, polar ice sheets with annual layers, and coral atolls with daily band deposits for days, years, and millennia that never happened. One prominent way they do this is to suggest that when God created the stars, he also created beams of light in transit between those stars and the earth (and presumably everywhere else in the universe). Otherwise, light from stars that are more than a few thousand years away from us wouldn't have reached us yet, and so couldn't be observed.

The problem here is very much the same as with tree rings that indicate weather conditions from years that, ex hypothesi, never happened. As I wrote here, when we observe light from distant objects, we don't just observe objects, we observe events. For example, astronomers regularly observe supernovae in other galaxies, millions of light years away. Now say God created the beams of light from those galaxies in transit a few thousand years ago. In that case, the light that left those galaxies immediately upon their creation would still have a long way to go before it reaches us; what we observe is just the beam God created between these galaxies and us. So when did these supernovae take place? Are they taking place now, that is, when they are observed by us? But then in a few million years, we'll see them again when the light they produce reaches us. It seems that since the light showing a supernova taking place was created in transit, these supernovae never happened.

Now this scenario is extremely contrived or ad hoc. But that's not the problem I have with it: the problem I have is that it ascribes deception to God. God is painting scenes on the sky that never happened, he is manipulating the universe to make it appear differently than what it actually is. But the God of the Bible cannot lie. It's not merely that he does not (in that he's never had occasion to) or will not (in that he chooses not to) but he cannot. It is contrary to his nature.

In response, I've heard young earth advocates challenge this, by suggesting that this puts God in a box. God can create any way he wants to: why should we assume that it's contrary to his inscrutable will to create, say, a car that looks rusted and dilapidated? Or take a Scriptural example: God had the Hebrews wander in a seemingly random manner in order to trick Pharaoh into thinking that they were confused and could be easily defeated (Exodus 14:1-4). So God can manipulate for purposes that will often be beyond our ken.

There's two answers to this. First, it seems to me that creating something that manifestly displays properties it doesn't really have would still qualify as deception (and thus as lying). By "manifestly" I do not mean "superficial", I mean something that is not ad hoc or contrived. If you built a car but designed it to look like an old rustbucket when it actually is not, would you be trying to deceive people? Whatever reason Pharaoh had for thinking that the path the Hebrews were wandering in was random, he had a much stronger reason for thinking that God was guiding them: he had just had ten plagues visited on his nation which were explicitly revealed to be a punishment from God for his failure to let the Hebrews go. Once he let them go, they traveled in such a way to look as if they were hemmed in by the desert, but Pharaoh could not have thought that meant they could be recaptured without ignoring the much more obvious, dramatic, and explicit events that had just taken place.

Perhaps I'm wrong about this though. Perhaps creating a car that looks old when it is not would not automatically count as a lie. But here's my second point: it would count as a lie if God told us the car was a reliable and trustworthy revelation from him. And this is exactly what God says of the natural world. He tells us that nature is true revelation (which is redundant) from God, which is clear and understandable to all people in all times and places -- including times and places that did not have access to the Bible or any other form of special revelation. God never told Pharaoh that he would reveal himself through the route the Hebrews would travel after their departure from Egypt, but he did tell him to let his people go. If God created a new car that looked rusted and dilapidated and then told us that this car could be trusted to reveal the truth, he would be lying, because it wouldn't reveal the truth. And God can't lie.

In response to this, young earth proponents will often give Scriptural examples of God creating things with a false appearance of age, and then suggest that this could be true of the universe as a whole (which, incidentally, commits the fallacy of composition). Here are the three examples I've encountered:

The creation of Adam and Eve. Many argue that when God created Adam and Eve, he didn't create them as zygotes which then slowly grew to infancy, childhood, and eventually adulthood -- he created them as adults. Since they were created "full grown" they bore the appearance of an age that they didn't actually have.

Now I will not argue here about how literally we are supposed to take the story in Genesis 2, I'll grant that it's literal for the sake of argument. Nor will I enter into an extensive analysis as to whether the biblical text really commits us to the claim that God created Adam and Eve as adults. I'll grant this too. Even with this, I think it is still enormously problematic to suggest that God created Adam and Eve with a false appearance of Age.

This can be illustrated by asking whether Adam's and Eve's cells and organs had physical indicators that they had been alive for twenty (or so) years. For example, according to this scenario God presumably created Adam and Eve with adult-sized hearts. But it doesn't follow from this that these hearts bore the wear and tear of having been beating for twenty years -- he created them brand new, not with a false appearance of age. Let me reiterate that: they would have appeared adult-sized AND brand new. The claim that being created as adults means being created with an appearance of age presupposes that size and age are essentially the same thing. This is obviously false.

Second, if the fact that they were created as adults indicated a false appearance of age, then we have opened a door we definitely do not want to go through. If Adam's and Eve's bodies bore a false appearance of age, we have no grounds for denying that their minds may have as well. In other words, God may have created Adam and Eve with false memories of childhoods which never happened. And thus, there is nothing to prevent us from maintaining the same thing of our own memories. God, in other words, would be implanting false memories into our minds. I've never seen anyone suggest anything like this, and it seems so absurd, and so blatantly contrary to God's truthful character, that I doubt any Christian would seriously propose it. But it's unavoidable that this would be a possibility if we try to argue that God's creation of Adam and Eve as adults implies that he created them with a false appearance of age.

Finally, the bodies of Adam and Eve are not here for us to examine to see if they really do bear a false appearance of age. But the universe is here for us to examine. We should always try to understand the unclear in light of the clear, not the other way around. We can't employ what is, at best, a highly speculative interpretation of Scripture in order to deny the reality of the world around us.

Jesus changing water into wine. At the beginning of his ministry, Jesus changed water in several jars into wine (John 2:1-11). Wine is by its very nature an aged substance. It takes time to ferment. When Jesus made wine instantaneously out of water he either radically sped up the fermentation process, or he created the wine with the appearance of having experienced the fermentation process when it had not. In either case, the wine would have borne a false appearance of age.

However, it is not evident that the molecular structure of wine by itself indicates a particular age or appearance of age. The fact that alcohol is naturally produced by fermentation does not imply that if God supernaturally changes H2O molecules into alcohol molecules, he makes them with the appearance of having been produced by fermentation. Just as the previous argument equates size with age, so this argument equates molecular structure with age, which again is obviously false.

I think some people who argue that changing the water to wine indicates an appearance of age are thinking of a wonderful passage by C.S. Lewis  in his book Miracles about Jesus' miracles of fertility. Lewis points out that the water to wine and the multiplication of bread and fish (Mark 6:30-44; 8:1-13) are doing something in a different way that God usually does through nature. Bread is multiplied in that a single seed grows into a full plant; fish are multiplied by procreation; and water is changed to wine through the growth of grapes and fermentation. "Thus, in a certain sense, He constantly turns water into wine".

Of course, it all turns on the phrase, "in a certain sense". Water, after all, doesn't ferment. The point of these miracles, Lewis argues, is that it shows that God is the God of fertility, the God of the vine, "He is the reality behind the false god Bacchus". God usually accomplishes these things through the universe he made, but he can also do it directly, "short circuit[ing] the process". To suggest that in these acts God is creating something with a false appearance of age is to completely miss the point. The miracle of changing water to wine was a miracle of transformation, not one of aging: God supernaturally changed the molecular structure of the water in the cisterns into the molecular structure of wine. In other words, God created all the elements of wine other than water and then placed them in the water. This doesn't mean that God "sped up" the natural process of fermentation any more than when someone mixes water with dehydrated wine (yes there is such a thing). Moreover, as with the bodies of Adam and Eve, the wine Jesus made from water is not present for us to examine. We simply cannot conclude, therefore, that it bore a false appearance of age.

Some may think that if we deny the possibility of God creating with a false appearance of age, we are claiming that he can't speed up natural processes. But I don't claim this. God can speed up (or slow down, or change in any way he wants) the processes of nature at his discretion. My claim is merely that, if he does, the objects acted upon would bear witness to his divine intervention. Or perhaps these critics are thinking that any proposed first state can be given a naturalistic history. Thus, it is impossible for God to not create without some appearance of age. This seems to assume that God's miracles could actually occur by natural processes given enough time, just as wine, bread, and fish can be produced by natural processes. Then, when God performs a miracle, he speeds up these natural processes. I simply disagree: while some miracles may be something that could occur naturally (perhaps the miracle then being in their timing; the parting of the Sea of Reeds might be an example), this is not the case for all of them. There are some miracles that could never occur naturally without divine intervention, so they wouldn't represent a false appearance of age. Water in a jar will never turn into wine by itself no matter how much time you gave it. Natural processes will not bring a dead man back to life with a glorified body if you wait long enough.

The budding of Aaron's staff. In Numbers 17, we are told that the Israelites were jealous of the special position God had given Moses and Aaron, so God had Moses take the staffs from the leaders of each of the twelve tribes and place them in the tent of meeting. The following morning, Aaron's staff had sprouted and budded, producing blossoms and ripe almonds. However, the miracle here was not that God "sped up" a natural process, but that he brought a dead piece of wood back to life. All of the reasons why the bodies of Adam and Eve and Jesus' transformation of the water into wine don't imply a false appearance of age also apply here. And just like the other two examples, we don't have Aaron's staff to examine to see if it really does exhibit a false age. How do we know that, upon closer examination, the bodies of Adam and Eve, the wine made from water, and Aaron's staff wouldn't give evidence that they had been supernaturally altered? Wouldn't it be more reasonable to conclude that God wouldn't cover up or conceal such remarkable examples of his power by making them appear normal when they weren't?

None of the examples above constitute examples of God creating things with a false appearance of age, and hence we have no grounds for asserting that he may have done so with the universe as a whole. We know that creation can be trusted to reveal the truth about itself, since God has gone to such lengths to tell us that it is a revelation by which he makes himself known to humanity. If this revelation weren't trustworthy, it's inexplicable why God would tell us that it is, unless God himself is a deceiver. That is not an option for the Bible-believing Christian.

(cross-posted at Agent Intellect)

Discuss this post at the Quodlibeta Forum