Monday, March 31, 2008

Celestial delectation...and gifts of grace

In his journal, the poet Samuel Taylor Coleridge jotted down what he took to be the five stages of prayer:

First stage -- the pressure of immediate calamities without earthly aidence makes us cry out to the Invisible.

Second stage -- the dreariness of visible things to a mind beginning to be contemplative -- horrible Solitude.

Third stage -- Repentance & Regret -- & self-inquietude.

Fourth stage -- The celestial delectation that follows ardent prayer.

Fifth stage -- Self-annihilation -- the Soul enters the Holy of Holies.

I mentioned this once before in these posts, and translated Coleridge's stages like this:

First stage -- Help!

Second stage -- Here I am!

Third stage -- Oh my God I am heartily sorry for having offended Thee...

Fourth stage -- Gee! -- followed by -- Wow!

Fifth stage -- silent attention.

For someone like me, raised in a traditional religion, getting to the fourth stage required something of a journey, a shedding of layers, a letting go of God. The fifth stage comes as a kind of grace -- out of the blue, occasional and unexpected -- when in the midst of a Wow! one becomes aware of how infinitely inadequate is any word. In those transcendent moments only silence is a suitable response. The essayist Pico Iyer put it this way: "Silence is the tribute that we pay to holiness; we slip off words when we enter a scared place, just as we slip off shoes."

(Click on image.)

Sunday, March 30, 2008

A mind of its own?

Viagra turned ten years old this past week, which promts the reprise of a Globe column from eight years ago. See this week's Musing.

Click to enlarge Anne's Sunday illumination.

Saturday, March 29, 2008

Picky, picky

A few days ago, Geoff gave us a beautiful example of "imprinting" from the work of Peter and Rosemary Grant with the finches of the Galapagos. Here's another example that struck me as rather amazing.

Back in 1967, Neal Griffith Smith of the Smithsonian Institution reported in Scientific American on a series of experiments with gulls. The problem that attracted his attention was this: Some species of gulls that live together look very nearly alike, yet do not interbreed. How is it that gulls of one species recognize others of their own kind?

For example, at one site on the eastern coast of Baffin Island in the Canadian Arctic, four species of gulls inhabit the same breeding ground without interbreeding. The only visual differences between the species are the color of the eyes and the fleshy rings around the eyes, and slight variations in the shading of the back and wings. These subtle differences are easily recognized by ornithologists. Are they equally important to the gulls?

To test the significance of visual cues for species isolation, Smith captured hundreds of gulls by lacing bait with drugs. Once the birds had been rendered immobile, Smith painted their eye rings. Gulls with light-colored eye rings were painted dark, and vice versa. As a control, other groups of birds were drugged but not painted, or drugged and painted with their own color. Still others, of course, were neither drugged or painted.

The result: Eye-ring color (or contrast) did make a difference in several interesting ways. For example, females chose mates that looked like themselves. Light-eyed females chose light-eyed males, and ignored birds of their own species that had their eye rings painted dark. In a word, gulls are gullible.

The next question, then, was how do female gulls know the color of their own eyes? Smith's conclusion: Since mirrors are generally absent on the cliffs of Baffin Island, female gulls must "imprint" on their parents soon after birth, and thereafter seek a mate with eyes like mom and dad.

This business of imprinting is rather wonderful. Even more wonderful, perhaps, is the idea of the intrepid zoologist traveling the Arctic by dog sled and kayak, with paint box and drug capsules, bringing the light of science to bear upon the sex lives of gulls.

Friday, March 28, 2008

The big C

The men in my family are prone to prostate cancer, so I keep close tabs on my own prostate. What am I looking for? Here is a microphotograph of prostate cancer cells, from the Wellcome Trust's annual scientific image competition. Other images among this year's winners are also of cancer cells. Granted the colors are computer enhanced, but one sees a kind of beauty in these micro landscapes of death. As we explore the world of the invisibly small, we discover a majesty and mystery as great as anything we find among the distant galaxies.

"Beauty is nothing but the beginning of terror," wrote the poet Rainer Maria Rilke. This is the pretty army of cells that escaped from my father's prostate at age 64 and infiltrated his entire body. He fought them with every instrument at his disposal. His journals from his final weeks are full of quantitative data, graphs, theories. He was a scientist to the end, but no match for the simple power of life itself. If you visit the Wellcome Trust image page, make sure you watch the computer animation of cellular machinery spinning out the fabric of life. Sometimes to our dismay the spinning gets out of control.

Meanwhile, Anne has given us a link to a wonderful animation of the molecular machinery of a cell in action. I have tried to describe this stuff before in words--
Now here comes the really astonishing part.

All that DNA -- those hundreds of trillions of wads of thread -- is not just sitting there, static. As you read this essay, a flurry of activity is going on in every cell of your body.

Tiny protein-based "motors" crawl along the strands of DNA, transcribing the code into single-strand RNA molecules, which in turn provide the templates for building the proteins that build and maintain our bodies. Other proteins help pack DNA neatly into the nuclei of cells and maintain the tidy chromosome structures. Still other protein-based "motors" are busily at work untying knots that form in DNA as it is unpacked in the nucleus and copied during cell division. Others are in charge of quality control, checking for accuracy and repairing errors.

Working, spinning, ceaselessly weaving, winding, unwinding, patching, repairing -- each cell like a bustling factory of a thousand workers. Ten trillion cells humming with the business of life.
--but the animation does it ten times better.

Thursday, March 27, 2008


Back to the decorations by the sculptor Gislebertus on the cathedral at Autun, France. As you can see here, they are really quite marvelous. Whimsical, gentle, with a keen eye to human nature. Even the horrific Last Judgment on the west tympanum -- which was clearly meant to scare the bejesus out of any poor soul who looked upon it -- has about it a fairytale quality. I love especially the sequence associated with the birth of Jesus on the interior capitals. None more delightful than the angel awakening the Magi (click to enlarge).

The kings are nestled all snug in their bed, their crowned heads resting on a single pillow as if it is the most natural thing in the world for such royal personages to sleep together. (Are they naked beneath the cover? We see one bare arm and shoulder.) The sweet-faced angel gently touches the little finger of one of the kings. His eyes pop open. Look! Look! says the angel. There in the sky. The star. The star.

The story was as familiar and endearing in the 12th century as it is today. All of us who were raised Christian know it by heart. The kings. The camels. The gold, frankincense and myrrh. The visit to Herod, The flight into Egypt. The slaughter of the innocents. It was part of the intellectual furniture of our youth. And a lovely story it was too, never more beautifully told than on the capitals at Autun.

And the star? Might it have been real? Astronomers, professional and amateur, never tire of proposing candidates for the Christmas star. Forget for the moment that the whole story is likely apocryphal. Forget that the actual year of Christ's birth was probably sometime between 7 B.C., when Augustus ordered a census of Judea, and 4 B.C., when Herod died. Forget that the season of birth may have been spring, when shepherds watched their newborn lambs by night. Let's pick the traditional place and time, Bethlehem in Galilee, on the night of December 24-25 in the year 1 B.C. (as historians reckon). With my astronomy software I can reconstruct what the kings might have seen as they approached the stable where Mary has given birth. The night is bright. A waxing gibbous moon stands near the Pleiades in Taurus. Saturn is in Gemini, and Venus is the evening star. As the long night progresses, the moon moves with spangled winter constellations to its western setting several hours before sunrise. As the moon sinks below the horizon, Jupiter rises in the east with the stars of Virgo. The sky grows bright. Peace on Earth, good will to men. It is the first Christmas morning.

Wednesday, March 26, 2008

Right before our eyes

Some of us will remember the terrific 1994 book by Jonathan Weiner, The Beak of the Finch, about the 14 or so species of finches that inhabit the Galapagos Islands off the coast of Ecuador. They are known as "Darwin's finches" because of the role they might have played in suggesting to the great biologist certain key ideas of evolution by natural selection. The beaks of the finches are marvelously adapted to their food sources -- large beaks crunch big seeds more effectively, but large-beaked birds are at a disadvantage when seeds are small, and so on. The finches presumably derived from an initial population that arrived at the Galapagos from the mainland, then diversified and speciated on the various islands. Weiner's book was based largely on the work of Peter and Rosemary Grant, who at that time had been studying the finches for more than 16 years -- banding birds, measuring beaks and body parts, locating nests, recording songs, keeping track of climate, food sources, competition, mate choice, and so on.

Now the Grant's have published their own account of more than 30 years of research, How and Why Species Multiply: The Radiation of Darwin's Finches, and I can't wait to get my hands on a copy. From Weiner's account of their work, we can expect a stunning compilation of data painstaking gathered over three decades, as climate and vegetation changed in response to swings in El Nino, all of which (according to reviews) is consistent with the basic insights of an insatiably curious young man who visited the islands 173 years ago. The armchair anti-evolutionists who want to deprive our public school children exposure to one of the great insights of human genius might profitably read the Grants' book to see how real science is done.

Tuesday, March 25, 2008


A remarkable carving, long a favorite of mine, usually called "The Temptation of Eve" (click to enlarge). It resides in the Musee Rolin near the 12th-century Cathedral of Saint Lazare at Autun, France, of which it was once a part. The sculptor did many of the wonderful decorations of the cathedral. On the west tympanum are carved the words Gislebertus hoc fecit, "Gislebertus made this," traditionally assumed to be the sculptor's claim to authorship at a time when religious art was almost universally anonymous.

Whatever the sculptor's name, there is something hauntingly original about his work -- especially Eve. Lithe and sensuous, she seemingly swims through the garden, delectably naked, She is about to pluck the forbidden fruit, and her hand is at her blushing cheek as if she knows she is doing something naughty. She could be any young woman about to embark upon her first misadventure, her very own original sin.

This Eve is a part of nature, her body as sinuous as the twining plants. The stem is about to snap. The luscious fruit will be eaten, and Eve -- lovely Eve -- will bear the burden of innocence lost. And look! Look at her expression. She doesn't know we are watching. But we are watching. And we recognize what's going on. Who has not shared this delicious moment, the first post-adolescent sin?

Science has long since rendered unliteral the story of Genesis. It has given us instead Mitochondrial Eve, the matrilineal most recent common ancestor, who apparently lived in East Africa about 140,000 years ago, and who contributed her mitochondrial DNA to every human now alive. She was not alone with a single partner in whatever passed for her garden. She was part of a population of other human ancestors, one twig of a family tree with a long ancestry of her own. Can we assume she already bore within her evolutionary heritage some mix of the emotions we see in Gislebertus' Eve -- the anxious stirrings of the flesh, the will to be wayward, the headstrong disobedience? And, yes, maybe guilt too.

The new story, like the old one, grounds much of human nature in an ancestral past. The difference is this: In the new story there is no prelapsarian Eden, no world without the pain of childbirth, without thorns and thistles, without the sweat of the brow. We are and always have been like Gislebertus' Eve entwined in a living web. What we are seeing in the Autun sculpture is the dawning of moral consciousness, a moment of singular significance for each of us individually and for our species.

Monday, March 24, 2008


In a review article in a recent issue of Science, archeologist Fiona Coward asks "How and why did humans learn to learn?"

We are not the only species that learns. Some behaviors of other animals -- from ants to birds to dolphins -- are learned by imitation of other members of the same species. We share with our primate cousins a so-called "mirror neuron system," which automatically maps the observed actions of others onto one's own motor system.

Humans, however, seem to be unique in going beyond imitation to intuiting another person's intentions and states of minds. Coward suggests this might be the prerequisite for cumulative cultural transmission of learned behaviors.

Which is to say, many animals learn, but only humans teach.

Teaching is perhaps the quintessential human characteristic, the thing that insures the cumulative transmission of cultural knowledge and behaviors. Parents teach. Elders teach. Older siblings teach. And teachers teach. The transmission of culture gave rise to a professional occupation -- the teaching of children and young adults. It is, I like to think, a noble profession. No teacher goes into teaching with a view to making money or garnering glory.

America's great experiment with secular public education is one of our nation's glories, the glue that binds a diverse population together into a tolerant and mutually respectful whole. The respect given to that project and to the teachers in it is a good measure of our self-respect as a people.

Sunday, March 23, 2008

"Our aim is perfection"

My father's birthday is this week. He would have been ninety-eight. Born in the year of Halley's Comet, he did not live long eough to see its return. See this week's Musing.

Click to enlarge Anne's Easter blessing.

Saturday, March 22, 2008


Here is the cover of the current issue of Science. What, pray tell, are these myriad golden spheres? They are Staphylococcus aureus, one of the most successful human pathogens, here revealed in all their glory by the scanning electron microscope.

Each aureate orb is a living organism. Invisibly small, but intent on doing the same thing you and I do -- propagating its genes. How small? A S. aureus bacterium is about a micron in diameter, which means a thousand could line up across this letter o. Think of them! That little column of one thousand, that little line of golden pearls. This is where it all began, the primaeval seed of life on Earth, the microscopic golden egg that laid the macroscopic goose.

And, oh, they love us. Here's a pic I found on the web showing some of the ways they love us (click to enlarge). They love us with a fierce, enduring love. Sometimes they love us to death.

It's not a one way street. We have used microbes for generations to make beer, bread and cheese. We are learning how to use them as medicines to fight other invisible organisms. We are splicing their genes into food plants to make the plants insect or insecticide resistant, salt or drought tolerant.

More is coming, a third major cultural revolution (after the agricultural revolution and the industrial revolution). A massive utilization of microbial life. Plants that fix their own nitrogen, making unnecessary the application of artificial fertilizer. Pollution-free bio-electricity generation. Water purification and sewage disposal systems. Don't be surprised if the long-term solutions to our energy problem and global warming involve harnessing bacteria.

There is nothing fundamentally new about any of this. Visible plants and animals would not exist if bacteria had not already spent billions of years transforming the Earth, making it habitable for the likes of you and me, adding oxygen to the atmosphere, for example, or creating soil. They weren't doing it for us. They were doing it for themselves. And now we will use them en masse for our continued success.

If they don't get us first.

Friday, March 21, 2008

Rekindling the fire

Yesterday at 5:48 UT the Sun crossed the sky's equator and moved into the northern celestial hemisphere. Today at 18:39 UT the Moon is full. By ancient convention, Easter is the first Sunday after the first full Moon after the vernal equinox. Easter seldom occurs earlier than it does this year. Seldom does the liturgical celebration of the Sun's return coincide so closely with the astronomical event.

I loved the Easter week liturgies as a child. The three-hour silence on Good Friday afternoon. The replacement of the bells by the wooden clackers during the time Christ spent in the tomb. The purple, black and white vestments. The lighting of the Paschal fire. The denuding of the altar. The teasing unveiling of the Cross (Ecce lignum Crucis). The glorious sunrise celebrations on Easter morning. The rabbits and eggs. It was all so magnificiently pagan. Even the name "Easter" may derive from the name of a pagan goddess.

It was the genius of the early Church to incorporate so many pagan rites and customs into the new theology. And why not? What was nearer or dearer to those who lived in the northern hemisphere than the fulcrum of the season when the Earth's axis tilts again towards the source of heat and light? We have lost most of our connection with nature's intrinsic rhythms. Artificial heat, air conditioning, electric light, and the global transport of food have pretty much erased our consciousness of seasonal change. We have divorced ourselves from the sky and from the Myth of the Eternal Return. Our theologies are neolithic, but they have been ripped from their neolithic roots.

Thursday, March 20, 2008

Seduction of the innocent

OK, I'll admit it. I don't need a new laptop.

My 2-year-old MacBook works perfectly fine. It does everything I could ask it to do.

But Apple Computer has a way of exciting my itch for novelty. They consistently turn out products -- hardware and software -- that are irresistibly gorgeous, playfully ingenious. Like the new MacBook Air. Whisper thin and feather light. Some little voice at the back of my brain whispers, "You have to have it."

I've lived for 44 years in a century-old house with wood-rot in the gutters. I drive a car with squeaks and rattles. I wear clothes until they fall apart. But computers. It seems like every couple of years I have to have what's new. Three years is an eternity.

Yeah, I know. I'm doing just what Apple wants me to do. But, damn, those Cupertino folks are clever. Even their ads and packaging make the competition look like a bunch of bumblers.

It was Apple, of course, that brought computers to the masses. It was Apple that made computers user friendly -- point and click, windows, menus, icons, set-back keyboards, touch pads, the whole shebang. They didn't necessarily invent all of these things, but they were the first to put them in our hands. By rights, Apple should be running the world right now, not Microsoft. How Microsoft can find a market for a crap product like Entourage (Outlook for Macs) is beyond me. Vista? Puh-leeze!

Oh, well, who said the world was just? The reason Apple was trumped by Microsoft is the same reason we diehard Apple fans are diehard Apple fans. We don't have much of a head for business. Rather, we admire technical cleverness and beautiful design. We are artists, writers, musicians, or plain old computer geeks who just know class when we see it.

And that's why I want a MacBook Air, even though I don't need one. It would be my only concession to conspicuous consumption. Just look at the thing. It sighs, it coos, it whispers sweet nothings -- "Buy me. Buy me." This is consumerism at its best. Or worst. Oh, Lord, help me to resist.

Wednesday, March 19, 2008


Barack Obama is a muslim.

OK, it's not true. Let me repeat: IT'S NOT TRUE. But I just helped make it true, Or at least I just helped make it "true" for those folks who want to believe it's true.

Tom will have to tell us if we get a spike for today's post, from people who google "Obama is a muslim." They know what they are looking for. If you can find it on the internet, it must be true.

Farhad Manjoo, who writes the technology column for Salon, has a new book called True Enough: Learning To Live in a Post-Fact Society. I haven't read the book, but I read bits of it in the New York Times Magazine (where he cites the "Obama is a muslim" phenomenon) and on Salon. One of his points is this: Repeating a claim, even if only to refute it, increases its apparent truthfulness. He cites psychological studies that show we often judge the veracity of a claim by society's assessment of it. "If something seems familiar, you must have head it before, and if you've heard it before, it must be true," writes Manjoo.

The internet certainly facilitates and globalizes the spread of phony baloney, but what Manjoo says about credulity isn't new. Humans have always lived in a post-fact society -- or rather I should say a pre-fact society. From the dawn of time people have looked to familiarity and tribal consensus for the validation of their beliefs. The closest we have come to creating a fact society is that body of assertions about the world we call science. Yes, consensus is part of it. But so is systematized doubt. So are quantitative data, mathematical theories, double-blind experiments, reproducible observations, peer review, and communication that makes no reference to an investigator's ethnicity, politics, or religion. Francis Bacon said long ago that what a person wants to be true, he preferentially believes. The whole point of science is to minimize personal and tribal bias.

The challenge is not learning to live in a post-fact society, but rather learning to live in a fact society. Mr. Obama would have fewer problems if more of us looked for empirical evidence to verify our beliefs.

Tuesday, March 18, 2008

The audacity of hope

In the middle of the last century, Teilhard de Chardin wrote: "A great many internal and external portents (political and social upheaval, moral and religious unease) have caused us all to feel, more or less confusedly, that something tremendous is at present taking place in the world. But what is it?"

What, indeed?

There are two prevailing opinions:

1) We are going to hell in a hand basket. Global warming, ecological disaster, tribal genocide, overpopulation, economic collapse, nuclear annihilation, moral decay: the future looks bleak, if there is a future. The best we can hope for is that some of us -- the chosen few -- will be raptured out of the cataclysm.


2) The thrust of history is towards cooperation, altruism, individual freedom, and the technological amelioration of famine and disease. We stand at the summit of a long uphill slog away from superstition and internecine strife, and the best is yet to come.

Each of us leans towards one scenario or the other, perhaps depending upon an inborn tendency towards pessimism or optimism. Teilhard, of course, opted for the brighter outlook; his world was heading toward a perfected Omega.

Only time will tell who is right. I personally side with Teilhard, maybe because I came more or less out of the same Catholic tradition, maybe because I was born an optimist, or maybe because the circumstances of my own life have been singularly blessed. I would like to think that my lifelong study of the history of science confirms an optimistic outlook. Certainly, the exponential growth of a body of reliable public knowledge, independent of ethnic, religious, and political prejudice, strikes me as the one most hopeful indications that the future will be brighter than the past.

Monday, March 17, 2008

Marvelous unity

"In the beginning, there was not coldness and darkness: There was the fire," wrote the Jesuit paleontologist Teilhard de Chardin in The Mass on the World. "The flame has lit up the whole world from within...from the inmost core of the tiniest atom to the mighty sweep of the most universal laws of being."

It's been almost half a century since I first read those words in the early 1960s as a young graduate student in physics, but I remember the tingle they sent up my spine, the exhilaration. A world lit up from within! Oh, sure, I knew "fire" was a metaphor, but here was a metaphor that fed my sense of mystery, a flickering effervescence, permanent and ephemeral all at once, so different from the dry and static world of the physics texts. Mass on the world. Mass on the world! Teilhard offered a cosmic vision that resonated with the sensual Catholicism of my youth -- bread, wine, wax, flame, chrism, water, and incense wedded to the adamantine laws of nature I was learning in the classroom. This was the theology I had been waiting for, a God that was indistinguishable from the creation, a God that invited one into the creation, a God that could be approached through the senses -- sight, sound, smell, taste, touch. A God who lit up my physics texts from within.

Then, just a few years later, in 1965, physicists discovered the cosmic background radiation, the whisper of the big bang, the electromagnetic signature of the primeval fire. And Teilhard seemed to have anticipated it. Introibo ad altare Dei,, I will go into the altar of God. That gentle Jesuit mystic offered his Church a vision of divinity that rested well with the unfolding cosmology of the physicists. He died in 1955, in exile, with much of his life's work officially censored by the Church he had served. Near the end of his life, he wrote: "How is it possible that I am so incapable of passing on to others...the vision of the marvelous unity in which I find myself immersed?"

Sunday, March 16, 2008

The wreck of worlds

I have previously touched on some of the content of this week's Musing. Let me put it all together in a neat package.

Click to enlarge Anne's Sunday illumination.

Saturday, March 15, 2008

A fishy tale

I had an e-mail request from M. for a column I wrote four or five years ago for the Boston Globe, about goldfish memory. Herewith a much abbreviated version (the complete column to M. by return e-mail):

As dumb as a goldfish.

I mean, what could be dumber than swimming around in murky water all day -- glassy eyed, slimy scaled, cold blooded -- waiting for someone to sprinkle food flakes into the pond or tank. On a scale of smarts, goldfish would seem to fall somewhere between a limpet and a stone.

Conventional wisdom has it that fish have a memory span of about a second.

But conventional wisdom, it turns out, is wrong. Goldfish are not the dummies they are made out to be.

Scientists at Plymouth University in England have successfully trained goldfish to push a lever to get food, and -- get this -- to do it at the same hour every day. And the fish remember what they have been taught for months.

Not exactly the science story of the year, but it does cause one to reflect on the nature of memory. What's going on in those tiny ichthyic brains that lets the goldfish remember when and where to go for dinner?

Scientists now overwhelming believe that memories are stored as webs of connections between spider-shaped brain cells called neurons.

Each neuron is connected through electrochemical connections to thousands of others. According to the current view, experience fine-tunes the connections, strengthening some, weakening others, creating a different "trace" of interconnected cells for each memory.

As the goldfish were trained by the British scientists, a cobwebby tangle of neurons was established in their brains: "Over here, push the lever. It's supper time."

If there is something in the human body that can fairly be called a soul, it is surely that ineffable electrochemical web of connections that was partly bequeathed to us by our genes and partly records a lifetime of experience -- including, of course, the cultural preferences we absorbed from our parents and teachers.

Some folks are put off by the idea of an electrochemical soul, and prefer the older notion of a self that is independent of our physical bodies. As for myself, I love the notion of that effervescent cobweb of neuronal connections contrived of the ineluctable stuff of creation by 4 billion years of evolution.

And I love too the way the new idea of soul binds us in a seamless web to all other creatures, goldfish included, and to the fabric of the universe itself.

Friday, March 14, 2008

Asking questions

An intriguing article in the latest issue of Science, providing a new estimate of the rate at which the Grand Canyon was incised into the Colorado Plateau. The authors dated cave deposits in the walls of the canyon using an improved method of radiometric dating. The caves indicate the position of the water table at the time they were formed. The conclusion: The Colorado River took about 20 million years to cut its way downward through the one mile depth of the canyon.

The river, of course, has stayed pretty much at the same level as the land went up. The land can rise for two reasons: 1) As the regional surface erodes, the crust rises, for the same reason a ship rises in the water as its cargo is removed; and 2) tectonic shifts can force the crust upwards. A comparison of the cutting rate of the Grand Canyon with estimates of the average surface erosion of the region suggests that tectonic uplift contributed to the incising of the canyon.

Altogether, a lovely piece of work that hangs together neatly with the known geological history of the West.

When I used to conduct an introductory earth science course for liberal arts students, I took them to a place on our New England campus (see photo) where the granite crust was manifestly weathering. We knew exactly when this particular outcrop was exposed to the elements by removal of the overbearing deposits when the college sold them off to the state for the construction of a nearby highway -- about 50 years previously. I asked the students to estimate the average rate at which weathering eroded the surface rock. Typically, they came up with rates of about 1 mm per hundred years. ("Oh, about this much per hundred years," they'd say, indicating a tiny gap between thumb and forefinger.) Interestingly, this is consistent with the rates of average surface denudation in the Grand Canyon study -- which suggests that even untutored 19-year-olds can do some pretty nifty science if asked the right questions.

And more. If the granite was implanted, say, a kilometer deep in the crust (as the textbooks suggest) at the time when the New England region was tectonically active, the students could work out when that might have been -- 100 million years ago. A nice order of magnitude calculation by a happy band of scholars, engaging with the deep geological history of the Earth as we stood in a yellow wood on a fine fall day.

Thursday, March 13, 2008

Knowers and gropers

Karl Popper, the eminent philosopher of science, wrote, "It is imperative that we give up the idea of ultimate sources of knowledge, and admit that all knowledge is human; that it is mixed with our errors, our prejudices, our dreams, and our hopes; that all we can do is to grope for truth even though it is beyond our reach."

Four centuries after Galileo, the world is still beset by those who claim access to an ultimate source of knowledge -- divine revelation through holy books or prophets. If there is a fundamental way to divide people in the world today it is into gnostics and agnostics, those who Know and those who grope. The former are still in the overwhelming majority, while the latter have meanwhile patiently created the scientific and technological infrastructure of modernity.

It was my pleasure once to visit Thomas Jefferson's home at Monticello. Everywhere one saw books, inventions and the other paraphernalia of a curious mind. Jefferson was an architect, horticulturist, paleontologist, archeologist, author and musician. Among his correspondents and friends were such scientists of the time as Joseph Priestly, Georges Louis Leclerc de Buffon, Edward Jenner, and, well, almost anyone else you'd care to name. He understood well the connection between science and democracy, as articulated by Jacob Bronowski: "The society of scientists must be a democracy. It can keep alive and grow only by a constant tension between dissent and respect, between independence from the views of others and tolerance for them."

With others of like mind, Jefferson laid the foundations of a new political paradigm, one not beholden to ultimate sources of knowledge or the "divine rights" of kings. It was a noble experiment, by and large successful, one that must be continuously defended against those who Know. As I toured the beautiful precincts of Monticello, it was clear I was in the company of a groper, a man by no means perfect, who knew that the perfection of truth was a noble human enterprise, not to be completed in his lifetime, if ever.

Wednesday, March 12, 2008

Bless me, Father, for I have sinned...

I was describing to my 17-year-old granddaughter last evening the theological circumstances of my own moral life when I was her age, the whole superincumbent calculus of sin and salvation -- states of grace, venial sin, mortal sin, examination of conscience, confession, absolution, penance, Act of Contrition, ejaculations (short prayers), indulgences, Heaven, Hell, Purgatory -- in all of which eternity hung in the balance, an everafter of horrible torment or happy repose with the Beatific Vision. Dying with a single mortal sin on one's soul was enough to risk everlasting damnation, and everyone knows how easy it was for a 17-year-old boy to commit a mortal sin.

My granddaughter was, as one might imagine, incredulous. There was lots of laughter as I told the stories, although if you were a 17-year-old Catholic in 1953 it wasn't funny. My granddaughter was brought up without any of that grim moral bookkeeping -- indeed without any formal religion -- and I must say that her ethics are remarkably sound, and for better reasons than my own at age 17. She does good and avoids evil not because she seeks to avoid the fires of Hell, but because she was raised to know that her own happiness depends upon the happiness of others. I dare say that part of her moral sense is innate, perhaps even shared with her primate cousins.

One doesn't hear much of that old moral theology any more; the Church has swept the sillier aspects under the rug. For myself, by the time I was a young man there would be no parsing of supernaturalist theology, buying this, eschewing that; it was all or nothing. But I saw no need to put behind me the resonant subtexts of my natal faith, especially the sacramental tradition -- a sense that every aspect of the natural world is alive with outward signs of inner grace. Coming of age in the Catholic Church of the 1950s was like living in a haunted house, a house haunted by powers or spirits of which one had only the vaguest perception. I have long since put supernaturalism aside, but I still live in that haunted house, a universe whose every particular evinces an unknown and perhaps unknowable animating force that is worthy of attention, celebration, thanksgiving, praise.

Tuesday, March 11, 2008

The Middle Way

While my daughter visited last weekend, she was reading a little book by the Dalai Lama, How To Practice: The Way To a Meaningful Life. This is my scientist daughter, and she finds in the teachings of the Dalai Lama much to admire.

I do not know much about Buddhism, but of the various world religions, it seems most compatible with scientific empiricism and what we here call religious naturalism.

The Buddhist Middle Way, for example, seeks a middle path between philosophical extremes. There is a modesty in the Middle Way, a suspicion of dogmas of every stripe. The Buddha is said to have remained silent when asked to pronounce on certain questions that have bedeviled Western philosophy, such as whether the world is eternal or non-eternal, finite or infinite. Like the Buddha, when faced with the Big Questions of Ultimate Meaning, the religious naturalist is prepared to say "I don't know."

The Middle Way, like religious naturalism, eschews dualities: natural/supernatural, body/soul, matter/spirit. Philosophical dualism has been an endemic affliction of Western philosophy, and remains at the heart of the enduring tension between science and faith.

From what I know of the Dalai Lama, he is not threatened by science, as are so many practitioners of faith-based religions. He writes (in How To Practice): "I believe that the practice of compassion and love -- a genuine sense of brotherhood and sisterhood -- is the universal religion. It does not matter whether you are a Buddhist or Christian, Moslem or Hindu, or whether you practice religion at all. What matters is your feeling of oneness with humankind." And, I might add, a feeling of oneness with all of the universe.

Monday, March 10, 2008

Eye in the sky

As I write (Saturday morning), 100 miles southeast of Nassau, there is not a cloud in the sky. My granddaughter is on the terrace topping off her tan. The sea gently laps the shore.

But look! I log onto AccuWeather to see what the day holds. And watch on satellite loop a narrow band of heavy weather racing across South Florida, scooting to the northeast. We'll catch the tail. The afternoon promises wind, rain, maybe thunder.

What a thing it is that the world is watched in real time by eyes in the sky, and I can lie here on the couch in the central Bahamas and watch a storm rumble across Miami on the screen of my laptop. We take all this for granted, forgetting how recent it is that folks had any idea what weather was coming. In the second year of my life -- 1938 -- a hurricane roared unannounced out of the Atlantic and devastated New England.

And so a nod, to a fellow who usually is made a goat of science, Robert Fitzroy, captain of H. M. S. Beagle, ardent creationist and nemesis of Charles Darwin. Fitzroy might fairly be called the first modern weatherman. If you missed my previous homage to a man who was caught tragically between science and faith, here is the link. And here is another story about scientist a caught between faith and evidence.

Sunday, March 09, 2008


I had my fling with moviedom, and it almost killed me. So why am I imagining another round? See this week's Musing.

Click to enlarge Anne's Sunday illumination. Thanks, Anne, as always, for gracing our page.

Saturday, March 08, 2008

Blue tang reef

There was a little out-of-the-way beach here on island with a lovely coral reef just offshore where we liked to snorkel. We called it Blue Tang Beach because of the beautiful blue tang fish we often saw there.

That was before the ultra-expensive resort came to the island, with its marina for luxury yachts carved out of the shore, just a few hundred meters from Blue Tang Beach. The last time we snorkeled there the reef was as good as dead. Another of our favorite snorkel reefs, not far from the resort, shows signs of stress as massive earth-moving machinery prepares yet another site for multimillion-dollar holiday homes.

And so we watch as virtually unregulated development on this sweet little island destroys the very things that made it an attractive destination. Dunes are bulldozed, ridges leveled, the land poisoned with pesticides, reefs destroyed, dark sky obliterated, all for the occasional pleasure of very rich foreigners.

There is of course an economic benefit for the islanders -- but at what cost? Carefully-regulated, ecologically-sensitive development on a more modest scale could have the same economic advantages while preserving the best of the Bahamas for the children and grandchildren of Bahamians.

Friday, March 07, 2008

Bow, take a bow

Wednesday's gathering of Venus, Mercury and the Moon in the dawn sky was breathtaking. Now the Moon has slipped close to the Sun, but young Moon enthusiasts -- like me -- wait for the new Moon to appear in the evening sky. That just might be tomorrow evening, when hereabouts the new Moon will be about 32 hours old -- eyelash thin, but easily doable if the western horizon is clear. This is the time of the year when the Moon's path is tipped most vertically to the horizon, which brings the crescent higher above the horizon at a younger age. Any Moon less than 30 hours old is wonderfully thin and barely visible in the darkening sky. The record for seeing a new Moon with the naked eye is about 15 hours old, but I have never been out of the low 20s.

And while we are pondering the young crescent Moon, consider these lines from Coleridge's Ancient Mariner:
We listened and looked sideways up!
Fear at my heart, as at a cup,
My lifeblood seemed to sip!
The stars were dim and thick the night,
The steersman's face by his lamp gleamed white;
From the sails the dew did drip --
Til clomb above the eastern bar
The horned moon, with one bright star
Within the nether tip.
The last line suggests the sort of thing you see on the Pakistani flag, and the flags of some other Moslem countries -- a star within the dark of the Moon. But of course this is impossible, as is apparent with a very young Moon when you can see the dark part of its face lit by earthshine -- sunlight reflected from the Earth. The Turkish flag gets the star part right, but (like the Pakistani flag) gives the crescent too much of a grasp; the horns of the Moon's crescent do not extend beyond a diameter.

Thursday, March 06, 2008

We hold these truths...

In her book The First Salute -- on the European context of the American Revolution -- Barbara Tuchman has this to say about living conditions in a late-18th-century warship:
Preservation of food from rot may have had no alternative, but human filth was not incumbent. Given sweat, vomit, defecation and urination, sexual emission and the menstrual flow of women, the human body is not a clean machine, and when people are crowded together in an enclosed space, its effluents can create a degree of unpleasantness raised to the extreme. Means of improving hygiene and sanitation could have been devised if they had been wanted, for men can usually work out the technical means to obtain what is truly desired unless the refrain "it can't be done" becomes their guide.
If living conditions provided for a fighting sailor were so nauseating, imagine what obtained in the hold of the ships that carried African slaves across the Atlantic. Pecuniary profit trumped all, for the slave merchants, and for the long-suffering sailors who could expect a share -- pittance that it was -- of the prizes of war. The captain of a ship-of-the line might fairly expect to become rich from the spoils of conquest, and some did. Greed, power and war held sway. The milk of human kindness had nothing to do with it.

How was it then that by the end of the following century it had become the acknowledged responsibility of governments to provide citizens -- on land or sea -- with clean water and sanitation? A sea-change had occurred in public expectations. "It can't be done" gave way to "can do." Even the poorest citizen in the developed countries can now expect to turn a tap and flush a toilet, receive medical care in a clean hospital, and flick a switch to obtain refrigeration, heat, and light. A sailor on a modern warship expects no less, and enjoys pretty much all the comforts of home, and maybe then some.

It was not religion that changed the picture. As Tuchman's book makes clear, religion managed to accommodate itself to any barbarism; dogma is the natural ally of the status quo. Rather, the "can do" spirit grew out of empiricism. Since the Scientific Revolution of the 16th and 17th centuries, and especially with the Enlightenment, a scent of progress was in the air that became in the 19th century the prevailing aroma. Darwin's notion of evolution was more than biology; it was a corollary of the time. The future beckoned, and fixed dogma conceded to an open-ended search for truth.

Mercantile greed, raw political power, and warfare are of course still with us, but they are no longer celebrated as defining dimensions of national identity, as they were in Europe at the time of the American Revolution. For that we can thank the philosophers and scientists of the Enlightenment who dared to question prevailing orthodoxies.

Wednesday, March 05, 2008

Free as a bird

I had a few words to say the other day about the Problem of Evil: If God is all-powerful, and all-good, why does evil exist? That is to say: Why do bad things happen to good people? The problem has a ready solution, which I need not elaborate again. In any case, as an eyelash moon rose orange this morning with Mercury and Venus in a star-spangled sky, the Problem of Evil seemed remote and far away.

But why was I up and on the terrace, instead of snug in bed? Which brings me to another of the Big Problems of philosophy: The Problem of Free Will. If the mind can be reduced to electrochemical states in the brain, as modern science suggests, and the laws of electrochemistry are causal, then in what sense can we be said to be free and responsible for our actions?

Possible solutions:

1) The mind cannot be reduced to electrochemistry. The mind is nonmaterial, supernatural, and potentially immortal, not subject to the laws of physics and chemistry.

2) The mind is reducible to physical sates in the brain, but those states are subject to quantum indeterminacy -- a la Roger Penrose. An illusion of freedom arises from stochastic randomness.

3) The mind is reducible to physical states in the brain, but the brain is so complex as to be in practice indeterminate, only to be understood -- in so far as that is possible -- by chaos theory. Like the weather, the elements of the system are causal, but the causal connections are so multitudinous and slippery as to render the outcome of any sequence of mental events entirely unpredictable.

The first solution requires a leap of faith, unsupported by even a shred of empirical evidence. It is, of course, the solution of those who long for immortality.

The second solution also has an inadequate empirical basis. Somehow, for a "free" action to occur, the supposed quantum effects would have to cohere across large areas of the brain. But large-scale quantum coherence, as we presently understand it, only occurs at extremely low temperatures, near absolute zero. The brain would seem to be too warm for this to happen. And besides, is an illusion of freedom arising from randomness any more attractive that an illusion of freedom that is causally determined?

Which leaves us with the third possibility, an illusion of freedom that emerges from complexity, the astounding complexity of a brain in interaction with its environment. This is not what philosophers traditionally meant by free will, but -- get this -- it is indistinguishable from what philosophers traditionally meant by free will. If it walks like a duck, and quacks like a duck, it's a duck.

What then of responsibility? Is "my neurons made me do it" an adequate defense in a court of law? Responsibility is not a scientific question, but rather a social construct. Free will may be a walking, quacking duck, but humans have discovered that civil society requires that we keep our ducks in a row.

Tuesday, March 04, 2008

In place of belief

A lightning storm the other night, slipping down from Florida. For an hour we sat on the terrace and watched the northern sky blaze with pyrotechnics, the best we'd ever seen, almost continuous rivers of electricity streaming between the clouds, clouds heaped like mountains and lit from within. After some particularly sky-filling explosions we could not resist applause. From horizon to horizon, from Andros to San Salvador the streamers ran, trailing their rumbles of far-off thunder. So much energy! Energy entirely beyond the human capacity to control, beyond the human capacity to comprehend.

There is a delicacy in nature -- the hummingbird at the feeder, the gecko's flicking tongue, the textured sand after a shower. There is a power too -- awesome and ominous, booming and cascading across the night. I think of lines from a poem of Grace Schulman, a poem called In Place of Belief:
               ...I would eavesdrop, spy,
and keep watch on the chance, however slight,
that the unseen might dazzle into sight.

Monday, March 03, 2008

The problem of evil

It would take a better informed scholar than me to say who first articulated the problem of evil, which can be stated: If God is all-powerful, he can prevent evil; if he is all-good, he would not want evil; but evil manifestly exists.

Perhaps the first was Epicurus, who concluded that therefore God does not exist. It is a simple and elegant resolution of the paradox, but of course unsatisfactory for the theist.

And so theistic philosophers and theologians have been wrestling with the problem ever since, without resolution. A zillion words have been written on the subject, to no avail. The Manicheans, at least, had a reasonable solution; God is not all-powerful (he is opposed by an equally powerful force for evil). Most contemporary theists assume that God allows evil because human freedom is a greater good (no freeedom without choice), which does not quite explain why bad things happen to good people. Others assume that God has his own inscrutable reasons for allowing evil, which begs the question. Which brings us back to Epicurus.

The 17th-century philosopher Pierre Bayle had reason enough from his own unhappy experience to appreciate the problem of evil. He was a French Protestant (by birth) caught up in the intolerance that followed the revocation of the Edit of Nantes. He wrote: "Man is wicked and unhappy; everywhere prisons, hospitals, gibbets and beggars; history, properly speaking, is nothing but a collection of crimes and misfortunes of mankind." He struggled mightily with the problem, and ended up, it would seem, somewhere close to the modern skeptic's doubt. Religious naturalists can claim him -- with Epicurus -- as a predecessor, even though his muddled and often contradictory views on religion and God have puzzled historians of philosophy. Maybe the muddle is why we like him. He certainly tried to understand all sides of every issue, and was (more-or-less) a champion of individual conscience and religious tolerance. Hume seems to have been influenced by Bayle, and did a better job at making skeptical naturalism respectable.

Religious naturalists, like all humans, are afflicted by evil and seek to ameliorate its effect by the conscious application of the Golden Rule. But we don't have a Problem of Evil to explain away. Nature is what it is, good and bad, order and chaos. Death and tribulation are a driving engine of complexification and, ironically, the evolution of moral consciousness. Most significantly, religious naturalists do not pretend to perfect knowledge, and certainly not to knowing enough of ultimate reality to suppose the existence of a personal God who is all-powerful and all-good. We are inclined to believe with Pierre Bayle and Epicurus that those popular religious beliefs that give rise to the Problem of Evil are based more on insufficently critical credulity than on reason or reality.

Sunday, March 02, 2008

Unnatural wonders

Who dug the Grand Canyon? See this week's Musing.

Click to enlarge Anne's Sunday illumination.

Saturday, March 01, 2008

Seasoning the year

March comes in like a lion, goes out like a lamb. We all know what that means. Sometime during the month -- in the northern hemisphere -- winter slips away and we catch the first whisperings of spring; the month begins with a roar and ends with a bleat. Guy Ottewell, that inestimable connoisseur of stars, suggests a another source for the phrase: In the evening sky of March, Aries sets in the west as Leo approaches the zenith.

There was a time, I suppose, when everyone carried in her head a mental map of the stars. Every human culture we know of had constellations -- arbitrary groupings of stars made recognizable by association with familiar objects, lions and lambs, for instance. Those of us who derive from European culture take our constellations mostly from the civilizations of the Middle East. When Europeans first ventured into southern seas they mapped the stars as well as continents, giving us such constellations as the Microscope, the Telescope, the Clock and the Furnace. Such a shame that they didn't bring home the constellations of the indigenous peoples of southern climes.

I learned my constellations from my father, who informed himself with A Primer for Star-Gazers by Henry Neely (the book is now long out of print, but my father's copy is still in my possession). I've tried to keep the tradition going with my own children and grandchildren, but it is probably doomed to extinction. Artificial light makes the stars increasingly unapproachable, and the proliferation of indoor electronic amusements makes learning constellations seem an oddly quaint pastime.

Still, whether we watch or no, the Lamb goes tripping westward, following winter into the wings, and the Lion -- with a roar -- takes center stage.