Wednesday, January 31, 2007

A few more words on silence

There is an irony in the fact that as I lose my hearing I am more drawn to silence, Perhaps it is because only in silence do sounds have meaning. That is to say, only in silence can we attend to sound.

Our ears are generally so immersed in cacophony that individual sounds -- the flitter of the bananaquit in the torchwood tree -- is lost in a sea of meaningless decibels. And what of all those folks I see who go about their business with earbuds pumping sound into their ears? Do they hear the song of the mockingbird or the slap of the waves on the shore? Music is a way of giving silence shape. When music never ceases it is not music but Muzak.

I wonder too about the sounds I don't hear because of the limitations of human hearing -- frequencies or volumes too high or too low to be audible. Like our other senses, our sense of hearing is a narrow window on the world. If we have access to such a narrow spectrum of potential sensations, imagine how little we understand of whatever is ultimate and eternal.

"All profound things and emotions of things are preceded and attended by Silence," wrote Herman Melville. He believed that silence is the only voice of God. When I was in Istanbul this past spring, I was rather put off by the loudspeakers that blared the voice of God four times a day from minarets all over the city, calling the faithful to prayer. I felt rather nearer to whatever is ultimate in the beautiful silent spaces of Hagia Sophia and the Blue Mosque. The God I seek hides in the creation and whispers sweet nothings, as lover's do -- the inaudible whirr of the hummingbird's wings, perhaps.

Tuesday, January 30, 2007


Richard Dawkins and Sam Harris are, to my mind, a healthy public counterpoint to the intolerant and self-certain stridency that increasingly characterizes Christian and Islamic fundamentalism. Of course, their miltantly atheistic books also challenge traditional, so-called "mainstream," believers.

If I were a believer and wanted to provide a response to Dawkins and Harris, I wouldn't roll out Francis Collins, Owen Gingerich or other theistic scientists who wear their faith on their sleeve. As a good lapsed Catholic, I would recommend Mary Gordon, Walker Percy, Sigrid Undset, Georges Bernanos, Evelyn Waugh, Flannery O'Connor, Andre Dubus, Graham Greene, Shusaku Endo (Silence), and the many other believing writers who treat of matters of the spirit and who rank with the best artists of the modern age. Their God is seen through a glass darkly, and comes bearing gifts of doubt and ambiguity.

I haven't learned much about the spiritual quest by reading true believers or true disbelievers. Give me instead the theism or the atheism of the pilgrim who has found or lost her God by walking through the dark valley, who hardly dares to speak his name for fear that he will disappear at the sound of her voice. If I were going to have a God, I would want him (her? it?) to come in silence, to hide in shadows, and whisper his revelations in a voice too faint to be clearly heard.

It is appropriate, I think, to praise the creation, to make a joyful noise in thanksgiving for the sensate world. But praising the Creator is another thing altogether. When we make a big racket HIS behalf we are more than likely addressing an idol in our own image.

The essayist Pico Iyer says, "Silence is the tribute that we pay to holiness; we slip off words when we enter a scared place, just as we slip off shoes."

Monday, January 29, 2007

Incredible shrinking technology

In 1965, Gordon Moore, cofounder of chipmaker Intel, predicted that the number of transistors that can be fabricated on a chip at minimum cost would double every 24 months. Last week, Intel announced a chip with transistors 45 nanometers wide -- and so 40 years later Moore's Law remains intact.

Forty-five nanometers! A working electronic device 10,000 times smaller than the period at the end of this sentence. And getting cheaper all the time.

I once visited a guy in a suburb of Boston who collected radios. He had them all. A huge Stromberg-Carlson with a dozen electronic tubes the size Coke bottles, glowing like bonfires. Zeniths as large as breadboxes (remember breadboxes?) with salt-shaker-sized tubes. From the 1950s, little Sears Silvertones like the one I got on my 14th birthday, with minitubes no bigger than my little finger. Then, along came transistors, and radios shrank to the size of a deck of cards. This guy had em' all -- a big double garage full of shrinking electronics.

Integrated circuits appeared in the 60s, and if it weren't for the necessity of dials and speakers radios could have become the size of this letter o. But forget radios, it was the dawn of the Age of Computers.

Centimeters. Millimeters. Micrometers. Nanometers.

Will nano be the ultimate in miniaturization? At some point we'll run up against the size of atoms themselves. But what's to keep chipmakers from moving into three dimensions? We'll see how long Moore's Law continues to hold.

By the way, on January 28, yesterday, as I was writing this, I went to Wikipedia to check on the date of Moore's prediction. The article on Moore's Law already included the January 27 announcement by Intel of the 45-nanometer chip. How's that for an up-to-date encyclopedia. Eat your heart out, Britannica!

Sunday, January 28, 2007

Empiricism and superstition

This week's Musing is an excerpt from my just published novel Valentine: A Love Story. In that book, I describe a time in the classical world when a scientific revolution was struggling to be born. It was too soon. The circumstances were not yet right. The forces of supernaturalism were still too strong. In another time (the 16th and 17th centuries), and another place (Europe), the seeds that were planted in Pergamon, Alexandria, and other centers of ancient learning would finally come to fruition.

Click on Anne's Sunday offering to enlarge. For those who have come recently to Musings, Anne is my sister, who communes with her own computer in a sweet sun-powered adobe house on a secret mesa in the Southwest.

Saturday, January 27, 2007


Which means, according to text messagers, "It could be worse (but) I don't think so."

It seems a Finn has published a 332-page novel written entirely in text messaging abbreviations. Leave it to the Finns, who essentially invented mobile phone culture.

Well, that's one novel I will never read. At my age, I might as well try to learn Finnish as the language of TXT MSG.

I've had a mobile (as they call them in Europe) for three years now, and I've yet to have an incoming call. Not surprising, since no one has my number, and even if they did the phone is never on. I had an instant message once, but it was in English, from another IM illiterate.

The mobile is a handy thing, which I take with me when I travel. I have a cheapo calling plan that gives me 100 minutes for 90 days. I've never used more than 10.

I know a cultural revolution is passing me by, and IMHO a rather significant one. It's like we are all being raptured out of our bodies into the ether. In the new dispensation, protoplasm is less important than pixels. Me, I'll stay in the world where we still go eye-to-eye, hold hands, mush lips, and make babies in double beds.

A hundred years ago, it was the telephone, which Scientific American magazine then saw as "nothing less than a new organization of society -- a state of things in which every individual, however secluded, will have at call every other individual in the community, to the saving of no end of social and business complications, of needless goings to and fro." Another pundit of that time proclaimed an "epoch of neighborship without propinquity." Or, as I suppose we'd say now, ILY W/O F2F

Propinquity survived the telephone. Propinquity will doubtless survive TXT MSG There are some things we can't get via that tiny scrolling screen. A decent haircut? A dozen red roses? A coronary bypass?


Friday, January 26, 2007

A taste of madeleine

My wife and I sat at the dinner table last evening trolling trough memories of almost 50 years together. We were astonished at some of the trivial stuff we remembered, but aware too of whole epochs of experience that have evaporated into forgetfulness.

"Too much life, not enough disk," said my wife. She suggested that the human brain evolved when an average lifetime was a only few decades. Now that we live two or three times longer, we just don't have the gigabytes to store it all.

There may be something to that, since remembering past experiences -- crocodiles in the river -- can clearly have survival value. But it's hard to see how natural selection would work to keep every little thing in the archives. Just before my mother died at age 92 she could still recite long poems by Longfellow, Whittier, Riley, Lowell, and the rest. Not much Darwinian advantage there, but it gave her considerable pleasure. Amazing that all those musty poems were somehow squirreled away in a tangle of her neuronal snapyses.

The human brain contains 100 billion neurons, and each neuron is in contact with a thousand others, more or less. If we think of each connection as being "on" or "off" (a crude simplification), then we can say that the human brain stores roughly 5,000 gigabytes of information (the equivalent of 5,000 billion keyboard characters). I'm not sure I did the calculation right, but that's more than enough capacity to store every poem you ever learned plus Proust's Remembrance of Things Past.

Still, memories slip away. And, when the power goes off, it is lost forever. Which is why we resort to diaries, scrapbooks, photo albums, souvenir collections. More publicly, we have memoirs, autobiographies, homepages, blogs. The internet has become willy-nilly the collective memory of our species. How vunerable are our wikiselves to evaporation? I have piles and piles of floppy disks around the house that will never be read again.

There was one more thing my wife said last evening that I wanted to add. Now let's see, what was it? It's just on the tip of my tongue...

Thursday, January 25, 2007

The man who loved islands

There is a story by D. H. Lawrence about a man who buys an island in order to escape the pandemonium of city life.

The island has a manor house, cottages, tenants, animals and gardens. Maintenance and improvements begin to deplete the man's fortune. Even so small a society makes emotional demands. At last, the man sells his island and moves to a smaller one with only a modest house and a few servants.

He soon becomes snared in the small pandemoniums of his reduced circumstances. Again, life becomes more complicated than he can bear. Seeking still greater simplicity, he moves to a craggy rock in the sea, with only a hut, a few sheep and a cat.

He sells the sheep. The cat wanders off. Winter comes and snow blankets the tiny island in featureless white. Alone, neither happy nor unhappy, the man at last achieves the perfect simplicity of --death.

I came here to this Bahamian island a dozen years ago looking for a life with less hustle and bustle than New England. And, indeed, for a while I found it. Just me and the wife and a few geckoes. Sun and sea. And a creaky internet connection to the outside world.

But the island changes. A lot of noise next door as new houses go in where before there was only a derelict shack. So, thanks to the generosity of a friend, I have been taking my laptop in the mornings to a tiny hut on a tiny island reached only by a footbridge. A chair, a table, no electricity, no plumbing, just the lap of the sea at the shore. The computer's battery lasts for only a few hours, but, never mind, that's about as long as my brain lasts anyway.

Maybe a place like this. A hut, a few sheep, a cat. . .

No way. I need civilization to charge my batteries -- the computer's and my brain's. A flush toilet, yes, that too. My wife would refuse to forego her internet connection to the New York Times. Still, as I sit in my borrowed writing hut I remember Lawrence's man who loved islands and just for a moment I think...

Wednesday, January 24, 2007

Probing the soul (a reprise)

More than three centuries ago, Pascal said: "Man considering himself is the great prodigy of nature. For he cannot conceive what his body is, even less what his spirit is, and least of all how body can be united with spirit."

Pascal lived at the dawn of the scientific era, but his words still ring true. We have sent spacecraft to the planets. We have listened to signals from the dawn of time. We have unraveled the mystery of starlight. We can even conceive what the body is. But the deeper human mystery remains: What is the spirit, and how is it united with body?

There is a sense among neuroscientists, psychologists and artificial intelligence researchers that the riddle is ripe for solution. Powerful new imaging technologies make it possible to probe the living brain -- watch the orchestra play, as it were, even as we listen to the music of thought. More powerful generations of computers provide analytical tools to model the complexity of neural circuits. Subtle refinements of molecular biology and chemistry let us fiddle with the machinery of the soul.

A philosopher colleague of mine worries about the experimental manipulation of consciousness. As we learn more about the brain's chemistry, he foresees increasing reliance upon drugs to control our mental lives -- a pill for this, a pill for that. "Increasingly, there's no room for us to talk to one another about our lives," he says. "No room for our histories, our stories, our art; no room for ourselves."

The self has become another object to be investigated, analyzed and manipulated, he says, nothing more than a flickering image on a brain scan monitor as electrochemical activity flares up, dies down, perhaps under chemical control. "Science is squeezing us to spiritual death," he groans, with the deflated spirit of an unreconstructed romantic.

Of course, all knowledge holds potential for abuse. But my colleague's pessimism is unwarranted. As Pascal said, "Man considering himself is the great prodigy of nature." The discovery that our spirits are inextricably linked to electrochemical processes in no way diminishes our true selves. We still have histories, tell stories, make art. We love, we cry, we respond with awe to the marvelous machinery of cognition. And, when necessary, we arm ourselves chemically against the devils of mental illness.

Many of us seem to believe that anything we can understand cannot be worth much, and therefore -- most especially -- we resist the scientific understanding of self. But the ability to know is the measure of our human uniqueness, the thing that distinguishes us from the other animals.

Understanding the machinery of the spirit does not mean that we will ever encompass with our science the rich detail of an individual human life, or the infinitude of ways by which a human brain interacts with the world. Science is a map of the world; it is not the world itself.

We can all agree with the Greek philosopher Heraclitus, who thousands of years ago wrote: "You could not discover the limits of soul, not even if you traveled down every road. Such is the depth of its form."

(This is a revision and extension of a post of several years ago.)

Tuesday, January 23, 2007

On being good

Several years ago, I attended a seminar on the foundations of ethical systems. The participants quoted Plato, Jesus, Heidegger, and a host of other authorities; they trotted out every philosophical and theological reason why we can or should be good. Of course, prominent among the arguments was that old canard: Without the promise of eternal salvation or the threat of damnation, we would all be scoundrels.

No one mentioned that we are first of all biological creatures with an evolutionary history, and that altruism, aggression, fidelity, promiscuity, nurturing and violence might be part of our animal natures.

I looked around the auditorium and saw folks of every religious and philosophical persuasion, and of many cultural and ethnic backgrounds, and I thought, "Gee, I'd trust any one of these folks not to take my wallet in a dark alley." Sure, humans are capable of great evil, but most of us are pretty good most of the time, and I suspect that it has more to do with where we have been as a biological species than with where we hope to be going in some airy-fairy afterlife.

We are animals who have evolved the capacity to cherish our fellow humans and to resist for the common good our innate tendencies to aggression and selfishness, not because we have been plucked out of our animal selves by some sky hook from above, but because we have been nudged into reflective consciousness by evolution. When it comes to living in a civilized way on a crowded planet, I choose to put my faith in the long leash of the genes rather than fear of hellfire or the chance to walk on streets of gold.

Monday, January 22, 2007


We hear much talk from environmentalists about sustainability. Generally what they mean is stasis. Keeping things the same.

But if we look at the 3.5 billion years of life on Earth we see process, not stasis. Change, not constancy. Sometimes change so dramatic as to wipe out whole ecosystems.

Human intelligence is a result of the one thing that has remained constant over the long haul: incremental adaptation. Natural selection. Prokaryotes moved over to make room for eukaryotes. Reptiles gave place to mammals. Marsupial mammals yielded center stage to placentals. Like it or not, technologically-equipped human beings now inherit the Earth.

Trying to keep things the same is a dead-end strategy. What we can take from the long roll of evolution is an organic metaphor to replace the industrial metaphor of conspicuous consumption. In the human cultural sense this means: Decentralization. The unimpeded exchange of ideas and information. Distributed processing. Individuals and communities plugged independently into the Sun.

The future is not going to look anything like the past. The anarchic internet is a more organic instrument for cultural evolution than were the turbine and the bulldozer. If I can be just a liitle bit flip -- maximize individual freedom, build in as many feedback loops as possible, then get out of the way.

Sunday, January 21, 2007

River out of Eden

Two million African children die each year of malaria. Malaria might be wiped from the Earth at a cost of a couple of dollars for each citizen of the rich nations. See this week's Musing.

Click to enlarge Anne's Sunday pic.

Saturday, January 20, 2007

Our every-other-month commercial

Valentine is available on Amazon, just in time for Valentine's Day, with a first rather lovely customer review. Do your sweetie a favor and order him/her a copy today.

Meanwhile, the Library Journal has selected Walking Zero: Discovering Cosmic Space and Time Along the Prime Meridian as one of the Best Sci-Tech books of 2006. Buy one for yourself so you'll have something to read while your valentine is reading Val.

The gap -- Part 2

Erwin Chargaff was one of the great biochemists of the 20th-century. He is best known for his demonstration in the late 1940s that certain chemical components of DNA molecules always occur in constant ratios, a result that was crucial to the discovery of the DNA double helix by James Watson and Francis Crick. He was among the first to recognize that the chemical composition of DNA is species specific, another step on the way to elucidating the structure of the human genome.

He died in 2002 at age 97.

In his autobiography, Heraclitean Fire, Chargaff says of his life: "In the Sistine Chapel, where Michelangelo depicts the creation of man, God's finger and that of Adam are separated by a short space. That distance I called eternity; and there, I felt, I was sent to travel."

Chargaff spent his childhood in Austria, in what seemed to him the last golden rays of a more civilized era. He was watching the younger sons of Kaiser Wilhelm II play tennis when news came of the assassination of the Austrian Archduke Franz Ferdinand, an event that plunged all of Europe into darkness. He spent the years between the wars in Vienna, where he took his degrees. Torn between science and the study of literature, he drifted into chemistry, as later he drifted into biochemistry. He was forced to leave Europe by the rise of the Nazis. Again darkness descended. His mother was deported from Vienna into the oblivion of the death camps.

Chargaff was aware at every moment of his life of the immensity of the darkness. As a scientist, he helped make the darkness light. Still, near the end of his life, he was struck by how much we know and how little we understand, and fearful that science was coming dangerously close to bridging the gap between God's finger and the finger of man. "A balance that does not tremble cannot weigh," wrote Chargaff; "A man who does not tremble cannot live."

There is little reason to fear, I think, that science will ever bridge the gap between knowing and unknowing. Our knowledge is finite, and -- as Chargaff suggests -- the gap is eternal. Reason enough to tremble.

Friday, January 19, 2007

The gap -- Part 1

Some years ago, the influential journal Nature used on its cover the well-known detail from Michelangelo's Sistine Chapel frescoes: the almost touching fingertips of God and Adam. The cover story was the sequencing of the first human chromosome -- a complete transcription of the chemical units (nucleotides) making up the chromosomal DNA.

In a subsequent letter to the editor of Nature, two biologists took issue with the choice of illustration. "Does the elucidation of the human nucleotide sequence provide us with insights into the work of the Christian God at the creation event?" they asked. What do Christian religious symbols have to do with science?

The editor responded that the journal's staff had debated the use of the Michelangelo detail, but decided that the image "combined iconic symbolism with the science without implying that the Bible is true or that evolution is not the key to making sense of biology."

I mentioned this in a Globe column at the time, and expressed the opinion that the journal's use of Michelangelo's art was appropriate. Readers of Nature are not likely to take Michelangelo's iconic image literally, nor imagine that the editors are endorsing Genesis.

The image of Adam stretching out his arm to receive from God the spark of soul is one of the most recognizable and powerfully moving images from all of art. It would be a shame if we were to abandon our cultural heritage because parts of that heritage have been rendered un-literal by progress in science.

The image on the ceiling of the Sistine Chapel does not belong only to Christian theists; it belongs to all of us, and it retains its significance even in the 21st century. The art director at my publisher used the image on the jacket of Skeptics and True Believers. Except for the fact that the image is a cliche, I had no objection.

And besides, it is not Adam or God that is the attraction of Michelangelo's painting. It is the gap between their fingers. Michelangelo could have had God touching Adam's finger. He did not. And all these centuries later, it is the gap that draws us to the painting again and again, and compels our fascination. Although both Adam and his gray-bearded Creator have lost their literal significance, the gap between their fingers -- between the human mind and the unnamable, unknown agency that creates and sustains the universe -- remains as real and as important as ever, even to the most unmystical and atheistic scientist.

Tomorrow: Chargaff and the gap.

Thursday, January 18, 2007

Biophilia -- Part 2

The dark hours of the night here belong to the ants. Tiny sand-colored ants, no larger than a grain of fine sand. They spread out across the floors and countertops looking for crumbs of food or dead insects. Successful scouts carry the intelligence back to the nest and soon a hoard of scavengers is marching to the quarry.

When I turned on the light in the kitchen this morning at six AM, one army of ants was trying to carry home the carcass of a roach. Not a big roach; one of those brownish things about a centimeter long common to the tropics. The problem was to lift the carcass up eight inches of smooth vertical tile to the crack between screen and window jamb where the ants had entered. Never mind that the roach would never fit through the crack; that bit of foresight is beyond their ken.

By the time I had finished my coffee the scurrying throng had risen and fallen back a dozen times, but still they endured. In all of that formicarian frenzy there was not a single mind sharp enough to say, "OK, fellows, this isn't going to work." A few millimeters upward. Fall back. A few millimeters upward. Fall back.

When I returned from my beach walk they were still at it, although with less focus and energy. An hour later, the flurry had subsided to an exhausted milling about. By midmorning the dead roach was abandoned.

I have written about this scenario before. I never tire watching it. And the ants never learn from their experience. Their tiny brains are not without resources, but there is no such thing as cultural learning. Whatever increase in foresight they muster comes with the infinitely slow refinements of natural selection. In the meantime, they will go on repeating the same melancholy drama of bug and crack for as long as I am here to observe it.

And I am here, and I observe. I'm not adverse to killing ants; sometimes I mop up whole armies with a damp sponge. But I never have the heart to interfere with their attempts to carry home some morsel that will never make it through the available exit. There is something sad, and brave, and hopeful about the drama, as if the ants are reenacting for my benefit one tiny chapter in the long saga of evolution that led in the fullness of time to -- me.

Wednesday, January 17, 2007

Cognitive dissonance

A paragraph from a recent article in the journal Science (January 5, 2007):
This then is our universe: On the whole, it is spatially flat and 13.7 billion years old, both of which are known to 1 percent precision; it is expanding at a rate of 70 plus/minus 2 km/sec per megaparsec, and the expansion us speeding up; and it is composed of 24 plus/minus 4 percent matter and 76 plus or minus 4 percent dark energy, with 4.2 plus/minus 0.5 percent of the matter in the form of atoms, between 0.1 and 1 percent in the form of neutrinos, and with the bulk of the matter dark and as yet unidentified. Stars...account for less than 1 percent of the total composition. The microwave background temperature has been measured to four significant figures, 2.725 plus/minus 0.001 K, and its tiny variations (about 0.001 percent) across the sky have been mapped with a resolution better than 0.1 degree.
What is most astounding about this paragraph is not the information that it contains, but that it can be written at all. Everything in science is subject to revision, but experimental cosmologists are writing a story of creation that is impressive in its quantitative precision, especially when you consider that we are talking about a universe whose breadth is measured in hundred of billions of light-years, at least, and contains hundreds of billions of galaxies that we can photograph. All of those mind-blowing images from the Hubble Space telescope reveal just the tiny part of the universe that is luminous.

The new creation story is getting more detailed all the time. Increasingly sophisticated instruments peer back to the beginning of time with ever greater resolution and every part of the spectrum. Later this year the most powerful particle accelerating machine on Earth, the Large Hadron Collider on the border between France and Switzerland, a 17-mile-wide underground colossus, will be up and running, taking physicists to energy levels that approach ever more closely the temperature of the big bang. (Check out the photographs of the Large Hadron Collider in last Sunday's New York Times Magazine.)

Yet, very few people on the planet have even the remotest idea what any of this means. Half of Americans, for example, profess to live in a universe that is coeval with human beings, less than 10,000 years old. For all practical purposes they live in the conceptual world of Dante Alighieri.

There are two typical responses to the universe described in the paragraph from Nature. One is to whine "It all makes me feel so insignificant," and retreat into the anthropomorphic faith of our ancestors. The other is to embrace with pride what is surely one of the greatest flights of human intelligence and imagination.

Tuesday, January 16, 2007


I may have mentioned here before my Bahamian neighbor who told me once, "I don't like any creature that has more or less legs than me." She was talking about a boa we saw slithering through the brush.

Yesterday, she had a visitor with too many legs. Her voice on the phone was urgent: "Chet, come!"

And there in a corner of her living room was a rat. Just sitting there. I would swear it was smiling.

I managed to get it into a kitchen trash bin, covered with a framed picture of Jesus. But now what to do?

Harvard biologist Edward O. Wilson talks about something he calls biophilia, "love of life" -- an innate emotional entanglement of human beings with other living organisms. Our brains evolved during the 99 percent of human history during which our ancestors lived as hunter/gatherers in a biologically diverse environment, says Wilson, and a memory of that long experience is hardwired into our emotions. Which is why we like to visit zoos and live in parklike environments similar to the African savannas where our species had its infancy.

A love of life, uh? What about rats? Grinning brown rats?

I knew I had to kill the rat; my neighbor would never forgive me if I let it go, no matter how far from the house. My other neighbors wouldn't be too happy either. I here confess to murder, of a violent, gruesome sort.

Other species are our kin, both biologically and -- in the biophilia sense -- psychologically, says Wilson, and therefore worthy of our affection, respect and conservation. Allowing our violent tendencies to destroy a world in which the brain was assembled over millions of years is a risky step, he says.

Sorry, Ed. Sorry, rat.

Monday, January 15, 2007

Deus absconditus

I am in the wrong place to see Comet McNaught -- "McNaught the Magnificent," as Fred Schaaf calls it. Fred is an old e-mail pal who is one of the most devout skywatchers on the planet; you may know him from Sky & Telescope Magazine or Guy Ottewell's Astronomical Calendar. I get his comet alerts. Which feed my hopeless comet watch.

Here on the Tropic of Cancer the horizon tilts up to hide the comet. Still, every evening at sunset I'm standing on the stoop (with sundowner drink in hand) scanning the western horizon. What do I expect to see? Well, nothing, really. But I look anyway. Maybe a glimpse of the comet's tail stretching up and away from the western glow.

But no comet. The time is not wasted. Some spectacular sunsets. The high silver gleams of airliners making their way from South America to the States, or vice versa, dragging long contrails behind them and catching the rays of a Sun that is below my horizon -- artificial comets of a sort. These are Maxfield Parrish moments, when tropical nature spills out an unworldly palette of colors suffused with hints of cobalt blue and gold.

"What makes the desert beautiful is that somewhere it hides a well," says Antoine de Saint-Exupery's Little Prince to his pilot. For a week, Comet McNaughton has been my hidden well.

Sunday, January 14, 2007

Animal spirit

"You do not have to walk on your knees/ for a hundred miles through the desert, repenting./ You only have to let the soft animal of your body love what it loves," writes Mary Oliver in a poem called Wild Geese. See this week's Musing.

Anne's Sunday offering, which is oddly appropriate to today's Musing. Click to enlarge.

Saturday, January 13, 2007

An animal and proud of it

If there were bumper stickers with that message, some of us might stick them on our cars.

But not many, apparently. The evidence suggests that most humans are embarrassed by their animal natures.

Americans, especially, seem eager to affirm that we are more than the cousins of chimps. In growing numbers, we embrace religious and secular gurus who proffer escape from our animal destinies -- mind over matter, "inner selves," channeling, the Rapture.

In the most extreme manifestations of anti-animal sentiment, we have Branch Davidians and Heaven's Gate cultists waiting to be plucked by God or aliens from this world of flesh and blood into some higher, non-metabolizing existence.

"There is no death." That is the primary message of the preachers and gurus. All you need to do is tithe or buy their books and tapes.

Thousands of years ago, stones, brooks and trees were thought to have immortal spirits. By Renaissance times, in the Western tradition, the souls of non-human objects and creatures had been mostly dispensed with, but humans still clung tenaciously to their own imperishable spirits. The poet John Donne wrote, "I am a little world made cunningly/ of elements and an angelic sprite." His "elements" were admittedly temporary, but his "angelic sprite" would live forever.

The problem with Donne's formula is that four centuries of scientific investigation have revealed not the slightest hint of a sprite that can exist independently of our animal bodies - no vital spirits, no disembodied life force, no angelic souls. Everything scientists have learned about life and consciousness places Homo sapiens squarely and inextricably within the animal kingdom.

We are buds on a flourishing tree of life, sharing branches and trunk with our bestial cousins. We share most of our DNA with other primates, and a lot of our DNA with bugs and barnacles. "We are biological and our souls cannot fly free," writes Harvard biologist E. O. Wilson, summarizing what science has taught us about ourselves. He adds: "This is the essential first hypothesis for any consideration of the human condition."

We are animals, yes, but we are animals who have evolved the capacity to create music, art, poetry, science. We explore the universe, unravel the secrets of the DNA, and stand with awe before the majesty and mystery of creation.

An animal and proud of it.

Friday, January 12, 2007

...and our little life is rounded with a sleep

In the long history of humanity, no thought has been so common as personal immortality. At every time and in every place men and women have assumed they will live forever. Even Neanderthals, it seems, placed flowers in the graves of their dead, presumably to grace the afterlife.

It would be interesting to know who was the first person to accept that death is final. Certainly, empirical learning and awareness of mortality go hand in hand. None of the supposed evidence for an afterlife -- near death experiences, seances, channeling, hauntings, etc. -- holds up to experimental examination. No conclusion of science is more firmly established than that the soul is irretrievably embedded in the flesh and goes out like a light at the moment of death.

One-celled organisms are potentially immortal, in that death is not programmed into their existence. The inevitable death of multicellular organisms was -- from our point of view -- a marvelous breakthrough of natural selection, the mother of diversity and complexification. As the microbiologist Ursula Goodenough says: "It was the invention of death, the invention of germ/soma dichotomy, that made possible the existence of our brains."

We admire people who surrender their lives for a noble cause. What cause is more noble than the continuance of multicellular life in all of its diversity and grandeur? If we are to accept our personal mortality without despair, we need a worldview that emphasizes cosmic wholeness rather than the primacy of self. In Hymn of the Universe, the Jesuit scientist/mystic Teilhard de Chardin writes: "Man has every right to be anxious about his fate so long as he feels himself to be lost and lonely in the midst of the mass of created things. But let him once discover that his fate is bound up with the fate of nature itself, and immediately, joyously, he will begin again his forward march."

Thursday, January 11, 2007

"Mystery in broad daylight" -- Part 4

At the end of yesterday's post I posed the question: Do we choose to live completely and exuberantly in the commonplace, as the Romantics urge, or do we seek to strip away nature's veil, revealing by force (as it were) her hidden secrets?

The former course offers the ravishments of immediate sensation -- sight, hearing, taste, smell, touch -- the soul embodied in flesh. What was it Mary Oliver said in a poem? "You only have to let the soft animal of your body love what it loves."

The latter course is an endless quest for that one, perfect transcendent thing -- the Beatific Vision, the sorcerer's all-powerful incantation (Mickey, beware!), the fundamental laws of nature -- the age-old dream of sharing the knowledge and power of the gods, albeit at mortal risk.

Science began its uninterrupted advance when it found a way through experimentation and mathematical reasoning to harness a hidden source of power that had eluded religion and magic. There is no turning back or stopping that advance -- Romantic protests notwithstanding.

Still, each of us individually makes a choice: lift the goddess's veil, or leave her chastely cloaked. Esoteric knowledge with its attendant risks, or conservation and stasis. Technotopia, or the prelapsarian Garden of Eden.

In his essay The Conservative, Emerson wrote: "Conservatism makes no poetry, breathes no prayer, has no invention; it is all memory. Reform has no gratitude, no prudence, no husbandry." If any institution -- state or church -- is to prosper, it must find a way to balance conservatism and reform, past and future, wisdom and wit. "Each is a good half, but an impossible whole," says Emerson.

A balance of innovation and conservation is at the heart of organic evolution -- that much we have learned by lifting nature's veil. The genes conserve; mutation and selection drive life to ever greater diversity and complexity. Perhaps we can do no better than adopt the creative dynamic of evolution as our own sustaining myth.

Wednesday, January 10, 2007

"Mystery in broad daylight" -- Part 3

Science has proved amazing adept at lifting the veil of Isis, the covered goddess who represents metaphorically the Heraclitean aphorism, Nature loves to hide. But should we try to lift the veil? Some ancient representations of the partially disrobed goddess show a monstrous apparition -- a female creature with multiple breasts. Goethe and other German Romantics, especially, cautioned against seeking Nature's naked truth, as Pierre Hadot has shown in his seminal book The Veil of Isis. "Is it wise to raise the veil/ Where terror, threatening, dwells?" asks Schiller in a poem. To tear away Nature's veil, especially by the crude violence of experiment, is to put poetry, beauty and happiness at risk.

The theme goes back to the Garden of Eden: Knowledge can be catastrophic. Pandora's box is best left unopened. The dream of reason brings forth monsters.

There can be no doubt that knowledge imposes responsibilities. Einstein's beautiful work on relativity revealed almost preternaturally the secret of starlight in that extraordinary equation E=mc2, but it also made possible the nightmare of nuclear weapons. "Enough!" cry our modern Cassandras, such as environmentalist Bill McKibben. Like the German Romantics, they foresee "the end of nature," and in its place a mechanized and terrifying monster.

The only first-rank scientist I know of who has urged restraint in lifting the goddess's veil was the grand old man of DNA research, Erwin Chargaff, who shortly before his death in an essay in Nature (May 21, 1987) warned his colleagues to back away from human embryo experimentation. In words shuddering with indignation, he lashed out at fellow scientists who "stick their clumsy fingers into the incredibly fine web of human fate." "Scientific curiosity is not an unbounded good," he thundered. "Restraint in asking necessary questions is one of the sacrifices that even the scientist ought to be willing to make to human dignity."

Curiosity or restraint? Lift the veil, or shy away from the temple of Isis? Learn the secrets of the gods and share their power, or be content as humble acolytes? Has natural selection spent billions of years contriving human intelligence to say "Enough!," or is it our destiny to transform nature in ways we can't yet imagine?

That the myths of Promethean hubris and humble restraint are so ancient and enduring speaks of their profundity. Each of us approaches the goddess. She lifts her veil, revealing a glimpse of (what we suppose to be) her hidden beauty. Do we blush and turn away? Or do we accept the invitation to a grande passion, an amour fou that promises excitements of literary proportions but risks all that we hold near and dear?

Tuesday, January 09, 2007

"Mystery in broad daylight" -- Part 2

Johann Wolfgang Goethe was a polymath who counted himself a scientist as well as a poet. He developed a theory of colors, wrote on the geography of plants, and so on. But he was not part of the mainstream of 19th-century science. He never achieved the enduring influence of a Faraday or a Maxwel. His success was forestalled by his explicit rejection of the Heraclitean maxim: Nature loves to hide.

The goddess has no veils, said Goethe. What you see is what you get. Nothing is hidden. There is a mystery, yes, but it is not concealed behind immediate perception; it is here in "broad daylight," available to anyone with sufficiently acute intuition. "Nature has no mystery," he wrote, "that she does not place fully naked before the eyes of the attentive observer."

Goethe famously took Newton to task for his experiments with light, notably for passing light through a prism and separating it into its component colors. Whatever Newton found thereby, Goethe believed, was not the nature we should seek to know, but rather a broken, shattered thing. As Wordsworth said, "We murder to dissect."

History has passed judgment on Goethe's science. The experimental method of Newton and Faraday has given us the modern world. Nature does hide. A non-experimental observer could attend to nature forever and never discern the electromagnetic spectrum, the quantum periodicities of the elements, or the double-spiral of the DNA.

But neither should the experimental method distract us from the world of the commonplace in which we live our affective lives. Rather, it should add more layers of affective understanding. We properly admire Goethe and Wordsworth for the intensity of their engagement with the natural world. But do we really want to live without knowledge of the galaxies and the DNA?

It is one thing to discern mystery in a starry night or a child's grin. We also encounter mystery in the harmonics of the periodic table and the genomic code. Nature loves to hide. We peel back the goddess's veil and find -- yes, more of the same natural world that excited Goethe's unaided perceptions, but more too -- that coy, come-hither, beckoning tease -- of an apparently inexhaustible mystery that deserves our attention, thanksgiving, reverence, praise.

Monday, January 08, 2007

"Mystery in broad daylight" -- Part 1

Nature loves to hide, said Heraclitus, giving expression to an intuition that has been at the heart of the human condition since, presumably, long before Heraclitus. No one, that I know of, has satisfactorily explained why we have so long assumed that behind the world of immediate sense perceptions there is another ideal world, less various, more unitary, and able to be controlled by some appropriate exercise of prayer, magic, or -- in the modern scientific manifestation -- experiment.

Religion, magic, science: All assume a reality behind the commonplace that gives meaning and structure to the world. And thus we have offered incense and sacrifice to the gods, cast spells and incantations, or built, for example, giant magnetically-confined, super-hot plasmas to wrest from hydrogen here on Earth the energy source that nature hides at the Sun's core.

The gods have been dramatically nonforthcoming, given the vast amount of attention and resources that we have proffered on their behalf; they smite us with the same afflictions whether we attend their altars or not, and not a shred of non-anecdotal evidence suggests otherwise. Magic was a preferred way of reaching behind nature's veil for countless generations, but it is now universally recognized as a sham, confined with a wink and a nod to the likes of David Copperfield. But the experimental method goes from success to success. We may not yet have cracked -- in a controlled way -- the energy source of the Sun's core, but we will.

There was a time when the world was universally thought to be full of spirits, auras, stellar influences, and intrusions of divine will. All of that sounds superstitious to modern ears, but is it any less astonishing that we believe (know!) that the very space in which we live our lives is resonant with thousands of immaterial vibrations bearing in their various frequencies music, news, telephone conversations, internet access. Who will deny that with the experimental discovery and manipulation of electromagnetic radiation science has tapped into and controlled something fundamental that nature was wont to hide.

But the experimental method has not gone unchallenged. We should pay particular attention to the critique that has gone by the name "romantic reaction," as represented, for example, by the poet Goethe, lest the hubris that comes with experimental success blinds us to the blessings of the commonplace, on the one hand, and attentiveness to mystery, on the other.

More tomorrow.

Sunday, January 07, 2007

Expressing the inexpressible

Science is our most powerful tool for generating reliable knowledge of the world. Art lifts us into the realm of the inexpressible. "Just as the world needs both certainty and uncertainty, the world needs questions with answers and questions without answers," writes the physicist/novelist Alan Lightman. See this week's Musing.

You can click to enlarge Anne's weekly gift.

Saturday, January 06, 2007

Such wild love

I mentioned yesterday that the builders next door scraped the land flat and bare before construction, all seven acres. Dunes, ridges, plants and animals. Not a twig left standing.

An early site manager, Terry, was a plant lover, and he rescued some wild orchids which have found sanctuary on our side of the line. As the bulldozer did its dirty work, we briefly had a marvelous influx of refugees: geckoes, frogs, snakes, hummingbirds, bat moths. It was not a population that could sustain itself. They are all gone now.

Presumably when construction is finished, the new owners will spend several more millions of dollars on landscaping. But they will face the same problems we all do: bur grass and love vine, scorpions and spiders. Their solution, I'm guessing, will be massive applications of pesticides, as in other projects of this sort. And so it goes.

The time has passed when we could listened at night to the mournful cries of feral peacocks. The indigenous Bahamian boa is on the way to extinction -- you won't see one on our land again. Yes, I know, my own ecological footprint is larger than it needs to be. But we left our plot pretty much as we found it, and didn't use bulldozers and chemicals against the creatures. My most formidable garden tool is a rusty machete.

Friday, January 05, 2007

Sacramental starlight

In the year 1750, a baby boy was born in Gambia in West Africa. On the eighth day after the birth, as was the custom, the villagers paused from their normal activities to celebrate, with feasting, music and prayer, the naming of the child - Kunta. Kunta Kinte.

That night, the father took his infant son to the edge of the village and completed the naming ritual by holding the child up to face the heavens -- a crescent moon, a sky streaked with stars. The father whispered to the child, in the language of the Mandinka tribe: "Behold -- the only thing greater than yourself."

You may recognize this episode from the first chapter of Alex Haley's family saga, "Roots," a semi-fictional re-creation of seven generations of the author's African-American family.

What a marvelous celebration of the sacredness of a human self and the infinite mystery of the universe -- the microcosm (the infant) and the macrocosm (the spangled heavens). At my own naming ceremony, a few drops of water were sprinkled on my head; Kunta Kinte's head was sprinkled with stars.

I have mentioned here before that a big condominium project of million-dollar holiday houses is going in next door, a project completely out of scale with the modest single-family homes scattered along our beach. I have spoken to the builder suggesting that outdoor lighting be kept to a minimum, and that what light is used be environmentally sensitive. But having seen what they have done to the dunes, the ridges, the flora and fauna -- all scraped flat and bare -- I have little hope they will care about dark skies.

We have lived here happily and securely in a relatively isolated corner of the island for a dozen years without a watt of outdoor illumination, enjoying a view of the universe that all the money in the world couldn't buy. But as the island develops, the orange glow rises up on every side. One more refuge of darkness surrenders to the thickening shell of artificial light that wraps the planet and cuts us off from the universe. Behold, there is nothing greater than ourselves.

Thursday, January 04, 2007

What is

I don't discuss politics on this blog. But a series of graphs in last Sunday's New York Times broke my heart.

The graphs track world opinion of America over the past six years, as measured by the Pew Global Attitudes Project. They show favorable opinion sinking dramatically, particularly among our traditional allies. About the only country where respect has risen is Pakistan, presumably (according to Pew) because of American aid the aftermath of the 2005 Kashmiri earthquake. (In this regard, I remember a segment on 60 Minutes about volunteers from the New York City emergency services who went to help quake victims. Ordinary Joes. America at its best.)

Anyone who loves America can't help but be heartbroken to see our nation so uniformly disrespected around the world, especially those of us who long (forlornly, to be sure) for the day when all of humanity will recognize ourselves as one family, brothers and sisters.

I think of what might have been done with the hundreds of billions of dollars we are currently exhausting on a counterproductive military adventure: disaster relief, schools, clinics, safe water, internet connectivity, research on infectious diseases, affordable pharmaceuticals, mosquito nets, collective UN-sanctioned responses to local genocides, and so on.

But most important, I think, is the attitude we project toward others. Americans tend to think those pale blue helmets of UN peacekeepers are for sissies, and the UN a trap where the God-blessed, red-white-and-blue colossus is pulled down and nibbled at by lesser beings. There is way too much feeling that our religion is right and all others are wrong, that our language was the one used by Adam and Eve in the Garden of Eden (which was probably somewhere in Kansas), that Jesus had white skin just like Britney Spears. All of this, I think, is a minority view, but it looms large because of the way politicians pander to our base self-interest. Where are the likes of George Marshall and JFK, who appealed to our generosity and sense of common humanity?

One bright spot is science, where people of all nations, religions, races and ethnicities work together with astonishingly little friction. When the focus is on "what is," not "what we want to be," it is amazing how quickly we learn that we are all one under the skin.

Wednesday, January 03, 2007

The soft animal of your body

Here is a prize-winning photograph by Kevin Raskoff of Monterey Peninsula College, from the December 21 issue of Nature -- a siphonophore, a kind of colonial jellyfish, like a Portuguese man-'o-war, about 30 centimeters long, from under Arctic ice. Those bulbs are propulsive "swimming bells." The flamelike tangle at the bottom are tentacles and reproductive organs. You can click to enlarge.

A photograph like this requires no comment. My only response is to say with Ursula Goodenough: Hosannah! Not in the highest, but right here, right now, this.

Gorillas missed

In the news (and in comments here) we read about the sad plight of animals under the threat of extinction, most wrenchingly, for me, the polar bear and the gorilla. It is inconceivable that a people who call themselves civilized could allow these magnificent animals or their habitats to disappear.

I grew up in a generation of American boys who idolized Frank "Bring 'Em Back Alive" Buck, the 20th century's most flamboyant live animal dealer. Buck's clients included many of America's zoos, and he supplied them with elephants, tigers, leopards, apes, monkeys, exotic birds, and every other type of beast imaginable. His base was Singapore, and his hunting grounds the tropical forests of Southeast Asia.

Buck's autobiography was in my father's library, and the great white hunter stared fearlessly out from the frontispiece photograph -- handsome, steely-eyed, macho and courageous. This guy could be charged by an enraged rhino or entwined by a 30-foot python and never flinch. A true boy's hero -- of yesteryear.

By the time my own children started reading, there were new zoologist heroes, Joy Adamson, Jane Goodall, Dian Fossey. And the book titles were different too: "Bring 'Em Back Alive" was replaced by "Living Free" and "Gorillas in the Mist." Clearly, a sea change had occurred in public attitudes about animals in captivity and in the wild.

The dashing adventurer of my boyhood was no longer a cultural icon, but a zoological imperialist who never doubted that the wild beasts of Southeast Asia had no better fate than a cage in a Western zoo. Nothing motivates animals more than fear, wrote Buck, and like the other assorted agents of empire who hung about the bar of Singapore's Raffles Hotel, he knew how to use fear to bring his quarry to heel.

But give Frank Buck this: While his friends entertained themselves by blasting away at wild beasts with elephant guns, he brought 'em back alive. Alas, Buck's animals were saved from the trophy room wall only to spend their lives confined in the grim cages of early-20th-century zoos.

The rampant destruction of habitat is a far greater threat to the survival of wild animals than Frank Buck ever was. It is too simple to blame the developing nations; they labor under economic pressures that are largely dictated in the West. As long as wealth generation and conspicuous consumption count for more than zoological diversity, polar bears, gorillas, leopards, pygmy hippos, and all the rest face a bleak future.

Tuesday, January 02, 2007

One more swing around the Sun

We watch aghast as people around the world blow up themselves and others for grand sophistries -- religious, political, racial -- idees fixes that have no more permanent substance than a sand castle but nevertheless so concentrate the mind as to allow the self-certain pulling of a trigger upon innocence and beauty.

Never trust a truth, I say, that can be shouted as a slogan. Never yield one's moral sense to any man who thumps a holy book. Beware Capital Letters.

Trust instead in little truths -- this green gecko that as I write watches from the porch, its dewlap pulsing.

That old agnostic and social progressive Thomas Huxley, as he entered genteel retirement, became more convinced than ever that the little truths -- why the dewlap? why the pulsing? -- were the bricks that by patient accumulation yield a firm foundation for a life. Even as his physical and intellectual vigor declined, he affirmed to a friend that "the cosmos remains always beautiful and profoundly interesting in every corner." Attending to the minute particulars of nature's grandeur, he held, is the only acceptable worship.

As that late stage in his life, Huxley was widely considered an adornment of his age and nation, but not, it seems, by the family's new cook, who walked out upon learning that she was expected to serve in a godless household.

Monday, January 01, 2007

Happy New Year -- Here's to your health

I have just finished reading Ron Chernow's big biography of Alexander Hamilton, American Founding Father, known to the rest of the world, if for no other reason, as the face on the ten dollar bill.

Chernow makes no special mention of it, but one thing that impressed me as I read the book is just how much illness and death were part and parcel of late-18th-century life.

In Hamilton's time, if you didn't have a toothache or gastric distress of one sort or another you could count your lucky stars. Typhoid, yellow fever, smallpox, and a host of other infectious diseases were ever-present threats. Infant mortality and the death in childbirth were commonplace. George Washington died of a throat infection contracted while riding in a snowstorm. Even a knick with a razor could be fatal.

The most famous physician in Hamilton's American was Benjamin Rush of Philadelphia. He treated victims of yellow fever with a horrendous regimen of bleedings, purgings and induced vomiting that probably caused more deaths than cures.

Today, we take good health for granted, and complain when we suffer heartburn or a sniffle. Early death is the exception rather than the rule, and debilitating illness is a matter for the resourcefulness of physicians.

Scientific medicine was pretty much an invention of the 19th century. Imagine a world without vaccines and anesthetics, without safe public water and sewage systems. Jump another century ahead and consider the boon of antibiotics.

A little history can be a bracing antidote for any nostalgic longing for "the good old days."