Saturday, September 30, 2006

365 Starry Nights

A long, long time ago, in the late-1970s, when my family was living close to the edge, I said to my spouse one evening, half in jest, "I'm not going to sleep until I figure out a way to make $10,000." At that time, $10,000 seemed a fabulous fortune.

Sometime after midnight, I dreamed up the idea of a book called 365 Starry Nights, which I would write and illustrate, with a little astronomy lesson for every day of the year.

I did up a month's worth of words and sketches -- January -- and sent it off to a dozen publishers. And waited. And waited. And waited.

More than a year passed with not a peep from anyone. Then out of the blue (or the black) an editor at Prentice-Hall said she wanted to publish the book (bless her for that and for much that followed, thank you, Mary). The book is still in print and made considerably more than $10,000.

It did not have an index, and over the years several people undertook to index the book for their own use, most recently Dan Schroeder, a physicist at Weber State University in Utah. With Dan's kind permission, readers of this blog who own the book can access his index here. Thanks, Dan.

365 Starry Nights had two younger siblings in the same illustrated format -- Crust of the Earth: An Armchair Traveler's Guide To the New Geology, and Biography of a Planet: Geology, Astronomy and the Evolution of Life on Earth -- which had, alas, a brief shelf life.

Friday, September 29, 2006

What is science?

I would emphasize consensus: Science is the attempt by skeptical and curious men and women -- let's call them scientists -- to gain consensus knowledge of the world, by trying as best they can to minimize cultural bias (tradition, religion, politics, ethnicity, gender, etc.) and let nature have its say.

To this end, they have devised a number of tools to ply their trade: quantitative observation, mathematical language, peer review, institutionalized doubt, the principle of parsimony, and -- above all -- the willingness to say "I don't know."

It is because scientific knowledge of the world is consensus knowledge that those of us who are scientists only in spirit embrace it with confidence. And, of course, we recognize that modern medicine, sanitation, technology, and wealth creation all stand as monuments to the effectiveness of the scientific way of knowing.

But it is precisely because we recognize the limits of science that we mind too poets, artists, musicians, and all who give creative expression to our intuition -- amply confirmed by the history of science -- that there is more to the world than what presently meets the eye. "What is spiritual about the manifest is not the part that leaves tracks in the snow," says Mary Oliver in one of her aphorisms.

Thursday, September 28, 2006

One book or many?

I found myself the other day outside the Boston Public Library, admiring once again the splendid original building designed by Charles Follen McKim and opened in 1895.

On the Copley Square facade is inscribed in tall letters: "The public library of the city of Boston built by the people and dedicated to the advancement of learning." On the Boylston Street side is enblazoned: "The commonwealth requires the education of the people as the safeguard of order and liberty."

On both facades are engraved many dozens of names from the sciences, arts, statesmanship and religion -- a grand roll call of civilization. Greats from the sciences include Galileo, Newton, Lister, Pasteur, Helmholtz, Faraday, and almost everyone else you can think of. Yes, religion too, not to be neglected by the public-spirited men and women who built this monument to learning and enlightenment. In the neighborhood of the library are many churches distinguished by traditions of Emersonian liberality.

So hopeful, so liberal, was the generous spirit evinced by the embellishments of the building that I was almost moved to tears. How proud to be part of that tradition, even as a spectator.

Wednesday, September 27, 2006

Speaking of snowflakes

Why the six-pointed symmetry?

The first person to ask the question seriously was the astronomer Johannes Kepler in a delightful little book called The Six-Cornered Snowflake, published in 1611. All around him Kepler observed beautiful shapes in nature: six-pointed snowflakes, the hexagonal honeycombs of bees, the twelve-sided shape of pomegranate seeds. Why? he asks. Why does nature display such mathematical perfection?

We might add: Why does the stuff of the universe arrange itself into spiral galaxies, planetary ellipses, double-helix DNA, rhomboid crystals, five-petaled flowers, the rainbow's arc? Why the five-fingered, five-toed, bilateral symmetry of the newborn child? Why?

Kepler struggles with the problem, and along the way he does a pretty neat job explaining why pomegranate seeds have twelve flat sides (squeeze spheres into the smallest volume and that's what you get) and why the bee's honeycomb has six sides (because that's the way to make honey containers with the least amount of wax). His book is a tour-de-force of playful mathematics.

In the end, Kepler admits defeat in understanding the snowflake's six points, but he thinks he knows what's behind it all, behind all of nature's beautiful forms: A universal spirit pervading and shaping everything that exists. He calls it nature's facultas formatrix, or "formative capacity."

We would be inclined to say that Kepler was just giving a fancy name to something he couldn't explain. To the modern mind, facultas formatrix sounds like empty words.

We can do rather better with snowflakes than Kepler. We explain the general hexagonal form by invoking the shape of water molecules, and we explain the shape of water molecules with the laws of quantum physics. But the perfect six-fold symmetry? As a snowflake grows, adding water molecules essentially at random, how does one point know what is going on at another point? On the scale of molecules, the faces of the growing crystal are light-years apart. You will find theories out there -- forced "tiling", exquisitely sensitive vibrations, that sort of thing -- but I haven't seen anything yet that is totally convincing.

And we are probably no closer than Kepler to answering the ultimate questions: What is the reason for the curious connection between nature and mathematics? Why are the mathematical laws of nature one thing rather than another? Why do natural forms exist at all?

Maybe facultas formatrix is as good a name as any to cover our ignorance.

Tuesday, September 26, 2006

Patience and understanding

In Comments recently there was some mention of Snowflake Bentley. Readers of Honey from Stone will know that Bentley has been a hero of mine since I came across the Dover collection of his snowflake microphotographs, hundreds of them, sometime back in the Sixties.

Wilson Bentley was born near Jericho, Vermont, in 1865. His mother was a schoolteacher, and from her he acquired a lively curiosity and a love for nature's minutiae -- drops of water, bits of stone, bird feathers, insects. By the time he was eight years old he had made a collection of every species of fern that grew in Vermont. On his fifteenth birthday, his mother gave him a microscope, and with it he looked at a snowflake. That was it! He spent the rest of his life perfecting the art of snowflake photography. By the time he died half-a-century later, he was known internationally as "the Snowflake Man." On the day after his death, the Burlington Free Press wrote: "He saw something in the snowflakes which other men failed to see, not because they could not see, but because they had not the patience and understanding to look."

If you have a very young child or grandchild, you might want to buy a copy of the Caldecott-Medal-winning Snowflake Bentley, by Jacqueline Briggs Martin -- illustrated by the always wonderful Mary Azarian. Full disclosure: This book was published by Houghton-Mifflin, where my daughter Margaret presides over such things. She purchased from Mary -- as a present for me -- the original hand-colored cover illustration of Bentley at work, which now hangs in our dining room.

Monday, September 25, 2006

Faith, reason and the university

By now every cartoonist and columnist on the planet has taken note of Pope Benedict's ill-considered quote regarding Islamic violence -- and the violent Islamic reaction.

No one seems to have noticed that Benedict's remarks on Islam were just a few paragraphs of a long speech that mostly took Western science and philosophy to task for an excessive reliance on reason. Scientists must overcome their "self-imposed limitation of reason to the empirically verifiable," said the pope. It is clear he would hope to restore theology to its former place as Queen of the Sciences, reversing empiricism's hard-won independence from those who claim to possess absolute truths of faith. In particular, the pope rejected any attempt to ground moral behavior in the biological or human sciences. Secular humanism, he implied, is a bust.

The secular humanist response was notable for its lack of violence.

You would have thought that at least a few of us would have stood in St. Peter's Square waving our library cards in protest.

Sunday, September 24, 2006

Wired

A few days ago I was in Powell's bookstore in Portland, Oregon, one of the best, and biggest, independent bookstores in the country. I was reminded of a Globe column I wrote eleven years ago on the publication of computer guru Nicholas Negroponte's manifesto, Being Digital. The demise of p-books was greatly exaggerated. See this week's Musing.

And for your pleasure, Anne's weekly feast of pixels. She rather likes being digital. Click to enlarge.

Saturday, September 23, 2006

Squaring the circle

Here is a little story about the best and worst of Western civilization. It can be summarized in one word: square.

Yep, that's right. Square. The ninety degree corner, the plumb-bobbed line. Forget the squiggle, the wiggle, the curlicue, the arc. We're talking orthogonality here. We're talking perpendicularity.

I was flying across the country the other day at 37,000 feet. From the Mississippi to the Rockies, for a thousand miles, as far as the eye could see, the land was ruled into one-mile squares, as neat as the tiles on your kitchen floor. (As always, click to enlarge.)


It's the Western way, begun by Greek geographers and perfected by the philosopher-scientist Rene Descartes in the 17th century. Cartesian coordinates: that's what we call his method of plotting the world on a rectangular grid.

In 1785 the young Congress in Washington, at Thomas Jefferson's request, decreed that public lands west of the Appalachians would be surveyed and sold in squares. Six-mile squares called townships. Each township divided into 36 one-mile squares called sections (usually bordered by roads). Sections divided into four quarters. Bingo! The checkerboard grid I observed from 37,000 feet.

The 1785 survey plan ignored natural contours of the land, the sinuosity of shorelines and watercourses. Congress had one thing in mind: the efficient distribution of land. Esthetics didn't enter into it. The goal was not to embellish nature, but to subdue it.

What I saw from the airplane is a classic example of Cartesian efficiency. One million square miles of geometric utility. A geographic straightjacket of mathematical lines. The rule of the square. We are richer and more powerful for it, of course, richer and more powerful than those cultures that are still "bogged down" in squiggles and curlicues.

The squaring of the circle is the secret of our presumed success -- the best and the worst of Western civilization.

(Thanks, Tom, for doing my posting while I was away.)

Friday, September 22, 2006

On names and understanding

In a 1931 letter to his sister, the celebrated paleontologist George Gaylord Simpson was pondering the ultimate scientific question, how did the universe begin: "Call that great Unknowable by any name you wish, call it X, or Yahweh, or God, or say that God created it. Applying the letters "g", "o", and "d" to it or what created it is no explanation and no consolation. It is a common failing, even more among scientists than among laymen, to think that naming a thing explains it, or that we know a thing because we can put a name to it. But to say that God created the universe means nothing whatever."

Faced with the (present) mystery of the Big Bang, the empirical naturalist will say "I don't know." Perhaps an explanation will come along, perhaps not, but to say "God did it" adds nothing to our understanding. "If a sign is useless, it is meaningless; that is the point of Ockham's razor," said Wittgenstein.

To name a perceptible thing has some advantage; it makes it possible to talk about it. There can be no theory of the electron, for example, until we have a word for the electron. But naming is not understanding. Before we say we understand a thing, we must weave it into the web of concepts that constitutes a theory. Only when the concept "electron" is enmeshed in a matrix of other ideas -- atoms, fields, valency, molecular bonds, etc. -- by taut, quantitative connections, do we have confidence that we know what an electron is.

Is there a circularity in scientific explanation? Of course. Every explanatory system refers back upon itself. It is the timbre of the web and the way the web makes empirical verification possible that give us confidence that we are doing something right. To say that "God" caused the Big Bang predicts absolutely nothing about what we should see when we turn our telescopes to the most distant universe. The thread of meaning that connects "God" to the earliest universe is infinitely slack.

Thursday, September 21, 2006

An axis of evil

It is only a few stone tools, shaped in a fashion that elsewhere is associated with Neanderthal bones, so take the new discovery with a grain of salt. But the tools appear to be the youngest Neanderthal artifacts yet found, a mere 28,000 years old, maybe even younger. Where? At the tip of the Iberian Peninsula, the Rock of Gibraltar.

For 200,000 years, Neanderthals had Europe pretty much to themselves. Then, about 35,000 years ago a new breed of humans, anatomically identical to ourselves, came sweeping out of Africa. They fanned across Europe, displacing Neanderthals. Did they interbreed? Did they live side by side in peace? Apparently not. Rather, the story of the Neanderthal/Cro-Magnon encounter seems to have been written in blood. It would appear that the last Neanderthals were pushed into caves at Gibraltar, their backs against a strait they had no way of crossing. There they made their last stand. And there they became extinct.

When I was a kid, we read the story as a triumph of modern humans over a grisly, sub-human race, a triumph of reason, imagination, and lofty moral vision over ugliness, stupidity, and amorality. Then, in his 1955 novel The Inheritors,, William Golding turned the story of the Neanderthals and Cro-Magnons on its head. Golding's Neanderthals live in a state of childlike innocence, possessed of wonder and imagination. They do not willfully kill other animals. They are sexually restrained, and charmingly uninhibited about their nakedness. Into this Edenlike existence come the violent and cannibalistic Cro-Magnons. The new folk revere a witchdoctor with an antlered mask. They are adulterous and engage in orgies. The gentle Neanderthals are no match against the craftiness and cunning of the new arrivals. Except for a single child, Golding's happy band of Neanderthals are eliminated. The tougher, more adventuresome Cro-Magnons inherit the earth.

Needless to say, the new story is as much a fiction as the old, a sort of hippie anthropology. There is no reason to believe that Neanderthals were less violent than their adversaries. Was it mental capacity, language, and inventiveness that gave Cro-Magnons the advantage? Was it aggressiveness, rapacity, and a shrewd instinct for self-advantage? Or was it simply a more efficient technology of killing?

Wednesday, September 20, 2006

Les matins du monde

"Thirteen billion years ago it was morning in the universe." So began a story in the New York Times about new studies of the earliest galaxies. The Hubble Space Telescope and the giant Subaru and Keck Telescopes on Mauna Kea in Hawaii are turning back the curtain on the birthing room of the first stars -- colossal spheres of primeval hydrogen and helium, hundreds of times as massive as the Sun, that lived and died in violence, forging heavy elements, seeding creation with the stuff of planets and life.

Tous les matins du monde sont sans retour: The mornings of the world are without return. If the astronomers are right in their current calculations of the recession rate of the galaxies, the universe will expand forever, growing ever more dilute, ultimately expiring in cold and dark.

Some years ago we imagined that the universe might be cyclic -- expand, contract, expand, contract, an endless repetition of big bangs tethered by gravity, God's big bolo bat. For the time being, a cyclic universe does not seem to fit the data. Is the universe then a one-shot affair? Who knows. There's another possibility. That this universe is just one of many, perhaps an infinite number, bubbling into existence, blazing brightly, then collapsing upon themselves or stretching themselves infinitely thin.

Every people, at every time, have had creation stories. Our story is the first to be affirmed tentatively, the only one held to the refining fire of empirical observation. We take our story seriously, but we don't stake our lives on it. Unlike every people who lived before us -- and most who are alive today -- we take our meaning from the search, not from a conclusion. We define ourselves as explorers. We welcome mystery as a challenge. We embrace our ignorance as a vessel waiting to be filled.

Tuesday, September 19, 2006

Because they take a little nip/ From ev'ry flower that they sip

"How sweet it is," said Jackie Gleason. Indeed. As I write, I am sitting in a quiet corner of the college Commons munching on a breakfast roll slathered with sugary frosting. Yum!

Hey, I make no apologies. Sugar is a key ingredient of all life on Earth and has been since the beginning.

We don't know where the earliest living organisms came from, but we have a pretty good idea how they made their living.

They took sugar molecules from their environment and broke them apart, rearranging the atoms into smaller molecules of carbon dioxide and alcohol, a process called fermentation. Some of the energy stored in the sugar molecule is released, and fueled the first life on Earth.

The seas were sweet in those days, a kind of dilute Kool-Aid, and the first organisms fed on this sweet elixir.

Where did the sugar come from? Most likely, it was brewed up by plain old non-biological chemistry. The early Earth was crackling with electrical storms and bathed with ultraviolet light from the Sun, and apparently used this energy to synthesize its sweets.

As life exploded exponentially, it was inevitable that the sugar would run out, dooming fermentation. But before this happened, some microbes evolved the ability to make their own sugar, using sunlight. With photosynthesis, life freed itself from scrounging ready-made sugars from the environment.

But we never lost our sweet tooth. Tell me about it.

Monday, September 18, 2006

God of the gaps


I faithfully peruse Science and Nature each week, the two premier science journals. Many of the research reports are over my head, so the tendency is to skip quickly past "In situ structure of the complete Treponema primitia flagellar motor" (Nature, August 31, 2006).

Big mistake. Buried in the tech talk is a thing of astonishing interest: the first detailed representation of a bacterial flagellar motor, the nanomachine that spins the whiplike appendage that propels a bacterium through an aqueous medium.

The darn thing looks exactly like the electric motor that spins your washing machine: a stator and rotor, made of 25 different proteins. It is about a thousand times smaller than the period at the end of this sentence, and rotates at speeds up to 300 cycles per second. Not only that, it can rotate in either direction!

Michael Behe, the creationist author of Darwin's Black Box, used the flagellar motor as a premier example of intelligent design -- and he didn't know the half of it. Now that we have a picture of the gizmo, well, it sure looks designed, and the lazy explainer will jump to that conclusion. Of course, saying "a designer did it" says nothing; it's just a phony way of saying "I don't know." No one yet knows the detailed steps by which natural selection contrived so marvelous a device, but we know of no reason in principle why it could not have happened, and plausible scenarios have been offered by biologists. Now that we do know what a flagellar motor looks like, we can be sure that plucky young graduate students will be working out its antecedents -- a rather more exciting prospect that simply throwing up one's hands and invoking divinity.

In the famous Dover, Pennsylvania, trial challenging the teaching of intelligent design in public school science classes, Eric Rothschild, chief counsel for the plaintiffs, said in his summation: "Thankfully, there are scientists who do search for answers...By contrast, Professor Behe and the entire intelligent design movement are doing nothing to advance scientific or medical knowledge and are telling future generations of scientists, don't bother."

Sunday, September 17, 2006

Fecund ignorance

"To make light of philosophy is to be a true philosopher," wrote Blaise Pascal in one of his more perceptive pensees. One might add as a corrollary that one should always wear one's convictions lightly. Or at least with a sense of humor. See this week's Musing.

Anne's Sunday pic. Click to enlarge.

Saturday, September 16, 2006

Who knows upon what soil they fed their hungry thirsty roots?

There is something immensely satisfying about the cycle of the seasons, that wonderful mix of progress and recurrence that makes of nature a kind of rhymed verse. The stanzas take us forward, the refrain reassures. And here they are again, the Amanita muscarias in the pine grove. I dutifully blog them for the third year in a row. (1)(2)

Mushrooms are the grave robbers of the plant world, the night stalkers, and it is appropriate that they come in autumn's failing radiance to skulk with goblins, witches, incubi and succubi, dancing in fairy circles. There is something darkly sexual about the mushrooms. The phallic stinkhorn. The vulval earthstar. And those wicked little men of the woods, which I have never seen except in foreign handbooks, the crowned earthstars, Geastrum fornicatum, marching in lascivious gangs, with open mouths.

Our ancestors who lived in the dark forests of Northern Europe may have seen the mushrooms as spirits of the dead in macabre resurrection. Appearing overnight, in spooky garb, these Lords of the Flies evoked, somehow, mysteriously, thoughts of malevolence and lust. We have inherited from that time a roster of names -- destroying angel, fairy helmet, jack-o'-lantern, death cap, witch's butter -- that invest mushrooms with an aspect of evil rivaled only by that which we associate with snakes.

Friday, September 15, 2006

No tithing necessary, but buy the book

I have an advance reading copy of Richard Dawkins' latest: The God Delusion. The "world's most prominent atheist" (according to the jacket) has been scooped, of course, by his friends Sam Harris (The End of Faith) and Dan Dennett (Breaking the Spell). But not to worry, Dawkins can hold his own. His book is funnier, more mischievously disrespectful of religion than either of the other two. You know where he's going when right off the bat he quotes Robert Pirsig, the author of Zen and the Art of Motorcycle Maintenance: "When one person suffers from a delusion, it is called insanity. When many people suffer from a delusion it is called Religion." What follows is a rollicking reductio ad absurdum.

But I doubt if these books will dissuade believers from their beliefs. Faith and reason are pretty much antithetical, and religious faith in particular is inoculated against empirical evidence by centuries of uncompromising tradition.

But the very fact that the books can be published by major houses, and find a sufficient audience to make their publication commercially viable, suggests that something is going on -- a small but growing grassroots movement that eschews the supernatural, while reverencing the creation and striving, in a non-dualistic way, for something that might be called spirituality. The UUs, of course, have been at this a long time, and, as I have mentioned here before, I have of late made the acquaintance of communities of Catholic women religious (eg.) who are more interested in the creation as disclosed by science than in the fine points of traditional dogma. In a sense, Harris, Dennett and Dawkins are Johnnies-come-lately to the project of demystifying the world, and they haven't quite caught the spirit of the broader movement, which is more attuned to quiet attention and celebration than to the secular equivalent of Bible-thumping.

Still, one can only welcome the appearance of these zestfully irreverent books as user manuals for further demystification and burs under the saddles of religious dogmatists of every stripe.

Thursday, September 14, 2006

The war of science and faith

We know the battle has been well and truly joined when the Religion section of Newsweek gives three pages to atheist/agnostics who happen to be scientists or science savvy -- Sam Harris, The End of Faith, Daniel Dennett, Breaking the Spell, and Richard Dawkins, The God Delusion (to be published next month). Scientists have generally been reluctant to make pronouncements on religion, but apparently too much is now at stake for silence. What Newsweek calls a "religious revival" is rather a global relapse into fundamentalism and righteous triumphalism. To Harris, Dennett and Dawkins it is simply mind-boggling that in the 21st century people are still ordering their lives -- and seeking to order their neighbors' lives, sometimes violently -- according to a clutch of mutually inconsistent (and self-inconsistent) books supposedly written by the creator of the universe. It is a simple fact -- no matter how much one invokes politics, economics, etc. -- that almost every instance of collective violence on Earth today is religiously inspired. The so-called "war on terror" is more accurately a war of opposing faiths.

For all of their God-bashing, Harris, Dennett and Dawkins take care to lay out a rational foundation for ethics, based on maximizing the happiness and minimizing the suffering of sentient beings. Their ethics is global, not tribal. They do not demonize actions without victims -- private homosexual acts between consenting adults, for example. They are aware of the ambiguities and complexities of many moral decisions -- collateral damage in a just war, for example -- but reject commandments based on thousand year-old texts -- the honor killing of female rape victims, for example. All things considered, I would rather live in a society based on the ethical principles of these atheist/agnostics, than have Mahmoud Ahmadinejad or Pat Robertson telling me what to do.

Theology, says Harris bluntly, is a branch of ignorance. Are we then left with a grim, heartless existence? His final paragraph: "Man is manifestly not the measure of all things. This universe is shot through with mystery. The very fact of its being, and of our own, is a mystery absolute, and the only miracle worthy of the name. The consciousness that animates us is itself central to this mystery and the ground for any experience we might wish to call 'spiritual.' No myths need be embraced for us to commune with the profundity of our circumstance. No personal God need be worshipped for us to live in awe at the beauty and immensity of creation. No tribal fictions need be rehearsed for us to realize, one fine day, that we do, in fact, love our neighbors, that our happiness is inextricable from their own, and that our interdependence demands that people everywhere be given the opportunity to flourish."

More on Dawkins tomorrow.

Wednesday, September 13, 2006

Impatiens

The jewelweed pods are ripe along the path. I look for the fat ones and cannot resist touching them, lightly with my fingertip. Kapow! The pods curl back like slingshots, hurling their seeds. Amen, I say, amen. The tiny particles of sea-green hope are prayers of sorts. Explosive dehiscence it's called. A fine name for a surprising property of plants.

The jewelweed pods fling their contents faster than the eye can follow. Witch hazel is another of our New England plants that has learned to pitch its seeds. To walk through a witch hazel thicket in late fall is to enter a no-man's-land raked by machine-gun fire. Apparently there are lots of plants worldwide that dehiss with a bang (I made up that word). The Venus flytrap closes its voracious jaws in milliseconds.

One doesn't expect such quickness from plants. The whole point of roots, after all, is sitting tight. Darwin wrote a book on the movement of plants. I've visited his home in the English countryside and sat quietly in his greenhouse as he must have sat, with infinite patience, noting such things as "circumnutation," the slow revolvings of growing plants, and plotting their peregrinations on glass plates. The Power of Movement in Plants is a boring book unless you are keen on things like the "nyctitropic movement of petioles" (or the powers of mind of a man who has mastered the art of attention). I don't recall that Darwin touches upon explosive dehiscence, but then such brio somehow seems out of place in the unhurried rhythms of his country home. A walk to the nearby village of Downe was almost more excitement than the great naturalist could bear.

Tuesday, September 12, 2006

HUDF


The Hubble Ultra Deep Field Photo (click to enlarge) is the deepest we have ever seen into space. It images an area of the sky that you could cover with the intersection of two crossed straight pins held at arm's length, in which you can see nothing with the unaided eye or even with an ordinary telescope. Everything you see in the photo is a galaxy, except for a couple of foreground stars. Peering into the apparently empty darkness, the Hubble camera soaked up faint light for a million seconds (about eleven days), letting us see back to within a few hundred million years of the universe's beginning, when time began and space swelled from nothing. What is important to recognize is that this didn't happen somewhere, it happened everywhere. The big bang happened right where you are sitting at this moment -- and everywhere else.

In a universe with no center and no boundary, one place is like every other. The galaxies scatter like faces in an anonymous crowd. The stars burn briefly and then are snuffed out like so many candles. If the physicists are right, the whole shebang is bent on infinite dispersal, a long inexorable ballooning into cold and dark.

In such a universe we must fashion our own centers, out of what the poet Seamus Heaney calls "the words of coming to rest: birthplace, roof beam, whitewash, flagstone, hearth." One by one we put those precious bricks in place, using the syntax of love, the cement of affection. If we are lucky we can shape a place to bide a while, out of the gale, as the galaxies go rushing by.

Monday, September 11, 2006

Anniversary

The following thoughts were written five years ago as one of a collection of reflections on the events of 9/11 invited by Orion Magazine for their web site:

I slipped out of bed early on the morning of September 15, 2001, to see a conjunction of Venus and the Moon. The sky was clear, a crisp autumn tang in the air. The two celestial objects blazed in the east. The crescent Moon was eyelash thin, the rest of its orb more brightly lit by Earthshine than I had ever seen before.

I wondered what it would it be like to be viewing Earth from the Moon at that same moment. Our planet's face would be almost fully lit by sunlight, a huge blue-white ornament in the Moon's sky. No sign of human strife or turmoil. A placid sphere wisped with water and air, afloat against the silent deeps of space.

In the presence of that morning's beauty, I almost forgot the terrible events four days earlier when terrorists smashed planes into the World Trade Center and Pentagon. I thought to myself: Why must human violence disturb nature's peace?

But, of course, I had it exactly backwards.

It is nature that is violent. Astronomers point out how few places in the universe are sufficiently calm for life to exist. Massive black holes at the centers of galaxies gobble gas and stars. In the arms of galaxies, suns explode with a violence that shatters surrounding worlds. Comets and asteroids smash into planets. Galaxies collide.

The TRACE satellite telescope has recently provided us with stunning photographs of our Sun; they are epics of fire and frenzy. The Chandra X-ray telescope shows us a universe of ferocious tumult. Paleontologists find fossil evidence of planetwide extinctions.

We now understand that violence and death are corollaries of life. To persist, living creatures must take matter and energy from their environment. As life proliferates, competition for resources becomes inevitable. Aggression is advantageous, even necessary. Genetic variations that confer a competitive advantage are favored in the struggle to survive. If nature were not cruel, conscious creatures such as ourselves would never have evolved.

It is as Loren Eiseley wrote: "Instability lies at the heart of the world." The criminals who wreaked havoc on New York and Washington were acting out an ancient biological script.

Yet there is ground for hope. Our brains are of sufficient complexity to give rise to that mysterious thing known as self-awareness. Our genes may predispose us to act in certain ways, good or bad, but they do not constrain us. We are effectively free to choose good over evil. Humans alone, of all the things we know about in the universe, can escape the bipolar logic of evolution.

To a cheering extent we have done so. As Margaret Mead pointed out, the circle of those whom we do not kill has steadily expanded throughout human history. The optimists among us imagine that the circle will ultimately embrace the entire planet.

From nature's point of view, there is no such thing as the Problem of Evil: order and disorder, life and death, cooperation and competition are the twin principles of nature's creative force. What humans uniquely face is the Problem of Good: How to create on this tiny planet an oasis of unalloyed peace.

Sunday, September 10, 2006

Sense or sentimentality?

"Up, noble soul!" exclaimed Meister Eckhart. "Put on your jumping shoes which are intellect and love." See this week's Musing.

Anne's Sunday pic. Click to enlarge.

Saturday, September 09, 2006

Going offline

We are a multicultural family. Two of our four children are Mac devotees, two are PC users. I've been with Mac since I bought a 128, the first Mac on the market, twenty-two years ago. Now, like my two Mac kids (including Tom, of course), I own one of the new MacBooks...

...which happens to have a tiny built-in video camera at the top of the display.

With iChat software, this means we can have a video chat from anywhere in the world, assuming access to broadband. Every now and then my computer rings, and there is Tom or Mo, full screen, looking me in the eye.

It's unnerving.

I'm not quite ready for the always-connected life. It is rare these days to see someone without a cellphone on her ear. Text messaging, chat rooms, FaceBook, MySpace: I barely know what these things are. I've had a cell phone for two years -- for emergencies -- but I have never had an incoming call. No one has my number.

And now I have a "Buddy List" for iChat. Well, two buddies, at least, Tom and Mo, two people who can make my computer ring and see my mug. That's it. The list is closed.

Whatever happened to solitude? A walk in the woods without a connection to the world? Library stacks without ring tones? My space that was real space? Private space?

Whoops. Hang on for a minute. My computer is ringing.

Friday, September 08, 2006

The parsimonious life

The postings on this blog have often referred to Ockham's Razor, the philosophical principle that can be variously stated as "Do not needlessly multiply hypotheses," or "Do not use a more complicated explanation when a simpler explanation will suffice," or "When two hypotheses account for the same facts, prefer the simpler." The Razor is a fundamental principle of science, although certainly not the only reason for our confidence in scientific knowledge.

There is of course no "proof" of the Razor. It is grounded in intuition, perhaps esthetics. Whatever proof it has is in the pudding; science has been fabulously successful at providing practical, reproducible, consensus knowledge of the world.

William of Ockham, a 14th-century English Franciscan scholar gets the credit, but he hardly invented the Razor. It has precedents going back at least to Aristotle, although the Razor was always at odds with supernaturalists, including the institutional Church and Divine-Right governments, who stood (and still stand) only to gain by a multiplication of powers that require for their management a priestly or privileged caste. Indeed, we know of William of Ockham primarily because of his tussles with the Pope.

Newton built the Razor into the philosophical foundations of physics. "We are to admit no more causes of natural things than such as are both true and sufficient to explain their appearance," he insists at the very beginning of the Principia. His theory of universal gravitation -- a single elegantly simple formula describes the motions of planets, moons, tides, and the fall of the apple -- is the Razor at its best. William Rowan Hamilton's Principle of Least Action fashioned the Razor into a mathematical axiom of mechanics.

Most of us -- myself included -- live in a mire of the superfluous, ideas and concepts that comfort and compose: traditional religion, for example, what the poet Philip Larkin called "that vast, moth-eaten musical brocade," or all the other pseudo-religions -- astrology, parapsychology, xenophobia, etc. -- that we use to fill the hours, but which merely distract us from the crystalline simplicity of life pared to the bone. How I long for the clean purity of a sunrise over the water meadow, the razor slant of light, the memory, in that sufficient radiance, of a touch, an hour earlier, in the darkness of the bedroom, a brief and precious talisman of love.

Thursday, September 07, 2006

Staying focused

It's called the focusing illusion, and social scientists who base their research on subject interviews keep it in mind.

For example, ask college students 1) How happy are you with your life in general? then 2) How many dates have you had in the last month? and the correlation between the questions is effectively zero. Conclusion, general happiness doesn't depend on popularity with the opposite sex (or same sex, as the case may be). But reverse the order of the questions, and the correlation rises dramatically, leading to exactly the opposite conclusion.

According to researchers, asking the dating question first "focuses" the attention of the respondents on that aspect of their lives and unduly influences their view of their general happiness.

A recent study reported in Science took the focusing effect into account when concluding that being rich doesn't make one happier than one's less wealthy neighbors. But ask the wealth question first and you get a skewed answer.

The focusing illusion is worth pondering quite apart from any particular study. We are all focused in our opinions by any number of factors: the circumstances of our birth, our upbringing, our experiences (especially traumatic experiences), perhaps even our genes. It is well known, for instance, that our fidelity to a particular religion correlates most closely with the religion we were born into. One recent study even suggests a genetic component to political persuasion.

Focus is generally considered a good thing, but it can be constraining too. My life's work as a writer has certainly been focused by the experiences of my youth; there is no way it could be otherwise. But I struggle to stay open to other ideas and cultures. Reading widely helps. Staying in touch with multicultural science helps. Most of all, I keep reminding myself of the focusing illusion as a hedge on zealotry.

Wednesday, September 06, 2006

Soaring aspirations


This recently released update of a previous Hubble image is the remnant of a supernova -- a star that blew itself apart -- in the constellation Cassiopeia. (Click to enlarge, please!) It is known as Cassiopeia A and just happens to be the brightest radio source in the sky outside of the solar system. The progenitor event was the most recent supernova explosion in our part of the Milky Way Galaxy. It occurred in about the year 1667, although apparently it was not observed on Earth, possibly for reasons you can read about here. (Curiously, another supernova in the same part of the sky and at about the same distance away was famously observed a century earlier by Tycho Brahe, the great Danish astronomer.)

What we are looking at is a bubble, not a ring; it is the thick sides of the bubble that show up best in the photograph. Here is a dying star spewing into space oxygen, nitrogen, silicon, iron, the stuff of butterflies and brooks. The material is still racing outward to a possible destiny as the building blocks of future planets.

A dozen years ago I wrote in the Boston Globe that the Hubble Space Telescope was "too big, too expensive and too late."

I still wonder if astronomy might have been better served if those billions had been spent on innovative ground-based technologies: segmented mirrors, light detectors, digital stabilization, image enhancement, and so on. Nor am I sure that the sum total of Hubble science has been worth the price.

Still, I'd hate to forego the pics. They may be the most expensive photographs ever made, but they touch the soul as deeply as they inform the intellect. What the great Gothic cathedrals were to the Middle Ages, the Hubble Space Telescope is to our own time: an extravagant assertion of our abiding faith that there is more to this universe than life on Earth. Abbot Suger of Saint-Denis, one of the first great Gothic builders, hoped that his cathedral would reveal the divine harmony that reconciles all discord, and that it would inspire in those who beheld it a desire to establish that same harmony within the moral order. If the Hubble images could do the same they would be worth every penny.

Tuesday, September 05, 2006

Flyways

In the first chapter of Skeptics and True Believers I told the story of the red knot, a bird that may hold the record for long-distance migration, from the Canadian Arctic to Tierra del Fuego -- and back. Juvenile birds make the trek without adults to guide them and without ever having made the journey before.

A recent issue of Science tells of the migratory habits of the northern wheatear, a little ground bird that lives in northern Eurasia, Iceland Greenland and eastern Canada and migrates to the open savannas south of the Sahara. I got to know the wheatear in Ireland. It's name, by the way, derives from "white-arse," which is perfectly descriptive.

A German research team collected wheatear hatchlings from nests in Norway and Iceland and raised them in the lab. As migration time approached, the birds bulked up with food for the journey, and -- here is the kicker -- they ate in proportion to the distance they would have had to fly had they been left in their native environment.

The urge to migrate, the preparatory feeding, the destination and the navigational tools to get there are all in the genes, written in a four-letter code on the birds' DNA. Each bird's life begins as a single, microscopic fertilized cell, and each cell contains the equivalent of a set of charts, a compass, a sextant and maybe even something akin to a satellite navigation system. The wheatear is able to learn from experience, to rewire its brain as necessary (to evade a storm, for example), but the bird's brain comes ready wired for its ancient migration.

And if that doesn't make your head spin, then nothing will. As the British cartographer and author Tim Robinson said: Miracles are explainable, it's the explanations that are miraculous.

Monday, September 04, 2006

Remember the shmoo?

The shmoo was invented by Al Capp in the comic strip Li'l Abner: a wobbly tenpin-with-legs sort of creature with the misfortune (or good fortune?) of being almost totally consumable. Broiled shmoo tasted like steak; fried, like chicken. Shmoos gave eggs, butter and Grade A milk. The skin was a versatile fabric, the eyes made perfect buttons, and even the whiskers served as toothpicks. Most importantly, shmoos reproduced in prodigious numbers and delivered themselves willingly to human appetites. If you looked hungrily at a shmoo it dropped dead of happiness.

I can think of another creature that reproduces in prodigious numbers and would make a fine meal. The zoologist Mark Ridley has written: "Just as we consume resources, so we are ourselves a resource to be consumed. So far, we merely happen to be extraordinarily underexploited...There is no ecological opportunity on the Earth to compare with the gigacaloric potential of human flesh."

Who's going to eat us? The answer is obvious: viruses and bacteria. The microbes.

With every bite of food we eat, we convert more of the available planetary resources into human flesh. Increasingly, we must look like shmoos to the microbes: plump, available, irresistible. Ridley draws attention to the Darwinian pressure on microbes to make their diet out of us. So far, they have made only limited evolutionary progress towards overwhelming our defenses, but the dynamic of evolution is on their side.

As the human population explosion increasingly turns the biomass of the planet into human flesh, the ancient balance between ourselves and the microbes is put at risk. A showdown may be in the offing. The microbes have the advantage of short reproduction cycles, a million times faster than our own: In any race to evolve defenses against the enemy, we haven't a hope of competing. Our own natural defense mechanisms against bacteria are the products of millions of years of evolution. Bacteria can evolve resistance against antibiotics within months or years.

We are sitting ducks, an irresistible potential feast, victims of our own success.

Good luck to the human shmoo. (He said, as he took his doxycycline.)

Sunday, September 03, 2006

Demystification

It is no accident that the last executions in Europe for heresy and witchcraft coincided with the Scientific Revolution, just as it is no accident that the last visitation of the Black Death to Europe coincided with young Isaac Newton's anno mirabilis. See this week's Musing.

Anne offers a variation on last week's pic. (Click to enlarge)

Saturday, September 02, 2006

Facing backwards

Perhaps the greatest problem facing the world today is the disparity of power between rich and poor. Technology (satellite television, the internet, easy international travel) keeps the disparity at front of mind for the oppressed. They turn first of all to religion as an anodyne. After all, what is more empowering than to have God on one's side, and what is more consoling than the promise of paradise while the oppressor burns in hell. There is no shortage of mullahs, preachers, and gurus with agendas of their own ready to whip believers into a frenzy of self-righteousness and violence.

I have written here before about Meera Nanda's book Prophets Facing Backwards: Postmodern Critiques of Science and Hindu Nationalism in India. A demanding, but richly rewarding read. It is addressed primarily to the clash between secularism and traditional religion in India, but has relevance to the world at large, including America.

Nanda does not dismiss the need for the sacred in everyday life, but she makes a compelling case for the universality of the Enlightenment project. Her last paragraph: "One can safely conclude that the only option for the friends of the oppressed in the postcolonial world is for them to recognize that the interest of the oppressed in secularism and demystification of traditional ideologies is best served by the naturalism and skepticism of modern science. It would be fair to say that modern science is the standpoint of the oppressed."

Which is almost certainly true. But when the choice for a powerless, poorly-educated, impoverished and resentful person is science or the promise of everlasting bliss (perhaps with seventy virgins to boot), there's not much chance the choice will be science. Which is why the hundreds of billions of dollars we are spending on a foreign policy of militaristically imposed values would be better spent on a quiet amelioration of the sources of resentment.

A few more thoughts on dragonflies

Since I posted yesterday, I found the name of the dragonfly photographer: Charmaine K. I hope she will not mind me using her wonderful photos here.

Glittering blues and greens. Opal. Blood red. Ultramarine. No wonder dragonflies are talismans of summer, one of the few insects we welcome unreservedly to the season of exposed skin.

Greybacks, clubtails, darners, biddies, and skimmers. Their names are poetry. Popular names are even more evocative. Water maidens. Demoiselles. Horse stingers. Mosquito hawks. Devil's darning needles. Snake doctors.

Forwards, backwards, straight up or down. Zip. Spin. Stop on a dime. The center of gravity lies just below the base of the wings, with helicopter balance. Opposite wings are connected by strong flight muscles, as far as I know an exclusive among insects. The two pairs of wings operate independently. A big dragonfly can reach an air speed of 60 miles per hour.

Silverfish are the most ancient insects that survive more or less unchanged into the present. Cockroaches and dragonflies are almost as old. Of these living fossils only the dragonflies are an unmitigated boon to humans. They use their netted legs like shopping carts and can gather up a hundred mosquitoes at a time. Iridescent exterminators.

Friday, September 01, 2006

Heart throb

Anyone who has watched a dragonfly scout a summer pond has seen one of the wonders of evolution.

A cross between a traffic-watch chopper and an F-16. A flawless match of form and function. A flying machine optimized for snapping up insects on the wing. And for sex. But more of that in a minute.

An old friend kindly sent me the new Stokes Guide to Dragonflies. What struck me first were the names: River Jewelwing, Smoky Rubyspot, Aurora Damsel, Vesper Bluet, Powered Dancer, Fragile Forktail, Sedge Sprite, Fawn Darner, Dragonhunter, Wandering Glider, Elfin Skimmer, to name just a few. If, as they say, Adam named all the creatures in Eden, he excelled himself with the dragonflies.

Every now and then evolution throws up a creature so perfectly adapted to its way of life that improvement seems impossible. Such species are rewarded by longevity. They survive for eons with little change, becoming what evolutionary biologists call "living fossils." The dragonfly is a living fossil, one of the oldest orders in the animal kingdom.

Sit by the summer pond and watch. One might as well be in a time warp. Glance up and see Triceratops grazing nearby. Or Tyrannosaurus rex. An asteroid smashes into the Earth and the reptilian giants become extinct; the dragonfly survives.

I have a special place for watching them, a plank bridge across a sluggish stream along my path. The males take up territories near the banks, perching on reeds or stones, chasing off intruding males, patrolling. It's a dominance sort of thing. The alpha male gets the chance to mate.

But there's a bit a business to take care of first. The male's genital opening is near the tip of his tail. The penis, however, is just behind the legs. So before he mates, he must transfer sperm from the tip of the tail to the penis up front.

Now he grasps the female behind her head with the tip of his tail. She curls her abdomen around and under until she brings her genital organ -- at the tip of her tail -- to his penis. Now their bodies are engaged in a heart-shaped valentine, one of nature's more engagingly semiotic acts of copulation.

(Click to enlarge.)