Tuesday, April 30, 2013
Someone, child or grandchild, left a book at the house in Exuma that I looked at while I was there: The 4-Hour Workweek: Escape 9-5, Live Anywhere, and Join the New Rich, by Timothy Ferriss. I wrote the musing below at the time, but didn't post it because it seemed too self-serving. It still does, but I yield it now because of the subject of yesterday's post: What is the good life?
"How Tim went from $40,000 per year and 80-hours per week to $40,000 per MONTH and 4 hours per week."
"How to outsource your life to overseas virtual assistants for $5 per hour and do whatever you want."
And so on.
Apparently, the book is a bestseller.
Well, it's too late for me, even if I were interested. I wish Mr. Ferriss every success, but I did flip through the book and at a glance it seemed like a lot of advice on how to market a successful self-image, while taking advantage of whatever dodge you can and outsourcing the tedious bits.
I spent my life teaching. Easily 80 hours per week. And loved every minute of it. Never a morning I didn't want to jump out of bed and go to work. Started at $7000 per year and ended forty years later at ten times that (plus some writing income). Had a few neat adventures along the way. Now in retirement I have three agreeable homes in three beautiful places for a total outlay of $200,000. And, believe it or not –- even with a few air tickets -- a relatively small carbon footprint.
Maybe I should write a book: The 80-Hour Workweek: Escape 9-5, Have a Hellava Good Time, and Join the Human Race.
Monday, April 29, 2013
While we are with the May 9 issue of NYRB and women of a certain age, let me take note of a review/essay by Marcia Angell.
Angell is a physician, a former editor of the New England Journal of Medicine, and another regular contributor to NYRB, often in the role of scourge to Big Pharm. She is also almost my age (mid-70s), which I mention only because of the subject of her current essay: What is the "good life" as observed from its final chapters?
The book under review is George Vaillant's Triumphs of Experience: The Men of the Harvard Grant Study. Vaillant was for more than three decades director of a study begun at Harvard in 1938, to follow the lives of 268 Harvard sophomores of classes 1939 through 1944, selected from the best and brightest, to determine what early traits best predict a successful life. The surviving subjects are now in their nineties.
As Angell points out, the initial selection of subjects ("best and brightest" Harvard men) undermines the relevance of the study for the rest of us. Also, there is the thorny problem of what constitutes "success" or "the good life." In this regard, Vaillant uses ten indicators, including being listed in Who's Who, above-average income, and happy marriage.
Angell picks apart the study on methodological grounds, but she is clearly interested in the question of what is "the good life" and what early factors are most likely to help us get there. What everyone seems to agree on is that being affluent and well-educated are the best possible ensurers of good health and longevity.
I was both amused and depressed by Angell's throwaway speculation that "old age takes many men almost by surprise, it sneaks up on them, and is all the more disturbing for that…[whereas] women are all too aware of aging, starting with their first gray hair or wrinkle."
I didn't learn much of universal relevance from the Harvard study, as recounted by Angell (love is good, alcohol is bad). I did enjoy her assessment of her own sources of contentment and anxiety at age 74. She doesn't like getting old (who does?), but finds an offsetting advantage in "a sharper sense of what is important in life." She says she is less interested in maintaining her professional profile, and looking forward to learning Italian and taking a course in astronomy. I have no doubt that if she keeps her health, like Alison Lurie, she'll still be writing for the NYRB at age 86.
Sunday, April 28, 2013
Saturday, April 27, 2013
Greystone Books publishes a series of "Literary Companions" to natural environments -- mountains. rivers and lakes, deserts, gardens, and the sea, so far. Now they come to my environment -- night -- and have been kind enough to include a chapter from The Soul of the Night, the chapter called "The Shape of Night." I am in lovely company, admired companions of several generations -- Diane Ackerman, Timothy Ferris, Annie Dillard, Henry Beston, Loren Eiseley, Louise Erdrich, Pico Iyer, and Gretel Ehrlich, to name but a few -- all connoisseurs of darkness.
Our earliest mammalian ancestors were presumably nocturnal -- to escape the predations of dinosaurs -- but for most of human history we have been afraid of the dark, huddling in caves around stuttering fires, curled together in darkness like mice in a burrow. Night belonged to animals with big, dark-adapted eyes and sharp teeth, to footpads and graverobbers, to werewolves and vampires. Ironically, it was with the coming of electric illumination that it became reasonably safe to go out and about at night, even as the illumination erased the best reason to do so.
William Blake called day Earth's "blue mundane shell...a hard coating of matter that separates us from Eternity." At night we peer into infinity, awash in a myriad of stars. We creep to the door of the cave and look up into the Milky Way and catch a glimpse of divinity -- everlasting, all-embracing, utterly unknowable. Night -- that cone of shadow, that wizard's cap of spells and omens -- is the chink in Earth's shell through which we court Ultimate Mystery the way Pyramus courted Thisbe.
Which is why, I suppose, that whenever I think of "the porch" of people who visit here, I imagine Carolina rockers on a southern summer verandah, far from city lights, Vega, Deneb and Altair swimming in the Milky Way, fireflies flickering on the lawn. At some point the conversation ceases and we simply sit, rock, and listen to the sounds of the night -- the whippoorwill, the bullfrog, the cricket and the owl -- and let starlight fall upon our heads like a sprinkling of holy water.
(This post originally appeared in April 2009.)
Friday, April 26, 2013
Remember Alison Lurie? The novelist? She hasn't published a novel in half-a-dozen years, but she regularly appears in the New York Review of Books, writing on everything from John Updike to Peter Pan (which may not be such a broad spectrum, after all). She is still very much with us, at age 86, and a personal rebuke to my ten-years-younger fading productivity.
I read her first in 1974, a novel called The War Between the Tates, about a married college professor who has an affair with a student. Then in 1984 she won the Pulitzer Prize for Foreign Affairs, a novel about, well yes, foreign (love) affairs. She was the female Updike, exploring the territory we ex-altar boys more timidly and less conclusively found ourselves traversing.
Anyway, here she is again in the NYRB, reviewing a novel by Claire Messud. Before getting to the novel, she takes the time to develop a theory of the "celebrity complex."
She draws our attention to celebrity culture, "occurring spontaneously in the so-called advanced democratic societies," that separates us into a privileged minority who are recognized as fully and triumphantly human, and the rest.
Basically, we're talking about the people who are featured in People magazine, and the people who buy the magazine.
This can lead, says Lurie, to an affliction she calls the celebrity complex:
When we get together we tend to gossip not about our own relatives and friends and neighbors and coworkers, but about film and TV and sports stars and members of the British royal family. Individuals who we know only as flimsy two-dimensional paper shadows, or fleeting electronic impulses on a screen, interest us more than three-dimensional human beings. In advanced cases of celebrity complex, the afflicted persons feel that fame is necessary to self-esteem; if they cannot achieve it themselves, they may define and value themselves most importantly as fans.Lurie wants to celebrate those folks who are neither in the magazine nor buy it -- schoolteachers, carpenters, doctors, homemakers, or civil servants, for example, who are locally recognized and honored by friends and families for doing what they do conscientiously and well. Lurie probably won't get in People magazine for writing NYBR essays/reviews at age 86, but she has my jealous admiration. I would even go so far as calling myself a fan.
Thursday, April 25, 2013
Who knew that Alan Turing was into biology.
Turing was of course the great computer theorist who helped crack the German Enigma code during the Second World War. In 1952, two years before his unfortunate and unnecessary death, he published a paper called "The Chemical Basis of Morphogenesis" that described how "two interacting chemicals diffusing through space could form interacting wave patterns that produce spots like a leopard's or stripes like a zebra's." (I quote from Science, 14 December.) After a long semi-eclipse, Turing's mathematical paper is now finding application in a number of areas of developmental biology, including the development of mouse paws.
I was struck by this illustration from Science of mouse paws formed by removing Hox genes from a developing mouse limb lacking the gene Gli3. The more Hox genes removed the more digits form, all apparently in conformity with Turing's equations. Paws to paddles.
When you come to think about it, cracking the German code may not be all that far removed from cracking the genetic code, which determines not only the number of mammalian digits but also (it seems) sexual orientation. I've told Turing's story here before, of how after the war he was found out to be a practicing homosexual and brought to trial. Convicted, he was offered the choice of imprisonment or chemical castration. He chose the latter, so that he might continue his theoretical work. He committed suicide at age 41.
Wednesday, April 24, 2013
The photo above, made in 2008, shows the famous Horsehead Nebula, a vast star-spawning cloud of gas and dust in the constellation Orion. The nebula is about 1500 light years from Earth, in our own spiral arm of the Milky Way Galaxy. It is a wispy projection of an even larger cloud known as the Orion Molecular Cloud Complex. I once calculated that 10,000 Solar Systems, imagined as spheres, would fit comfortably in the horse's snout.
When I was teaching astronomy, a similar photo, projected on a big screen at the front of the classroom, was always a pleaser. Why? Because of the "horse," of course. The universe is almost unimaginably vast. We love to see something familiar among the gassy chaos. It cuts the light-years down to size, helps us feel at home, reinforces a fading sense that we may be important after all. Giddy-ap! Hi-ho Silver!
We need all the reassurance we can get.
And now comes this, a new photo of the Horsehead, made in infrared light by the Hubble Space Telescope (click to enlarge). I've been looking at photographs of the Horsehead for fifty years, and this one took my breath away.
In The Soul of the Night I wrote: "The Greeks believed that the eye has a double role in vision. They believed that a pale light went out from the eye to the world and returned again to the eye as a traveler returns bearing gifts. For the Greeks, the eye was both illuminator and receiver. Modern science has rejected the Greek theory of vision. We are told that the eye is a passive agent, a mere collector of whatever light comes its way. Seeing, in the new dispensation, means turning toward the light and nothing more."
Nothing more! The Soul of the Night was itself a refutation of "nothing more." Seeing may be all on the receiving end, but it is a cerebral activity of interpretation and assimilation. We look into the Horsehead and try to understand ourselves. We want to believe -- we were taught to believe -- that the universe was made for us, that we are the reason for it all, for those myriad stars, those billowing clouds. We stumble, like the blind giant Orion himself, into a darkness so vast, so rich and deep, as to make us tremble. And like Orion, we will learn to see again only by bravely confronting the light.
O, what a mighty work is vision!
Tuesday, April 23, 2013
They should have taught us birds and trees
in school, they should have taught us beauty
and weaving bees and had a class
on listening and standing alone --
The first few lines of a poem by the Kentuckian Maurice Manning, who might be called the poet laureate of Appalachia. You can find the entire poem here. It is also contained in Manning's newly published volume, The Gone and the Going Away., which is where I found it. It's a lovely poem, and espouses a philosophy of education that cultivates awareness. If a child is aware, everything else will follow.
Which reminds me of a wonderful book that was published in America in 1964, as I began my own teaching career: Elwyn Richardson's In the Early World. Richardson taught in a country school on the North Island of New Zealand, with mixed classes of Maori and Caucasian children. The book describes and illustrates the way of teaching that he developed, which focused on awareness and the unique creative potential of each child. Language skills, mathematics, social studies, science: All followed from a creative engagement with the world.
Awareness. "The children should have studied light/ reflected from a spider web," says Manning. From that small observation might spring questions of mathematics and physics, chemistry and natural history, poetry and art.
As a young teacher, I was blown away by Richardson's book, by the richness of the childrens' arts and crafts. Teaching science at the college level was of course something very different, but the book certainly influenced the direction of my career. Its spirit can, I trust, be sensed in my posts here.
I owned several copies of the book over the years, but gave them all away to students who were going into primary teaching. If you know a young person about to begin a teaching career, track down a copy and give it as a gift. They will not, of course, have the freedom in any American school system to follow Richardson's path, but they will have a glancing awareness of awareness.
The title of the book, by the way, comes from a poem by one of Richardson's students, Irene:
The blue heron stands in the early world
Looking like a freezing blue cloud in the morning.
Monday, April 22, 2013
When I wrote The Path: A One-mile Walk Through the Universe," I had walked the path between my home and college almost daily for 37 years. In the Introduction to the book, I wrote: "For all of its familiarity, there has never been a day I have walked the path without seeing something noteworthy."
Well, yesterday I came across something unprecedented, which I record here for the sake of historical record, and because those of you with a sense of whimsy might enjoy it. Along the path in the deep woods there is a tree with a hollow cavity at its base and an open hole in the side of the hollow. I have often imagined it as a tidy home for some tiny creatures. And here they are. Two gnomes have taken up residence. They have even fenced off the yard and put up a lace curtain at the "window."
I have no idea who did this. Or of what age. Maybe the gnomes themselves, in some magical midnight animation.
There was a time, of course, and not so long ago, when our ancestors imagined the whole of nature inhabited by wee creatures. When we first lived in western Ireland 41 years ago, there were still a few old people in the village who believed in fairies. To their consternation, we built our cottage on "the Fairies' Road," and though we would have welcomed the little people as visitors, they did not appear.
These gnomes are the best we are going to get. They gave me a smile. And a mental note of appreciation to the fairy-spirited person who helps maintain an enchanted landscape.
Sunday, April 21, 2013
Saturday, April 20, 2013
...it helps to have neighbors with meadows, hedgerows, fruit trees, organic gardens, and nesting boxes designed especially for bluebirds.
I have such neighbors. My walk to college each day takes me through conservation land administered by the Natural Resources Trust of Easton. It would be hard to imagine a habitat more perfectly suited to a bluebird's needs, and my friend Bluebird Bob sees to it that the birds are well housed.
Electric blue, robin-red-breasted, plump, round-shouldered insect snatchers flitting through the branches of the crabapple tree: Who can see a bluebird and not be happy? The naturalist John Burroughs heard its song as "pur-i-ty, pur-i-ty." Others hear "tru-al-ly, tru-al-ly." No one with an ounce of sentimentality in their soul doubts that bluebirds are both pure and true.
It's great to see them back after decades of absence. When I came to New England in 1964, bluebirds were few and far between. As I recall, that was the same year I read Rachel Carson's Silent Spring. The book exposed the massive, indiscriminate use of pesticides, especially DDT, and gloomily assessed the consequences for the environment. So successful was Carson's call to action that within months of the book's publication many states and foreign countries banned DDT. In 1957 the U. S. Department of Agriculture sprayed 4.9 million acres with the poison; in 1968 the figure had dropped to zero.
DDT is no longer an issue in the U. S., but huge amounts of other pesticides are still dumped into the environment. Everyone wants picture-perfect lawns, healthy trees, and supermarkets stuffed with abundant, flawless produce. I've got weeds between the bricks in my backyard patio that I would love to douse with killer. The chemical industry urges us on with "yes, yes, yes."
"Pur-i-ty, pur-i-ty," the bluebirds call. And those of us who are pleased to see these delightful birds return to our neighborhoods can only answer "Tru-al-ly, tru-al-ly."
(This post originally appeared in April 2007.)
Friday, April 19, 2013
I'm not through with this painting. Or rather, it will not let go of me. Thanks, James Holland, whoever you are.
It's not because it reminds me of my own house. And it's not because it rises to great art, although it is masterful in its own way. It's because it lets me see something we seldom take note of -- the ubiquity of night.
Night. The default state of the universe. Night for most of us is that time from dusk to dawn when the body of the Earth gets in the way of the Sun. But night lurks in every shadow, all day long. Each blade of grass casts its nocturnal shape upon the ground. Each tree spreads a shrubbery of darkness. With each step along the sidewalk I crush a bit of daylight.
All day long night scatters and gathers. Skittering behind kittens. Scooting with the thrown ball across the ground. Then, in afternoon, night comes marching from the east, the full black army, the orc-fell dark. Benign at first, daubing the side of the farmhouse with filigreed lights, the white clapboards with their pale hue of our yellow star. Night gathers under the eaves, pools in the dark grass.
And this, I submit, is the most perfect time of day. Pale sun, pale shadows. Nature's most subtle palette. The silent golden moment when the pink and purple shadows promise a perfect marriage of light and dark, of solar photons and their extinction.
Thursday, April 18, 2013
As I was sipping my coffee and nibbling my croissant this morning in the college Commons, I was browsing an issue of Artscope someone had left on the table, a New England magazine of the visual arts. The title of one article caught my eye: "Living In a Material World."
Well, of course. Of course we live in a material world. Clap your hands. Scratch your head. Stamp your foot. Materiality. Stuff. A world made of stuff. Solid. Enduring.
Except matter has a nasty reputation. To be a "materialist" is to miss the essence of life. God fashioned man out of matter -- humans out of humus -- then breathed soul into the mix. And ever since we have had our attention fixed on soul, that ineffable something that is not matter, not stuff. The spritely essence of angels.
Except whenever we go looking for the essence of angels, it has a way of eluding our grasp. Which is the best thing about matter. Its graspability. One can touch it. Taste it. Smell it. See it. Hear it. Slosh around in it. Rub it on our skin. Pet it. Snuggle up against it. It whistles at the door jambs, patters on the roof. Coffee and croissant.
Living in a material world. I love it.
And then as if to confirm the thought that was swirling in my head, I turned a few pages and came across this painting by the artist James Holland, about whom I know nothing, called "Farmhouse Shadows." (Click to enlarge.) It reminds me a bit of my own New England house, from the same era, but it reminds me too what I like about matter. Its malleability. Its habitability. Its go-with-the-flow. The way it transmogrifies with an organic liquidity in the late afternoon sun. The way nuclear fusion at the heart of the Sun strikes a Cupidic arrow across the blue door.
I might be napping by that downstairs window when the shadow passes and feel the cool, sweet brush of materiality against my cheek.
Wednesday, April 17, 2013
A splendid day of spring in the wake of yesterday's double bombing of the Boston Marathon (I'm writing on Tuesday). But the Earth doesn’t stop in its annual circuit of the Sun. And it would take a bigger bang than those on Boylston Street to tip the planet on its axis. So let's take what gifts nature gives us here in New England, and lean with the planet toward the Sun.
The trees are budding. The daffs and narcissi are in bloom. The frogs are singing in the water meadow. Fiddlehead ferns raise their crosiers by the brook. This afternoon I'll be sure to see the first snake come out to bask on the path. The newspapers are full of trauma, carnage, heartrending tragedy, and the globe sails on in its year-long lopsided circumnavigation of our steady star.
Every second, at its hot core, the Sun converts 657 million tons of hydrogen into 653 million tons of helium, by a process known as nuclear fusion. The missing 4 million tons of mass are converted into energy (how much? 4 million tons times the speed of light squared). The energy makes its way to the Sun's surface where it is hurled into space as heat and light. The Earth intercepts about one two-billionth of this energy, or about four pounds worth of the vanished matter. The Sun never misses so slight a fraction of its huge bulk, but for the Earth it is the difference between day and night. And winter and spring.
Now, in April, those of us in the northern hemisphere lean toward the Sun. Its rays hit the surface more directly. In summer, about a millionth of an ounce of the Sun's mass falls onto our college campus; in winter less than half as much. A fraction of a millionth of an ounce of the Sun's depleted mass is all it takes to tip the season toward spring. We grab it eagerly. It lifts our spirits. And, by a strange metaphorical magic, it reassures us that good will triumph over evil.
(I made the above calculations more than thirty years ago when I was writing 365 Starry Nights. No one's questioned them in the meantime, so I'll roll them out again.)
Tuesday, April 16, 2013
Had I the heavens' embroidered cloths,If you have seen my movie Frankie Starlight, you will surely remember this scene, where Bernadette (Anne Parillaud) reads Yeats to Terry (Matt Dillon). It is as sweetly sexy a scene as you are likely to find in a movie (click to enlarge).
Enwrought with golden and silver light,
The blue and the dim and the dark cloths
Of night and light and the half-light,
I would spread the cloths under your feet.
But I, being poor, have only my dreams;
I have spread my dreams under your feet;
Tread softly because you tread on my dreams.
Benadette's socks. The rumpled bed. The dirt poor room. Lovestruck Terry. And Anne Parillaud reading Yeats' incomparable words in her heart-stopping Gallic accent.
I like to think of the scene as a kind of culmination of my life as a writer, a coming together of threads -- the night sky, poetry, beauty, love, sex.
Every book I've written has had the cloths of night embroidered into its fabric. Every book -- fiction and non-fiction -- has tried to evoke the sensual beauty of the blue and the dim and the dark, the light and the half light, a beauty that is frankly erotic even in the inanimate star-spangled sky, an eroticism that fuels the fiery nebulae and drives the flower through the green fuse.
The tangled bank that ends in tangled sheets. I lay them at your feet.
Monday, April 15, 2013
In he current issue of Poetry, I came across this quote from the poet Frank O'Hara: "Don't be bored, don't be lazy, don't be trivial, and don't be proud. The slightest loss of attention leads to death." It gave me something to muse on as I was walking home.
O'Hara was something of a jack of all trades, including booze and sex. He might have been a painter or a musician, but we know him as a poet, for a more or less lifelong stream of consciousness. He is not a poet to my taste, but then the New York art scene of the 1950s and 1960s is about as remote from my experience as anything is likely to get. I read O'Hara for the same reason I read, say, Patti Smith's Just Kids (which is, by the way, a great little book).
But back to the quote, which struck a late-life note.
Am I bored? After 40 years of keeping half-a-dozen balls in the air, eighteen hours a day, I now seem to be spending an inordinate amount of time napping on the couch.
Blogging is a trivial activity, at least by comparison with my former activities. It is certainly self-indulgent, and leaves lots of time for napping.
Proud? If by proud O'Hara means "self-satisfied," then I'm less proud than I used to be. The itch of ego is apparently napping too.
Three feet in the grave?
Now we get to the gist of it. The first commandment of art and life: Pay attention. The slightest loss leads to death, say O'Hara. Or at least to ever longer naps.
"It is not easy to live in that continuous awareness of things which alone is true living," wrote the naturalist Joseph Wood Krutch, anticipating O'Hara. But I try. To pay attention. To see. To listen. To walk, not drive. To keep the ear buds out of my ears. To keep my glasses clean. To notice and to make note. Which is what I've been doing all of my life. Which is what I'm doing here.
O'Hara, by he way, died young, aged 40, when he was run over by a dune buggy while sleeping on the beach at Fire Island.
Sunday, April 14, 2013
Saturday, April 13, 2013
...Thisby knowsA few lines from a poem of Linda Gregerson. Never mind the context; the image is arresting. Beautiful Thisbe is confined by her parents' to her high-walled house in Babylon, with only a crack in the wall through which to communicate with her forbidden lover. And, of course -- as so many parents discover -- the restriction makes her passion all the more intense.
so little of the world
as yet: the bit
she can see through the
chink in the wall
has made her heart beat
faster in its cage...
We look out at the universe through a metaphorical chink in the wall. We are prisoners of our limited sensory apparatus, our finite brains. Slowly we have widened the chink -- just think of the Hubble photographs compared to what Ovid, say, knew of the world. But the wider chink has only made us more aware of the limits of our knowing, heightened our curiosity, excited our passion -- made our hearts beat faster in their cages.
We put our lips to the chink, we whisper prayers, not knowing to whom or what we pray, imagining a lover whose remembered image grows ever more indistinct even as our passion grows.
If it were possible, would we want to have the walls down, to have full access to what the physicist Stephen Hawking whimsically called "the mind of God" -- a full and complete knowledge of everything that is? Not me. Woo prolonged is woo sustained. Remember what happened to Thisbe and Pyramus, and for that matter to Eve and Adam when they ate of the Tree of Knowledge. The ancient myths tell a great truth: the tease is more exciting than the consummation.
Friday, April 12, 2013
I had a few words to say yesterday about Oxford anthropologist Harvey Whitehouse, and his thoughts on ritual as "the glue that holds social groups together." I was drawing on a story in the 24 January issue of Nature.
He is quoted further: "Emotionally intense rituals have bound us together and pitted us against our enemies throughout the history of our species. It was only when nomadic foragers began to settle down did we discover the possibilities for establishing much larger societies based on frequently repeated creeds and rituals."
The big question, according to Whitehouse, is whether the "glue" of ritual can be extended to humanity at large.
At first glance, the internet might seem to offer possibilities. Does Facebook count as a ritual? Are the social media universal enough and adhesive enough to bind together California Valley Girls and Afghani war lords? Is any imaginable global ritual cohesive enough to overcome already deeply entrenched alliances? Will Muslims and Christians ever sing kumbaya together.
If not the internet, what about consumerism? Consumerism seems to be doing a pretty good job overcoming ancient animosities between Asia and the West. Can the shopping mall be our new temple/cathedral?
Or maybe Dancing Matt?
I'm not optimistic. The key might be in the Whitehouse quote above: "bound us together and pitted us against." Whatever ancient cultural, perhaps innately behavioral, influences incline us to ritual, they were undoubtedly forged in a dynamic of "us" versus "them."
Rituals can divide as forcefully as they unite.
Thursday, April 11, 2013
A month or two ago, I was describing here how my daughter can't grasp why I "cling to religion," specifically the Catholicism of my youth. I tried, apparently unsuccessfully, to let her understand that I don't cling, that I have put the theology firmly behind me, that I am as robust a practicing agnostic as she is herself.
But, yes, an aura of Catholicism clings to me, like a scent of incense or a smudge of chrism. And, frankly, I'm perfectly comfortable to have it there. Smoke and oil: why not? A sprinkle of water: asperges me.
The rituals of my youth had a purpose, not necessarily the conscious purpose of the celebrant and participants. The Oxford anthropologist Harvey Whitehouse calls rituals "the glue that holds social groups together." In my case, they were the glue that made me part of a universal church, ancient and hoary with certitude, the true religion.
Whitehouse describes two kinds of rituals: 1) "Doctrinal" rituals that bind together large groups that need not meet face-to-face; these rituals are easily taught to children, and can be as various as religious rites, Saint Patrick's Day parades, and reciting the Pledge of Allegiance to the flag; and 2) "imagistic" rituals, often secret and traumatic, that forge tight, small, mutually-dependent communities, such as cults, military platoons, fraternities, sports teams, or terrorist cells.
If I can play with Whitehouse's "glue" metaphor: doctrinal rituals are paste, and imagistic rituals are epoxy.
The rituals I was brought up with were clearly of the former sort. They pasted me into a global, indeed eternal, collage. The paste was not adhesive enough to keep me attached to the doctrine, but the rituals themselves adhere. Show me an ex-Catholic anywhere on Earth of a certain age who can listen to the Tantum ergo, say, and not feel the tug of ritualization.
(Article on Whitehouse and colleagues in the 24 January Nature.)
Wednesday, April 10, 2013
Tradition calls it the first poem written in Ireland, by a Milesian prince named Amergin, who, with his brothers Evir, Ir and Eremon, colonized the island many hundreds of years before the time of Christ. I used my own version of the poem (made with the help of Irish-speaking friends, and drawing on several published translations) in Honey From Stone:
I am the wind on the sea,Let it be said at once that any "translation" from so ancient an oral source is problematic, and even then there have been many interpretations of the poem. Still, to a boy who grew up on Catholic litanies, the form is clear. The poem is a kind of prayer, a prayer of acknowledgment and praise to whatever it is that infuses the world with meaning.
I am the ocean wave,
I am the sound of the billows,
I am the seven-horned stag,
I am the hawk on the cliff,
I am the dewdrop in sunlight,
I am the fairest of flowers,
I am the raging boar,
I am the salmon in the deep pool,
I am the lake on the plain,
I am the meaning of the poem,
I am the point of the spear,
I am the god that makes fire in the head.
Who levels the mountains?
Who announces the age of the moon?
Who has been where the sun sleeps?
Who, if not I?
Douglas Hyde, the famous scholar of the Irish language (and first president of Ireland), made the canonical translation, and who am I to dispute him? But his version is less than poetic, and has a few quirks that strike me as wrong. Where I chose "I am the meaning of the poem," Hyde has "I am a word of science," although the very notion of science, as we understand it, would not come along for millennia. And he capitalizes the word "god," which seems to me another anachronism in the context of a polytheistic culture. Likewise, he has "God who" where I chose "god that," which seems more in keeping with the rest of the "I am"s.
The poem strikes me as pure pantheism. God or gods in and of the world -- good and bad, peace and strife, beauty and ugliness. It is part of the ancient Celtic consciousness that there is a mysterious power afoot in the landscape, sometimes called neart, that can be used for good or bad, and the gods were simply a way of giving an anthropomorphic face to a force that was otherwise beyond human knowing or control.
In Celtic thought, neart is everywhere -- in sky, Sun, Moon, earth, sea, animal, plant, stone. Even the gods were caught up in the web of this mysterious power. Neart was not so much something one thought about as felt -- sensed as one sometimes senses a presence in a dark room at night. In certain places and at certain times the felt presence is especially strong, in forest glades, perhaps, or by deep clear mountain pools.
I am willing to sing with Amergin in praise of neart. Imminent, yet mysterious. Not diminished by knowledge, but broadened. Addressed, if at all, by a kind of inarticulate awe. It is not enlightenment one feels in the presence of neart; rather, one is reminded of one's ignorance. Most of all, one feels caught up in something that reaches into (or out of) every part of one's being, not just the reason, or the will, or self-awareness, but the senses, the viscera, the lusts and longings of the human heart.
Tuesday, April 09, 2013
Last week the New York Times had a full two-page ad for The Game of Thrones, an HBO television series now in its 3rd season. The blurbs were fantastic. Sounded like the greatest thing since sliced bread.
To tell the truth, I had barely heard of the series, which is based on a Tolkienesque series of fantasy-fiction books by George R. R. Martin. We only have television half of the year, and HBO not at all. But I noticed the college library has Season One on DVD and thought I would check it out.
The first five minutes offered two decapitations and a spread of hacked up body parts. That was it for me. I have no stomach for violence.
Is it nature or nurture? Am I missing the slasher gene? Or was I brought up with a wussy aversion to gore?
When I was writing Valentine I became deeply interested in how the so-called "civilized" Romans, masters of literature, law, architecture and engineering could rely for their public entertainment on the butcheries of the arena. I explored this topic in the novel, but found no satisfying resolution of the paradox. If Valentine were made into a movie, I wouldn't be able to watch it.
With the rise of Christianity, the gladiatorial gore declined, but not, apparently, our taste for butchery. Public executions of heretics and criminals continue in many parts of the world. Even in the enlightened democracies slasher movies and blood-splattered video games are hugely popular. Cable television seems intent on pushing the boundaries of violence at far as they can go. The Hollywood moguls seem to know that deep down in some reptilian part of our brain we love the spilling of blood and guts.
With computer-assisted graphics it is now possible to render violence as vividly on screen as in real life. When it's impossible to tell the difference between the real thing and the simulation, is there a moral equivalence in watching? Are we really more advanced than the Romans in our taste for entertainment?
(I know my last two sentences are problematic, but I toss them out there to stir the pot.)
Monday, April 08, 2013
My ignorance grows by leaps and bounds.
I try to keep up. I read Science and Nature every week. I check the new-book shelves in the college library for anything of interest. I do a modest amount of surfing science sources on the internet. Yet I fall farther and farther behind. The store of reliable human knowledge accumulates exponentially faster than my ability to assimilate.
So I mostly tend my bailiwick. My own little garden of verses. The wispy bits of universe that cling to my life like an aura of places been, things seen, syllables heard. Monarch butterflies, for instance.
It was just a few weeks ago I wrote about the monarch migrations and new threats to their existence. The chopping down of their overwintering refuge. The pesticides. The vanishing plants on which they feed and breed.
And the mystery -- of how a slip of tissue can fly 2500 miles to a place it has never been before.
Now this. They migrate by a Sun compass (no big surprise) mediated by circadian clocks located in their antennae. Off they go, each autumn, with map, compass and clock encoded in their genes. No guide to show them the way, except what they were born with. ATTCTGCCATGC…
That the compass and clock evolved together, leading butterflies from all over eastern North America to the same place in central Mexico, seems a miracle beyond imagining. But I try to imagine it. Because, for the moment, there is no alternative.
And this. The Sun compass that guided them south must be reset for the northward journey. It turns out that it is temperature that resets the compass. The chilly temperatures of the Mexican winter swing the needle around to point away from the Sun.
Back to my New England meadows.
Now the monarchs have something else to worry about. Global warming.
Sunday, April 07, 2013
Saturday, April 06, 2013
(This post originally appeared in December 2006.)
My mother, bless her, often quoted the poets she memorized as a young woman studying English literature at the University of Chattanooga. Even until the week she died, earlier this year at age 92, the words were fresh in her memory and on her tongue, such as these lines from James Russell Lowell, written sometime in the mid-19th century:
New occasions teach new duties,Lowell was speaking of slavery, and his lines became part of a popular Protestant hymn, no doubt Unitarian. The slave trade that I spoke of here these past few days was, of course, initiated and carried on by good Christian men. When the American Founding Fathers met in Philadelphia to hammer out a constitution for the new republic, slavery was the elephant in the room, recognized by many as an abomination, but generally ignored in the deliberations by tacit agreement that no federal arrangement was otherwise possible. (The representative from Georgia insisted that the Bible placed its benediction on the institution of slavery.) Madison, Jefferson, Washington, and many others we take to be paragons of virtue were slave holders. Seventy years would pass, and a horrendous civil war fought, before the "ancient good" became at last by law uncouth.
Time makes ancient good uncouth.
They must upward still and onwards
Who would keep abreast of truth.
Another century later, when I was growing up in Chattanooga, it was thought good and appropriate by my white neighbors that blacks and whites be kept strictly separate in schools, churches, and public facilities. And heaven forbid that blacks might exercise their right to vote. Thanks to the civil rights movement of the 1960s that "good" too became uncouth. And so it goes, onwards and upward, not so much keeping abreast of truth as making up truth as we go.
The very idea of the progress of truth is associated with the Scientific Revolution of the 17th century. Until that epic transformation of human culture, truth was defined by the authority of the past, as embodied in ancestors, holy books, or divinely appointed prelates and kings. Which is not to say that liberal spirits had not always been with us, but not until the time of Bacon and Galileo did it become common to suppose that ancient truth might be amendable. The Earth-centered universe was a venerable truth that Copernicus, Kepler and Galileo showed to be untenable. And if the Earth could be moved from its central position in the cosmos, then everything else was up for grabs -- the divine right of kings, the power of the Church to burn heretics, slavery.
By its commitment to an open-ended search for truth, science has been the great engine and friend of political and religious liberalism. It is no coincidence that the present foes of science in American public schools are the same unyielding adherents to ancient authority who would deny, for example, the civil rights of gays or a living wage for the poorest of the poor.
Friday, April 05, 2013
If you want a book to put in opposition to Thomas Nagel's Mind and Cosmos, which we discussed a week or so ago, you could do no better that Gerald Edelman's Wider Than the Sky. Edelman is a Nobel prize-winning neuroscientist who has written widely on the physiological origins of consciousness. Wider Than the Sky is a short book, like Nagel's, and intended for the same popular audience. It is not, however, as easy a read, for the reason that doing real science takes more effort than does philosophizing off the top of one's head.
Edelman does not prove that consciousness can be reduced to Darwinian materialism -- it's too early for that -- but he offers a good case for not supposing otherwise.
But right now I want to consider his curious title: Wider Than the Sky. Where does it come from? What does it mean?
From an 1863 poem of Emily Dickinson:
The Brain--is wider than the Sky--It is sometimes argued that we will never have a full scientific understanding of consciousness because the brain is only as complicated as the thing it is trying to understand, and with much more to do besides. That is to say, consciousness involves more of the brain's circuitry than the part of the brain that might contrive and contain a physiological description of consciousness. Like a sponge trying to mop up a spill more fulsome than the sponge.
For--put them side by side--
The one the other will contain
With ease--and you--beside--
The Brain is deeper than the sea--
For--hold them--Blue to Blue--
The one the other will absorb--
The Brain is just the weight of God--
For--Heft them--Pound for pound--
And they will differ--if they do--
As Syllable from Sound--
But this misunderstands the nature of scientific understanding, as Edelman makes clear. We will never be able to predict the weather in every particular -- every gust and raindrop -- but we have a pretty good understanding of the physical principles that control the weather, and no reason to suppose that any gust or raindrop is not in principle reducible to laws of temperature, pressure, condensation, and so on.
The world and the brain contain each other, as Dickinson supposes. The brain evolved in the world to contain as much of the world as possible within the physical limitations of the organism. How it works is yet to be discovered, but I for one applaud President Obama's brain mapping initiative. The tools are ready. Theoretical structures are in place. The buckets and sponges wait to be filled.
Thursday, April 04, 2013
Started to toss out this 7 December issue of Science and took another long look at the cover. What is this thing? Whatever it is, it is beautiful (click to enlarge).
End-on view of the atomic model of the bacterial actinlike ParM protein double-helical filament, generated from an electron microscopic reconstruction. A bipolar spindle of antiparallel ParM filaments pushes plasmids to the cell poles, constituting the simplest known apparatus for the segregation of genetic information. The loops on the outside of the 8- to 9-nanometer-thick filaments are involved in spindle formation.So we are looking at a computer-generated, artificially-colored, schematic representation of molecular machinery that assures genetic information will be properly copied and allotted to daughter cells when a cell divides by binary fission. The cell, in this case, is E. coli, a common bacterium of the human gut that is generally benign.
How big? One-hundred thousand of what you see above could line up across the period at the end of this sentence.
Is it fair to call this stuff "machinery"? Why not? The authors of the article in Science do. Nature invented machinery before we did. Are we ourselves just big, hugely complex molecular machines? Are we seeing in the image above a smidgen of the human soul? That's for you to decide, but as for me, I have no problem at all. The protein filament pictured above inspires in me a lot more awe than the images of angelic spirits that decorated my youth.
But you protest: Isn't it demeaning of human nature to suppose that we are just windstorms of molecules spinning and weaving, atoms coalescing and disengaging, ParM filaments pushing plasmids.
And I say: Wow! Fabulous! Especially the more we learn about it.
Molecular machines who think and love and make art and science. Molecular machines who delight in beer and pizza, have sex, sing and dance. Molecular machines who wonder about the meaning of it all, and find stunning beauty in an end-on view of a ParM filament. Molecular machines who can figure out what's going on in cells smaller than the point of a pin.
If it is true that I am a fabulously complex molecular machine, it doesn't so much diminish my humanity as it elevates the mechanical metaphor.
Wednesday, April 03, 2013
There he is on the cover -- that messy hayloft of hair, the impish grin, like a boy who just pulled a prank on his teacher. We know at once who we are looking at: Richard Feynman, the clown prince of physics, now sadly departed.
I'm holding in my hand The Pleasure of Finding Things Out, a collection of Feynman's shorter writings, published in 1999. That seems an eternity ago now, and, as I read, Feynman seems like a throwback to a vanished era.
He was, of course, a brilliant physicist. As someone who as a graduate student sat in on a few of his classes, and later used his texts in my teaching, I can vouch for the fact that he was an incomparable teacher. But what I'm reading now is his essay on science and religion, a sometime topic of this blog. His physics may be unexceled, but when it come to the big philosophical problems of life, on which he was sometimes impelled to make pronouncements (of his popular books is called The Meaning of It All), he is just a schmuck like the rest of us.
And yet, and yet…
As I read his essay on science and religion, I'm struck by two qualities so often absent from the recent debates.
First, an unrelenting emphasis on our ignorance, which Feynman insists is where science begins. Uncertainty is vital to discovery, he says, and it is only natural that for a scientist this should carry over into matters of faith. It also means that Feynman treats believers with a gentle respect. Hey, even our most confidently held knowledge is tentative and evolving.
Second, the wit. One never gets the impression that Feynman takes himself too seriously. No smirky condescension, no "take-no-prisoners" atheism. If you're not having fun, he seems to suggest, you should find something else to do.
Humility and humor generally go together. One can be confident without being dogmatic, and one can be serious with an impish grin.
Tuesday, April 02, 2013
That's what everyone in East Tennessee wanted to know during the Second World War. A huge industrial complex and a good-sized city had been built in the middle of nowhere. Tens of housands of people were employed. And no one seemed to know what was going on. I remember my father, an engineer, puzzling over the mystery.
People hired on without a clue what they would be doing, and didn't know what they were doing while they did it.
"Ever'thang's goin' in, and nuthin's comin' out."
As an old East Tennessee boy I've been reading with interest Denise Kiernan's new The Girls of Atomic City, the story of the women who lived and worked at Oak Ridge. Except it wasn't known as "Atomic City" at the time. The word "atomic" wasn't mentioned.
Then, sixteen hours after the first atomic bomb obliterated Hiroshima, President Truman informed the nation of the biggest secret of the war. At last the folks at Oak Ridge, and the other atomic sites, knew what they had been doing.
According to Kiernan, Elizabeth Edwards, the librarian for Oak Ridge, recruited by the government from the New York Public Library system, went immediately to the encyclopedia and took down the volume for "U". It fell open to the page for "uranium," the spine bent and broken from being opened so many times to the same page. Clearly, a substantial number of chemistry-savvy people had made a shrewd guess about what was going on.
Elizabeth Edwards! Could this be the same Elizabeth Edwards who in the early 1950s, as head of the Chattanooga Public Library, gave me my first job, as a sixteen-year-old stack boy? An encyclopedia wouldn't help answer that question, but a few seconds with Google -– "Elizabeth Edwards"+librarian+Chattanooga -- gave the answer. (I should say that although Miss Edwards gave me the job, the fact that my aunt was on the library's staff was not irrelevant.)
No more secret trips to the library to suss out government secrets. Whatever you want to know is probably on the internet and you can be sure someone is watching if you go snooping.
Monday, April 01, 2013
Over the past few years, I have developed a tremor in my right hand. Only my right hand. And I'm right-handed.
This means it takes me at least twice as long to write -- the usual time to type, then again as much time or more to correct the typos. Touch-screen devices: forget it. Type an A, say, and I get AAAA.
Diagnosis: non-Parkinsonian essential tremor. My docs have tried three drugs, none worked. So I shake. And no one knows how to stop it. Do you?
But now Bill Clinton has the same problem. Bill Clinton! My tremor is almost stylish.
Well, not exactly. It's still a pain in the butt. But it has one redeeming quality: Since my hand started shaking, I've lost fifteen pounds. I can eat and drink all I want without worrying about gaining weight. My hand burns up the calories. Or at least that's what I choose to believe.
I'm thinking about writing a book. Shake Yourself Thin.
Whisking eggs. Sprinkling salt. Wobbling the wok. Mixing dressing. Dicing veggies.
And the drinks? Shaken, not stirred.
I'm hoping Bill will give me a blurb.