Wednesday, April 06, 2011

"Experimental Philosophy and the Problem of Free Will"

That was the title of an article in the March 18 issue of Science. What, I wondered, is experimental philosophy? If it's experimental -- that is, based on reproducible empirical data -- then it's science. And what new, pray, might philosophy -- experimental or otherwise -- have to say about free will?

I read eagerly.

The author begins by saying that most central philosophical problems concerning free will, morality and consciousness are notorious for their resilience, many of them stretching back to the earliest days of philosophy. In this he is certainly correct. In more than two thousand years, philosophy has contributed precisely nothing to the problem of free will, except to state the problem: Are our actions free or determined, and is freedom necessary for moral culpability?

So what might this new discipline -- experimental philosophy -- contribute?

I quote at random: "According to one hypothesis, the internal motoric signals that cause behavior also generate a prediction about imminent bodily movement, and this prediction is compared to the actual sensory information of bodily motion. If the predicted movement confirms to the sensory information, then one gets the feeling of agency; otherwise the movement is likely to feel involuntary."

Or: If I feel like an action was free, then I think it was free.

At least, I think that's what it means.

In general, this rather long article says virtually nothing about free will. Rather, it compiles data -- using the methods of the social sciences -- on what people think about freedom and moral responsibility. Whether you call this "experimental philosophy" or "experimental psychology" probably depends on which academic department you're employed by.

Anyway, back to the "problem". If I choose at this moment to kick the cat, is that action intrinsically free, or is it determined by some accumulative chain of cause and effect -- including prior mental states -- over which some hypothesized autonomous "self" has no control? And, if the latter, am I morally responsible for my action?

No one knows the answer to the first question. Whatever concantations of causality may determine my conscious actions is far too complex to be amenable -- at this point in time -- to experimental analysis. An outside observer cannot predict with certainty whether or not I will kick the cat, even if that action is in fact entirely determined. There are simply too many undetermined variables. Massively complex causal determination is not what philosophers traditionally meant by free will, but it is indistinguishable from what philosophers traditionally meant by free will. If it walks like a duck and quacks like a duck, then -- for all practical purposes -- it's a duck.

And the second question? Moral responsibility is a social construct, not a scientific hypothesis. Humans discovered long ago that living peaceably in groups required a notion of individual responsibility. Responsibility implies freedom, real or effective. Society negotiates responsibility.

If there is such a thing as "experimental philosophy," problems of free will, consciousness and morality are presently beyond its reach. Lots more groundwork will need to be done -- in neurobiology, artificial intelligence, and so on -- before these perennial problems are tractable to experimental solution.