Wednesday, June 9, 2010

My Advice? Don't Listen to My Advice

How should I know whether the climate is warming due to human activity?

Or what specific measures would best resurrect today’s sour economy? Or whether the Supreme Court’s latest decision is the correct one? Or whether vaccines cause autism? Or whether escalating, or scaling down, is the right way to go in Afghanistan? Or whether the Israeli or Palestinian version of the story is the right one?

There are people with Ph.D.s working on all these questions. So how could I have the hubris even to have an opinion on such matters?

Let’s face it. I’m just not informed enough about most of the many important issues of our day to be entitled to an opinion. And even where I am reasonably informed, I’m just not smart enough to be confident of any opinion. The issues – the data, the variables, the formulae -- are far more complex than my own brain is smart.

About the only thing I am confident about is that I ought not be very confident about anything.

All this is obvious to me. But here’s what’s scary.

I myself have a Ph.D. Admittedly not a very relevant one; it’s in philosophy. But my degree means that at least someone thought me capable, at least to some degree, of grasping difficult things, of making and evaluating rational arguments. I also think my I. Q., not to mention my SAT scores, puts me somewhere around the 97th percentile in such matters. (Though how should I know if these are meaningful measures of anything?) I even got to portray not just a genius, but The Genius, on the David Letterman show a few years back. I may not be an Einstein, but I’m not exactly the dimmest bulb on the planet either.

So if I’m not entitled to an opinion on today’s issues, then who, exactly, is?

Well, there are those 3% of the population above me on the scales – those smarty-pants. And quite a few of them even have Ph.D.s., even relevant Ph.D.s.

There’s just one problem.

These brainiacs never seem to agree amongst themselves on any of the important issues. Just about every side of every issue has its experts, its authors, its talking heads. Here am I, regularly following the news, reading the columnists and blogs, occasionally even making my way through long articles in the monthlies. The first guy makes what seems like a pretty compelling case; the next lady makes an equally compelling case for the opposite conclusion.

And I’m supposed to sort it all out, to figure out which side to sign onto? Me, with my measly Ph.D., outrageous test scores, and celebrity Genius career?

Lately I’ve realized that what I tend to respond to isn’t the actual force of the competing arguments I’m hearing so much as the degree of confidence each talking head projects. Yet even the great Isaac Newton – as much an Einstein as Einstein himself – recognized that his own achievements were comparable to finding a smoother pebble than ordinary “whilst the great ocean of truth lay all undiscovered before me.” And if this groundbreaking thinker saw that the universe was far more complex than even his brain was smart, then just where do these cocky 3-percenters get off projecting so much confidence in their opinions?

About the only other thing I am confident about is that no one ought to be very confident about anything.

The big problem, of course, is what I’m supposed to do with this confidence (or lack thereof).

I think I need to consult an expert.

Wednesday, April 21, 2010

Ignoratio Elenchi (1636-1710)

As this year marks the 300th anniversary of his death, now seems as inappropriate time as any to remind us of one of the long-forgotten dead white males who gives being a dead white male the reputation it deserves.

Who was he?
Ignoratio Elenchi was an Italian rationalist, empiricist, and skeptic, and the primary inspiration for the earliest modern anti-humanists. Hegel cited him as one of the thinkers that it's most important for philosophers to overlook; Marx reserved his harshest words on Hegel to a savage critique of that understatement. Nietzsche wrote an entire sentence about him, the famous exclamation, “Ignoratio Elenchi is dead!” (Apparently Nietzsche was speaking at a ceremony celebrating the 179th anniversary of Elenchi’s death.) In 1982, the American Philosophical Association did Elenchi the unique honor of naming one of the most bone-headed logical fallacies after him.

His life
Elenchi was born in a village just outside Rome better known for its production of manure than for that of philosophers. He didn’t get beyond second-grade in his formal education, and even that achievement, Spinoza speculated in a letter to Descartes, was probably only due to grade inflation. Nevertheless he achieved some early notoriety for a series of letters he wrote protesting the government’s provision of housing and food for disabled orphans, arguing that they would never learn to fend for themselves if their basic needs were taken care of for them. (Lost for centuries, these letters only recently resurfaced in the bedside drawer of former U. S. President George W. Bush.) Elenchi’s lack of education prevented him from obtaining an academic position, and so he supported himself by working in his village’s main industry. Hume wryly observed a century later, in a long essay on Elenchi’s legendary personality disorder, that “the keenest eye could scarce discern where ‘tis, exactly, that Elenchi’s remunerative work leaves off, and his intellectual work, as such, commences.” The final years of Elenchi’s life saw him in a bitter dispute with Leibniz, instigated when Leibniz claimed to have stolen the idea of the calculus not from Elenchi but from Newton, upon which Elenchi grabbed Leibniz’s wig right from the great man’s head and tossed it out the window, exclaiming “I refute you thus!” (Leibniz recounted the affair in an interview years later, conceding that Elenchi’s refutation was probably devastating, if only anybody could figure out what the hell he was talking about.) Elenchi died in a village just outside Rome better known for its disposal of disagreeable wastes than for that of philosophers. The precise cause of death was unknown, but Feuerbach, in an early 19th century journal entry, speculated that Elenchi’s might have been the only credible case of death due to a self-inflicted sexually transmitted disease.

His work
Elenchi wrote prolifically, mostly angry letters he sent to all the leading intellectuals of his day. These were collected and published posthumously, in a volume entitled The Prosecution’s Case Against Ignoratio Elenchi. Many had a similar structure. Elenchi would begin by praising the thinker’s work, noting that he himself had been thinking along similar lines; he’d then suggest that, perhaps, the thinker in question was familiar with Elenchi’s own work; he’d then accuse the thinker of stealing his work; and then wrap up by asking to borrow some money. In fact Elenchi adopted for his own some of the most famous doctrines of his day: Cartesian dualism, Malebranchian occasionalism, Leibnizian theodicy, and so on. Nevertheless he did develop at least one original doctrine, his idiosyncratic version of pantheism -- according to which God was, quite literally, a particular kind of iron skillet. This doctrine remains alive and well to this very day in the recently founded religion “Pastafarianism,” which acknowledges that there could have been no Flying Spaghetti Monster to create the cosmos if there had been no prior divine vessel in which He was prepared.

Wednesday, January 6, 2010

What You See is NOT What You Get

People regularly tell me to come to my senses, but the philosopher in me thinks we should run as far from our senses as we can.

To concentrate just on vision, our shifty little eyes deceive us all the time. A tower in fact square may look round from a distance. Our bedsheets look spotless yet harbor more hungry dust mites than we want to know. The moon looks larger on the horizon than above us and yet it isn’t. A straight stick in water looks bent. The sky looks blue when in fact it consists only of air or gas molecules which aren’t themselves blue. When we watch a movie, objects seem to move across the screen when all we’re actually seeing is a rapid sequence of still pictures. And finally that dining room table we paid a month’s salary for, for what looks like its solid cherry surface? In fact it’s composed mostly of the empty space inside its atoms. Suckers!

Indeed the whole idea that our eyes can tell us how things really are doesn’t make a lot of sense. Our perceptions are constantly varying, for one thing, without our having any basis for choosing one perception to be the “true” one. In fact (for example) I shouldn’t have suggested above that the stick “really is” straight since even that information only comes from other conflicting perceptions. Instead we should just say that to our visual perception the stick looks crooked while to our tactile perception of it under the water it feels straight. There is no way of saying how things “really” are. We can only say how things appear to us in different circumstances.

Even more importantly, to confirm that our visual perception of a thing is accurate we’d have to compare that perception with the thing itself. But how can we do that? Every time we look at the thing we only get another perception of it, and never the thing itself!

Things are simply not, in short, as the eyes have it. So next time you’re told to come to your senses—say nay!

Tuesday, December 22, 2009

The Paradox of the Preface, and World Peace

Reprinted from Religion Dispatches
[http://www.religiondispatches.org/archive/oped/2127/]

The epiphanies came as suddenly and strong as my newborn’s projectile spit-up. Yes, I realized, I SHOULD wear a burping cloth. And also, the Paradox of the Preface is the key to universal religious harmony and world peace.

It might also help you lose that weight, quit smoking, and find the man or woman (or both) of your dreams, but that is for another essay.

What is the Paradox of the Preface?

Imagine an author writing something like this as a preface to her work:

I am certain, of each and every sentence in this work, that it is true, on the basis of various considerations including the careful arguments and use of evidence which led me to it. And yet I recognize that I am a fallible human being, likely to have made some error(s) in the course of this long work. Thus I am also quite certain that I have made some such error somewhere, even if I cannot say where.

Such a refreshingly honest preface! So what is the paradox?

Well, there is the implicit, apparent contradiction. To believe of each and every sentence that it is true is to believe, in effect, that not one of the sentences is false; but to believe that there is at least one error in the work is to believe that at least one of the sentences is false, and thus to contradict the first belief.

And yet both beliefs can seem so plausible! Indeed—and here’s the key—even after we become aware of the implicit contradiction, both the contradictory beliefs remain quite appealing in their own right.

Thus the paradox.

But the key to world peace?

Well, there may be a number of ways to respond to this paradox. Amongst them, you might take the certainty in your fallibility to undermine the certainty in any or all of your particular individual beliefs. My thought here is that those who take this route are not the ones primarily responsible for disturbing the global peace. Or you might take the certainty in each particular belief to take away the certainty of your general fallibility. My thought here is that the folks who go this route—convinced of their infallibility—are generally the troublemakers.

My hope, however, is that these same people might, just might, be open to a third option, if only they were aware of it.

What I suggest, instead, is that we simply acknowledge the paradox: that is, recognize that both contradictory propositions are, in their own right, extremely plausible. In the preface case this actually seems quite easy to do. My ultimate hope, then, is that world peace will break out when enough people simply acknowledge the paradox as well and begin applying it more generally.

Why is that?

Because acknowledging the paradox allows you simultaneously to say two things.

Choose some important, life-governing, very controversial thing you happen to believe in with great fervor: the existence of God (or perhaps atheism), the truth of Christianity (or perhaps Islam or Hinduism, etc.), absolute morality (or relativism), the right to bear arms (or the government’s right to regulate them), etc. Focusing on religion as our example, you can now say, first, that you believe, with certainty, in the truth of (say) Christianity, and thus believe, with equal certainty, in all the things entailed by that belief: that, say, all other competing religions are simply false.

But then you can say, second, something else: that you may be wrong.

Got it? You can simultaneously be certain that Christianity is true and everything conflicting with it is false, and yet acknowledge that you may be wrong without taking away your certainty. You can thus keep your certainties without having to claim that you are, in fact, and grossly implausibly, infallible. It’s what everyone (other than bakers) has yearned for since time immemorial: the proverbial cake, both eaten yet had!

Imagine, now, that all parties came to acknowledge the Paradox of the Preface as well. Then THEY could say that they are certain that (for example) Islam is true and everything conflicting with it is false—and yet acknowledge that they may be wrong without taking away their certainty.

Everyone could get what they most want: namely, certainty in the truth of whatever it is they are certain is true. This certainty can lead people to do all the things they should do when they are certain of a thing: defend it, live in accordance with it, try to spread it, etc. But once you add the proviso “but I may be wrong” you might, just might, no longer do it in quite the rather unpleasant or sometimes violent way that such things are often done.

Thus universal religious harmony and world peace.

I refer to this overall perspective as “Humble Absolutism.” You may believe, with certainty, in the truths in question, and that they are absolute truths. But you do it with the form of humility appropriate to the recognition that your belief might be false.

Now let me go find that burping cloth.

Friday, December 4, 2009

The rational thing to do is to act irrationally

Consider the following scenario. There are two closed boxes. You may choose Box 2 alone or both boxes. Box 1 contains $100. Box 2 contains either zero or a million dollars, depending on what a certain “Predictor” has predicted. If she predicted you will take Box 2 alone she put $1M into it. If she predicted you’ll take box boxes she left Box 2 empty. The Predictor has already done her work and left the room.

One further piece of information: A billion people (say) have gone through this experiment before you. And the Predictor has so far predicted correctly for every one.

What then is the rational choice for you to make?

Well, if she has also predicted your choice correctly, then if you take Box 2 alone she’ll have put $1M in it and if you take both boxes she’ll have left Box 2 empty, yielding you only the $100 from Box 1. So it seems rational for you to take Box 2 alone.

But on the other hand, the Predictor has done her work and left. Right now Box 2 has either zero or a $1M in it. If it has zero you’re better off taking both boxes because then at least you’ll get the $100 in Box 1. If it has $1M then again you’re better off taking both boxes because you’ll get the $1M plus the $100. So either way you’re better off taking both boxes. So the rational thing to do seems to be to take both boxes!

So which to choose?

Admittedly it seems unbelievably improbable, with her impressive track record, that the Predictor will predict wrongly for you, but in fact it is not absolutely impossible. But the second argument exhausts all the logical possibilities. It is literally impossible for that reasoning to go wrong. And when you must choose between what’s unbelievably improbable to go wrong and what’s impossible to go wrong, the rational person must choose the latter.

So you take both boxes. And you know what happens: for the billionth plus one consecutive time the Predictor predicted correctly and left Box 2 empty. You slink home with your paltry $100 instead of the $1M you’d have received had you taken only Box 2, having only the small consolation of knowing that you at least did the rational thing.

Unless the rational thing, in this case, would have been to act irrationally.


Source: Robert Nozick, "Newcomb's Problem and Two Principles of Choice," in Nicholas Rescher, ed., Essays in Honor of Carl G. Hempel (Dordrecht, the Netherlands: D. Reidel), 1969, p 115.

Monday, November 16, 2009

Really Moved, By The Unreal

I’m a weeper. I rarely make it through a decent book, or a movie, without the tears flowing. I bawl like a baby when Jimmy Stewart begs Clarence, in It’s a Wonderful Life, to let him live again. In the cinema I could not suppress an embarrassingly loud sob when the Beast, astonished, murmurs to the Beauty, “You came back, Belle; you came back.” And Humphrey Bogart putting Ingrid Bergman on that Casablanca plane? Always good for at least three hankies.

What I don’t understand is why.

Why am I moved when the joys and sorrows in fact are not my own—nor even real?

One idea, perhaps, is that when immersed in a movie we temporarily forget that we’re observing a fiction. But that seems hard to accept. If I’m watching a DVD I may well get up, make a phone call, then resume watching and weeping. Or I might continue to munch on popcorn right through my tears. I certainly wouldn’t do any of those things were I witnessing some real-life sorrow. And similarly I might be moved to experience great fear when watching Jurassic Park—yet I’m never tempted even slightly to run screaming from the cinema, which I surely would do were I even briefly forgetting that those raptors aren’t real.

Another idea is that we are moved out of empathy or compassion. After all, I rarely make it through the evening news, either, without weeping at whatever terrible events are reported, and those events are perfectly real. Yet even so, it seems, the question remains. The pain I may learn about this way is not my pain. The awful events depicted did not happen to me, nor, typically, have I even experienced anything very similar in my own life. To say I have empathy is to say that I am moved by those stories. But it is not to explain why I am moved.

And surely not to explain why I am moved by things which aren’t real.

So no, nobody is really put on a plane when Bogart puts Bergman on that plane; and nobody really comes back when Belle the Beauty comes back. But for some reason that doesn’t stop me from really reaching for yet another box of Kleenex.

Monday, October 26, 2009

St Thomas Aquinas (1225-1274)

A century before the Black Death swept through Europe, a crisis of another sort was brewing. The long-lost works of Aristotle were suddenly rediscovered. You wouldn't think that the discovery of some dusty old philosophical manuscripts would be comparable to the arrival of the bubonic plague, but a lot had happened since those manuscripts had gone missing. Mainly, Christianity. And Aristotle’s works seemed to contradict some of its key tenets, including that one about a God who created the cosmos. Not surprisingly, the Church initially banned Aristotle’s works.

But we know what happens when you ban books. Before long everyone was reading Aristotle -- and the Church was forced to try a new strategy: to embrace him -- by finding ways to reconcile him with Christian beliefs. And no one was better at that task than Thomas Aquinas. By the 14th-century, Aristotle had gone from being banned to becoming required reading at the universities!

Aquinas wrote prolifically, culminating in his great masterpiece Summa Theologica, a massive work summarizing all of Christian doctrine -- at least as understood by Aquinas. He did not, however, complete this great work. On December 6, 1273, during Mass, he underwent some mystical experience, upon which he suddenly ceased to write -- explaining that all that he’d written now seemed to him “like straw.” He died just months later, at 49.

A short time after his death, Aquinas was canonized by the Church, to become Saint Thomas. His works went on to become the main text to which all the rest of Christian philosophy amounts to just footnotes.

Some straw!