Tuesday, December 22, 2009

The Paradox of the Preface, and World Peace

Reprinted from Religion Dispatches
[http://www.religiondispatches.org/archive/oped/2127/]

The epiphanies came as suddenly and strong as my newborn’s projectile spit-up. Yes, I realized, I SHOULD wear a burping cloth. And also, the Paradox of the Preface is the key to universal religious harmony and world peace.

It might also help you lose that weight, quit smoking, and find the man or woman (or both) of your dreams, but that is for another essay.

What is the Paradox of the Preface?

Imagine an author writing something like this as a preface to her work:

I am certain, of each and every sentence in this work, that it is true, on the basis of various considerations including the careful arguments and use of evidence which led me to it. And yet I recognize that I am a fallible human being, likely to have made some error(s) in the course of this long work. Thus I am also quite certain that I have made some such error somewhere, even if I cannot say where.

Such a refreshingly honest preface! So what is the paradox?

Well, there is the implicit, apparent contradiction. To believe of each and every sentence that it is true is to believe, in effect, that not one of the sentences is false; but to believe that there is at least one error in the work is to believe that at least one of the sentences is false, and thus to contradict the first belief.

And yet both beliefs can seem so plausible! Indeed—and here’s the key—even after we become aware of the implicit contradiction, both the contradictory beliefs remain quite appealing in their own right.

Thus the paradox.

But the key to world peace?

Well, there may be a number of ways to respond to this paradox. Amongst them, you might take the certainty in your fallibility to undermine the certainty in any or all of your particular individual beliefs. My thought here is that those who take this route are not the ones primarily responsible for disturbing the global peace. Or you might take the certainty in each particular belief to take away the certainty of your general fallibility. My thought here is that the folks who go this route—convinced of their infallibility—are generally the troublemakers.

My hope, however, is that these same people might, just might, be open to a third option, if only they were aware of it.

What I suggest, instead, is that we simply acknowledge the paradox: that is, recognize that both contradictory propositions are, in their own right, extremely plausible. In the preface case this actually seems quite easy to do. My ultimate hope, then, is that world peace will break out when enough people simply acknowledge the paradox as well and begin applying it more generally.

Why is that?

Because acknowledging the paradox allows you simultaneously to say two things.

Choose some important, life-governing, very controversial thing you happen to believe in with great fervor: the existence of God (or perhaps atheism), the truth of Christianity (or perhaps Islam or Hinduism, etc.), absolute morality (or relativism), the right to bear arms (or the government’s right to regulate them), etc. Focusing on religion as our example, you can now say, first, that you believe, with certainty, in the truth of (say) Christianity, and thus believe, with equal certainty, in all the things entailed by that belief: that, say, all other competing religions are simply false.

But then you can say, second, something else: that you may be wrong.

Got it? You can simultaneously be certain that Christianity is true and everything conflicting with it is false, and yet acknowledge that you may be wrong without taking away your certainty. You can thus keep your certainties without having to claim that you are, in fact, and grossly implausibly, infallible. It’s what everyone (other than bakers) has yearned for since time immemorial: the proverbial cake, both eaten yet had!

Imagine, now, that all parties came to acknowledge the Paradox of the Preface as well. Then THEY could say that they are certain that (for example) Islam is true and everything conflicting with it is false—and yet acknowledge that they may be wrong without taking away their certainty.

Everyone could get what they most want: namely, certainty in the truth of whatever it is they are certain is true. This certainty can lead people to do all the things they should do when they are certain of a thing: defend it, live in accordance with it, try to spread it, etc. But once you add the proviso “but I may be wrong” you might, just might, no longer do it in quite the rather unpleasant or sometimes violent way that such things are often done.

Thus universal religious harmony and world peace.

I refer to this overall perspective as “Humble Absolutism.” You may believe, with certainty, in the truths in question, and that they are absolute truths. But you do it with the form of humility appropriate to the recognition that your belief might be false.

Now let me go find that burping cloth.

Friday, December 4, 2009

The rational thing to do is to act irrationally

Consider the following scenario. There are two closed boxes. You may choose Box 2 alone or both boxes. Box 1 contains $100. Box 2 contains either zero or a million dollars, depending on what a certain “Predictor” has predicted. If she predicted you will take Box 2 alone she put $1M into it. If she predicted you’ll take box boxes she left Box 2 empty. The Predictor has already done her work and left the room.

One further piece of information: A billion people (say) have gone through this experiment before you. And the Predictor has so far predicted correctly for every one.

What then is the rational choice for you to make?

Well, if she has also predicted your choice correctly, then if you take Box 2 alone she’ll have put $1M in it and if you take both boxes she’ll have left Box 2 empty, yielding you only the $100 from Box 1. So it seems rational for you to take Box 2 alone.

But on the other hand, the Predictor has done her work and left. Right now Box 2 has either zero or a $1M in it. If it has zero you’re better off taking both boxes because then at least you’ll get the $100 in Box 1. If it has $1M then again you’re better off taking both boxes because you’ll get the $1M plus the $100. So either way you’re better off taking both boxes. So the rational thing to do seems to be to take both boxes!

So which to choose?

Admittedly it seems unbelievably improbable, with her impressive track record, that the Predictor will predict wrongly for you, but in fact it is not absolutely impossible. But the second argument exhausts all the logical possibilities. It is literally impossible for that reasoning to go wrong. And when you must choose between what’s unbelievably improbable to go wrong and what’s impossible to go wrong, the rational person must choose the latter.

So you take both boxes. And you know what happens: for the billionth plus one consecutive time the Predictor predicted correctly and left Box 2 empty. You slink home with your paltry $100 instead of the $1M you’d have received had you taken only Box 2, having only the small consolation of knowing that you at least did the rational thing.

Unless the rational thing, in this case, would have been to act irrationally.


Source: Robert Nozick, "Newcomb's Problem and Two Principles of Choice," in Nicholas Rescher, ed., Essays in Honor of Carl G. Hempel (Dordrecht, the Netherlands: D. Reidel), 1969, p 115.

Monday, November 16, 2009

Really Moved, By The Unreal

I’m a weeper. I rarely make it through a decent book, or a movie, without the tears flowing. I bawl like a baby when Jimmy Stewart begs Clarence, in It’s a Wonderful Life, to let him live again. In the cinema I could not suppress an embarrassingly loud sob when the Beast, astonished, murmurs to the Beauty, “You came back, Belle; you came back.” And Humphrey Bogart putting Ingrid Bergman on that Casablanca plane? Always good for at least three hankies.

What I don’t understand is why.

Why am I moved when the joys and sorrows in fact are not my own—nor even real?

One idea, perhaps, is that when immersed in a movie we temporarily forget that we’re observing a fiction. But that seems hard to accept. If I’m watching a DVD I may well get up, make a phone call, then resume watching and weeping. Or I might continue to munch on popcorn right through my tears. I certainly wouldn’t do any of those things were I witnessing some real-life sorrow. And similarly I might be moved to experience great fear when watching Jurassic Park—yet I’m never tempted even slightly to run screaming from the cinema, which I surely would do were I even briefly forgetting that those raptors aren’t real.

Another idea is that we are moved out of empathy or compassion. After all, I rarely make it through the evening news, either, without weeping at whatever terrible events are reported, and those events are perfectly real. Yet even so, it seems, the question remains. The pain I may learn about this way is not my pain. The awful events depicted did not happen to me, nor, typically, have I even experienced anything very similar in my own life. To say I have empathy is to say that I am moved by those stories. But it is not to explain why I am moved.

And surely not to explain why I am moved by things which aren’t real.

So no, nobody is really put on a plane when Bogart puts Bergman on that plane; and nobody really comes back when Belle the Beauty comes back. But for some reason that doesn’t stop me from really reaching for yet another box of Kleenex.

Monday, October 26, 2009

St Thomas Aquinas (1225-1274)

A century before the Black Death swept through Europe, a crisis of another sort was brewing. The long-lost works of Aristotle were suddenly rediscovered. You wouldn't think that the discovery of some dusty old philosophical manuscripts would be comparable to the arrival of the bubonic plague, but a lot had happened since those manuscripts had gone missing. Mainly, Christianity. And Aristotle’s works seemed to contradict some of its key tenets, including that one about a God who created the cosmos. Not surprisingly, the Church initially banned Aristotle’s works.

But we know what happens when you ban books. Before long everyone was reading Aristotle -- and the Church was forced to try a new strategy: to embrace him -- by finding ways to reconcile him with Christian beliefs. And no one was better at that task than Thomas Aquinas. By the 14th-century, Aristotle had gone from being banned to becoming required reading at the universities!

Aquinas wrote prolifically, culminating in his great masterpiece Summa Theologica, a massive work summarizing all of Christian doctrine -- at least as understood by Aquinas. He did not, however, complete this great work. On December 6, 1273, during Mass, he underwent some mystical experience, upon which he suddenly ceased to write -- explaining that all that he’d written now seemed to him “like straw.” He died just months later, at 49.

A short time after his death, Aquinas was canonized by the Church, to become Saint Thomas. His works went on to become the main text to which all the rest of Christian philosophy amounts to just footnotes.

Some straw!

Tuesday, September 29, 2009

A Twitter Tour of Western Philosophy

[Reprinted from the Christian Science Monitor: http://www.csmonitor.com/2009/0928/p09s02-coop.html]

When both neocons and lit profs are all atwitter about the same thing, you know it's got to be big.

It seems that two young undergrads at the University of Chicago, Emmett Rensin and Alex Aciman, recently landed a deal with Penguin Books to publish their book "Twitterature: The World's Greatest Books, Now Presented in Twenty Tweets or Less."

Why plod through 3,000 pages of Proust, after all, when you can just get the gist while listening to podcasts on the subject with your iPhone? Just think how much more time you would have been able to waste in college if you hadn't had to splash through the stormy seas of of "Moby Dick"!

In fact, though, I think Mr. Rensin and Mr. Aciman could do better: Who has the time, these days, for a leisurely meander through 20 whole tweets about anything?

So here, for those of you seeking, between tweets, to plug some gaps in your education, is a brief tour of Western philosophy up to the 20th century, a very manageable one tweet at a time.

Socrates: Drinking hemlock; toes tingling; legs getting numb. Maybe unexamined life worth living? Guard!

Plato: Symposium 2nite 7pm, @ The Cave. Open mike, open bar. Under 21 admitted free.

Aristotle: 2 say of what is, that it is, is true; 2 say that it is not true, is false. And this is what is, and thus true; and there4 not false.

(Early) Augustine: In Carthage w/the Smashers 2nite total caldron of lust here XXX!

(Later) Augustine: John 3:16.

Anselm: God must exist, for otherwise that than which none gr8ter can b conceived would b 1 than which a gr8ter CAN b conceived, fool.

Aquinas: How many angels can dance on a pin? I answer that it depends on whether the tango, fandango, or the Mexican hat dance is in question.

Descartes: Check out the new Facebook fan page 4 my fav starchy tuber! I link there4 a yam.

Spinoza: Mind is God. Matter is God. I m God. Only thing not God is God.

Locke: Our minds @ birth are like blank slates, except 4 all the ideas, dispositions, and powers we are innately born with.

Leibniz: Optimist says this is the best possible world. Pessimist agrees.

Hume: No sense can b made of anything, nothing can b known, crud just happens. I'll b @ the pub.

Kant: The thing as it appears is white, creamy, and delicious; we cannot know whether it is, in itself, just mayonnaise.

Hegel: God's long path toward realizing, in His highest form, Himself, all of history is, namely German bureaucracy.

Schopenhauer: All is empty, pointless. Deep, dark despair. Could use snack.

Marx: Hegel wrong. It's not spirit that moves bodies. It's coffee.

Kierkegaard: OMG heartsick again today. OMG Regina, luv of my life, still hasn't called. OMG.

Nietzsche: Restraints too tight, barely wriggle fingers 2 type. Renfield brought some flies to munch, a real ├╝bermensch. But what, no dip?

For any acquisitions editors at Penguin who may be reading – follow me @60SecondPhilosopher.

Friday, September 11, 2009

“It Depends On What The Meaning Of The Word ‘Is’ Is”

Philosophers, lawyers, spin doctors—and the former U. S. President who infamously uttered the title sentence to a grand jury—all suffer from a bad reputation: they play games with words. That may well be true, but we shouldn’t blame the philosopher in a person for those offenses. We should blame the English language for making those offenses possible in the first place.

For English, like other languages, is a mess: it’s vague, ambiguous, and inconsistent. And it is most notoriously unclear with respect to one of its most basic words: “is.” Sometimes (for example) “is” indicates the present tense: “Fred is eating now.” But other times it indicates the future: “Fred is coming later.” And other times it is used timelessly, as in “The number 3 is odd,” or “'Is’, simply, is a mess.”

And even if we restrict ourselves to the present tense, “is” is no better. For consider the following sentences:

Fred is red
Fred is lead
Fred is Ted
Fred is

To say that Fred is red is to say that redness is one of his properties. (Maybe he’s blushing.)

But to say that Fred is lead is to say that he is composed of lead—maybe “Fred” is the name of a statue—in a way we’d never say that blushing Fred is “composed of” redness.

When we say that Fred is Ted we’re identifying Fred with Ted: Fred and Ted are one and the same person. (Perhaps he’s been two-timing some women by using different names). But we don’t say that Fred the statue is “identical” to lead. After all there’s plenty of lead in the world that’s not affiliated with Fred.

Finally, when we say “Fred is,” we’re not saying anything about his properties, what he’s composed of, or what he’s identical to. We’re merely saying that he exists.

So “is” is a very difficult word. So many possible meanings packed into so few letters! And the language only gets messier for more complex words. So don’t blame the philosophers, the lawyers, the spin doctors, nor the former U. S. President (who may be all of the above)—it’s English itself which deserves to be impeached.

Monday, August 17, 2009

An Inconvenient Tooth

There’s something about movie popcorn. My sweet tooth I can satisfy anywhere but only movies can satisfy my popcorn tooth. I also firmly believe that you should try to do some good in this world.

And that precisely is the problem.

Think about the roughly 15 dollars you spend whenever you go to the movies. Then think about those commercials you’ve seen on television: weepy, wide-eyed, hungry children staring at you while you’re reminded that just pennies a day could keep that that very child from starving to death. You are moved, you resolve—and then a moment later you are chuckling over Joey's latest antics in the Friends rerun you are watching for the 11th time.

You are spending 15 dollars munching popcorn while children are literally dying.

It’s easy to rationalize your behavior. “What could my $15 do against the all the world’s problems?” Answer: It could save a child’s life. “Hey I do plenty of good, I give to charity, donate my time. Can’t I just go to the movies?” Answer: You could always do more. Is your evening at the movies worth a child’s life? “How can I be sure my $15 will actually do any good?” Answer: Stop going to movies and get involved in the relevant organizations.

In fact it’s very hard to justify going to the movies. Or going out to dinner. Or buying new clothes. Or pretty much anything we do. If all of us just cut back a little on our luxuries and redirected our resources we could do an awful lot of good in this world. Take global warming, for example. If everyone who saw Al Gore’s An Inconvenient Truth had just applied their popcorn money directly towards the problem in some way, perhaps the movie wouldn’t have been necessary.

You are a terrible person for going to the movies.

Oh wait—Ross is about to propose to Rachel!

Monday, August 10, 2009

Rain Rain Go Away -- Not!

Rain rain go away
Come again some other day
Little children want to play….


We’re an odd culture.

Our nursery rhymes – those sweet little ditties we sing to our children – are filled with terrible, terrible things. Rock-a-bye Baby is a beautiful, gentle melody with which we lull our babes to sleep, at least until we terrorize them with the part about babies tumbling from trees. Ring Around the Rosie has whiled away many a pleasant afternoon with laughter, twirling and falling down – falling down dead, that is, from the rosy rashes of the bubonic plague. Humpty Dumpty, of course, shares the deadly fate of those rocking babies, irreparably smashing himself into pieces.

And worst of all is that contemptible little classic sung by children everywhere – and by most of us New England adults during the dreary first part of this summer – Rain Rain Go Away.

Now your first instinct might be to disagree.

Falling babies and bubonic plague is pretty heavy stuff, admittedly. And what’s the harm in wishing, now and again, that the rain go away? For rain can be awfully problematic: it causes cancellations and delays, outages and accidents, and it interferes with everything from baptisms and bar-mitzvahs to baseball games. Rain can be, in short, so darn irritating, especially when it dominates a season it’s not supposed to. So why shouldn’t we wish it away?

Especially if little children – or we big ones – want to play?

But maybe, just maybe, the problem isn’t the rain, but us.

In fact, the “problem of evil” has long tormented philosophers inclined to believe in the existence of a deity. How could an all-perfect, all-knowing and all-good being have created a world which has so much evil in it? The great 12th century Jewish thinker Maimonides cautioned, however, that our judgment of what qualifies as “evil” is often remarkably self-centered. If something does not fit our personal desires or interests, we immediately condemn it as evil, as if the whole world is all about us. But individual people, and even all humanity, he thought, are but the tiniest components in this immensely vast world – a world which is not made worse because some people enjoy less goodness than others, but rather more beautiful by the tremendous variety of circumstances it contains.

We may not like it, but the world just might be better off, as whole, in other words, if we personally happen to be enjoying less goodness than we might. Who are we to declare that the world is only good if things go well for us in particular?

Rock-a-bye Baby, Ring Around the Rosie and Humpty Dumpty at least have the integrity to face the facts: boughs break, plagues break out and odd, egg-shaped creatures fall from walls. Even better, they teach us to have a positive attitude toward unpleasant facts and realities generally out of our control. If most of medieval Europe is going to be decimated by plague, after all, we might as well twirl in a circle and fall down laughing hysterically. This, it seems to me, is the attitude we should have toward much of what happens in our lives.

But Rain Rain Go Away?

The other rhymes encourage positive attitudes toward the inconvenient facts of reality. This one teaches us to demand that reality itself change, that reality adapt to us – rather than teach that we, ourselves, ought sometimes adapt to reality.

Must the rain go away simply because we want to play? Is the world, overall, going to be a better place if little Johnny gets to play kickball this afternoon? It’s not all about you, or me, Maimonides reminds us. Maybe the world will be a better place as a whole if Johnny stays indoors today and works on the cure for cancer instead.

Now, if what I have said has taken the sheen off a beloved refrain from childhood, please know I am deeply sorry to have rained on your parade. But it seems to me that popular culture, in its little ditties and funny phrases, often contains some pretty deep and important lessons – lessons worth some reflection, at least during yet another dreary, drizzly afternoon.

Now if I could only make things right as rain, believe me, I would.

Unless they already are.

Monday, August 3, 2009

It's All English to Me

I recently learned that the expression “It’s all Greek to me” derives from medieval, Latin-speaking philosophers bemoaning their inability to read ancient Greek texts. That made me wonder what the Greeks say; which, it turns out, is “It’s all Chinese me.” Before investigating what the Chinese say, however, I realized I’d have a deeper problem with whatever resource I might consult: it would be all English to me. And I don’t understand what understanding English amounts to.

To see why, a brief detour.

It’s easy to treat creatures around us as if they had minds like our own. Every pet owner believes that his beloved Fluffy has thoughts, desires, and feelings; we say things like “that ant hopes to get that bread” or “those weeds want to kill the lawn.” The hard question is whether such talk is literal truth or merely metaphor. It’s especially hard with respect to computers programmed to produce very human-like behaviors. Computers have gotten so sophisticated these days that it’s very easy to think a properly programmed computer could cross the line from merely seeming to have a mind to actually having one.

Here’s a reason to think it wouldn’t—and, at the same time, to question our own understanding of English.

Imagine a man locked in a room. Pieces of paper with strange marks come through a slot in the door; the man studies them, consults a rule book he has (in English), and then from some boxes assembles some new marks to return out the slot. The process repeats. He doesn’t understand these marks; he’s just mechanically following rules matching input marks with outputs. But unbeknownst to him the marks are actual Chinese characters. The people on the outside are native Chinese speakers who believe they are conversing, through writing, with another native speaker within.

Well, computers are like the man in the room: they’re purely mechanical devices which operate on electrical inputs to produce electrical outputs, all according to a program they follow mechanically. Just as the man with his rule book can perfectly simulate an ordinary conversation to outside observers, so too could a properly programmed computer. But just as the man does not actually understand any Chinese, neither does the computer understand what it is doing. Thus computers at best simulate mentality and cannot literally possess it.

This argument points to a crucial difference between computers and people, and thus gives us a reason to deny minds to computers while granting them to other people—but it also raises a difficult question. It assumes that there is more to “understanding” a language than simply being able to produce appropriate outputs given various inputs. After all, the man and computer both can do the latter but only the man allegedly displays the former. But what else is there? When you hear certain English sounds you know what other sounds are appropriate to produce in reply. You “genuinely understand” English. So what exactly is there to “understanding” beyond the ability to utter the appropriate responses?

That’s what is all Urdu to me.


Source: John Searle, “Minds, Brains, and Programs,” Behavioral and Brain Sciences, 3, 1980, 417-458. Reprinted in John Perry and Michael Bratman, eds., Introduction to Philosophy: Classical and Contemporary Readings, 3rd Edition (Oxford, UK: Oxford University Press, 1999).

Friday, July 24, 2009

My Right to Complain

My eyes are going. Just last month I found myself lifting my glasses to read something, the way, you know, old people do. And my knees hurt. And the only impressive thing about the mediocre town we live in is the number of ways it is unimpressive. And I think our water heater is broken. This morning I went to relight the pilot light but got intimidated by all the, basically, WARNING: EXPLOSIONS signs pasted over it. A quick search online suggests a newly installed water heater will set us back a thousand dollars. One day you have hot water, the next day there will be no Hanukkah this year.

Oh, and then there’s Iraq, that whole middle East thing, those families without health insurance, AIDS, e. coli, avian and porcine flu, world poverty, global warming, and all those comets in space hurtling our way.

My friend the brain surgeon can tell you more miserable and heartbreaking stories than anyone I know. The most recent time I updated him on my concerns–about twenty minutes ago–he said to me: “I just informed a pregnant woman, with a toddler in her arms, that her 32-year old husband has incurable brain cancer. So I wouldn’t sweat the water heater.”

Oh great, so now I can’t even complain any more.

But, now, why exactly not?

Of course there are many people with far greater misfortunes than I. But that doesn’t make me feel any better, it makes me feel worse–for not only do I have my troubles, but I live in a world surrounded by people with even greater troubles. Or perhaps my friend means for me to compare how things are for me with how they could be, for me. But I take those heartbreaking stories to illustrate how things probably will go for me, in one form or another. There but for the grace of God go I–but God’s continued grace, and our having hot water any time soon, are two things I’m not betting on.

In any case, if the misfortunes of others means I should feel better about myself, why wouldn’t the good fortunes of different others mean I should feel worse? Those rich people, those beautiful people, those celebrity people–I want what they got, and, frankly, it stinks.

Some years ago a friend said she’d read an interview with Jack Nicholson in which he reported that he preferred to live alone because on his really dark days he doesn’t like to be around other people. “Really dark days?” my friend asked. “What would Jack Nicholson have to feel dark about? Money, fame, accolades, sex, and success–what more could he want?”

You see, I thought, this is human nature. It doesn’t matter how well off we are, things could always be better. And that thought, in fact, made me feel better, at least temporarily. For I realized that, when it comes down to it, I’m really no different from Jack Nicholson. Just minus the money, fame, accolades, sex, and success.

Now if only he’d let me take a shower at his house.


(Reprinted from This I Believe: http://thisibelieve.org/essay/56552/.)

Friday, July 17, 2009

The 60-Second Philosopher in Asia

Korean rights to the new book, The 60-Second Philosopher, have just been sold. Next time you're in Seoul, look for my face on book covers everywhere.

Monday, July 13, 2009

Diary of a Small-Time Philosopher

I spy her ten rows away, coming down the plane aisle. There’s no place for me to escape: it’s clear that the empty seat next to me will shortly be filled by that woman. Not that there’s anything wrong with her, at least to the innocent observer.

Except that, to my expert eye, she is obviously friendly.

“Hi!” she says exuberantly as she slips her carry-on overhead.

I will not capitulate. Nowhere is it written that one must return an unsolicited greeting, especially one that is so irritatingly warm. That’s what the book I’m reading is for, after all—something to briefly look up from, then instantly return to.

“Hi,” I reply pleasantly, miserable.

Now it’s only a matter of time. This woman is already super enthusiastic about getting to know her new friend.

“So what do you do, Andy?” She asks this, I swear, before her rear end has even touched the seat. (And when did she extract my name from me?)

But there it is. My first instinct is to break the window and dive out of the fuselage. My next is to initiate my usual fake coronary, but that’ll no doubt lead my seatmate immediately to begin mouth-to-mouth. Should I pretend I’m deaf? Start speaking in tongues?

“I’m a philosopher,” I mumble, defeated.

That’s it. It’s out. Like ripping off a band-aid, plunging into the ice-cold pool. Not that I’m comfortable, stating it so boldly like that. It seems sort of pompous to call oneself a philosopher. Like handing someone your card announcing you are a prophet. I imagine Jesus, at a party, extending his hand and saying, “Hi. Jesus. Messiah.”

But I am what I am.

I ought to be used to it by now. For so many years, my undergraduate days as a philosophy major, my years in graduate school and as a young professor, I’ve been telling people, at dinners, on airplanes, wherever, and watching as their eyes glaze over or they instantly change the subject. How many women have moved away from me, at bars, just moments after the dreaded subject came up? How many parents have clutched their children more tightly?

Then there are those who go on the offensive. “What can you do with philosophy,” they ask skeptically, “other than just make more philosophers?” “Well,” another might concede, “I suppose someone has to teach college students.” And some have simply hilarious senses of humor. “What did the philosopher say to the executive?” they ask, chuckling. “What?” I reply with a sigh. “Would you like fries with that?! Ha ha!” We both guffaw, except for me.

But almost worse are the few people who actually find my profession interesting. I imagine that psychotherapists, financial advisors, plumbers have similar experiences: here it comes, the request for advice, for a hot tip, for help unclogging the toilet. Only in my case it’s usually a request for wisdom, or for the meaning of life; or, God forbid, help with some ethical dilemma. Once at a wedding a tablemate lit up and said, “Oh! So tell me, which is more intelligent: the East or the West?” I didn’t even know where to begin with that question, other than to ask what he had against the North and the South. Another time at a bar-mitzvah somebody’s third cousin wanted to know whether it was morally worse to beat someone up on the Sabbath or on a weekday. That conversation actually had some potential, until I realized he was talking about beating up his teenaged son, who was sitting at the table. And of course there was that woman I met at a gallery opening during graduate school, who perked up and asked me what my philosophy of life was. “Three words,” I said, tragically encouraged by her enthusiasm, “Sex, drugs, and rock’n’roll. Or I suppose that’s four words, strictly speaking. Or maybe six, if ‘rock’n’roll’ is taken as the full expression ‘rock and roll.’” By the time I’d finished clarifying she was nowhere to be found.

The most pressing question for me now, however, is just which category my new best friend Brenda will fall into. (When did she tell me her name by the way?)

“A philosopher!” she exclaims, her eyes twinkling. Okay, first two categories ruled out; trapped as we are in these seats, this is not auspicious. “So tell me: what are some of your sayings?”

My sayings?

With great regret on all those previous occasions I have had to explain to people that, in fact, contrary to public perception, actual professional philosophers do not have any particular expertise in wisdom or the meaning of life, and are probably the last people you want to consult about your ethical dilemmas. But I have never met anyone who thought that actual professional philosophers would have, you know, sayings.

But now, why shouldn’t we?

We write all those articles, we speak at all those conferences, we teach all those students. Mustn’t we have something to say, if we are doing all this? And if we actually have something to say, shouldn’t we have sayings?

In my head I run through a few possibilities. “Never listen to a philosopher!” comes immediately to mind. No; too annoyingly clever. “Every choice we make presents two options,” comes next, “the one we choose, and the one we instantly regret not having chosen.” Too Woody Allen. I go for some profundity: “God exists or He doesn’t, and either way it’s a staggering thought.” Whoa, too heavy. “Whereof one cannot speak, thereof one must be silent”--perfect, if only Wittgenstein hadn’t already said it. Tear down the wall: Pink Floyd and Ronald Reagan. I have a dream. All you need is love. Happiness is a warm puppy. All taken! “What if,” I ask myself, in a final desperate shot, “the Hokey-Pokey is what it’s all about?”

I got nothing.

I find myself experiencing a new sensation: I’m actually speechless.

Not that Brenda seems to mind. Apparently we’re getting together on Friday with her husband’s family to celebrate somebody’s ruby (or somebody named Ruby’s) wedding anniversary. She’s so very excited to have a met a genuine bona fide philosopher and it seems I’ve agreed to speak at the occasion.

I’ve got three days to think of something to say.

Thursday, June 25, 2009

Some Ado About Nothing

Seinfeld famously billed itself as a show about nothing. But all that meant was that it was about nothing “out of the ordinary”: getting up, having breakfast, going to work. Strictly speaking, it wasn’t so much about nothing as about nearly everything. But it does make the philosopher in me wonder what a show truly about nothing would be like. Would it just consist of 30 minutes (say) of a dark screen? But then what is the difference between, say, a TV that was tuned to nothing and a TV that was turned off altogether?

Nothing is quite as hard to think about as nothing itself, as a matter of fact. In fact it may be impossible to think about nothing, since, like the TV example, thinking about nothing seems equivalent simply to not thinking at all. Sure we can think about the word “nothing,” and that’s perhaps what you were doing when I raised the issue of thinking about nothing; but the word “nothing” is something, a word, and not nothing, so thinking about “nothing” is thinking about something.

Indeed, nothing itself does seem like something. We have that word for it, after all, which is a noun to boot -- and don’t words, especially nouns, have meanings by standing for things? “Nothing” definitely seems meaningful, but if it is, then it stands for nothing -- in which case it isn’t meaningful after all. So nothing must be something.

Nothing also seems to have lots of properties. We can say, for example, how much nothing there is in various places: maybe there’s forty light-years of absolute nothing between adjacent galaxies, for example. We can say how long it lasts: that dead air on the radio, in which nothing happened, or that painful silence following your proposal of marriage, each lasted seven seconds, even if the latter felt like an eternity. We can be moved emotionally by nothing: when the medical report comes back with the news that there’s nothing in our abdomen after all, we are relieved, and when our boss neglects to promote us -- she does nothing instead -- we are distressed. Nothing even has causal powers. The passerby who did nothing, instead of alerting you to the oncoming bicycle, was clearly a cause of the collision, as the posted sentry who did nothing to alert his troops of the oncoming attack was a cause of the consequences. But if nothing can have all these properties-- a size, a duration, even causal powers -- mustn’t it be something?

And there are so many different kinds of nothings! Look in the corner: there’s no lizard there (I hope), but also no plutonium, no Franklin Roosevelt, no Prince Hamlet, no space aliens, and, happily, nothing to fear (but also, sadly, nothing not to fear either). Space is a nothing: it’s the absence of anything. And darkness is a nothing; it’s the absence of light. And coldness is a nothing; it’s the absence of heat. But how could there be all these different kinds of nothing, unless they were each something? That dark cold space over there, in that corner, may look like nothing but in fact it’s awfully crowded!

Admittedly, this is a lot of ado about nothing. But thinking about nothing is a lot more complicated than one might think. And that is not nothing. It is the absence of nothing, which is really something. Or is that everything?

Which, in the end, is what Seinfeld was all about.

Friday, June 19, 2009

I'll See (a Picture of) You in my Dreams

It’s all in your mind, man.

The philosopher in me is used to hearing this, usually expressed with either concern for my well-being or a desire for me to leave the room. My response, typically, is to utter “exactly!” as the door closes behind me. Most people accept that at least some things are just in the mind: subjective sensations such as feelings of warmth and coolness, or how things taste, and even colors. But in fact, I think, it is all in the mind.

Consider a typical dream. You’re on an island, say, in the Caribbean, the sun is shining, the ocean is a gorgeous blue, you’re sipping a cool pina colada, under a coconut tree, with (literally) the man or woman of your dreams, or maybe both … And then you wake up. And you’re in your bed, at night, in winter, in New Jersey, with no sunshine, no ocean, no pina colada, and desperately, desperately alone. We’re all familiar with this phenomenon: how things appear in the dream is just not how things really are. But we’re less familiar with the implications.

In the dream, at one moment, you gazed at the coconut tree. But what, exactly, were you seeing there?

It was not a real -- that is, physical -- tree, because there is no physical coconut tree in your lonely New Jersey room. Indeed it could not have been a physical tree because, while dreaming, your eyes were closed: you weren’t physically seeing anything at all. You must have been seeing something else: a mental image of a tree, a mental tree. The same goes for everything else in a dream. What we see in dreams are mental images.

Could we say, perhaps, that your dream was of some physical tree you have seen, which your memory is now recalling? After all, even if a dream is essentially fictional it is based in reality: you have seen trees, and oceans, and islands, and your mind and memory are capable of storing and reordering all the components in new ways.

That may be true -- but still. When you “store” a memory, so to speak, just what are you storing? The real physical tree? But how can any physical object literally be “stored” in a mind? Whatever goes on in your mind must ultimately be grounded in your brain. But the real physical tree surely is not literally stored anywhere in your brain. Similarly, when your mind calls up a memory, just what is it calling up exactly? Again, the real physical tree? But that tree is far far away; it may even by now be long out of existence. It makes more sense to say this: what gets stored in memory, and “recalled,” is not the physical object itself, but some mental image of it. Mental images can be stored in minds, and can exist long after the physical object of which they’re an image is gone.

Now you are awake. If you are lucky, you’re reading this book right now on an island, in the Caribbean, the sun is shining, the ocean is blue, you’re sipping a cool pina colada, under a coconut tree, with the men and women of your dreams … Look at the tree. Your visual experience is in every way exactly like your dreamed visual experience of that tree. But in a dream, what you see are only mental images of objects. So what you see, when, while awake, you look at a tree -- is not a real physical tree.

It is all in your mind, man.

Wednesday, June 10, 2009

True Colors

I’m a terrible dresser. But I’m a great blame-shifter, and my dressing problem is not entirely my fault. My pants and shirt today matched perfectly at home, in my walk-in closet; but then in front of my class earlier they didn’t match at all. I could solve the practical problem, of course, by simply holding my class in my walk-in closet. But that wouldn’t solve the philosophical problem.

What color is this shirt hanging in my closet, anyhow? I’ll say blue, which is about the best I can do with my very limited color vocabulary. I’ll still call it blue when I’m standing outside, at noon, on a sunny day, in Connecticut, in spring, even though even I can see that its color looks slightly different here than it did in the closet. And I’ll still call it blue under the fluorescent lights of my classroom, though it now looks nothing like the pants that matched its color in my closet. My limited color vocabulary can’t mask the fact, however, that this damn shirt keeps changing colors on me.

Or does it? Nothing about the shirt has changed, after all; it’s the same shirt. How can it have changed colors, when it hasn’t changed at all? Maybe I should just say that it appears different colors to me, in these different viewing contexts. The shirt isn’t changing, in other words; it only looks like it is.

But now if it looks like the shirt is changing colors, when it isn’t, then some of my perceptions must be wrong. The shirt looked different in three different contexts, above, so at least two of those perceptions must be wrong. And since it would, in fact, look different in many other contexts, maybe all three of those perceptions were wrong. Maybe, in fact, I’ve never even seen the true color of the shirt!

But wait--why believe the shirt even has a true color? To believe that it does is to believe that one of the many viewing contexts is correct while all the others are wrong. But which one is the “true context”? We’re naturally inclined to say that my dimly lit closet is not it, but why, exactly, should we privilege, say, the sunlight over the closet? What about the fact that how the shirt looks at noon, in spring, in Connecticut, on a sunny day, might be very different from how it looks at 4 pm, in the fall, on a hazier day, or in Alaska? Or why not say that fluorescent light is an improvement on sunlight, and that it lets us see the true color?

It seems to me we should give up the idea that my shirt--or other physical objects--have a “true” color. In fact we should give up the idea that objects have any colors at all. Think about it: Bodies are made up of atoms, which in turn are made up of little particles like electrons. But nobody thinks that electrons have any colors! And how can something have a color if everything it is composed of does not?

To the contrary, we should say that colors are not in objects but only in the minds of perceivers. That way we don’t have to decide which single viewing context gives us the “true” color, because there is none. Rather, we can say, in effect, that objects have every color they appear to have, in their different contexts. My shirt does not have a true color--but only true colors.

Now let’s get out of this closet.

Monday, June 1, 2009

The Philosopher Within You

There’s the legend of the fish who swam around asking every sea creature he’d meet, “Where is this great ocean I keep hearing about?” A pretty small legend, true—but one with a pretty big message.

We are very much like that fish.

For consider, it’s hard to look at a newborn baby without thinking: what an incredible miracle. But when was the last time you looked at an adult and had the same thought? But why not? Every adult was a little baby; if the latter is a miracle then so is the former. But it never occurs to us to think this way for one simple reason: we’re so used to seeing people that we stop reflecting on them.

Or you drop something, a spoon, and it falls to the floor. But why? Couldn’t it, in theory, have remained floating in air or moved upwards? And how exactly does it fall to the floor, by “gravity”? There are no strings connecting the earth to the spoon. How can the earth pull on something from a distance, that it’s not even attached to? Why don’t we pause every time something drops and say: what an incredible miracle!

The most ordinary things contain a whole lifetime of questions, if only we are reminded to start asking them.

Children already know to ask these questions. Every answer you provide to one of their “Why?” questions just generates the next question. But we were all children once. What we need to do now is to let the child still within us—the philosopher within us—re-emerge. What we need now are a few seconds out of our ordinary conceptual habits. We need to take a cold wet plunge into the great deep ocean of thought.

It’s time to start thinking.