Monday, November 16, 2009

Really Moved, By The Unreal

I’m a weeper. I rarely make it through a decent book, or a movie, without the tears flowing. I bawl like a baby when Jimmy Stewart begs Clarence, in It’s a Wonderful Life, to let him live again. In the cinema I could not suppress an embarrassingly loud sob when the Beast, astonished, murmurs to the Beauty, “You came back, Belle; you came back.” And Humphrey Bogart putting Ingrid Bergman on that Casablanca plane? Always good for at least three hankies.

What I don’t understand is why.

Why am I moved when the joys and sorrows in fact are not my own—nor even real?

One idea, perhaps, is that when immersed in a movie we temporarily forget that we’re observing a fiction. But that seems hard to accept. If I’m watching a DVD I may well get up, make a phone call, then resume watching and weeping. Or I might continue to munch on popcorn right through my tears. I certainly wouldn’t do any of those things were I witnessing some real-life sorrow. And similarly I might be moved to experience great fear when watching Jurassic Park—yet I’m never tempted even slightly to run screaming from the cinema, which I surely would do were I even briefly forgetting that those raptors aren’t real.

Another idea is that we are moved out of empathy or compassion. After all, I rarely make it through the evening news, either, without weeping at whatever terrible events are reported, and those events are perfectly real. Yet even so, it seems, the question remains. The pain I may learn about this way is not my pain. The awful events depicted did not happen to me, nor, typically, have I even experienced anything very similar in my own life. To say I have empathy is to say that I am moved by those stories. But it is not to explain why I am moved.

And surely not to explain why I am moved by things which aren’t real.

So no, nobody is really put on a plane when Bogart puts Bergman on that plane; and nobody really comes back when Belle the Beauty comes back. But for some reason that doesn’t stop me from really reaching for yet another box of Kleenex.

Monday, October 26, 2009

St Thomas Aquinas (1225-1274)

A century before the Black Death swept through Europe, a crisis of another sort was brewing. The long-lost works of Aristotle were suddenly rediscovered. You wouldn't think that the discovery of some dusty old philosophical manuscripts would be comparable to the arrival of the bubonic plague, but a lot had happened since those manuscripts had gone missing. Mainly, Christianity. And Aristotle’s works seemed to contradict some of its key tenets, including that one about a God who created the cosmos. Not surprisingly, the Church initially banned Aristotle’s works.

But we know what happens when you ban books. Before long everyone was reading Aristotle -- and the Church was forced to try a new strategy: to embrace him -- by finding ways to reconcile him with Christian beliefs. And no one was better at that task than Thomas Aquinas. By the 14th-century, Aristotle had gone from being banned to becoming required reading at the universities!

Aquinas wrote prolifically, culminating in his great masterpiece Summa Theologica, a massive work summarizing all of Christian doctrine -- at least as understood by Aquinas. He did not, however, complete this great work. On December 6, 1273, during Mass, he underwent some mystical experience, upon which he suddenly ceased to write -- explaining that all that he’d written now seemed to him “like straw.” He died just months later, at 49.

A short time after his death, Aquinas was canonized by the Church, to become Saint Thomas. His works went on to become the main text to which all the rest of Christian philosophy amounts to just footnotes.

Some straw!

Tuesday, September 29, 2009

A Twitter Tour of Western Philosophy

[Reprinted from the Christian Science Monitor: http://www.csmonitor.com/2009/0928/p09s02-coop.html]

When both neocons and lit profs are all atwitter about the same thing, you know it's got to be big.

It seems that two young undergrads at the University of Chicago, Emmett Rensin and Alex Aciman, recently landed a deal with Penguin Books to publish their book "Twitterature: The World's Greatest Books, Now Presented in Twenty Tweets or Less."

Why plod through 3,000 pages of Proust, after all, when you can just get the gist while listening to podcasts on the subject with your iPhone? Just think how much more time you would have been able to waste in college if you hadn't had to splash through the stormy seas of of "Moby Dick"!

In fact, though, I think Mr. Rensin and Mr. Aciman could do better: Who has the time, these days, for a leisurely meander through 20 whole tweets about anything?

So here, for those of you seeking, between tweets, to plug some gaps in your education, is a brief tour of Western philosophy up to the 20th century, a very manageable one tweet at a time.

Socrates: Drinking hemlock; toes tingling; legs getting numb. Maybe unexamined life worth living? Guard!

Plato: Symposium 2nite 7pm, @ The Cave. Open mike, open bar. Under 21 admitted free.

Aristotle: 2 say of what is, that it is, is true; 2 say that it is not true, is false. And this is what is, and thus true; and there4 not false.

(Early) Augustine: In Carthage w/the Smashers 2nite total caldron of lust here XXX!

(Later) Augustine: John 3:16.

Anselm: God must exist, for otherwise that than which none gr8ter can b conceived would b 1 than which a gr8ter CAN b conceived, fool.

Aquinas: How many angels can dance on a pin? I answer that it depends on whether the tango, fandango, or the Mexican hat dance is in question.

Descartes: Check out the new Facebook fan page 4 my fav starchy tuber! I link there4 a yam.

Spinoza: Mind is God. Matter is God. I m God. Only thing not God is God.

Locke: Our minds @ birth are like blank slates, except 4 all the ideas, dispositions, and powers we are innately born with.

Leibniz: Optimist says this is the best possible world. Pessimist agrees.

Hume: No sense can b made of anything, nothing can b known, crud just happens. I'll b @ the pub.

Kant: The thing as it appears is white, creamy, and delicious; we cannot know whether it is, in itself, just mayonnaise.

Hegel: God's long path toward realizing, in His highest form, Himself, all of history is, namely German bureaucracy.

Schopenhauer: All is empty, pointless. Deep, dark despair. Could use snack.

Marx: Hegel wrong. It's not spirit that moves bodies. It's coffee.

Kierkegaard: OMG heartsick again today. OMG Regina, luv of my life, still hasn't called. OMG.

Nietzsche: Restraints too tight, barely wriggle fingers 2 type. Renfield brought some flies to munch, a real übermensch. But what, no dip?

For any acquisitions editors at Penguin who may be reading – follow me @60SecondPhilosopher.

Friday, September 11, 2009

“It Depends On What The Meaning Of The Word ‘Is’ Is”

Philosophers, lawyers, spin doctors—and the former U. S. President who infamously uttered the title sentence to a grand jury—all suffer from a bad reputation: they play games with words. That may well be true, but we shouldn’t blame the philosopher in a person for those offenses. We should blame the English language for making those offenses possible in the first place.

For English, like other languages, is a mess: it’s vague, ambiguous, and inconsistent. And it is most notoriously unclear with respect to one of its most basic words: “is.” Sometimes (for example) “is” indicates the present tense: “Fred is eating now.” But other times it indicates the future: “Fred is coming later.” And other times it is used timelessly, as in “The number 3 is odd,” or “'Is’, simply, is a mess.”

And even if we restrict ourselves to the present tense, “is” is no better. For consider the following sentences:

Fred is red
Fred is lead
Fred is Ted
Fred is

To say that Fred is red is to say that redness is one of his properties. (Maybe he’s blushing.)

But to say that Fred is lead is to say that he is composed of lead—maybe “Fred” is the name of a statue—in a way we’d never say that blushing Fred is “composed of” redness.

When we say that Fred is Ted we’re identifying Fred with Ted: Fred and Ted are one and the same person. (Perhaps he’s been two-timing some women by using different names). But we don’t say that Fred the statue is “identical” to lead. After all there’s plenty of lead in the world that’s not affiliated with Fred.

Finally, when we say “Fred is,” we’re not saying anything about his properties, what he’s composed of, or what he’s identical to. We’re merely saying that he exists.

So “is” is a very difficult word. So many possible meanings packed into so few letters! And the language only gets messier for more complex words. So don’t blame the philosophers, the lawyers, the spin doctors, nor the former U. S. President (who may be all of the above)—it’s English itself which deserves to be impeached.

Monday, August 17, 2009

An Inconvenient Tooth

There’s something about movie popcorn. My sweet tooth I can satisfy anywhere but only movies can satisfy my popcorn tooth. I also firmly believe that you should try to do some good in this world.

And that precisely is the problem.

Think about the roughly 15 dollars you spend whenever you go to the movies. Then think about those commercials you’ve seen on television: weepy, wide-eyed, hungry children staring at you while you’re reminded that just pennies a day could keep that that very child from starving to death. You are moved, you resolve—and then a moment later you are chuckling over Joey's latest antics in the Friends rerun you are watching for the 11th time.

You are spending 15 dollars munching popcorn while children are literally dying.

It’s easy to rationalize your behavior. “What could my $15 do against the all the world’s problems?” Answer: It could save a child’s life. “Hey I do plenty of good, I give to charity, donate my time. Can’t I just go to the movies?” Answer: You could always do more. Is your evening at the movies worth a child’s life? “How can I be sure my $15 will actually do any good?” Answer: Stop going to movies and get involved in the relevant organizations.

In fact it’s very hard to justify going to the movies. Or going out to dinner. Or buying new clothes. Or pretty much anything we do. If all of us just cut back a little on our luxuries and redirected our resources we could do an awful lot of good in this world. Take global warming, for example. If everyone who saw Al Gore’s An Inconvenient Truth had just applied their popcorn money directly towards the problem in some way, perhaps the movie wouldn’t have been necessary.

You are a terrible person for going to the movies.

Oh wait—Ross is about to propose to Rachel!

Monday, August 10, 2009

Rain Rain Go Away -- Not!

Rain rain go away
Come again some other day
Little children want to play….


We’re an odd culture.

Our nursery rhymes – those sweet little ditties we sing to our children – are filled with terrible, terrible things. Rock-a-bye Baby is a beautiful, gentle melody with which we lull our babes to sleep, at least until we terrorize them with the part about babies tumbling from trees. Ring Around the Rosie has whiled away many a pleasant afternoon with laughter, twirling and falling down – falling down dead, that is, from the rosy rashes of the bubonic plague. Humpty Dumpty, of course, shares the deadly fate of those rocking babies, irreparably smashing himself into pieces.

And worst of all is that contemptible little classic sung by children everywhere – and by most of us New England adults during the dreary first part of this summer – Rain Rain Go Away.

Now your first instinct might be to disagree.

Falling babies and bubonic plague is pretty heavy stuff, admittedly. And what’s the harm in wishing, now and again, that the rain go away? For rain can be awfully problematic: it causes cancellations and delays, outages and accidents, and it interferes with everything from baptisms and bar-mitzvahs to baseball games. Rain can be, in short, so darn irritating, especially when it dominates a season it’s not supposed to. So why shouldn’t we wish it away?

Especially if little children – or we big ones – want to play?

But maybe, just maybe, the problem isn’t the rain, but us.

In fact, the “problem of evil” has long tormented philosophers inclined to believe in the existence of a deity. How could an all-perfect, all-knowing and all-good being have created a world which has so much evil in it? The great 12th century Jewish thinker Maimonides cautioned, however, that our judgment of what qualifies as “evil” is often remarkably self-centered. If something does not fit our personal desires or interests, we immediately condemn it as evil, as if the whole world is all about us. But individual people, and even all humanity, he thought, are but the tiniest components in this immensely vast world – a world which is not made worse because some people enjoy less goodness than others, but rather more beautiful by the tremendous variety of circumstances it contains.

We may not like it, but the world just might be better off, as whole, in other words, if we personally happen to be enjoying less goodness than we might. Who are we to declare that the world is only good if things go well for us in particular?

Rock-a-bye Baby, Ring Around the Rosie and Humpty Dumpty at least have the integrity to face the facts: boughs break, plagues break out and odd, egg-shaped creatures fall from walls. Even better, they teach us to have a positive attitude toward unpleasant facts and realities generally out of our control. If most of medieval Europe is going to be decimated by plague, after all, we might as well twirl in a circle and fall down laughing hysterically. This, it seems to me, is the attitude we should have toward much of what happens in our lives.

But Rain Rain Go Away?

The other rhymes encourage positive attitudes toward the inconvenient facts of reality. This one teaches us to demand that reality itself change, that reality adapt to us – rather than teach that we, ourselves, ought sometimes adapt to reality.

Must the rain go away simply because we want to play? Is the world, overall, going to be a better place if little Johnny gets to play kickball this afternoon? It’s not all about you, or me, Maimonides reminds us. Maybe the world will be a better place as a whole if Johnny stays indoors today and works on the cure for cancer instead.

Now, if what I have said has taken the sheen off a beloved refrain from childhood, please know I am deeply sorry to have rained on your parade. But it seems to me that popular culture, in its little ditties and funny phrases, often contains some pretty deep and important lessons – lessons worth some reflection, at least during yet another dreary, drizzly afternoon.

Now if I could only make things right as rain, believe me, I would.

Unless they already are.

Monday, August 3, 2009

It's All English to Me

I recently learned that the expression “It’s all Greek to me” derives from medieval, Latin-speaking philosophers bemoaning their inability to read ancient Greek texts. That made me wonder what the Greeks say; which, it turns out, is “It’s all Chinese me.” Before investigating what the Chinese say, however, I realized I’d have a deeper problem with whatever resource I might consult: it would be all English to me. And I don’t understand what understanding English amounts to.

To see why, a brief detour.

It’s easy to treat creatures around us as if they had minds like our own. Every pet owner believes that his beloved Fluffy has thoughts, desires, and feelings; we say things like “that ant hopes to get that bread” or “those weeds want to kill the lawn.” The hard question is whether such talk is literal truth or merely metaphor. It’s especially hard with respect to computers programmed to produce very human-like behaviors. Computers have gotten so sophisticated these days that it’s very easy to think a properly programmed computer could cross the line from merely seeming to have a mind to actually having one.

Here’s a reason to think it wouldn’t—and, at the same time, to question our own understanding of English.

Imagine a man locked in a room. Pieces of paper with strange marks come through a slot in the door; the man studies them, consults a rule book he has (in English), and then from some boxes assembles some new marks to return out the slot. The process repeats. He doesn’t understand these marks; he’s just mechanically following rules matching input marks with outputs. But unbeknownst to him the marks are actual Chinese characters. The people on the outside are native Chinese speakers who believe they are conversing, through writing, with another native speaker within.

Well, computers are like the man in the room: they’re purely mechanical devices which operate on electrical inputs to produce electrical outputs, all according to a program they follow mechanically. Just as the man with his rule book can perfectly simulate an ordinary conversation to outside observers, so too could a properly programmed computer. But just as the man does not actually understand any Chinese, neither does the computer understand what it is doing. Thus computers at best simulate mentality and cannot literally possess it.

This argument points to a crucial difference between computers and people, and thus gives us a reason to deny minds to computers while granting them to other people—but it also raises a difficult question. It assumes that there is more to “understanding” a language than simply being able to produce appropriate outputs given various inputs. After all, the man and computer both can do the latter but only the man allegedly displays the former. But what else is there? When you hear certain English sounds you know what other sounds are appropriate to produce in reply. You “genuinely understand” English. So what exactly is there to “understanding” beyond the ability to utter the appropriate responses?

That’s what is all Urdu to me.


Source: John Searle, “Minds, Brains, and Programs,” Behavioral and Brain Sciences, 3, 1980, 417-458. Reprinted in John Perry and Michael Bratman, eds., Introduction to Philosophy: Classical and Contemporary Readings, 3rd Edition (Oxford, UK: Oxford University Press, 1999).