This Dismal Cairo

On a trip back to Kentucky last week I got to take a two day road trip with my aunt and uncle, primarily to visit Cairo, Illinois. People who know Cairo may find that last clause surprising. It’s not exactly a tourist Mecca. At least there’s no border crossing to get there. One time years ago I stayed with a friend in Duluth, Minnesota, and when he had to work for a couple of days I took a side trip up to Thunder Bay, Ontario for no better reason than its name: Thunder Bay! It sounds like such a fun place, but the name is a lie. As far as I could tell, the city was just a series of strip malls loosely stapled to a two block downtown whose most striking feature was “the world’s largest building designed by a Ukrainian architect.” (I’m relying on memory for that last detail, but if it’s wrong, I assure you that the real answer was comparably weird.) The place was so down and out that merely wanting to go there got me in trouble. Crossing over the border both ways I was asked why I was visiting Thunder Bay, and when I said, “Tourism” I received funny looks and had my car searched for drugs. The border patrol apparently had a policy that no one in their right mind would go to Thunder Bay without an ulterior and illicit motive.

They’d probably think the same thing about Cairo, but fortunately there were no guards on either bridge linking the bottom tip of Illinois to Kentucky over the Ohio River or to Missouri over the Mississippi. In fact, there were scarcely any people at all. I should have known this – I had read that Cairo’s population had dropped from a peak of just over 15,000 in the 1920s to under 3000 today, and I had seen the sad pictures of the decrepit buildings in the old downtown that looked like little more than habit was keeping them upright. What I didn’t realize was that for whole blocks the buildings that weren’t falling down were already fallen. The old avenues by the Ohio levee where saloons and mills bustled around the time of the Civil War hadn’t taken the ghost town turn I expected, but instead had just disintegrated. If you hadn’t known a city had been there, you wouldn’t have guessed it. Much of the scene more closely resembled a quarry than a downtown.

Cities come and cities go, like everything else. The visit saddened me, though, because Cairo was important once, and there’s scarcely any sign left to remember that. Later on the day we saw Cairo we stopped at the Jefferson Davis monument, a 351 foot phallic symbol rising out of the flat lands of southwestern Kentucky, and a stunning reminder of the vast effort the losing side in the Civil War put into memorializing the conflict’s landscape. I can’t help thinking of the quip that we should have put Aaron Burr on the $10 bill instead of Alexander Hamilton, since after all, Burr won the duel. Cairo was the first seat of Union success in the war and there’s virtually nothing there to make that known. The city has been thoroughly Burr-ed.

Check out a map. Cairo hangs there at the very bottom of Illinois, dangling like a stray piece of free soil that the mighty rivers swirling around it could break off at any moment and amalgamate with the slave-holding lands to the south, east, and west. The city is lower in latitude than the Confederate capitol in Richmond, Virginia, and the attitudes of its inhabitants when Fort Sumter was fired upon were scarcely more favorable to a Lincoln-led Union than that geography would suggest. But the Union arrived there in the form of Grant and his army, and from that base they and Admiral Porter’s gunboats made the Mississippi, Tennessee, and Cumberland rivers grand avenues of invasion for subduing the rebel states in the west. While the fighting in Virginia amounted to nearly four years of stalemate the Union steadily ran over the west, and no site was more important to that effort than Cairo. Yet even the park marking the point just south of the city where Grant’s Fort Defiance once stood hardly deserves the name, being little more than one good rain from counting as a swamp.

It’s not as if the victors in the Civil War set up no monuments. Lincoln’s on the capitol mall certainly counts. But still it seems that winning produced less of a desire for memorials than losing did. Perhaps the Union side’s comparative lack of enthusiasm for the war after it was over helps explain why Cairo slowly faded away. Or maybe what we saw last week is just Cairo’s natural state. I don’t wish that to be so, but there’s historical evidence to support the claim. From the earliest European exploration in the area, settlers took it for granted that some great city should rise at the confluence of the Mississippi and Ohio rivers, but repeated failures kept destroying that assumption. Only the Civil War finally produced the investment in infrastructure that made Cairo reasonably large and prosperous, and as the decades wore on and that infrastructure wore out, nobody renewed it, and the prosperity and population departed. Much as Wagner’s music is better than it sounds, Cairo seems a better site for a city than it actually is. Charles Dickens wasn’t fooled. Here’s his description from 1842:

 A dismal swamp, on which the half-built houses rot away: cleared here and there for the space of a few yards; and teeming, then, with rank unwholesome vegetation, in whose baleful shade the wretched wanderers who are tempted hither, droop, and die, and lay their bones; the hateful Mississippi circling and eddying before it, and turning off upon its southern course a slimy monster hideous to behold; a hotbed of disease, an ugly sepulchre, a grave uncheered by any gleam of promise: a place without one single quality, in earth or air or water, to commend it: such is this dismal Cairo.

That’s not going on the brochures. But that’s just as well. We couldn’t find any brochures anyway.

 

 

 

 

The True Tale of Grandma Rathbone: An Odd Coincidence at Lincoln’s Birth and Death

Hello valued bistro customers,

My apologies for not having a fresh dish prepared just for the bistro today. I’ve been cooking up a couple, but I had to delay them because I was so busy sorting out an odd story about Lincoln, which I posted at my own blog:

http://brandonclaycomb.com/2013/misc/the-true-tale-of-grandma-rathbone-an-odd-coincidence-at-lincolns-birth-and-death/

I’ll have more good mental roughage for you next Monday. Until then, keep chewing!

Brandon

Walter Was Right

Late in the Coen Brothers’ The Big Lebowski, the Dude (Jeff Bridges) corrects Walter (John Goodman), informing him that the three German men threatening to hurt them are nihilists, not Nazis. Walter replies with typical profanity,

Nihilists! **** me. I mean, say what you like about the tenets of National Socialism, Dude, at least its an ethos

Walter was right, and there’s some interesting recent social science research backing him up. As this article in Slate explains, Nazism seems to have spread fastest in those German communities with the greatest concentration of voluntary civic organizations – choral groups, athletic teams, associations of animal breeders, and so on. The stronger the bonds between individuals in a community, the easier it was for one of them to convince others to come along and join the Nazi party. (My own experience playing on a softball team during my year in Germany serves as counter-evidence, but of course this was well into the post-Nazi era and we literally could not even turn a double play, so there was no danger of us ever annexing the Sudetenland. But I digress.)

The point I take from this analysis is simple: social power is not good in and of itself, any more than electricity is. Power is the ability to effect change in the world. The more that power is used to good ends, the better. The more its used to bad ends, the worse. The gasoline that fuels your car when you drive your child to the emergency room is power used well. You can also gas up with the expressed intent to run someone else’s child down. That’s power used poorly.

This point is so simple that I could understand someone questioning why it’s worth bringing up. The answer is that I think we often forget it when we make a standard critique of our contemporary world. As the Slate piece notes, political scientist Robert Putnam’s 1995 book Bowling Alone has been only one of the most prominent attempts to argue that Americans have largely ceased to take part in voluntary civic organizations that build trust, form relationships bigger than circles of friends, and make it easier for people to assemble spontaneously to take on a common problem. That is a shame, if we really are getting worse at building these sorts of attachments. That sounds like a recipe for greater loneliness and isolation. But even if such relationships are good in themselves – and I think they are – they might be used for ill purposes as well as worthy ones.

So the next time you see someone who seems like a total loner – the kind of person who will never join a bowling league – it may seem counter-intuitive, yet it might be right to say, “Well, at least s/he probably isn’t likely to ever become a Nazi.” Nihilists aren’t suited for an ethos.

 

Brief Reflections on Turning 42

So I turned 42 today. It feels like I’m getting into the groove of this being over forty thing. Actually turning forty seemed like a novelty. Forty-one was just more of the same. But forty-two? I can see the pattern now.

There’s something so cocky about calling this “middle age,” as if the universe owed me a good forty more years. But they didn’t call it “The Thirty Years War” until the war was over. They didn’t finish year fifteen and say, “Well, we’re half done now.” Things end when they end and you don’t know when. One of our girls got her front teeth slightly loosened on a festival ride the other day and had to go get a late night X-ray. (She’s fine.) When her sister found out that she had to go to bed while her sister got to stay up late having fun at the dentist’s office, she said, “That’s not fair.” I laughed and said, “I don’t think that’s what that word means.” But I guess I’d feel a similarly misplaced sense of injustice if I discovered that I was closer to the end of my stay here on earth than to the middle. Maybe that’s a bad attitude. Back when he was only mildly insufferable, way back when he co-hosted ESPN’s “The Big Show” with Dan Patrick, Keith Olbermann had a standard line when he had to report that a slightly injured baseball player was listed as “day-to-day”: “We’re all day-to-day.” So we are. Might as well face it.

I don’t honestly remember the event, but my first conscious experience of mortality must have been when my Great Grandfather Burt Thurman (“Mr. Burt”) decided he wanted to ride his horse one more time and got thrown. He lasted a little while in the hospital but the fall basically killed him. The whole family had been watching his ride, including me, just three years old. As they were getting ready to drive him to the hospital, I took it upon myself to cheer him up by saying, “Mr. Burt, a lot of people think it’s awfully funny, an old man like you falling off a horse.” They told me he laughed about it later, but he wasn’t very happy at the time. I like to think that I was just trying to give him a little perspective.

About three years later I received my second exposure to death. Mom and I came home from school on a hot early June day and found our sheep dog Morton had strangled himself trying to get to his water. Mom ran inside. At the time I thought she was upset, but now I’m pretty sure she was running to call for help. In any case, the screen door slammed behind her and stuck, so I couldn’t get in. I had to stand there for what seemed like a long time right beside my dead dog. I didn’t like that.

But while I’m sad that Morton died so painfully and that Mr. Burt got hurt falling off his horse, I appreciate a little better now why we all have to move on. There’s a time to bloom and grow and a time to decay and fade, to make room for the next season’s blooms. I do like to imagine that every day in every way I’m getting better and better, but there’s increasing evidence that it isn’t so. If I’m improving at anything, it’s accepting that the roller coaster doesn’t only go up, and that most of the fun is in the coming down. So, as much as my planning counts for – which might not be much – I plan to be around for a lot more of this ride. But I may have already passed the peak. And that’s alright. Do not rage against the dimming of the light. Go gently in, and then sleep tight.

 

Hitting off the Tee in the Game of Reasons

One staple of Philosophy is that we human beings inhabit (at least) two worlds: the one in which events are determined by physical causes and the other in which actions are governed by reasons.

Wilfrid Sellars

Most books in pictures of professors like this are cardboard fakes from IKEA. Colleges and universities hand them out when faculty are hired.

The 20th century American Philosopher Wilfrid Sellars invented the phrase “The Game of Giving and Asking for Reasons” to describe aspects of the second world, and I’m adapting it here.

No, no. Wrong game. In the Game or Reasons you don’t win or die – you’re just right or wrong.

As I’ve mentioned in some other dishes here at the Bistro, there’s ample evidence that our reasoning doesn’t always function in our lives the way we like to think it does. We’d like to believe that our reasons are the causes of our behavior rather than just their after-the-fact rationalizations and excuses, but the evidence indicates that this is true only in certain controlled circumstances. Nevertheless, learning to play the Game of Reasons remains essential to our humanity in two respects: first, if we ever want to overcome our worst impulses – whatever they are – good reasoning will play a necessary role; and second, if we don’t want to overcome those impulses, but also don’t want to be blamed for acting on them, reasons will come in handy then too.

That’s why I’m always pleased to see my step-daughters hitting the ball off the reasoning tee. Here’s one of my favorite examples. Last fall A was sick and had to stay home while her twin sister B went to school. Around noon I needed to run some errands and took A with me. We stopped to get some cash and A said, “Look, it’s a McDonald’s.” I agreed that the building next to us was, indeed, a McDonald’s, and A continued by saying, “I sure would love some McNuggets right now.”

But I knew that didn’t tell the whole story. McNuggets, as you likely know, are sufficiently nasty by themselves. I can’t remember or find the exact account, so I hope I’m not making this up, but I seem to recall Anthony Bourdain – author of Kitchen Confidential author and star of shows such as No Reservations – being asked in an interview the strangest food he had ever eaten and answered “unwashed warthog anus.” (I’ll give you a second to reread that last phrase. Now let’s move on.) Oddly, the interviewer followed up by asking what was the most disgusting food he’d ever eaten, and without pausing Bourdain responded, “A McNugget.” The girls’ mother and I agree in principle, but on the rare occasions when we do take them to McDonald’s as a special treat they kick the gross factor up a notch by ordering vanilla ice cream cones and dipping the McNuggets in the ice cream. Keep in mind that these are people who mix virtually no foods. If we put vegetables in their eggs, they react like LL Cool J’s character in Toys. But this combination works for them. My arteries seize up in sympathy just watching this meal.

Fortunately, I had an answer. “Honey,” I said, because I talk that way, “I need to go to The Grange to get some hay and pellets for Turbo.” (And yes, we have a guinea pig named Turbo. Don’t judge us.) “That’s near the Chocolate Factory and I thought we could go there and pick out some candy for you and your sister.”

You don’t need a golden ticket.

A pondered this proposal for a moment as I pulled out onto Gilman Boulevard and said, “Well, if you think about it, I still have candy left over from last year’s Halloween and this year’s is coming up again soon. I don’t really need any more candy. I’d rather have the McNuggets.”

Well argued, I thought to myself. But I had an ace up my sleeve. “That makes sense, honey. But I know you’ll want ice cream with your McNuggets, and I don’t know a good way to get ice cream home for your sister.”

Again A thought this over as I continued on toward The Grange and the Chocolate Factory. “Well,” she said, “we don’t have to tell her.” 

“That’s true,” I responded, doing my very best not to laugh, “but I’m not sure that’s a good idea. Think about if the situation were reversed and your sister got to go to McDonald’s and you didn’t, and we didn’t tell you. I don’t think you’d like that very much if you ever found out.”

We crossed the last major street before our destination and I thought I had won. Then A played her last card. “One time last summer J (our neighbor) gave us all candy bars, except for T (her son), and she told us not to tell him.” I looked in the rear view mirror. A looked back at me calmly. She didn’t yet know the phrase, “So there’s precedent,” but I could tell she got the concept. I turned around. We went to McDonald’s. It was as gross as ever, watching that deep fried breading dip into the sugary white goo. And I knew that I was rewarding problematic behavior. No amount of explaining exactly why I turned the car around would convince the Dopamine receptors in A’s brain – which at that very moment were marinating in all that oily, bready goo – that it was reasoning I meant to reward, and not duplicityBut I decided it didn’t matter. As usual, what was most important to me was that we were playing together – and one of my favorite games. We would work later on playing the Game of Reasons right. Right then, I was just so happy watching her take a smooth swing and make contact.

Photo credit: Cynthia Freeland, Bantam, Boehm’s.

War is Never Civil

It begins today: the 150th anniversary of the first day of the Battle of Gettysburg. My post on the 50 Greatest Names in the Civil War will appear at The Weeklings this Thursday, in honor of the conclusion of both the last major southern invasion of the North and the surrender of the Confederate garrison at Vicksburg. Come check it out.

Far from being civil, the war was distinctly cruel (though, as The Atlantic’s Ta-Nehisi Coates has pointed out numerous times, the preceding centuries had been as cruel or crueler to the slaves who did so much to build this country). One theory I’ve come across – and am sorry I can’t source – suggests that it was so unrelenting in large part because both sides were democracies. This goes against the grain of much contemporary political theory, which holds that democracies tend to work out their differences with each other by more peaceful means. There may be something to the latter idea, but the flip side is that when they do go to war, governments of, by, and for the people do so with a vengeance. This makes sense if you think about the fact that if a government that accurately reflects the will of its people decides to fight, those people are probably willing to suffer great hardship to see the struggle to its successful conclusion. When absolute rulers went to war in the late medieval and early modern periods, they could typically only rely on the loyalty of their vassals and their hired mercenaries, which seldom lasted past a few successful battles and one failed one. That’s part of why partisans on both sides of the American Civil War assumed that their war would be over soon. They assumed that after a few battles demonstrated which side had won, the other side would give up. But this was a different sort of conflict. Majorities in both the North and the South valued the cause they were fighting for enough to risk  their loved ones, their households, and their sanity. There are few if any clearer instances in the country’s history of people making sacrifices on behalf of something greater than themselves. Nor have we ever been exposed to such sustained horror.

I said something to Chef Robert once about not quite knowing why I was so interested in the Civil War, and he responded, “You’re from the South. Of course you’re interested in the Civil War.” And that’s fair. I’m not just from the South, I’m from Abraham Lincoln’s birthplace. And the county Lincoln was born in supposedly only gave him three votes in the 1860 election. (That may be an exaggeration, but he couldn’t have gotten many tallies.) Kentucky, like every border South state except for Virginia, stayed in the Union, but its allegiance was strained, as evidenced by the fact that it was Lincoln’s worst state in his 1864 reelection. By a margin about roughly 70% to 30%, Kentucky voters went for the Democratic candidate, former general George B. McClellan, who promised a return to a more limited war, restoring the Union without emancipating the slaves. They hadn’t given up the hope for a country that had ceased to exist and could never be again.

Trying to understand this perspective has led me down various paths through the endless woods that is the scholarship on the Civil War era. Recently I’ve been reading about the Whig party, and have run across a few surprising insights there. I had underestimated how early and how thoroughly slavery had begun to split the sections. I had thought that the Whigs started out fairly unified on economic issues – they were the “liberal” party, supporting higher taxes (in the form of tariffs), infrastructure projects (mainly roads and canals, and later railroads), and national institutions (particularly banks) – but fell apart in the 1850s over the expansion of slavery. But in fact, the Southern and Northern wings of the party were at odds from the beginning. At its midpoint, the devastating election of 1844, the Whig’s great champion Henry Clay struggled and failed to keep both portions happy, never finding a position on the annexation of Texas and the consequent expansion of slave territory that satisfied southerners without alienating increasingly abolitionist states such as New York. Had slavery not been an issue in that contest, the Whigs would likely have won, and many of the nationalizing projects that the Republicans ended up passing after so many southern representatives and senators departed Congress in the 1860s might have gone through much earlier, and American history might have progressed entirely differently. But it always had been an issue, and couldn’t help being one. What had brought the Whigs together was opposition to the allegedly monarchical tendencies of President Andrew Jackson. But this was a flimsy basis for a coalition, tying together as it did those who wanted a weaker executive and a stronger nation with those who wanted a weaker executive and more independent states – most of all, so that the United States would never be sufficiently powerful to abolish slavery. There’s a great early Simpsons episode when Apu is applying for citizenship. When his interviewer asks him what caused the Civil War, Apu begins a long dissertation on the complex economic and social forces that had increasingly divided the two sections over decades, and the interviewer interrupts and tells him, “Just say ‘slavery.'” It’s funny, but maybe for the wrong reason. The deeper I dig on the Civil War, the more I find slavery at the bottom. It’s been called America’s original sin, and I’m inclined to agree.

The other finding that’s surprised me is how much the political parties of the era were explicitly founded in opposition to particular groups. The best example of this is the American party, also known as the Know-Nothings. The Whig party did fall apart in the early 1850s, largely because the southern wing thought the northern part was too abolitionist, even though abolitionist voters in the North thought the Whigs were too pro-slavery – because of their affiliation with the party’s southern wing. The Republicans – a distinctly anti-slavery and almost entirely Northern party – were the ultimate beneficiaries of this collapse, but that wasn’t inevitable. The Americans, their main rival, won many state-level elections in both the North and the South. Like the Whigs, they tried to avoid questions about slavery. But rather than stressing economic issues, they instead focused on opposing immigration and immigrants – particularly Catholics. The Democratic party was splitting on sectional lines at this point too. The electoral weight leaned toward Illinois’s Stephen A. Douglas and his principle of popular sovereignty – the right of each state to decide whether or not it would have slavery. This was almost surely the median position of the country’s voting population as a whole, but by this point it had become anathema not just to abolitionists but to the pro-slavery faction as well. When Douglas and his platform won out at the 1860 Democratic national convention, the bulk of the southern delegates walked out and held a separate convention to select their own candidate, which ultimately handed the Presidency to Lincoln. And so, during this period pretty much the only party with claim to a truly national constituency was the Americans. Or to put this another way: the fight over slavery so divided the South and the North that the only thing a significant section of the electorate across both regions could agree on was how much they distrusted Catholic immigrants.

It’s a problem of collective action. For a nation to be more than a loose collection of individuals, we need a cause greater than our solitary self-interest. But as Machiavelli knew, fear is the easiest group motivator. Most white southerners were so scared of slave revolts that it led them to try to destroy the country. Northern Republicans were scared of the spread of slavery and the loss of the West for expansion by free-soil whites. Americans were scared of all the Irish and Germans they thought were taking their jobs and spreading crime. These were all ideals strong enough to bring parties together. And I find that scarier than I find it inspiring.

And yet, and yet. Our ancestors did fight that war. And they did end slavery. It’s sad that that’s what it took. It’s sadder still that the further struggle to overcome the legacy of slavery lasted another century and more. But today I suggest that we celebrate even the country’s imperfect triumphs, along with its never-ending effort to deliver a “new birth of freedom.

Our Blinkered, Biased Friends

How biased are you? What about your friends? One of these questions is simpler to answer than the other.

It’s easier to triangulate our friends’ judgments than our own. Imagine, for instance, sitting between two friends who are rooting for opposite teams in a game you’re not personally invested in. And let’s say that they are both emotionally invested enough in the outcome that they’re at least tempted to view each call in favor of their team as fair but each call against their team as a travesty. You’re in a tough position as a friend of both people but in a great position to test which of these people is more objective. You get to evaluate each call by the officials, and to hear both sides’ responses to those calls, without being swayed by your own elation or devastation as a fan. If you put your mind to it, you can probably give a fair judgment about which of these two friends is best able to set aside what she wants to be true and see accurately what actually is true. (Please be warned that usually isn’t a service that people want from their friends unless they’ve asked for it. And sometimes not even then.)

It’s much tougher, of course, to triangulate your own judgments like this. You can try, but you have to put yourself in two positions at once: you have to be both the interested fan and the disinterested observer of the fan. Most of us don’t do this naturally but there are some tricks that can help, as one of my favorite stories shows. Early in his career an Israeli psychology professor was put in charge of a special project. He headed an interdisciplinary committee tasked with running a major conference and then publishing the conference papers as a book . As it happens, the theme for this conference was objectivity.

At the committee’s third meeting the psychology professor said, “Let’s use some of these objectivity developing techniques we’ll be talking about at the conference on ourselves. First, let’s check out our subjective intuitions. How long do we think that this whole project is going to take from start to finish?” They went around the room giving their best guesses. Their answers ranged from twelve months to two and a half years, with eighteen months as the median answer.

Then the professor said, “Now let’s ground our judgments more objectively. Dean, you’ve worked with two of these projects already, correct?”

The dean said he had.

“From what you’ve seen so far, would you say our group is performing better, worse, or on par with those other two groups?”

The dean answered, “Probably a bit worse, actually. We’re going a little slower.”

Then the professor asked, “And how long did those two earlier projects take from start to finish?”

The dean blanched. “They both took eight years.” He had had this evidence at his disposal when he gave his more subjective answer earlier, yet then he had said he thought the current project would only require two years.

It took nine.

So it is possible to get a more objective view of the biases that skew our own judgments. But there’s another challenge, which the article “Objectivity in the Eye of the Beholder” reveals. We all feel like we try to check how biased or objectivity we and our friends are. And when we see one of our friends being a typical unreasonable fan – the kind who sincerely believes that the referees are trying to make her team lose – we’re generally comfortable chalking their behavior up to their biases. But what about when we see our own biases, as the dean did? Are we as willing simply to say, “Well, I guess I’m biased”?

Not at all. And in a sense, there’s a good reason for this. After all, when I do recognize my own bias, I don’t usually just throw up my hands. I either try to make the appropriate adjustments or else convince myself that I didn’t really make a mistake in the first place. But note that both of these responses are ways of preserving my sense that my judgments are consistent with reality. In the one case I commit myself to making better judgments, whereas in the other I decide that reality really is in accord with the way I see it. Each is preferable to accepting that I’m just consistently off. That would be like beginning most sentences with, “I know this is wrong, but -” That sort of nonchalance is just terribly difficult to maintain.

Unfortunately, this makes us much more likely to recognize other people’s biases than our own. Since I thought this was an interesting finding I started sharing it at conferences. I found that people agreed wholeheartedly. I had never before seen whole rooms shake their heads so enthusiastically. Finally I realized this might be additional evidence for the same point, so I added a question as a test. The next time I presented the conclusion, “People have no trouble recognizing others’ biases but tend not to see their own,” and once again everyone nodded yes. So I asked, “Are you nodding because you think this is true about other people, or because it’s true about you too?” This time I received silence. Then about half the room laughed and the other half frowned.

So here’s the challenge: Can you sustain this contradiction, Whitman-style, and accept that your biggest bias might be that you cannot see your own biases? Or is this one of those Philosophical puzzles that is fun to play with but that we have to let go when it’s time to get on with our lives? (The classic example, of course, is Zeno’s paradox, which demonstrated convincingly that’s it’s impossible to ever get from point A to point B. At some point Zeno probably told a friend he was coming over that evening to share his great new paradox. I’m hoping the friend listened attentively and then asked, “Then what are you doing here?”)

Atoms and Anecdotes

I’ve been thinking lately about the reality of stories. I don’t have much penchant for doing science myself, but I respect its ability to uncover truth enough to consider the scientifically revealed universe as fundamental reality. That is to say, as much as I struggle to understand how atoms work, much less quarks, I believe that reality necessarily consists in the interactions of such particles. By contrast, I am significantly more invested in stories – hearing them, creating them, passing them on – even though I am much less certain about the extent to which I should consider them real. Now, you may say, “What difference does that make?” – but this is a Philosophy bistro, after all, and that’s really kind of a sour attitude you’re taking. Now here: have a refreshing cup of lime and mint-infused ice water as I get your appetizer order in. You’re welcome.

My stepdaughters know I’m full of it. We’ve lived in the same house together for a little over a year, and they’ve gotten in the habit of responding to almost anything I say by asking, “Really?” I always immediately confess if I’ve made up the thing I just said, and try to keep to something close to a 50% truth ratio going. It’s particularly fun when I’ve found some fact that seems so absurd that it must be fiction, so that I get to swear “Yes” over and over to each repeated “Really?” At the same time, it’s as important to me as it is to their mother that we be truthful with each other when we need to. Lying about, say, whether you threw a toy at your sister is never okay. What’s more, the girls seem to have some predilection toward science, and especially because I know how easy it is for girls and women to get pushed out of those fields, I’m doing my best to encourage empirical observation and other basic science skill sets to help them stay on that track if they choose to. Yet I smile most when they tell stories too – when they engage in some ridiculous word play or spontaneously come up with an absurd explanation. As long as we still keep track of what the truth is for when we need it, I value that kind of play more than I value the truth.

So maybe it just doesn’t matter whether stories are real or not. Except that, as a matter of fact, I think they are. That reality isn’t, of course, the same as that enjoyed by atoms and quarks. But stories are an aspect of our shared environment. Saturday Night Live’s Jack Handy once quipped, “I bet one legend that keeps recurring throughout history, in every culture, is the story of Popeye,” which is funny because that’s actually pretty hard to imagine. Popeye doesn’t seem a likely candidate for a cultural universal. But he is widely known to Americans, so much so that your audience is likely to understand many references to him instantly. I’m somewhat embarrassed to admit that nearly every time I throw a little olive oil in the pan I hear his scraggly voice singing, “Oh Olive Oyl.” (As voices in your head go, this isn’t a bad one.) But even if most people wouldn’t remember that, they would know that spinach instantly makes you extremely strong and prone to getting into fistfights with your enormous forearms. This isn’t a tremendously rich store to draw on. At most it might liven up, say, your encouraging your daughter to eat her spinach, or to not get in fistfights, or to eschew ethnic profiling. (What ethnicity was Popeye, anyway? And has anyone ever actually talked like that?)

Other stories are much richer, though. Sacred texts like the Torah, the New Testament, and the Quran have been passed down perhaps most of all through individual stories like the parable of the good Samaritan, which people repeat in order to make sense of their own and others’ actions. (The Coen Brothers’ A Serious Man has fun with this by presenting a series of inscrutable parables, including “The Goy’s Teeth,” that it resolutely refuses to explicate for the audience.) Harriett Beecher Stowe’s Uncle Tom’s Cabin provided such a condemnatory picture of slavery that pro-slavery authors fought back with competing novels with titles like Uncle Robin, in His Cabin in Virginia, and Tom Without One in Boston. But Stowe’s narrative largely won out and her account of Eliza’s escape with her son over the icy Ohio River provided common inspiration for abolitionists all over the North in the decade leading up to the Civil War. Generations later, many African-Americans would so object to aspects of Stowe’s characterization of slaves that “Uncle Tom” would become a denigrating epithet for a black man taken to act submissively toward whites.

All of these examples indicate some of the many ways that stories provide a shared means of interpreting our experiences, without which we would struggle much more to understand each other. Maybe if stories can have that much effect, it doesn’t matter whether we consider them “real” or not. “Made up” is a kind of real too.

Your Mind: A General or a Lawyer?

The Philosophy Bistro is a leader in the “slow thought” movement.

If you have read Daniel Kahneman’s Thinking, Fast and Slow, let me give it a plug here: it’s a fun and educational read for anyone interested in how the human mind works. Kahneman’s main innovation (in collaboration with his research partner Amos Tversky) was to show that normal human beings are vastly more irrational than was previously thought. Into the 1970s, social scientists – and not just economists – still assumed that under standard conditions, most adults acted rationally most of the time. Through a series of clever experiments, Kahneman and Tversky demonstrated that this is far from the case.  Here’s one example.

Imagine that Bruce Springsteen is coming to your town. (Odds are good that this is true.) And let’s say that you consider going. You think about how much you’re willing to pay for tickets, and decide that you’d spend $20 for a seat. This is simply a quantification of your preference. You’d rather listen to “Blinding By the Light” live than have that $20 still in your pocket. That is the current value of that experience to you.

But later that day you’re listening to the radio and the station’s having a contest and giving away a free ticket to tomorrow’s Springsteen concert to the winner. You pull over (do pull over, please), call in, and give the correct answer to the question, “What are the names of all of the US state capitals west of Los Angeles, California?” (Answer at bottom of this post.) Congratulations! You’re the proud new owner of a free ticket to see Bruce. You drive right over to the stadium to pick it up.

And then you lose it. That’s what you get for having lunch at Denny’s. Those scrambles take up the whole table. You shouldn’t have had the ticket out in the first place, but it was so fun to show off. (Next time, come to the Bistro instead.) But you still want to go to the concert. And you were thinking about buying a ticket anyway. So, you check the prices again. But they’ve gone up. They must be selling out, and the venue’s trying to maximize their take as they supply dwindles. Are you willing to pay more than $20 to get your lost ticket back?

If you were a purely rational being, your answer should be, “No.” The same ticket was worth $20 to you earlier in the day. It ought to still be worth $20 to you now. But this is where what Kahneman and Tversky termed “loss aversion” comes in. The threat of losing something we already have makes us value it higher. The typical “loss premium” in their experiments came out to about 50%. That is, if you’re like most people, you would pay up to $30 to get back the ticket you lost, even though you would have only been willing to pay $20 for it to begin with.

This is a bug built into our systems, and it causes trouble. I used to see it working on space issues at colleges. Office shuffles could be maddeningly difficult because (a) professors had greater veto power than employees usually do, so you needed at least their tacit approval, and (b) that approval was hard to get because almost nobody was willing to give anything they valued up, at least easily. You could show someone her new office, which was nicer  than her current space in every way that mattered to her – including the nice windows and its location near her colleagues and the cafeteria, but just enough off the main drag to provide a little quiet. But if it was 20 square feet smaller than her old office, she likely wouldn’t want to make the trade. Time and again, these discussions would ignore all the large gains and focus almost entirely on the small loss. I occasionally became frustrated because I was sure that the person I was working with was having trouble making a decision that would make them more satisfied in the long run because it felt so bad to give anything up. Many of these situations worked out eventually. Some didn’t. That’s life with loss aversion. And though I find it easier to see in others than in myself, I know I suffer from it too.

Things are even worse than I’ve made it sound, though, according to Kahneman. I’ve just given one example of human irrationality. His book is full of dozens, which he deploys to argue that mind can be thought of as two systems, which he terms simply “system 1” and “system 2.” Because the Bistro is a Philosophical venue, I’m going to ignore Kahneman’s cautions against reifying and romanticizing these systems (What else is Philosophy for?) and call them “the intuitive self” and “the reflective self.” The intuitive self is the one that acts instantly. It’s the you that reacts to an oncoming car by steering out of the way more quickly than your more deliberative decision-making process could function. The reflective self, meanwhile, is the one that thinks situations through in detail, that focuses intently, that calculates. So far, so good.

Here’s the thing: Your reflective self thinks it’s the one in charge. As the Civil War-era saying goes, “There’s nothing so like a god on earth than a general on a battlefield,” and that’s the way the reflective you thinks about itself. And why not? It’s so good at coming up with persuasive explanations for your behavior. But that’s part of the problem, as another Kahneman experiment demonstrated. They flashed participants a series of paired pictures, which they had them rate by comparative attractiveness. (Here’s one face. Here’s another. Which one was better looking?) The participants only got a glance, so this was their intuitive self at work. After they said “The first” or “The second,” the experimenter brought the preferred picture back out and asked why it was more attractive than the other one. With the added time, the reflective selves had no problem coming up with elaborate account of all the reasons for their choices. And at first glance this looks like a moment of harmony between the two systems: the intuitive self decides, the reflective self explains. It’s a perfect division of labor.

But it’s a sham. Or at least it can be. You see, half the time the experimenters brought back out the picture the participant hadn’t selected. (That is, the participant said, “The first picture is better looking,” and the experiment pulled back out the second one.) In most instances the participant did not catch the switch. (If this seems unlikely, click here.) And in those instances they had no trouble explaining their “choice”. That’s the big finding: your reflective self can make a case for a decision you didn’t make just as easily as it can for one you did. That’s not reasoning, that’s rationalizing. The reflective you isn’t the general it thinks it is. It’s a lawyer. 

That’s what lawyers do, after all. Within certain broad boundaries, they aren’t expected to pursue the truth of the matter; they’re expected to give you the best chance to win. That’s why they say such ridiculous things sometimes. A lawyer is someone who will say, “I didn’t do it; and if I did to it, I had a very good reason.” (This is what trips up Thomas More with the jury in Wolf Hall.) Most of us have trouble being quite so mercenary about our relationship with the truth, at least in public. We expect others to expect us to say whether we did it or not, full stop. But a good defense attorney doesn’t want to make things that easy for the prosecution. His goal isn’t to uncover what happened, it’s to keep his client out of jail. And with the least bit of cover that everyday life away from courtrooms tends to provide us, that’s the approach most of us take most of the time. Unless we put the extra effort in, our natural inclination is to explain events to ourselves and others in ways that put us in the best possible light.

At least that’s what Kahneman’s work suggests. And though I don’t mean to be a complete cynic, I am inclined to agree. We’re capable of better. Often we do better. But without deliberate and skilled effort, this is our fate: to live with half the mind of a lawyer. And that’s a bit of a shame, given that we generally don’t think that much of those who have the whole thing. (Note: I kid because I love. When your father, brother, and wife are all lawyers, you hear a lot of lawyer jokes.)

Let me close with a question: Maybe your mind is neither a general nor a lawyer; so what is it?

Photo credit: Farrar, Straus and Giroux

Answer to radio call in question: Honolulu, Juneau, Olympia, Salem, Sacramento, and Carson City. (The California coast tends dramatically east around Santa Barbara, much more than your mind’s eye would probably lead you to believe. People who got the correct answer can redeem their Springsteen ticket here.)