Stanley Milgram, the Phantom of the Laboratory

October 16th, 2014

Aeon Magazine has a brilliant piece arguing that the the Milgram obedience experiments are better viewed as performance art than as science:

To view the Milgram experiments as a work of art is to include the haunted young doctor as a character, and to question his reliability as a narrator. As an artwork, the experiments can tell us about much more than obedience to authority; they speak to memory, trauma, repetition, the foundations of post-war social thought, and the role of science in modernity. There is no experiment that can prove who we are but, in its particulars, art can speak in universals. Long after his tests are considered invalid, Milgram’s story will live on.

According to author Malcolm Harris, a recent book has cast critical light on Milgram’s science:

In Behind the Shock Machine (2012), the Australian journalist and psychologist Gina Perry assailed the very validity of the Milgram experiments. Although she initially came to the study of Milgram with sympathy for the haunted doctor, Perry quickly found a more worthy object for her feelings: Milgram’s subjects. Reviewing transcripts from the experiments in the Yale archive, she found a lot of disobedience hidden in the obedience numbers, and a number of confounding variables. For example, Milgram made sure subjects knew the payment for participation was theirs even if they walked away, but in the transcripts this seems to have triggered reciprocity with the experimenters. One subject continues only after the experimenter tells him he can’t return the money. Another obedient subject remonstrates after she’s finished obeying, because she quickly understands what the experiment was really about and is disgusted. In the drive for quantitative results, the procedure ignored valuable qualitative information. ‘I would never be able to read Obedience to Authority again without a sense of all the material that Milgram had left out,’ Perry writes, ‘the stories he had edited, and the people he had depicted unfairly.’

Read the whole piece, it’s fascinating. Ever since I became aware of science theater as a thing–about nine years ago or so, when I joined the Underground Railway Theater board–I’ve been surprised that so few plays focus on the social sciences, particularly psychology. It’s all wonderfully wifty mathematical metaphors or inspiring laboratory breakthroughs or medical ethics–never Pavlov or Milgram or Zimbardo. There have been a few plays about H.M., whose surgery-induced amnesia provided psychologists a chance to discover much about the workings of memory, and Freud and C.S. Lewis debate philosophy, more than science, in a popular one-set two-hander. But that’s about it.

Psychology experiments are wonderfully theatrical. Even the most boring thing you can do–give undergraduate psych students a bunch of surveys and flog the results for correlations*–requires a set, a script, and carefully arranged props. As the experimenter, you are playing a role and must stay in character. And that’s just surveys. When you get into social psychology and experiments with deception and confederates, it’s explicitly dramaturgical.

Which makes me wonder, then, if that isn’t part of the reason psychology doesn’t find itself onstage much? When the experiment itself is a little playlet, maybe that’s hard to dramatize. Except backstage dramas, comedies, and musicals are terribly popular, and twice-baked potatoes are a delicious food, so why should thematic doubling be so problematic?

Maybe the problem is that the results of psychology experiments, certainly the most famous ones, aren’t inspiring. Science plays tend to have the human being as the subject of the science, an inherently agentic and frankly inspiring stance. We are the species who figured out our origin! We can reach the stars and cure disease! Occasionally, a play will focus on people as the objects of science or technology–patients struggling with the complexities of medical science and politics, workers displaced by machines. This, too, is agentic, and if not inspiring, it can at least be ennobling. We are complex and worthy! We will fight for our rights!

Psychological science, and any stories you can think of to tell about it, has humans as both its subject and its object. It’s all us. And when it’s all about us, there’s no privileged place to put the human perspective. We are the dark continent being explored, and we are the explorers. And the bottom line is that what we’ve found in many of those explorations is extremely complicated, qualified, but undeniable evidence that under many circumstances, humans suck. We conform needlessly yet ignore important information. We literally do not see what is in front of our eyes. We are suggestible, vain, overly influenced by inappropriate cues, and wildly mistaken about our own nature. We are tribal, mistrustful yet gullible.

Discovering this does not make us feel inspired, or ennobled. The Milgram experiments can be viewed as art, and might indeed have been better art than science, but psychological science in general doesn’t tell the kind of stories that audiences want to hear. Science is about increasing our knowledge and control of the world around us. Psychological science shows us over and over how little knowledge and control we have, even of our very selves.

Do I sound like a Victorian, saying that audiences want to be “inspired” or “ennobled” by tales of scientific derring-do? Perhaps. But they do, dammit. The only possible response to learning about the Milgram experiment for the first time is “Aw, fuck. Really?”

You just can’t leave an audience in that state of mind and expect word-of-mouth to sell out your show. You just can’t.

*Needless to say, this is that thing I swore I would never, ever do for my dissertation, and then wound up doing.

Art & empathy

October 13th, 2014

I’m trying to choose reading material for our upcoming vacation, and am debating whether or not to give the “Game of Thrones” books a whirl. I like political intrigue and family drama and dark, violent themes in my books–I also read fast, so a nice long read is a good thing, for a trip. (Even with a Kindle, it’s nice to stay in the same fictive world for a while, when your external environment changes every day.) On the other hand, I’m terrible at visualizing while reading, so extended battle sequences are a no-go, and also I can never follow espionage plots, so there’d better not be any of those.

I’ve asked my friends to chime in, but of course they aren’t going to take any of that into account, they’re simply going to tell me whether or not to read the books based on whether or not they liked them.

It’s a real cognitive load, apparently, to regenerate your memory of a particular experience and judge it against someone else’s criteria. I got interested in that idea a few years before I finished grad school–the idea that most people, even thoughtful and other-oriented people, find it very hard to recommend books (or movies, leisure activities, or restaurants) that will appeal to someone else. The mind defaults to a kind of distributive property of affection: If I like Friend, and I like Book, Friend will surely like Book!

Recently I was in D.C. and had dinner with a writer friend and his wife, a lovely and gracious couple, and mentioned to them after dinner that I was planning to visit the Smithsonian Museums the following day. I must, I was told immediately and enthusiastically by my friend, must see the Air & Space Museum. As he extolled its virtues I made eye contact with his wife, and in her dancing eyes I read the following: “You’re all about the First Lady dresses and the Great Hall of Mammals, aren’t you?” Yes, yes I am. Technology and machinery do not excite me, and I think my friend probably realized that on an abstract level, but his enthusiasm got the better of him. The kind of arts-empathy I was looking for really is difficult to generate and maintain.

There are at least two reasons for this. One is the unconscious assumption mentioned above, that everything I love must also love each other. The other is that memory doesn’t work like a video recorder. We remember what we encode, and we encode what is relevant to us. When my friends with kids ask me if a particular book or play would be appropriate for their child, I usually can’t answer, because I wasn’t watching or reading through that filter in the first place. So the stuff that I’m looking for, or looking to avoid, in an artistic experience might not even exist in your memory of that experience.

I think, though, that the basic reason people are terrible at predicting what other people would like is simply that they’re not trying hard enough. Almost all social reasoning–figuring out who to trust, what social cues to mimic, etc.–is done through rough, semi-aware heuristics. You can improve your reasoning through active effort. Social interaction is so much the water in which we fishies swim that half the time we’re not even fully conscious of it.

I used to randomly shove books that I loved at people I loved, and sometimes it would “take” and sometimes it wouldn’t. In grad school, my dissertation was on mental models of literary genres–or how people think about different kinds of stories–and started using my own research to better predict what I, and my good-read-seeking friends, might enjoy. It’s a fun exercise.

Here are some questions to ask yourself about what types of books (or plays, movies, television shows–stories are stories, to some extent) you enjoy.

Do you prefer

… stories about a complex individual or stories about a whole society?
… stories in which people compete with each other or in which they cooperate to solve a problem?
… stories that are universal in theme or stories that paint a vivid picture of a particular time and place?
… stories about extraordinary, unusual people and events, or stories about the ordinary and everyday?
… stories in which people are from very different walks of life, or stories about groups of equals?

It’s a different way of thinking than the usual “mystery,” “science fiction,” “romance” categories. See if it helps you make better recommendations!

Stephen King is making Ebola worse

October 7th, 2014

Salon addresses Ebola panic:

Ebola, at least from the American perspective, is something like the great white shark. It’s dangerous, all right, but the odds that it’s going to get you are vanishingly small. Fear of large predators and fear of the plague are deeply encoded in human experience and handed down from our ancestors. Maybe an instinctive response is invoked that we can’t resist. But in both cases, the self-refueling cycle of media panic is an epidemic that’s almost certainly more destructive than the original phenomenon itself — and the fear is not really about what we claim it’s about.

Author Andrew O’Hehir identifies the usual suspects for our collective overreaction: cognitive biases honed by evolution, fear-mongering by Fox News and its ilk, and the fact that the Ebola epidemic fits neatly, oh, far too neatly, into the kinds of stories we’ve already learned to tell and read:

Indeed, I’d suggest that Ebola-panic (like shark-panic) is shaped and informed by fictional thrillers — in this case, yarns about civilization-destroying plagues and the zombie apocalypse and so forth. It also taps into our cultural narcissism and xenophobia, into the paranoid imperial perception that American civilization is the center of the world and also that it’s precariously balanced, and constantly under attack from dangerous outsiders. All it takes is a handful of African visitors with cardboard suitcases and undiagnosed infections, and next thing you know the cable goes out at Mom’s house and we have to eat the neighbors.

Theater and science bump up against each other in all kinds of ways, and one of those ways is understanding the psychological science of storytelling. Humans are a narrative species, we put everything in story form–but reality is under no obligation to actually unwind itself like a well-told tale. In real life events may occur that do not foretell, call back to, or symbolize anything at all. They just happen.

Storytelling can be crucial to good science, but one thing science does is to slap us out of that storifying instinct, and give us a way to demonstrate reality to other people besides telling stories about it. Artists tell. Scientists show.

I’m struggling now to have a rational response to the Ebola crisis. Practically every friend I have has posted the NPR “You’re Not Going to Get Ebola Already” graph:

… and I believe it, I really do.

But if there were going to be a zombie apocalypse … this is what the beginning of it would look like.

I’m a Stephen King fan going back years, see, and what people who think they don’t like Stephen King don’t realize is how utterly mundane and realistic his work is. Until the werewolves show up. But until then, it’s ordinary people living ordinary lives. A New England couple, say, who are doing basically okay, although she’s a little bored in her career and he’s coming off a big project and feeling burned out and they’ve both got some eldercare worries hanging over their heads and are planning a vacation in the Southwest to recharge their relationship.

And as he’s digging out from a mountain of licensing agreements and P&L statements and she’s looking up dude ranches in Flagstaff, they see the headlines and video clips from Africa … and then the quieter news of one patient identified in Dallas … and an editorial in the nation’s paper of record about what “virologists are loath to discuss openly but are definitely discussing in private.”

This is exactly how Stephen King would write it.

And stories fit in my head better than statistics. I don’t have to behave irrationally, and I can despise the fearmongering and xenophobia that people are bringing to this situation, but I can’t respond to it as though I haven’t spent decades reading and watching stories that began exactly like this.

Art will always have unintended consequences. Stephen King is a great humanitarian, a good writer, and by all accounts one hell of a mensch. But he’s taught us how horror looks–not in a Transylvanian castle, but in a Somerville three-decker. He’s taught us to see the terror in the everyday, he’s pulled it out of the gothic tradition and pushed it into comedies of manners and coming-of-age tales. So that now, when we see some loose thread of worry, it’s so easy to imagine pulling it until the entire garment of our comfortable-if-annoying middle-class lives unravels.

Science informing theater: Autism-friendly “Lion King”

October 1st, 2014

A friend of mine posted this on Facebook* and I found it fascinating. “The Lion King,” playing in Boston through October 11, will be doing an “autism-friendly” performance on October 10. From Boston Magazine:

The show is still the same production that we all know and love, but with some slight tweaks in order to create a sensory-friendly and, most-importantly, judgment-free environment. Some of the unique elements include: a reduction of jarring sounds and overall intensity and volume level; the elimination of strobe lights focused on the audience; the addition of a “calming area” for audience members; and trained staff and volunteers to provide real-time support.

“They leave the house lights up so that people can come and go,” [director of state government affairs for Autism Speak Judith] Ursitti says. “That’s a big accommodation that they provide. Many times, people with autism need a sensory break and they need a place to go. The production itself, what you see on the stage, the changes are subtle. It’s mainly sound and lighting changes. The scene with the hyenas in the elephant graveyard where there’s a lot of little geysers shooting up and there lots of light and noise, they only do one little light, and special effects like that are reduced.”

Now that’s science theater! The idea that autistic kids might enjoy plays, but have a hard time coping with the sensory overload and the social rules of theatergoing, is frankly groundbreaking. Until recently–I mean until very recently–we were thinking of autism only in terms deficits in social reasoning. And if autistic people didn’t understand the games people play and the motivations that led them to play those games, what on earth could they possibly get out of going to a show? Increasingly, though, researchers are looking at the autism spectrum in terms of sensory processing. This is clearly the model that the modified “Lion King” is using.

This article in Salon–an excerpt from Gregory Hickok’s book on neurology and cognition–is a heavy read, but does an outstanding job explaining the various controversies in the field. Here he is on the logic of the sensory-overload hypothesis:

This kind of effect—hyper-responsivity leading to avoidance— is observed regularly and uncontroversially in the sensory domain. Autistic individuals often cover their ears when even moderately loud sounds are present in the environment and exhibit other forms of avoidance behavior. As with the rock concert sound system example at the beginning of this chapter, if an autistic person failed to get information out of moderately loud sounds or simply left the room, we wouldn’t say that he or she had a diminished capacity to hear the sound. The response is more readily explained as an increased sensitivity to sensory stimulation. As autistic author Temple Grandin said in a radio interview, “How is a person going to socialize if their ears are so sensitive that just being at a restaurant is like being inside the speaker at a rock ‘n’ roll concert and it’s hurting their ears?” Good question.

One piece of evidence cited for autistics’ supposed lack of concern for other people’s mental states is that autistic people often do not look at faces, either in social situations or in lab experiments. However, what if faces contained too much information for them to focus on?

Also consistent with the alternative, emotional hyperreactivity hypothesis are statements from autistic individuals themselves. Here’s a sample gleaned from a paper covering face processing in autism: It’s painful for me to look at other people’s faces. Other people’s eyes and mouths are especially hard for me to look at.

My lack of eye contact sometimes makes people, especially my teachers and professors, think that I’m not paying attention to them.

—Matthew Ward, student, University of Wisconsin

Eyes are very intense and show emotions. It can feel creepy to be searched with the eyes. Some autistic people don’t even look at the eyes of actors or news reporters on television.

—Jasmine Lee O’Neill, author

For all my life, my brothers and everyone up ’til very recently, have been trying to make me look at them straight in the face. And that is about the hardest thing that I, as an autistic person, can do, because it’s like hypnosis. And you’re looking at each other square in the eye, and it’s very draining.

—Lars Perner, professor, San Diego State University

These are revealing statements for two reasons. First, they provide a clear indication of an intact theory of mind in these individuals (“my lack of eye contact . . . makes people . . . think that . . .”). And second, active avoidance of eye contact provides just as much evidence for sensitivity to the information contained therein as does active engagement of eye contact. If you can’t recognize that there is information in the eyes, why avoid them?

In this piece from the New York Times, a father recounts how Disney movies have enabled him to connect with his autistic son. Owen Suskind’s extreme affinity for Disney movies gave him an emotional vocabulary, a set of images and metaphors and models for being that he could use to interact with the world around him. He learned to read by sussing out the credits.

Owen’s chosen affinity clearly opened a window to myth, fable and legend that Disney lifted and retooled, just as the Grimm Brothers did, from a vast repository of folklore. Countless cultures have told versions of “Beauty and the Beast,” which dates back 2,000 years to the Latin “Cupid and Psyche” and certainly beyond that. These are stories human beings have always told themselves to make their way in the world.

But what draws kids like Owen to these movies is something even more elemental. Walt Disney told his early animators that the characters and the scenes should be so vivid and clear that they could be understood with the sound turned off. Inadvertently, this creates a dream portal for those who struggle with auditory processing, especially, in recent decades, when the films can be rewound and replayed many times.

The latest research that Cornelia and I came across seems to show that a feature of autism is a lack of traditional habituation, or the way we become used to things. Typically, people sort various inputs, keep or discard them and then store those they keep. Our brains thus become accustomed to the familiar. After the third viewing of a good movie, or a 10th viewing of a real favorite, you’ve had your fill. Many autistic people, though, can watch that favorite a hundred times and seemingly feel the same sensations as the first time. While they are soothed by the repetition, they may also be looking for new details and patterns in each viewing, so-called hypersystemizing, a theory that asserts that the repetitive urge underlies special abilities for some of those on the spectrum.

Disney provided raw material, publicly available and ubiquitous, that Owen, with our help, built into a language and a tool kit. I’m sure, with enough creativity and energy, this can be done with any number of interests and disciplines. For some kids, their affinity is for train schedules; for others, it’s maps. While our household may not be typical, with a pair of writerly parents and a fixation on stories — all of which may have accentuated and amplified Owen’s native inclinations — we have no doubt that he shares a basic neurological architecture with people on the autism spectrum everywhere.

The challenge is how to make our example useful to other families and other kids, whatever their burning interest. That’s what Team Owen seems to be talking about. How does this work? Is there a methodology? Can it be translated from anecdote to analysis and be helpful to others in need?

Yes, parents of neurotypical kids, there are children who want to watch “Frozen” over and over again in a way that makes your daughter look like a quitter. Let it go!

From laboratory to stage to family rec room, scientists and artists and parents are using stories and theater to understand the human mind–and using our increasing knowledge of the human mind to tell stories in new ways. Ways that more of us can understand.

This kind of thing excites me, and fills me with great hope.

Yes, I am on Facebook! Also Twitter. Come see me!

Autobiographical memory and H.M.

September 24th, 2014

Do women remember life events better than men?

A better question might be, do little girls get taught to remember better than boys? According to Slate, this might be the case:

Researchers are finding some preliminary evidence that women are indeed better at recalling memories, especially autobiographical ones. Girls and women tend to recall these memories faster and with more specific details, and some studies have demonstrated that these memories tend to be more accurate, too, when compared to those of boys and men. And there’s an explanation for this: It could come down to the way parents talk to their daughters, as compared to their sons, when the children are developing memory skills.

To understand this apparent gender divide in recalling memories, it helps to start with early childhood—specifically, ages 2 to 6. Whether you knew it or not, during these years, you learned how to form memories, and researchers believe this happens mostly through conversations with others, primarily our parents. These conversations teach us how to tell our own stories, essentially; when a mother asks her child for more details about something that happened that day in school, for example, she is implicitly communicating that these extra details are essential parts to the story.

And these early experiments in storytelling assist in memory-making, research shows. One recent study tracked preschool-age kids whose mothers often asked them to elaborate when telling stories; later in their lives, these kids were able to recall earlier memories than their peers whose mothers hadn’t asked for those extra details.

But the way parents tend to talk to their sons is different from the way they talk to their daughters. Mothers tend to introduce more snippets of new information in conversations with their young daughters than they do with their young sons, research has shown. And moms tend to ask more questions about girls’ emotions; with boys, on the other hand, they spend more time talking about what they should do with those feelings.

A few years ago, I was leading a post-show talkback after a production of “Yesterday Happened,” a play at Central Square Theater about Henry Molaison, better known as “H.M.” H.M. was a man born in the 1920s who suffered from severe epilepsy, and the surgery used to cure it–removal of most of his hippocampus and amygdala–also prevented him from ever forming new memories. He lived in 10-minute increments, much like the man in “Memento.” Much of what we know about how human memory works is because of experiments performed on H.M.

Anyway, one of the audience members who stayed for the talkback pointed out that H.M. seemed to have a notable lack of memories from before his surgery as well, and what was up with that?

I said that I didn’t know, but that H.M. reminded me of my father in many respects–the same generation, general ethnic background and social class, IQ and intellectual ambitions, overall temperament–and he hadn’t had a lot of specific memories, either. I pointed out the fact that memory isn’t an automatic recording of events, and that my father simply never bothered to encode a great deal about his own experiences, and you don’t remember what you don’t encode. He was taught to value facts, observable phenomena, and social expectations–not his own personal mythology. He didn’t make a big deal about his life story and the various chapters thereof, the way people do today.

After the talkback, an older man in the audience came up to me and said I was exactly right about the psychology of men of his generation and station in life.

There’s a new play about H.M. in town, if I’ve managed to pique your curiosity! The guy was important–they never write science plays about the subjects of experiments, for heaven’s sake, and H.M. has been the star of two, now! The new one is “The Forgetting Curve,” by Vanda, a Bridge Rep production playing at the Boston Center for the Arts. This weekend they’ve got some great memory experts to lead post-show conversations:

Wednesday, 9/24 – Dr. Howard Eichenbaum; Dr. Daniel L. Schacter
Thursday, 9/25 – Dr. Ayanna Thomas
Friday, 9/26 – Bob Linscott
Saturday, 9/27 – Dr. Bonnie Wong

“The Forgetting Curve” runs through this Saturday. Check it out!

“Cartoon dramas,” political & personal

September 23rd, 2014

The Globe published a good op-ed this weekend by Meta Wagner, a writing instructor at Emerson, about “cartoon dramas”:

But, now there’s a new, popular TV genre that somehow pulls me in while preventing me from becoming fully invested. I’ve come to think of it as the cartoon drama.

With cartoon dramas, the people, the storylines, and the situations are so unreal — or perhaps hyper-real — as to be laughable, which perfectly befits cartoons but not traditional dramas. These shows (their precursor is “24”) take the most frightening and horrifying political events of the day and present them in an over-the-top, unbelievable, outrageous fashion. It’s television for an age where we’re concerned and terrified yet simultaneously suffering from compassion fatigue: the age of ISIS, ISIL, the beheadings of two American journalists, war in Syria, a do-nothing Congress, the militarization of our police forces, the Ebola virus, etc.

And so viewers not only turn to sitcoms and reality TV to escape, we also turn to cartoon dramas to confront the ugliness of current events, but in a way that can leave us ultimately untouched. Murder, torture, corruption — none of it sticks.

She identifies “Scandal,” “Homeland,” and “House of Cards” as three of the biggest offenders, or perhaps I should say “delighters.” Ever since Bertolt Brecht, we’ve known that while drama inherently draws people in, there are also techniques it can use to push an audience away–not in the sense of disengaging, exactly, but in the sense of making people aware, suddenly or stubbornly, that they are watching a piece of staged entertainment. Brecht called it the “alienation effect.” If you’ve ever seen a show where you can see all the ropes and pulleys backstage, or where the stagehands move the furniture around in plain sight, not trying to be unobtrusive–that’s a little Brechtianism, right there.

Television can’t simply show you the wires and hired help, like theater can, but it has other ways of reminding the audience that this is just a show. (Besides the most obvious one, commercials–which to this day no one has employed to better Brechtian effect than Alfred Hitchcock in 1950s “Alfred Hitchcock Presents” program.) Television can get the alienation effect by being over-the-top, or self-referential, or–and no stage director would dare try this–simply not very good.

I wrote a similar analysis to Ms. Wagner’s about “Law & Order: Special Victims Unit,” which I still consider, pace “Scandal,” to be the finest exemplar of the genre.

An aesthetic style that would continually shift audiences between sentimental empathy and critical awareness is called “epic theater.” It was a groundbreaking idea a hundred years ago, and the smartest theater artists in the world are still exploring this extraordinarily fertile concept today.

“L&O: SVU” achieves epic theater status by the simple expedient of not being very good.

Or, more precisely, being bad in very specific ways that keep the viewer from being overwhelmed by the horror of the actual stories portrayed in the show. Those stories, and the actors who play them–those are often very good indeed.

In the episode “Disabled,” for example, the detectives watch a video recording of a caretaker beating a paralyzed woman with a bar of soap in a sock. The woman in the wheelchair has advanced multiple sclerosis; she can feel the beating, but not dodge or even scream beyond choked moans and grunts. The video goes on for several minutes, one woman mercilessly pounding another across the head, face, breasts. The detectives are repulsed–even Ice-T is visibly shaken. The video cuts out.

After a moment of silence, the forensic psychiatrist, Dr. Huang, speaks. “I think Janice deeply resents having to care for her sister.”

YOU THINK? Let me tell you, Bertolt Brecht is kicking himself in his grave, if such a thing is possible, for not putting Dr. Huang in “The Good Woman of Szechuan.”

This is how “L&O: SVU” works. It doesn’t distance the viewer with theatrical “breaking the fourth wall” tricks. It distances the viewer by providing such an excess of information, which is never understood by the characters to be so, that the “Duh” response of any normal person is triggered several times an episode. This makes it possible to actually enjoy tales of horror that would otherwise be far too disturbing.

Whether the fears are international terrorist threats or the psychopath next door, “cartoon drama” helps you put them in a box and cope. You can read the whole thing here.

Story Collider at Oberon tonight (and my own science story)

September 23rd, 2014

Story Collider has a show at the Oberon tonight at 8pm, on the theme of “Survival of the Species.” Story Collider is one of those “paratheatrical science events” I’ve talked about, and it’s a good one. From their website:

Science surrounds us. Even when we don’t notice it, science touches almost every part of our lives. At the Story Collider, we believe that everyone has a story about science—a story about how science made a difference, affected them, or changed them on a personal and emotional level. We find those stories and share them in live shows and on our podcast. Sometimes, it’s even funny.

Tickets to tonight’s show are only $12–$10 for standing room–and you can drink, and meet interesting people, and walk around Harvard Square before or afterward. What are you waiting for?

I did a Story Collider last year, and I hope it was funny. It was about two baby rabbits that I raised as a child, and what I learned from them. Here’s the podcast–it’s about nine minutes long. The theme of the show I was in was “It Takes Guts.”

And a transcript:

Nine years old, and the next-door neighbor comes over with a big cardboard box. In the box, hasty handfuls of freshly mown grass. In the grass, two baby rabbits, the size of mice.

Do I want them, he says. He found them in his lawn, and picked them up before he thought twice, and now feared the mother would reject them. Did I want to try my hand at raising them.

Do you not understand that I am a nine-year-old girl? Some part of me wondered, as the rest of me shrieked agreement in a pitch so high a dog began barking across the street. Of course I want the bunnies!

My mother was ready to sue the guy. A Depression-era baby from Queens, she was like some deeply religious primitive who looks at animals with no grasp of their differences in locomotion or dietary requirements, only of their ritual cleanliness or uncleanliness. There were the horses in Central Park, which were Nice, and there were all other animals, which were Not Nice. Intellectually, she was capable of recognizing differences in species, but emotionally, every animal was either Horse or a Cockroach to her.

She was furious at the notion of Not Nice animals in her clean house, but the love of a nine-year-old-girl for small baby animals is a love that burns too bright to be denied. So she got her revenge another way, by explaining in graphic detail exactly why our neighbor had picked them up in the first place, and the connection between the unusually small size of the litter and the presence of that freshly mown grass in their box. The horror! Oh, the rabbinity!

Guts.

Which, on the metaphorical level, one of my little rabbits had to a far greater extent than the other. And this was where a psychologist was born. I would put my hand in the box—one would crawl in and explore, the other would race in panicky circles. One was tame and calm, the fat bunny Buddha of his cardboard world, happy to be petted, to eat bits of apple right off my fingertips. The other one treated me like I was a war criminal. My footsteps signaled terror. The day I took the box to a vacant lot and tipped it over, one dashed for cover—the other lingered, unwilling to leave.

Two rabbits. The same litter. The same rabbit upbringing, disrupted by the same nightmarish slaughter, the same miraculous rescue. And they were so tiny! Their little brains smaller than pencil erasers.

Somehow everything I had ever noticed about how the same song could make one person happy and another person sad, or how the kids in the Oklahoma school were nice to me but when we moved to Kansas I got bullied, or how Sunday School teachers could sometimes draw opposite conclusions from the same Bible story, crystallized around those rabbits and their impossible, irreducible difference.

Two years later, I read Watership Down, Richard Adams’ saga of a band of brave, bonny, British bunnies escaping existential threat for a better life. I cried for a week. (My mother was like, “Honey, it’s just a book. About cockroaches.”) I took to imagining the adventures my own foster rabbits’ adventure in Watership Down style, and it occurred to me that if those rabbits could tell their own stories, what very different versions they’d tell. Were their happy early days the source of a sustaining faith, or a childish illusion to be ripped away? Were the mysterious giants benevolent rescuers or only more subtle tormentors? Did the tipping over of that box into that field represent long-dreamed-of freedom, or expulsion into a savage and chaotic wilderness?

Personality is story. The story of a glass half empty or half full, if nothing else.

A friend of mine is a developmental biologist who works with all kinds of small lab-able animals, from mice to fruit flies. I asked her once how simple an organism could be and still have anything akin to personality.

She said she had worked with flatworms that can do one of two things with their lives: plank on the bottom of the beaker, or hug themselves against the side of the beaker. This is the big existential choice you face as a flatworm; being a career counselor for flatworms gets boring fast. Cut a flatworm in half, each half will regenerate into a whole, equally traumatized flatworm, identical to the original. And frequently, one half will be a side-hugger, the other, a bottom-planker. Planaria personality! Flatworm flair!

We once thought that humans were the only animals who used tools. No. Who made tools. Not that either. Who possessed language, an artistic instinct, morality—one by one, we are nudged from our exclusive pedestals. But still, still, we are the only species that tells stories. Homo narrativus. Who express that willful nubbin of self we call “personality” through planking plot, side-hugging symbolism. Tell me your stories, and I’ll tell you who you are.

My own stories have always been those of wanderers, of the ones born in the wrong place who must seek a new home. And the day came that like Hazel and Bigwig and Fiver from Watership Down, I too began to sense that the place where I lived (Missouri) threatened my well-being. So like those brave and bonny bunnies, I too set out for a better place: Boston. Where I would become a psychologist who studied the science of stories. And that is my story of science.

If you were a meat puppet, could anyone tell?

September 9th, 2014

If an alien took over your body and controlled your speech and actions, how long would it be before anyone noticed?

That’s not exactly the research question that “cyranoids” are designed to answer, but they could. Neuroskeptic reports that a couple of British psychologists, Kevin Corti and Alex Gillespie, have replicated two “cyranoid” experiments originally done by Stanley Milgram, of obedience-experiment fame.

“Cyranoid” was Milgram’s coinage–from Cyrano de Bergerac–for a person who is not speaking for or as themselves, but merely repeating words that another person is giving them. Cyrano had to hide in bushes and whisper loudly enough to be heard by Christian–and the audience–but quietly enough to be unnoticed by the fortunately rather dim Roxanne. This is all much easier with modern technology, and Corti & Gillespie were able to set up a microphone-and-monitor system that allowed Person #3 (Cyrano) to listen in on a conversation between Persons #1 and #2 (Roxanne and Christian) and feed “lines,” appropriate or inappropriate, to Christian.


Different aesthetic, same idea.

People didn’t notice, not even when Christian was a 12-year-old boy with his conversation being supplied by a 37-year-old psychologist as Cyrano. Maybe Roxanne wasn’t so dim after all.

Neuroskeptic calls this “Milgram’s creepiest experiment” and writes

If I started shadowing someone else’s speech, would my friends and family notice? I would like to think so. Most of us would like to think so. But how easy would it be? Do we really listen to each others’ words, after all, or do we just assume that because person X is speaking, they must be saying the kind of thing that person X likes to say? We’re getting into some uncomfortable territory here.

I’m not sure that much surprise is warranted, although I envy Neuroskeptic’s easy confidence that his loved ones are truly listening to him. We know people often attend more to the form than the content of other people’s speech–this is why Miss Conduct often recommends giving “placebic excuses” when ruffled feathers need to be soothed. And there’s a whole series of experiments showing that people don’t notice change in their environment. (I don’t mean “How could you not notice I changed the shelf liners, honey,” either–I mean like you’re talking to a whole ‘nother person than you started talking to, and you still don’t notice.)

More to the point, though, people aren’t going to twig to a cyranoid because cyranoids don’t exist. As Corti & Gillespie write,

It seems that when encountering an interlocutor face-to-face, people rarely question whether the “mind” and the “body” of a person are indeed unified–and for good reason, as social interaction would be undermined if we began to doubt whether each person we encountered was indeed the true author of the words they expressed.

The authors point out that people do often notice identity discrepancies “in artificial environments (e.g., Second Life and other virtual community games) wherein users can construct outer personae which starkly contrast with their real-world identities.” You don’t even need to go into immersive environments–even the comment threads on opinion blogs will tend to feature people accusing others of not really being a member of whatever group they’re attempting to speak for, or of adopting a sock-puppet identity, or the like. When we know that people’s words and being need not match up, we can be quite vigilant about clues.

I always figured that’s how Starfleet crew members managed to cotton on so quickly whenever their colleagues got possessed by the Aliens of the Week. Deanna Troi learned all the Signs of Alien Possession to watch out for when she was in psychology school, just like nowadays you learn the signs of addiction or suicide risk. I don’t even want to think how long it would take me to notice if my boss got assimilated by the Borg.

Corti & Gillespie write that people have always been fascinated by the idea of persons speaking through other persons, or different identities in the same body:

This well-known story [of Cyrano de Bergerac] is but one of the many examples of a fantasy that has appeared in the arts and mythology throughout history–that of the fusion of separate bodies and minds. Other illustrations include The Wonderful Wizard of Oz, in part the tale of a fraudster who is able to attain great power by presenting himself to the world through an intimidating artificial visage. The film Big entertains the folly that ensues when an adolescent boy awakens to find himself in the body of a middle-aged man. More recently, films such as Avatar and Surrogates have imagined hypothetical futures in which mind can be operationally detached from body, allowing individuals to operate outer personae constructed to suit their social goals. Fiction though they may be, these stories illuminate the power façade has over how we are perceived by ourselves and by others, and how we and others in turn behave in accordance with these perceptions.

Paul Bloom hates empathy (good on him)

September 5th, 2014

Yale psychologist Paul Bloom wrote an amazing takedown of empathy in the Boston Review. You don’t need to feel another person’s pain in order to be a good person–empathy might even impede morality:

Strong inclination toward empathy comes with costs. Individuals scoring high in unmitigated communion report asymmetrical relationships, where they support others but don’t get support themselves. They also are more prone to suffer depression and anxiety. Working from a different literature on “pathological altruism,” Barbara Oakley notes in Cold-Blooded Kindness (2011), “It’s surprising how many diseases and syndromes commonly seen in women seem to be related to women’s generally stronger empathy for and focus on others.”

The problems that arise here have to do with emotional empathy—feeling another’s pain. This leads to what psychologists call empathetic distress. We can contrast this with non-empathetic compassion—a more distanced love and kindness and concern for others. Such compassion is a psychological plus … It is worth expanding on the difference between empathy and compassion, because some of empathy’s biggest fans are confused on this point and think that the only force that can motivate kindness is empathetic arousal. But this is mistaken. Imagine that the child of a close friend has drowned. A highly empathetic response would be to feel what your friend feels, to experience, as much as you can, the terrible sorrow and pain. In contrast, compassion involves concern and love for your friend, and the desire and motivation to help, but it need not involve mirroring your friend’s anguish.

This mirrors my recent experience with my mother. People naturally feel empathy for their parents, especially their mothers–you’re literally connected to her body for the first few months of your life. You learn to be a person by imitating and being imitated by her. To feel what your parents feel is normal.

Except now, my mother is in a nursing home and deeply unhappy with her life for very good reasons, reasons that neither I nor anyone else can do anything about. I felt her pain for a long time, and it damaged my life and turned our phone calls into stomach-churning ordeals. And did nothing to better her quality of life.

A few months ago, somehow, I managed to pull the plug, emotionally, and stop feeling the sadness and frustration and anger about her condition that I had been. The emotions are still there, I just choose not to … visit them. I cultivate an attitude of chipper detachment that feels horribly fake and a complete betrayal of everything my relationship with my mother has ever been. And it’s saving both our lives. I’ll take the guilt of being a Stepford daughter over the anguish of feeling too much any day of the week. It’s what my mother would prefer, too.

I’d be very curious to hear what any actors who read this blog think of Bloom’s essay (read the whole thing, it’s complex and fascinating). My sense is that actors are, generally, pretty damn in favor of emotion for its own sake. Emotion to actors is like sweat to athletes, someone said. Acting is difficult for me because I have a lifelong, learned habit/skill of pulling myself out of emotional situations. It’s why I’m a good advice columnist–I don’t get swept up in emotion. I hold back, I look at the big picture, I examine my reactions. It’s terrible for acting.

Speaking of acting, did you know there is such a thing as medical acting? Now there’s science theater! From Bloom’s essay:

Leslie Jamison makes a similar point in her new essay collection The Empathy Exams. Jamison was at one time a medical actor—she would fake symptoms for medical students, who would diagnose her as part of their training. She also rated them on their skills. The most important entry on her checklist was number thirty-one: “Voiced empathy for my situation/problem.” But when she discusses her real experiences with doctors, her assessment of empathy is mixed. She met with one doctor who was cold and unsympathetic to her concerns, which caused her pain. But she is grateful to another who kept a reassuring distance and objectivity: “I didn’t need him to be my mother—even for a day—I only needed him to know what he was doing,” she writes. “His calmness didn’t make me feel abandoned, it made me feel secure. . . . I needed to look at him and see the opposite of my fear, not its echo.”

I picked up The Empathy Exams at the library yesterday, and look forward to reading it and sharing my thoughts on it with you!

You can’t fake faces or physics

September 4th, 2014

Wired is doing a fascinating series on science and cinema. I can tell already that one of the major themes of this blog will be, “Science makes explicit what art has always known.” The Wired series brings filmmakers and scientists together to see where their knowledge overlaps.

The first piece is about visual processing, which is much more interesting than I thought it would be when I started grad school. “Seeing” is not a passive experience:

While film makers intuitively understand things about visual perception and attention, scientists are trying to understand these things at a more mechanistic level, Smith said. He and other scientists want to know, for example, how the brain constructs a fluid perception of the visual world. “Visual perception feels like a continuous stream, but it’s not,” he said. What actually happens is that we look at one thing at a time, taking in a bit of information here, then moving our eyes to take in a bit of information over there. Then, somehow, amazingly, our brain stitches all those bits together to create a seamless experience.

In filmmaking terms, this means that your audience aren’t mere receptacles, but are co-creators of the art, actively–if unconsciously–ignoring this stimulus and paying extra attention to that one to make sense of the flickering images before them:

“We’re constantly calculating where we think the audience’s eye is going to be, and how to attract it to that area and prioritize within a shot what you can fake,” Favreau said. “The best visual effects tool is the brains of the audience,” he said. “They will stitch things together so they make sense.”

What you can’t fake, Favreau said, are faces and physics. Favreau is working now on an adaptation of The Jungle Book, and he says almost everything is CGI except the faces. Faces are just too hard to fake convincingly, he said, even with sophisticated motion capture systems designed to capture every eye blink and facial twitch.

“It’s the same with physics,” Favreau said. In the Iron Man 2 ccene, his special effects team created replicas of Formula 1 cars with the same weight and dimensions as the real thing and launched them with hydraulics or air ramps to create the flying, cartwheeling spectacle you see onscreen. “You get a tremendous amount of randomness in the way these things bounce and tumble and roll as they hit the ground and interact with each other, and that creates a sense of reality,” Favreau said.

I don’t know what “no CGI faces” really means. If Favreau is implying that Caesar and Rocket are anything other than wonderful to look at, he’s out of his mind. But the human brain does have a particular capacity to recognize faces–a face isn’t just any old arrangement of meat, it’s very special to us. And humans also have an innate grasp of physical realities. If I threw a ball high in the air for my dog, and he didn’t see where it landed, he would keep staring up in the sky for as long as I would let him. A dog’s brain doesn’t automatically know that what goes up will always come down. A human’s brain does know that, and if something onscreen behaves in an impossible fashion, it will pull our focus.

The second story looks at what happens in people’s brains when they watch movies. Certain types of movies can totally synch up the audiences’ brains. Scans show that the same areas in almost everyone’s brains lights up at pretty much the same time:

“They do look very similar, but it’d be more surprising if they didn’t,” said Handel, who earned a PhD in neuroscience at New York University before getting into movies. “If you’re watching a movie, that’s your entire sensorium and your feelings.” If people’s brains were out of synch during a movie, Handel suggested, that might be a bad sign that their minds were wandering. One person might be thinking about the call they need to make, while another contemplates making a popcorn run.

Think about that the next time you’re at a movie! You and all your fellow audience members sitting in isolated silence, while your brains ebb and flow like a team of Esther Williams swim-dancers.


Your brain on film.

Real relationships with fictional people

September 3rd, 2014

Who is your favorite literary or pop-culture character?
Why?
Do you ever think about that person to get you through hard times?

This is another one of those bits of human nature that art and culture have long realized, and the psychological sciences are slowly catching up with. Of course thinking about inspiring people can give you the courage or patience to handle your own ordeals. That’s why people say “What Would Jesus Do?” When I was an undergrad, if I couldn’t muster up motivation for a study or library research session, I would pretend I was a student at Starfleet Academy. Starfleet cadets never lacked for motivation.

I reviewed a new study about this for the British Psychological Society’s research digest earlier this month:

While there is a clear, bright line between real people and imaginary people (I exist, Hermione Granger does not), there is no such line dividing real and imaginary relationships. (As far as you are concerned, dear reader, both Ms. Granger and I are studious women who exist only on the page or screen.) Even in our most intimate personal relationships, we are often interacting with a mental model of our partner or parent, imagining their current state of mind, or how they would respond to whatever situation we find ourselves in. Although operationalised in this article as relationships with fictional characters, other researchers have included connections with real people whom we don’t personally know (artists, politicians, athletes) and historical figures in the spectrum of parasocial relationships.

Parasocial relationships enable us to explore emotional and social realities without the risks inherent in the real world. The authors dryly note: “Readers and viewers are protected from social rejection and the physical danger of threatening circumstances; thus, forming a relationship with an interesting but potentially dangerous character (e.g., Tony Soprano) does not present the same obstacles in the narrative world as it might in the physical world.”

The paper suggests that these parasocial relationships help us envision a bigger, better version of our selves, much as our real-life relationships can do. I credit Daniel Day-Lewis’s performance as Lincoln with giving me the political will to begin the process of getting my mother into assisted living. I just felt so decisive after seeing that movie!

I’ve written before about parasocial relationships (read the whole thing here):

So given how much even our relationships with real people can take place in the imagination, it’s no leap to have a strong relationship with a fictional character. Some people are more inclined to this than others–and, counter to the geeky fanboy/girl Comic Book Guy stereotypes, it’s the people who are overall highly social and relationship-oriented who are most likely to have strong parasocial relationships as well. I tend to be very prone to them, myself: I really was in tears, yesterday, of happiness that dogs I have never met are going to survive and be safe. Certain writers–Dorothy Parker, Anne Sexton–have always felt like sisters to me. When I read Torah, I have extremely vivid images of the Four Matriarchs–if I could draw, I could draw you exactly what they look like to me.

Bringing up the Four Matriarchs, and Jesus, is no accident. Religion has always encouraged parasocial relationships with people you don’t know in the flesh, and uses stories and images to encourage adherents to identify and model themselves after various ancestors, saints, or demigods.

Sunday synthesis (no column): Rewards and the golden rule

August 31st, 2014

Because of the holiday, there’s no column today, so I thought I’d take this opportunity to pull some ideas together from previous weeks. A number of you liked my deconstruction of the golden rule from a few weeks back, in which I pointed out

… the GR is a wonderful starting place for ethics, but it can’t take you all the way into etiquette and the finer points of social interaction. The more positive phrasing implies that the GR is the be-all and end-all, which … well, look. You can follow “Do unto others as you would have them do unto you” to the letter and still out your birthday-having colleague to the staff at Applebee’s, as long as you think it would be great fun to have servers singing and clapping at you while diners at other tables gawk openly or stick to their conversations with grim determination.

You shouldn’t always do unto others as you would have them do unto you, in other words, because they’re not you and their tastes may differ.

The week before that, I’d written about behaviorism and the importance of knowing the person (or animal) you are attempting to “train”:

The mistake that some people make about behaviorism (including some early behaviorists) is thinking of it as mechanistic, emotionless, impersonal. This isn’t true at all. If you want to redirect someone’s behavior, for example, you need to find an alternative task that will be equally engaging. This requires understanding the other person’s skills, and what they find rewarding and enjoyable. Behaviorism has to take the nature of the individual into account.

Operant conditioning (a behaviorist approach) is about reinforcing the behaviors that you want, and not reinforcing the ones you don’t want. In order to do it effectively, you have to find reinforcements–rewards–that really motivate. It’s all too easy to stick with rewards that are easy to administer–like kibble for dogs or money for people. It’s tidy, it’s quantifiable, it’s portable. What Milo truly wanted from life was to chase squirrels, but I couldn’t very well carry squirrels around in my pocket as a reward for good behavior, so he got liver treats, which offered an acceptable balance of convenience and desirability.

Another mistake is to over-apply the golden rule when creating rewards. Without conscious effort, we naturally tend to assume that other people will like, want, or think what we do. Even, disastrously, when we don’t actually know, ourselves, what it is we do like, want, or think.

One of the best tools I’ve found for thinking about these kinds of differences in what motivates people is C. Brooklyn Derr’s “career orientations,” which I wrote about earlier this year for the Harvard Business Review blog. Knowing your own career orientation can help you figure out what kinds of jobs and projects you should pursue. If you’re in charge of creating rewarding experiences for other people–employees, students, clients–Derr’s taxonomy can help you think about what all those people who aren’t you might find exciting and validating.

And of course, there’s a character on “Mad Men” who illustrates each of Derr’s orientations.

One of the most common career orientations is getting ahead: “People who are motivated by upward mobility focus on promotions, raises, making partner, and increasing their authority. They’re competitive and willing to put in long hours and negotiate office politics to win those rewards. This is the default career model in the U.S., which means that it’s easy for those who want to get ahead to explain themselves to bosses, colleagues, and family.” (All of the orientations are described in greater detail here.)

Lots of people start off with getting ahead as their orientation, but some people stay that way even after they’ve achieved a certain professional level, like Pete Campbell. The getting-ahead people are easy to manage because what they want is what the system is set up to deliver: good grades, money, promotions, whatever.

The elevator to success only has room for one of us, Bob.

People with the getting secure orientation have a harder time in today’s economy and corporate culture:

Those who seek regularity and predictability in their work environment are motivated to fit in with others and uphold group norms. They avoid risk and are less concerned with advancement than with career control. If this description has you rolling your eyes, you’re not alone. It’s difficult for people to admit they want this kind of security, because it sounds like the life of a corporate drone, which no one wants to be. That’s especially true today, given the rise of the free agent in all industries. But people motivated by security are loyal and willing to put in extra effort when the situation requires it — not just when it will bring them glory.

Sounds boring? But it can look like this.

Joan Harris is the poster girl for the getting-secure orientation. She values efficiency, decorum, and accuracy and expects everyone to do their part. She also values long-term financial security, which may or may not be tied to her continued employment at Sterling Cooper.

The opposite of the getting secure orientation is getting free: “Derr describes people with this orientation as ‘hard to work with, impossible to work for, slippery as eels to supervise and manage, and infinitely resourceful in getting their own way.’ People who value getting free want autonomy and self-direction.”

Right. Yes, they’re both silvertongued hotties preternaturally good at their jobs, but the similarity ends there. Joan wants the trains to run on time. Don wants to know he can always jump a boxcar and blow town. Their seventh-season conflicts shouldn’t have come as a surprise.

Office stoner Stan Rizzo is not going to be the poster boy for getting high, because that’s not what Derr is talking about:

These are people who care deeply about deploying their expertise, solving problems, creating new things, and feeling engaged. They are ambitious and sometimes idiosyncratic. Unlike professionals intent on getting ahead (who might take on boring but important assignments in order to win favor with clients or managers), those motivated mainly by getting high will gravitate toward work that provides greater stimulation, even if it’s low-profile or high-risk.


It’s why we love her. It’s why she seems so modern.

Speaking of modern, Ken Cosgrove is a quiet revolutionary who consistently values getting balanced:

People with this orientation want to enjoy objective career success, personal development, and close relationships, and they’ll strive to achieve all these goals over time. They are unwilling to sacrifice a personal life to career demands, but they’re also unlikely to coast in a job for which they are overqualified to free up their time at home. They want challenge, and fulfillment, both on and off the job.

Ken decides no account is worth being shot in the face. Go-getter Pete happily takes over the account.

Ken has published short stories–and continued to write under a pen name after being told that ad work left no margin for extracurricular activities. He handed over the prestigious Chevy account to Pete when the toll on his health became unacceptable. More than the other men, he talks about his children in the workplace.

***

Derr’s taxonomy isn’t meant to be a consultant’s version of astrology. Most people are at least somewhat motivated by all five of his categories–status, security, freedom, excitement, and balance–and people’s motivations may change over time as their circumstances do. Still, in general, most of us are going to identify more strongly with one or two of the types above. And we’re going to wind up teaching, managing, or working with all the other types sooner or later. It’s useful to get a sense of how they see the world.

“Her Aching Heart” and the science of romance (novels)

July 21st, 2014

Last night I dreamed I went to Central Square Theater again …

… because “Her Aching Heart” was so darn funny the first time (Globe review here). Aimee Rose Ranger and Lynn Guerra play modern-day urbanistas cautiously falling in love with each other while reading a gothic romance about a tempestuous English lady and the innocent peasant girl who sparks her affections. Only the occasional phone call or song hint at the present moment–most of the show is dedicated to the two actresses playing all the parts in the lesbian bodice-ripper. (Aimee’s bluff, rapey Lord Rothermere and Lynn’s palsied Granny, full of incomprehensible forest wisdom and whole-body tics, were my favorites.) Yes, I know it sounds stupidly complicated, but it’s not, really. If you liked the movie parodies Carol Burnett used to do, you’ll like this.


(Lynn Guerra and Aimee Rose Ranger in “Her Aching Heart,” A.R. Sinclair photography)

The romance parody in “Her Aching Heart” inspired me to dig up my dissertation, which was on the psychology of literary genre. I was curious to know if people had expectations about stories that went beyond surface characteristics (e.g., if it’s in the future, it’s science fiction, if there’s a murder, it’s a mystery). I asked participants to rate 10 different genres, including romance, classics, science fiction, and fantasy, across 16 different dimensions.

Here’s a graph showing how “romance” (in red, natch) differs in people’s imagination from ordinary fiction (in black):

People perceive romance as dumber, basically–I said that in a fancier way in the actual dissertation, of course, but I think my advisers knew what I meant. Romance is seen as more predictable, simpler, upbeat, emotional, and fantasy-based than regular fiction: It’s written for money and read for fun. No wonder it’s so delightfully easy to parody! We don’t even feel bad about making fun of romance novelists, because we assume as long as they’re making bank they don’t care about critical opinion.

I did my dissertation in 2002, and I wonder how “romance” would be defined in today’s imagination. That’s the tricky bit about trying to scientifically study a cultural phenomenon like literary genre–it keeps changing on you. In 2002, I would occasionally encounter people who didn’t know what “genre” meant, because it was still a lit-crit term, and wasn’t how iTunes and Amazon and Netflix preferred to organize your content and sell you more. Romance-wise, 2002 was before “Twilight” and “50 Shades of Gray” and “The Fault in Our Stars.” Would these dark offerings lead college students today to rate romance as a more pessimistic, complicated (if not intellectual) genre?

Hangout of the “Planet of the Apes”

July 17th, 2014

Earlier this week I did a video broadcast with PeaceBang and NYT religion reporter Michael Paulson about religion themes in “Dawn of the Planet of the Apes,” which Mr. Improbable and I saw this weekend. Boy, the reading glasses were a mistake! But I had never done a Google hangout before, and wanted to keep an eye on the proceedings. We do give away most of the plot–elements that aren’t implicitly contained in the title, that is–so watch with caution.

More discussion after the jump

Click to continue reading "Hangout of the “Planet of the Apes”"

The Bostonian personality

July 1st, 2014

Commented to a friend from Kansas this morning that she really ought to consider moving out here, as her personality would fit much better in New England, which led me to muse on one of my favorite muse-snacks, the Boston personality. Here’s what I see, almost 20 years after I made the move from the Midwest myself:

Bostonians value honestly over tact and would rather discuss their opinions than their emotions. We expect people to have some kind of clear identity, whether it’s ethnic, professional, religious, or whatever. We have an innate understanding of multiple intelligences. This moderates the intellectual snobbery people expect from the city, although it also means, in practice, that most of us are easily intimidated by each other: I’ve seen physicists scared of actresses, lawyers intimidated by chefs. We have no ability to move through space in a coordinated and efficient fashion, whether on foot or by car or bike, in striking contrast to New Yorkers, who navigate their city like schools of fish. Despite our terrible street signage, Bostonians place a high value on information and think that giving people the full and accurate intel to make a decision is an important etiquette practice. (The homeless people have more informative signs in Boston than in any large city I’ve been to.) We are somewhat antisocial, although to us it feel more like respecting other people’s privacy, and avoiding the awkwardness that we secretly believe is inherent in every social interaction. (It’s no coincidence that half the cast of “The Office” came from Newton.) Bostonians will ghost at a party because we don’t want to put the host through an awkward goodbye when he’s deep in a conversation about string theory or the Sox with another guest.

What do you think? Am I right? What would you add?