What I don’t know about dreams

August 6th, 2010

Following up on yesterday’s post about dreams and Paxil — I learned a fair amount about dreams during my work with Alan Hobson. One thing I still don’t know, however, is why certain dream plots are so common: performance-anxiety dreams like the Actor’s Nightmare, having one’s teeth fall out, missing a train. And I don’t know to what extent these “common” dreams are culturally determined. If anyone has good resources on this — empirical, not mystical — I’d be interested to know about them.

The Paxil thing, cont’d

August 5th, 2010

So as I mentioned a while back, I went on Paxil about six months ago as part of the whole mind-body thing. Clearly, my gut was not going to calm down until my brain told it to, no matter how much yogurt and bananas I ate. (Yes, after about a month of no substantive posting, I figured I’d jump right into the deep end. Come on, you’re with me, right?)

Going on the Paxil coincided with cutting way back on drinking, and the two together did a real number on my dreams. Drinking alcohol before bed — even a seemingly modest glass or two of wine, if it’s a regular habit — can suppress dream sleep, which means that when you quit, you may get a bounceback effect. Add to that the fact that SSRIs intensify dreams, and things got quite exciting for a while.

After graduate school, I worked for a while with Alan Hobson on the psychology of dreams. As I’ve written about before, one of Alan’s ideas is that we solve problems in our dreams much as we do in real life, we simply don’t question the bizarre. Alan also believed that Freud and psychoanalysis had led people to focus too much on the symbolism of dreams. When you stop trying to figure that out, and instead focus on the story and the emotions, what the dream “means” will usually become quite clear.

The power of a dream lies in its story, and in how that story affects you. The set and props are just whatever your unconscious mind could most quickly grab: images from the day’s business; random memories that floated up in response to this color or that smell; faces or places you watched on television before bed. This is why there’s no point to “dream dictionaries” that purport to tell you what the various symbols in your dreams mean. Dream symbols are at once universal (ever go through a computer training with co-workers, and discover afterward that many of you dreamed of the program you were learning that night?) and idiosyncratic (a cigar may be merely a cigar to Sigmund, but it might symbolize the Cuban embargo to Rosalita, or her father’s cancer to Dora, or even a penis to James).

Anyway, about a month or so after I’d been on the medication, I had a dream that nicely illustrated both the principles above and the effect that Paxil had had on on my problem-solving style.

I’d been over to a friend’s house that night to catch up on some Tivo’ed episodes of “Big Love.” (It’s a fun show to watch in batches — when you watch several episodes back-to-back, you realize that every time someone smiles, something horrible happens within 10 seconds.) Unsurprisingly, that night, I had the classic Actor’s Nightmare: I’d been cast as Bill Henrickson’s fourth wife, but no one had bothered to give me a script.

Was I anxious or worried? Oh, heck no. I have a fair amount I’d like to say to those characters, so until the directors put a script in my hand, I was going to say what I thought. (I recall telling first wife Barb, “Listen to how Bill yells orders at you! My boss doesn’t talk to me that way, and he’s my boss! A person’s spouse certainly shouldn’t bark at them like that.”) And if the director or other actors didn’t like what I had to say, well, give me the script, already, and I’ll stop improvising and say what you want.

Have you ever had a dream that used to make you anxious, but doesn’t anymore? Or a kind of dream you stopped having once certain problems in your waking life got resolved? Or a dream that makes more sense to you now that I’ve talked about the “story, not symbolism” principle?

So, the Paxil thing

June 8th, 2010

As I mentioned last week, I went on Paxil in December. My digestive system had not been working well for a couple of years, and what had once been an occasional annoyance turned into a full-time debilitation by last fall. As it turned out, I have the trifecta: IBS, gastritis, and esophageal reflux disease. This was going to require not only medication, time, and rest, but significant changes to my diet and cutting out alcohol. And as my gastroenterologist is one of the good ones who realizes that I am not a bunch of interconnected malfunctioning tubes, but a person, she suggested I go to behavioral health and get myself on, as she put it, “something that will help you cope without dissolving your esophagus.”

Now, here’s the thing. Every time I had gone to this doctor before with a bout of gut misery, she would ask, “Are you under any particular stress?” And I would say, “No.”

And, having seen the incredible difference that Paxil has made in my life, I was obviously wrong. Why didn’t I answer the question correctly?

Because of how it was asked, that’s why. People are notoriously susceptible to answering a question based on how it’s worded: for example, a recent study showed that more people agreed that “gay men and lesbians” should be able to serve in the military than that “homosexuals” should be able to serve in the military. Same question, obviously, but “homosexuals” sounds clinical and perverted, while “gay men and lesbians” sounds like people you know.

The question “Are you under stress?” or “Are there particular stressors in your life?” is a question that leads me to look outward, away from my emotions and to the objective circumstances of my life. And every time I did, I simply couldn’t see anything that could be, almost literally, twisting my gut into knots. My husband and I get along well. We are both in relatively good health, physically and financially. Yes, sometimes it can be difficult to juggle multiple jobs and projects, but I’ve always preferred to have a lot going on (and in this economy, having multiple sources of income seems like a good thing). I have good friends to confide in. What did I have to be stressed about?

But if she’d said, “Do you feel anxious?” — oh, I would have given a very different answer to that. Because that’s a question that would lead me to look inside, to how I felt. And I am an anxious person. Not because of my life circumstances, but because of how my brain chemicals are mixed. My flight-or-fight response threshold is ridiculously low.

And it isn’t anymore. I don’t have the off-the-chain startle reflex that I used to. I find it easier to read e-mails criticizing my work, even when they’re completely hateful, without my heartbeat going into overdrive. To my great surprise, even Milo has picked up on this. Before, if he was sitting on my lap at night while I watched TV or movies, he’d leap out of the chair and run to the window barking at the slightest noise. Now, he’s more likely to lift his head, growl, and settle back down immediately when I say “It’s just the wind, little guy.”

I’m amazed that the way a question was worded kept me from getting the help I needed for several years. I study this kind of thing: I know about cognitive biases, and the power of language and framing, and even a fair bit about temperament and brain chemistry. It’s a good lesson in staying humble and always, always, remembering to look at a situation from more than one perspective.

Going on Paxil really did a job on my dream life, too, in some fairly amusing ways. But I’ll save that for another post.

Metaphor du jour

June 7th, 2010

One of the reasons I love writing an advice column is because I do my best thinking not in a solitary state of meditation, but in response to other people. And here’s a non-advice-related example. A friend of mine e-mailed me a week or so ago to ask about the common distinction between left brain/right brain, and how much of that is actually based in science and how much is simply shorthand for analytical v. artsy-fartsy.

I gave him some basic 411 about the different brain hemispheres, and the corpus callosum, and handedness, and neural plasticity. Then I summed it up with this:

“I mean, sure, they’re different, but it’s like Manhattan v. Brooklyn. You can’t really imagine one without the other, and everything is constantly commuting between one and the other. Real estate’s a little cheaper in the right hemisphere, and the left hemisphere is more influential in the world of ideas and commerce, but fundamentally, it’s two halves of a whole.”

I gotta say, I’m fairly proud of that one. And it sure beats the last metaphor I came up with for the difference between Manhattan and Brooklyn.

April, fools

April 1st, 2010

How can you do an April Fools’ Day prank when every day, reality surpasses satire? “Vice President Dubs Health Care Reform ‘A Big Fucking Deal.’” “Oscar Winner for Heartwarming Film about Inter-Racial Friendship Dumped for Neo-Nazi Mistress.” “Thousands of American Refuse to Answer ‘Invasive’ Census Questions While Posting Drunken Pictures of Selves on Facebook.” “Rod Blagojevich to Be Contestant on ‘The Apprentice.’”

As they say, you couldn’t make this stuff up. So I’m sympathetic, overall, to people who fall for hoaxes or rumors at first. (No sympathy for those who run to e-mail everyone they know about it without first checking on snopes.com.)

But today, I thought I’d share my favorite with you, and this, I promise, is not made up. I’m not messing with you.

Back in 2000, The Onion — a satirical newspaper parody — published an article entitled “Harry Potter Books Spark Rise In Satanism Among Children.”

Shortly after, Readers’ Digest published an article about J.K. Rowling. Reader response was positive, except for one woman who wrote:

“I am shocked that Reader’s Digest would put someone like J.K. Rowling on the cover without more investigation about what she really believes. Harry Potter is doing much to further the evil in this world through spells and incantations. It saddens me that parents prefer to look the other way when something is ‘popular.’”

This is where it gets awesome, though. Because a few months later, this same woman — Laurie Rice of Athens, Georgia — wrote back to Readers’ Digest with this gem:

I was angered you did not print my entire comments on Harry Potter (“You Said It”, February) and left important points out. I made these comments because I read an article from theonion.com quoting J.K. Rowling. These concerns need to be publicized. She is an admitted Satan worshipper. There has been an increase in 14 million children into the church of Satan as a result of these books.

The editors responded:

We hope you’ll be relieved to learn theonion.com is actually the website for a satirical newspaper, with a readership of five million. The article you read was a spoof — unfortunately passed along as a fact by countless people. Even Christianity Today calls the Harry Potter series “a Book of Virtues with a pre-adolescent funny bone,” containing “wonderful examples of compassion, loyalty, courage, friendship, and self-sacrifice.” — Eds.

I hope you agree with me that the editors’ response was a perfect blend of snark and politesse. Because you know perfectly well that Ms. Rice would not be relieved to learn this. It’s not as though you or I thought that our laptops were being recalled, and then found out that in fact, they weren’t. Ms. Rice wanted to believe that Harry Potter is evil, and I’m sure she was very, very disappointed to have her “evidence” debunked.

What do you think the odds are that she found some brand new “evidence” right quick-like to support that which she wanted to believe anyway?

Happy April! Fool the day!

In ur consushness, watchin ur brane

January 29th, 2010

On our post about the 00s (see, when you’re writing, you don’t have to figure out what to call them), Stupendousness, who had one heck of a decade, writes this from her 27-year-old perspective:

I feel like a very different person from my 17-year-old self. I believe part of that is clearly due to the continuing maturation of my brain, which is just biological. The way my brain works has changed enough to affect my personality to an extent, and some of that has been involuntary, but I’ve also consciously changed many of my thought-patterns. Or tried. I am much less cynical these days, for example.

Yes. The brain develops a lot between the teenage years and 25 or so. Some of the mistakes we make in our teens and early 20s are the result of lack of experience — but some are due to the simple fact that your brain works about as well as the beater cars most of us were driving at that age. It’ll get you where you want to go most of the time, but it’s not always reliable. Your capacity for executive functioning kind of fades in and out for a while like an AM oldies station two towns over.

I was aware of that myself, in my teens and 20s. I didn’t let myself have a credit card until I was 25. But I sure didn’t know the biology of it, and I bet that, Stupendousness, made your experience rather different. How did you learn about brain development? And it makes me wonder what it would be like if this kind of thing were taught in schools more. It seems (and please, if anyone knows different, tell me) that schools offer a lot of coaching (or at least nagging) about good study and health habits, deferral of gratification, career planning, and the like, without ever explaining to students why it’s going to be hard for them to learn these skills, and why sometimes they’ll find themselves doing exactly the thing they know they shouldn’t. I don’t think students would take this as an excuse to skive off (“Dude! Give me a break! My brain’s not finished yet!”). I think the more dutiful ones — and most young folks do want to be responsible — will give themselves a much-needed break from time to time. And I think it makes the process of learning boring, unfun things more interesting, because you would know you are actually programming your brain.

(Please don’t misunderstand me — I’m not saying that once you’re 25, you’ll never make a regrettable impulse buy, impolitic comment at work, or pitcher of grain-alcohol punch again. We all do things we know we shouldn’t, take short-term pleasures over long-term gain. But certain kinds of judgment really are, biologically, more difficult to sustain before the mid-20s.)

Age 30 transition

January 27th, 2010

Sounds like I’m not the only person for whom the 00s have been a big decade! Thank you for sharing your stories with me.

As I’d mentioned, my understanding of adult development is heavily influenced by the work of Daniel Levinson — you can get the Cliffs’ Notes version of his theory here. (All the language refers to men, and his original study was on the male lifespan; he did wind up writing a second book about women, but if there were any major differences, I would have remembered them, and I don’t. Studying adult development is hard because the specifics of everyone’s life differ, but the people who have done it successfully, like Levinson and Dan McAdams, focus on general themes. Maybe “becoming a grownup” to you means running your own business, or having a baby, or buying your first real car, or doing your first jail stint, but everyone wants to do something in their late teens/early 20s to prove their adulthood, for example.)

And it sounds like a bunch of you all are coming up on the Age 30 Transition, or have recently gone through it. This is a really helpful concept to understand, especially if you’re within five years of 30.

One of the major things I loved about Sassy Curmudgeon‘s “Ten Years of Twenties” post is her acknowledgment of the dark side of the 20s:

When I was 22, a 28 year-old friend of mine sat me down and gave it to me straight. “The next four to five years are going to suck,” she said. “But then it gets awesome.” I smiled and nodded and truly believed that life would not suck for me, because I was starry-eyed and ambitious and different, and she was fucking old anyway, so what did she know? She was right, of course. Being 22 through 27 just kind of blows. It’s not a constant state of blowing, though—it’s like a fine wine; the blow ripens over time until you get a nice, full-bodied suck.

This is why the Age 30 Transition needs to happen. The media give one’s 20s great play as a time of dating, urban adventures, maximum good looks and minimum responsibility, but the fact is, that’s not how most of us experience it. For most of us, it’s a hard time: a time of piecing together the scraps of adult life from whatever’s nearest, all the while not fully knowing yourself well enough to know what you really need from and can contribute to a relationship, a career, a community. It’s a mad scramble for jobs that aren’t too demeaning, dates that aren’t too depressing, used furniture that looks more “shabby chic” than “trailer park panache,” and trying to find something affordable at H&M that can get you through a job interview.

As you near your 30s, you’ve got a little ground under your feet and can start to make some decisions. Maybe that job you took right after graduation because you had to have a job isn’t the right one for you, and law programs have been looking surprisingly tempting. Maybe that job you took is turning out to be a real career, after all, and you’re thinking about moving away from your home town to go work at headquarters. You start realizing what works for you and what doesn’t, and you’ve begun to develop the experience, financial resources, and general life savvy to get what you want. (Among my group of friends, we referred to this time as “Everyone who’s married gets divorced, everyone who’s single gets married, everyone in grad school drops out, everyone in the workforce goes to grad school.”)

So for those of you still doing the patchwork-quilting of the 20s, hang in there. And those of you starting to lift up your heads and say, “Hey, wait, why am I working at/dating/living in X when I’m really a totally Y kind of person?” — fasten your seatbelts. It may be a bumpy ride for a year or two … but it’s worth it.

The Oughts

January 25th, 2010

… is that what we’ve decided the last decade should be called? If so, the Oughts were, for me, the Dids. During the past ten years, I

- Met and married Mr. Improbable
- Got my PhD
- Converted to Judaism
- Taught college for two years
- Started writing the Miss Conduct column, and eventually two blogs
- Wrote my first book

… along with various other life-transition experiences, like starting to travel overseas and getting a dog.

That list isn’t meant to be “ooh, haven’t I accomplished an impressive lot,” but as evidence of what a huge decade of transition the 00s were for me. According to psychologists who study adult development, we spend about half our adult life in periods of transition. Sometimes it can be hard to know when you’re in one of those phases — maybe you don’t realize you’re in transition until you’ve already made the change.

What are you doing when you’re not in transition? Building on what you’ve got. Which is how I’m feeling at the moment: all the major pieces in my life are in place. Now it’s up to me to do something with them, to start husbanding and growing my resources.

I don’t ever recall before having a calendar decade match so closely with a personal turning point (which is probably why I got such a kick out of that post by the blogger who was born in a year ending in zero) before. Have you? How were the Oughts for you?

New Year’s resolutions

January 15th, 2010

Now that we’re halfway into the month, let’s talk New Year’s resolutions! I asked you all about yours a while back, and never really followed up on that.

I’ve always found the NYE resolution to be an interesting beast. On the one hand, there is something that seems very natural about a season of excess followed by a period of restraint and sacrifice: it’s a pattern you see in too many cultures and religions to ignore. On the other hand, the way so many people do NYE resolutions seems set up to guarantee failure: black-and-white absolutes, with no room for the inevitable backsliding. By the second week in February, you’ve already missed your goal of getting to the gym four times a week, so you just quit entirely.

I was pondering what my own 2010 resolutions and goals should be, and then more or less got handed a new set by my doctors: quit drinking, and change my entire eating pattern. Which was a little more ambitious than anything I was planning to carve out for myself, I tell you what. Here’s what’s helped:

1. Not having a choice.
I’ve never been a fan of the classic AA notion that one must “hit bottom” (is that still a going concern in AA, or have they more or less dropped that idea?) before making a change. Still, there’s something to be said for having one’s doctor say “Yes, there is a real problem, and you can and must stop this problem now.” (Funny, on the other blog we are discussing why people write in to advice columns, and one thing that a number of folks mentioned, that hadn’t really occurred to me, was that the columnist not only provides a reality check, but also a sort of kick in the butt, just as my doctor did for me. Having someone say not only, “Yes, you’re right, there is a problem,” but say “And you need to do something about it now.”)

2. Quick feedback. I think this is something that scuttles a lot of NYE resolutions — people simply don’t see results fast enough, so they get discouraged and quit. I was lucky, because I felt markedly better after only a few days of getting on the right meds and knocking off the booze and spice. But let’s face it, a lot of good habits actually make you feel worse when you start. Sure, going to the gym will give you more energy and a better mood … after a few weeks. Before that, it will make you tired and cranky. So if the behavioral change itself won’t give you immediate, positive feedback, figure out a way to implement some little reward system, so you’ll know you’re getting somewhere.

3. Taking positive action. It’s always easier to do something than to not do something. (As you read the rest of this post, do not think of a white bear. See?) I’ve decided to look at my new diet as a chance to explore new cooking techniques and ingredients, rather than as simply giving up X, Y, and Z. WES alluded to a similar idea:

I think I have stumbled on an epiphany for my new year’s resolutions. In the past those pesky resolutions were things I knew I **should** do even if I didn’t want to do them. However this year I am making my goals shorter and more in tune with what I want to do. And if I finish them before the year is up great, I might do new ones in July!

So rather than my resolution to go on a diet my resolution is to crochet more and learn a new technique. It is a calming activity, allows me to be creative, and while still a sedentary activity it has the added bonus of you really cannot eat/munch while crocheting. And snacking is a big weakness of mine so really it should be a win win.

4. Communication and support. The research on the extent to which social networks affect behavior is impressive and grows more every day. We need our friends to support the kinds of things we do, the kind of person we want to be. It’s been immensely good for me to be able to write about my health issues here, and feel that by doing so, I’ve opened up a forum for other people to share their own experiences. It’s also been good to have a couple of weeks of minimal socializing, so I can get my new habits well under control before having to attend a cocktail party. And Mr. Improbable and I have had a number of conversations about how his life (since I do the cooking) will and won’t change.

Some further thoughts on your comments …

TJ wrote, “I’m not big on New Year’s resolutions (those always seem a little overwhelming), but I (along with my family) make resolutions with a more limited time frame.” I like that; I like that a lot. Make goals for a month or so, not for the entire year. I wonder if that isn’t what people do anyway, really … there’s the New Year’s Eve goals, and then spring cleaning and getting in shape for summer, and then back-to-school season.

Anne with an E wrote, “I resolve to stop waiting until the time is right/we have the dough to throw a huge shindig before inviting people over. Pizza and game night for six is just as fun as a BBQ for thirty (with a lot less cleanup.)” YES! I figured this out about four or five years ago and it was quite a revelation. And with six or eight people, everyone can really get to know each other. (Note for Bostonians — Redbones BBQ delivers, and they are very good. They also have enough good sides that any vegetarians will be taken care of. Highly recommended for informal parties.)

Military Mom wrote:

My first resolution is to stop agreeing to do or help with activities without REALLY stopping to assess if I have time or want to do it. Up until now I’ve volunteered when other people need help and have almost always regretted it afterwards. My second is to try to lower my stress level. This will require the rest of my family to step up and help, but I think they are recognizing my stress is affecting my health…and therefore their lives too…

Good luck with those two, obviously related, resolutions. I’m sure it’s something many, many of us can relate to.

How about the rest of you? How are your resolutions working out?

Things to know in your 20s

January 5th, 2010

Blogger Sassy Curmudgeon has a nifty chronological advantage for a writer — born in a year ending in “0,” she can write a decade-in-review piece that is also a review of her own life. She’s done so, rather hilariously, in “Ten Years of Twenties,” which everyone who is or has been in their twenties should read:

Unless you have a particularly rough childhood, your twenties are your birth into the real world, by which I mean a world that doesn’t involve trading “points” for meals or having a third party pay for your cell phone. They are painful and joyful, exciting and despondent, infantile and terribly grown up-seeming, drunken and sobering.

Career changes

December 4th, 2009

A big part of what I study at that Harvard Business School job of mine is what happens when people switch jobs. My boss and I are interested in what makes people successful — and, as well, what they think makes them successful, and looking at what happens when people move from one organization to another is a good way of doing that. We’ve got an article coming out in the January (or possibly March) issue of Harvard Business Review on the top five mistakes people make when changing jobs.

So I was very interested to see this item in the British Psychological Society’s blog on the differences, culturally, in how people explained why they changed jobs. Workers from the U.S., three European countries, and China were interviewed. The most interesting finding:

Workers in the United States didn’t ever attribute a career transition to an external cause, such as conflict with a boss. Not once. Instead they tended to mention internal factors, such as their desire for a fresh challenge. By contrast, workers in China almost exclusively stressed the role played by external factors. Meanwhile, workers in the the European nations were more of a mix, attributing their career transitions to both internal and external factors.

The researchers said a lot of the transitions reported by the participants, especially in the USA and Europe, were positive. Generally-speaking, people are known to be biased towards attributing positive events to themselves, and so it’s perhaps little wonder that many workers attributed all these positive career transitions to internal causes. “In addition,” the researchers said, “in many cultures ‘being in charge’ of one’s life is positively valued. Conversely, reconstructing crucial career transitions as purely triggered by external circumstances does not convey a great amount of competence.”

I bolded that because I think it’s absolutely huge. I’m dying to read Barbara Ehrenreich’s latest, Bright Sided, about how America’s near-pathological obsession with optimism (and, hence, the belief that we can or should control our fate) has warped our culture, our economy, our medical system.

We so want to be in control. We so want the narrative of our lives to be about “choosing our choice.” It’s very hard for an American to say, “This bad thing happened. No, it wasn’t a blessing in disguise. No, it wasn’t God closing a door and opening a window. No, it wasn’t a ‘challenge.’ It was a bad thing, and it sucked, and now I have less and can do less than I did before, and if I’m going to make any meaning out of it, well, that’s going to take a long damn time and frankly, there’s other and better things I would have liked to do.”

We can’t say that. Even if we could admit it to ourselves, we can rarely say it. You sure as hell can’t say it in a job interview. You can’t say it to strangers at a cocktail party, either. You can only say it to your closest friends, and even that not too often. It’s like the way we can or cannot talk about chronic illnesses. (Ms. Ehrenreich, not surprisingly, wrote the book after a bout with breast cancer.)

In the “children” chapter of Mind Over Manners, I write about the so-called “mommy wars” between working and stay-at-home mothers. I read up a lot on the issue, and came away more or less convinced that except for a privileged few, neither set are really, truly, choosing, but are rather making the best of a limited set of options:

The repeated talk of “choice” makes women feel entirely responsible for the situations they find themselves in. Is a mother who works full-time really making a “choice” if she dare not even ask for a reduction in hours if her husband is self-employed and she provides the family’s health insurance? Is a stay-at-home mother really making a “choice” if the public schools are so bad that they must be supplemented or replaced by homeschooling, or if child care would cost more than she can earn? If we label the decision to stay home versus to go off to work as a “choice,” it allows us, as a society, to maintain that any negative consequences are a problem for the individual to solve, and don’t require reform of our laws or workplace cultures. I’m not here to offer policy recommendations—only the politeness recommendation that both working and stay-at-home mothers recognize that the other side, like they themselves, are making decisions under severely difficult circumstances.

There’s a career transition in my past that I “brightside” a lot, too. I talk about how fun the job was, but how exhausting, and frame my leaving as a simple “when the contract was up,” and then I start talking about all the cool things I’m doing now. I don’t talk about the fact that “the contract was up” really means “they didn’t offer me the permanent job.” I don’t talk about how I wasn’t even given the courtesy of an interview for that permanent job. I don’t talk about how, on my last day of work, I went home early and cried until nightfall.

Because the contract was up! It was over! I was free to pursue other dreams, and now I’m Miss Conduct! I’m not a loser, I’m a winner!

Sometimes, it is very, very tiring to be an American. Always to be a survivor, never a victim. Always to craft that winning story. Always to feel in control, when science and religion and art and philosophy since time immemorial have converged on the simple fact that we are not.

Or to put it another way, because the language of LOL is true and good:

funny-dog-pictures-lemmunaid-suk

I know, I know

November 27th, 2009

Chances are, if you’re spending this holiday with friends and/or family, you’ll turn to someone with a story you’re sure they’ll love only to find, embarrassingly, that they did love it — the first two times you told it to them.

Memory can break down in a lot of ways (and here‘s a good book on it, if you’re interested). Two ways are to forget the source of information — where we learned something — and to forget the destination of information — who we told it to.

Source memory breakdown is common, and generally not a very bad thing unless 1) you’re a terrible critical thinker and an undiscriminating reader, in which case you’re going to wind up believing a lot of half-baked things*, or 2) you’re a writer and you’re trying to look something up. (I’ve been meaning to do a post for months now on something Alfred Adler used to ask his patients, except I can’t find the original source anywhere.) Socially, source-memory breakdown can be embarrassing if you’re telling someone a joke and, when you get to the punchline, they fill it in for you and say, “I told you that joke last Thanksgiving.” People tend to think that kind of faux pas is funny, though.

It’s a bit more annoying to be told the same story over and over, which is what will happen when a person has breakdown of destination memory. This is as common, yet much less studied, than breakdown of source memory. In my wholly non-scientific appraisal, marriage is a great contributor to destination-memory breakdown, as the question, “Did I tell you X, or did I only think it?” implies.

Some researchers — and this is what I’ve been getting to, if anyone’s curious — have started looking into destination-memory breakdown. Their initial finding suggests that part of the reason we forget who we told things to is because we are too focused on ourselves as the teller, and not on the listener. As a research psychologist, I have some methodological quibbles with their experiment, but as an etiquette columnist, I have to say, that’s a terrific lesson learned.

No matter where you learned it.

*Of course, if you’re not a critical thinker, good source memory can only help you so far. On my bulletin board, I have a letter from a woman to the editors of Readers’ Digest, which had published a laudatory article on J.K. Rowling. The letter-writer was angry that RD did not point out that Ms. Rowling is an acknowledged Satan worshipper, who has led some 14 million children to the Chuch of Satan through her best-selling series. The source of this information, the letter writer proudly reported, was The Onion. (And the editors’ reply, I must say, was a freaking masterpiece of tact.) Now quick, go tell a friend that story while you still remember that you read it on Robin Abrahams’s blog.

Just who do I think I am?

October 13th, 2009

The answer to this question is not the title of the blog post below!

Rather, it’s the title of a talk I’ll be giving at a Boston University alumni luncheon this Friday. Shortly before I’d been asked to give the talk, I’d given another one in New Hampshire, at which someone asked me — as someone often does — what “makes [me] an expert”? As I mentioned to my Facebook friends afterward, I’m always tempted to answer “What makes you an expert?” with “The fact that people ask me questions,” but I fear that would sound sarcastic. I don’t mean it that way, though — I’m enough of a social constructionist to think there’s a good amount of truth in that reply.

It’s one thing to define expertise in a field that has boundaries. We know what karate is, so defining what it means to be an expert in the field of karate — or 17th century French drama, computer programming, veterinary medicine, architecture, even etiquette — is possible, if not necessarily simple. Of course, in all of these fields, levels of expertise can differ. More trickily, “expert” can either mean “having expert knowledge of,” as a critic would, or “being an expert practitioner of.”

I wrote about this question a couple of years ago on the other blog (back when that was a mishmash of whatever was preoccupying me, and wasn’t a straightforward question & response format as it is today). Check out the ideas I explored there. As applicable to the whole “Miss Conduct” venture, I wrote:

But if I am an expert, as Miss Conduct, what exactly am I an expert on? I’ve been writing the column nearly three years now and it moved beyond classic etiquette a long time ago. Social behavior, I suppose you could call it, but what isn’t? And in what way am I an expert? Whatever my column is about, there’s no degree in it, no professional organization, no standardized test to pass. I bring scholarly and work experience in theater, psychology, storytelling, comedy, project management, human resources, philosophy, and religion. And life experiences too odd and idiosyncratic to explain. But someone else could do just as good (well, almost as good …) a job as I do with a completely different skill set and experiences.

These are the ideas I want to explore Friday afternoon, I think, at the institution that granted me a PhD., which is where many people think my “expertise” comes from. I’m not so sure I agree with that. It’s an ingredient, sure, but if chili were only beef, we’d call it “steak.”

What am I an expert on?

What does it mean to be an “expert,” anyway?

What does it mean to be an authority? Does that differ from being an expert?

I would love to get your thoughts on this! And I will respond in comments, and share any insights that emerge from Friday’s lunch as well.

Are you a difficult person? (because it’s not like anyone would tell you if you are)

October 7th, 2009

My September 27 column in which I answered a question from a woman whose mother makes her “feel as though I am always walking on eggshells around her, waiting for her to explode at something that I say,” got a lot of response from readers. If you have someone like this in your life, please know two things:

1) You can get help, and

2) You are not alone.

One person sent in a link to a very thought-provoking quiz to determine — are you one of those people around whom others walk on eggshells? (Goodness, that was syntactically awkward, but you know what I mean.)

She wrote:

I am not kidding when I say my mother could answer “yes” to 2/3 of the questions. It’s a nice impartial way to see if you are over-reacting or if the person in question is just having a bad day. But that’s not its only purpose. Shouldn’t we all determine if we are difficult, and not realizing how others feel about interacting with us? Most of these behaviors are routed in bad manners. Treating others with dignity, respect, kindness and not embarrassing them is the key to any relationship, whether you know them intimately or not.

Thank you for sharing! And do check out the quiz.

Sam Raimi and dream logic

October 5th, 2009

Mr. Improbable and I finally got around to seeing “Drag Me to Hell,” inappropriately enough the night after Rosh Hashanah, but hey, that’s the only time the Brattle was showing it. I am a huge fan of Sam Raimi’s “Evil Dead” series, especially “Evil Dead II.” “Drag Me to Hell” had been heralded as a return on Mr. Raimi’s part to his ultraviolent, comedic, schlock-horror roots, so of course I had to go. (Don’t you bash on my lowbrow tastes — Roger Ebert liked it.)

“Hell” is the story of Christine, a young loan officer (you can tell they really, really wanted Jenna Fischer from “The Office” for this role) who denies an ancient Gypsy woman, Sylvia Ganush, a mortgage extension, and is subsequently cursed. The curse is delivered after an extended fight scene between Pam Christine and Mrs. Ganush in a parking garage. As I’ve mentioned, I’m a self-defense graduate, and I had to appreciate how Christine immediately went into action, fighting with total vigor, commitment, and ingenuity. If I’m ever attacked, I hope my training kicks in like that.

Which it might not, instantly, because in most situations, there’s a moment of disbelief. Most people without serious training — and I don’t mean the kind I got, I mean the kind soldiers get — have a moment of shock when another person aggresses, whether the form of that aggression is a racist joke, a subway grope, or a mugging. Then you sort of “come to” and start fighting, or running, or arguing, or (in the case of the subway grope) grabbing the guy’s hand, holding it up, and saying loudly “Who does this hand belong to? I found it on my butt.” In my life, I’ve only known one person who could go instantly from Suzy Creamcheese to wailing ninja banshee if she had to. Most of us get stopped in our tracks, at least for a crucial few seconds.

Especially — and this is where the “dream logic” part of the headline comes in — if you’re being attacked by a one-eyed Gypsy woman who looks about 110 years old. This is one of the things I love about Sam Raimi’s movies: his characters never pause to think about the sheer improbability of the situations they are in. They just cowboy up and do what needs to be done. Fine, my hand is possessed and trying to beat me to death. Chainsaw time! (The particularly awesome thing is that not only is Ash willing to accept that his hand is trying to kill him, but that it is laughing at him.)

This is how dream logic works. When I was a professor at Emmanuel College, I got to do some work with the distinguished, and very wonderful, J. Allan Hobson on dreams. One of his theories/discoveries is that when we are dreaming, we solve problems pretty much the same way we do when we are awake, with one exception: we don’t question the bizarre. In short, if I were dreaming that my hand were smashing dishes on my head while giggling hysterically, I wouldn’t say, “Hey, wait a minute, it’s not physiologically possible for a hand to manipulate itself in contradiction to the desires of my brain, nor, for that matter, does it have a mouth.” I would, instead, accept the situation as a given and use whatever problem-solving mechanisms come to me most naturally in everyday life.

Have you ever had this experience in a dream? I know I have, although I can’t always remember the details. I do remember a recent dream in which I offered someone sudafed. I am never without sudafed and aspirin in my purse, and will offer them at the drop of a hat to anyone who appears under the weather, so I thought it was really funny that I carry ‘em with me into dreamland as well. Yes, even in the depths of my unconscious mind, I am still a hypochondriac yenta. Good to know.

Now, here’s the cool thing: if you can train your mind to recognize the bizarreness of dream scenarios, but not let that wake you up, you can take control and do lucid dreaming. I’ve managed this once or twice, and let me tell you, lucid dreaming is fun. You can fly or do anything at all!

Have you ever lucid-dreamed? Have you ever solved a problem in a dream in exactly the same way you would have in real life? Have you ever been cursed by an ancient Gypsy woman? Discuss.