Saturday, December 6, 2008

What is Reality, Part II: Virtual Reality as “Real Life”?

Most of us would probably agree that task-oriented role playing computer games, such as World of Warcraft and Everquest, aren’t real life. In these games, you play as a “character,” acquire objects and go on quests. Examples are World of Warcraft or Everquest”. (These games are of course a real life activity, but the “experiences” of your character aren’t real in the sense that what happens to them doesn’t happen to you: Your character may go on a quest, but you don’t really feel that you did too.)


The line gets a little bit fuzzier with games that are less task focused, and involve activities that resemble real life activities—computer role playing games like the Sims and Second Life. In these games, users create avatars—their representations in the game—who then proceed to do things that resemble real life (with the exception of flying without a plane a few others). Avatars walk around, watch a sunset, travel, meet and interact with other avatars/people, fall in love, have sex, get into fights. In these games, an avatars is, in some sense, the alter ego of the user. Users spend significant amounts of time (and money) perfecting the appearance and style of their avatar. If you’ve never played these types of games, see below for a sample.




People who spend a lot of time in these virtual worlds (is "playing these games" the right phrase?) see their avatars as extensions of themselves, and players report the sense that what happens to their avatars happens to them (Here's an example of firefighter training in Second Life; each person you see on the screen is controlled by an actual human at a keyboard, directing its actions.




Research at Stanford’s
Virtual Human Interaction Lab has found that among 30,000 virtual gamers who spend upwards of 20 hours/week in their virtual worlds, 40% of the males and 53% of the females report that their virtual friends are equal to or better than real-life friends. And, as discussed in my previous post, it’s not only our minds that respond to virtual worlds as if they’re real—our bodies do too. If you’re at the edge of a virtual cliff, your heart rate will increase just as it would at the edge of a real cliff.

In an effort to provide us all in the next step in entertainment, it’s only a matter of time until companies create a cost-effective technology that make virtual worlds more authentic and interactive.

Nintendo’s Wii is a step in that direction. For those of you not familiar with the Wii, check out this ad:




(If you’re not familiar with the Wii, when people move the white remote, they are controlling what happens on the screen: playing a traditional “computer game,” virtually conducting an orchestra, or virtually playing tennis or baseball.) If you ask people using the Wii whether their experience is “real”—you might hear more ays than nays. After all, if their bodies and minds feel that they’ve had the experience, does it matter whether the visual portion of the experience occurred on a screen or monitor?

Once the Wii-type devices become even more technologically enhanced in true immersive virtual reality, the question “what is reality” seems to me to have crossed a threshold into an existential realm (or a psychophysical realm), akin to the question “if a tree falls in the woods and no one hears it, did it make a sound?”

Friday, November 7, 2008

What Is Reality, I: Virtual Reality as Part of the Solution




Phobias are excessive fears that interferes with life. People with phobias typically try to avoid the feared stimulus because of the uncomfortable physical sensations that arise—often those of a panic attack: rapid heart rate, irregular breathing or the sensation of difficulty breathing, sweating, a sense of dread. If you avoid flying because of a fear of flying, it can limit your professional life and vacation activities. If you have a spider phobia, you may refuse to enter basements or wooded areas.


There's a very effective treatment for phobias: exposure. Exposure involves allowing yourself to confront the feared stimulus in a planned way for a sustained period of time (about 23-30 minutes) and noticing that nothing catastrophic happened. When you sustain attention to the feared stimulus, the uncomfortable physical sensation dissipate, and you confront head-on any irrational beliefs you may have about the feared stimulus. For instance, when someone with a spider phobia is exposed to a spider, he or she learns that, in fact, nothing awful happens beyond the physical discomfort of the anxiety symptoms, which pass with time. Ditto for fear of flying—beyond the physical discomfort of the anxiety, nothing catastrophic happens because of flying per se. (Of course, any method of transportation has risks: people get hit walking down the street, riding their bikes, in their cars.)


In learning that no harm befalls them other than their own physical anxiety, they can rethink their fears: "Hm…so I didn't go crazy from being around the spider, and it didn't bite me and turn me into some type of freakish creature…maybe I don't need to be so afraid of it" or "Hm…maybe my fear and worry wasn't actually keeping the plane in the air…maybe I can relax a bit and not view the plane ride as a death sentence." With repeated exposure to the feared stimulus--more time around spiders, more plane flights—the anxiety and physical reactions lessen even more.


There are several different ways exposure can be undertaken:

  • imaginal exposure—using your imagination to form mental images of the feared stimulus
  • in vivo exposure—being exposed to the actual feared stimulus. A percentage of people seeking treatment for phobias hestitate to use this method because they feel it will be "too much" for them.
  • virtual exposure—being exposed to a "virtual" stimulus, such as virtual reality of the feared stimulus (on the high-tech end) to photographs or recordings of sounds of the feared stimulus (on the low-tech end). For more about virtual exposure, click here,

Here's an example of virtual exposure to an airplane.


Click here for a video about virtual reality therapy


Recent research suggests that as a treatment for some phobias, such as fear of flying and fear of heights, virtual exposure is as good as in vivo exposure. Even thought the virtual world is a bit cartoonish, users are able to confront the feared stimulus and learn that their almost automatic fear reaction is out of proportion to the situation. People were able allow the virtual reality to simulate reality to the point where there was no difference in people's responses to the two different types of reality.


To me, the fact that people respond to exposure with virtual reality as well as if it was "real" is remarkable. In the next blog posting, I'll talk more about virtual reality as is relates to gaming.

Wednesday, October 1, 2008

Civics and Citizenship

Continuing on the theme of civics, as of today, the United States Citizenship and Immigration Services is offering a revised naturalization test for people seeking citizenship.

Redesign Process
The major aim of the redesign process is to ensure that naturalization applicants have uniform, consistent testing experiences nationwide, and that the civics test can effectively assess whether applicants have a meaningful understanding of U.S. government and history. Following a basic U.S. history and civics curriculum, the redesigned test will serve as an important instrument to encourage civic learning and patriotism among prospective citizens.


You can see all 100 history and civics questions (and answers) here. In a citizenship test, in addition to other components (related to the ability to write in, speak, and read English), applicants must answer orally up to 10 questions randomly chosen from the pool of 100 questions. It would be interesting to know what percentage of US-born adults would be able to pass the naturalization test.

Here are some of my favorites; bulleted points are acceptable answers:

What does the Constitution do?
▪ sets up the government
▪ defines the government
▪ protects basic rights of Americans

What did the Declaration of Independence do?

▪ announced our independence (from Great Britain)
▪ declared our independence (from Great Britain)
▪ said that the United States is free (from Great Britain)

What stops one branch of government from becoming too powerful?

▪ checks and balances
▪ separation of powers

What is one responsibility that is only for United States citizens?
 
▪ serve on a jury
▪ vote in a federal election

Name one right only for United States citizens.

▪ vote in a federal election
▪ run for federal office

What are two rights of everyone living in the United States?
▪ freedom of expression
▪ freedom of speech
▪ freedom of assembly
▪ freedom to petition the government
▪ freedom of worship
▪ the right to bear arms

Friday, September 19, 2008

Civic Education and Licensing Journalists



With the election coming up and the mainstream media focusing on trivialities and variants of "gotcha," my recurrent wish for a mandated national civics curriculum reared its head. Consider this: In 1998, 4th, 8th, and 12th graders around the country took a civics test at the behest of the U. S. Department of Education.

Students performed abysmally: A third of the high school seniors—who were or would soon be of voting age--didn't understand the basics of the American government. How many high school seniors demonstrated that they were proficient--the highest level of knowledge? One quarter of them. And only 9% percent of students could give two reasons why it is important for citizens to be involved in a democratic society.

Here's what the test was designed to assess:
Students [should] show broad knowledge of the American constitutional system and of the workings of our civil society. They [should] demonstrate a range of intellectual skills-identifying and describing important information, explaining and analyzing it, and evaluating information and defending positions with appropriate evidence and careful reasoning. (from the National Assessment of Educational Progress Civics Assessment Governing Board. Civics Framework for the 1998 National Assessment Educational Progress. US Department of Education, 1998.)

Okay, live and learn. Fast forward eight years to 2006, when 4th, 8th and 12th were administered the civics test. How'd they do? As you can see below, the fourth graders did a bit better, but there was no different between this cohort of 8th and 12th graders and those from 8 years ago.

Achievement level graphic showing the percentage at or below Basic, at or below Proficient, and at Advanced, respectively for grades 4, 8, and 12 in the 1998 and 2006 NAEP Civics assessment. At grade 4, 1998 was 69*, 23, and 2 and 2006 was 73, 24, and 1. At grade 8; 1998 was 70, 22, and 2 and 2006 was 70, 22, and 2. At grade 12, 1998 was 65, 26, and 4, and 2006 was 66, 27, and 5.
[Source: http://nces.ed.gov/nationsreportcard/pubs/main2006/2007476.asp]

Why might this be? Shouldn't all high school students graduate with a firm knowledge of civics? Isn't there a common core set of values all Americans should share? It is important for high school students to enter adulthood (and voting age) with a firm sense of the rights and responsibilities of citizenship. It is in the country's interest for its citizens to look beyond 60 second sound bites or their "gut feeling" when deciding how to cast their vote on a referendum or for a candidate.

Here's what Center for Civic Education says about why civic education is important:

A free society must rely on the knowledge, skills, and virtue of its citizens and those they elect to public office. Civic education is the primary way our citizens acquire the knowledge and skills necessary for informed and engaged citizenship. While many institutions such as the family, the church and social organizations help forge a person’s civic character and propensity to participate, civic education in the schools is the one common experience American citizens share that helps them acquire and learn to use the skills, knowledge and attitudes that prepare them to be competent and responsible citizens throughout their lives. This is the historic civic mission of schools. A mission considered so important by those who established a free universal system of public education in the United States that they termed civic education as one of the central purposes of education. Unfortunately, as the indicators of civic engagement in our nation are dropping so too is the amount of time and attention devoted to civic education in our schools. 
And consider this:
The Policy Research Project on Civic Education Policies and Practices found that on average civics content in states' social studies standards overemphasize lower-order thinking of identifying and describing positions, stating that "civic statements requiring students to evaluate, take, and defend positions-the highest-order level of thinking-are the least prevalent in most state standards." (Policy Research Project on Civic Education Policies and Practices. The Civic Education of American Youth: From State Policies to School District Practices. Lyndon B. Johnson School of Public Affairs. Policy Research Project Report, no. 133, 1999)

Idea #1: Bring back mandatory national civic education for students. We could even have a national test on civics (not American History per se, but on civics—rights and responsibilities of being an American citizen). We could even have a national debate about whether students must pass such a test in order to graduate high school. (There is an organization, Center for Civic Education, which is devoted to promoting civic education in the schools). 

Idea #2: Consider making voting compulsory. Actually, the voting itself wouldn't be compulsory—just showing up at the polling place. This is how Australia does it (to see Eric Weiner's article about the Australian system, click here).  If people had to go into the voting booth, they might just be motivated to exert the mental effort to be informed about the issues.

Even if motivated, though, the skills necessary for true citizenship—"evaluating information and defending positions with appropriate evidence and careful reasoning"—are not skills that are much on display in public discussions of politics and governance. Rather, the "political discourse" consists of sound bites, simplistic arguments designed to tug at emotions rather than reason, and to smear an opposing candidate's position or character (often falsely or by stretching the truth significantly). Is the mainstream media helping further a civics education for Americans? No.

I'd like to make a proposal (idea #3, below). First, let me explain about professional ethics and licensing for psychotherapists, which I will then use as an anology; this may seem a like digression, but it isn't.

There are many different types of psychotherapists. In fact, anyone can call him or herself a psychotherapist and hang out a shingle (so to speak). What differentiates the generic "psychotherapist" from a specialized one? The simple answer is training, and then a license. It is illegal for psychologists who evaluate or treat patients or clients to say they are psychologists—unless they have a state license. Otherwise it is "practicing without a license."

To obtain a license, a psychologist must demonstrate having received the appropriate training and experience (as determined by the state licensing board), and then take a national test plus a test on state legal and ethical matters. If the test score passes the cutoff, the psychologist is given a license. To maintain the license, the psychologist must continue learning—must obtain a specific number of continuing education credits within a specific number of years (the particular numbers vary from state to state) for each license renewal.

If a psychologist is found to have violated the state laws or ethical code that relate to professional conduct, the license can be taken away. (A license is necessary to receive health insurance payment for services, and required for certain jobs.) The licensing process is set up to protect citizens from fraud and from unethical or illegal behavior by the psychologist in his or her professional capacity.

Idea #3: License journalists. Just as psychologists are licensed, let's license journalists. For this discussion, I'm going to create a distinction between a correspondent (analogous to the generic psychotherapist—anyone can call themselves a correspondent) and a journalist (analogous to a psychologist--whose title conveys a deeper responsibility—for analysis, commentary, accuracy, integrity). The journalist is held to a higher standard, and requires more training (and should, in theory, receive more compensation). If a journalist has engaged in irresponsible reporting or analysis--according to the decision of the licensing board--his or her license could be taken away, and if so, then that individual cannot practice as a journalist. The fact that the licensing credential has been stripped becomes public knowledge. (We can use words other than correspondent and journalist—I'm using those terms to illustrate the concept of the two different types of labels for people who convey news to the public). When we see, hear, or read news through any media, we would thus be able to know whether or not the source was held to a high standard of accuracy and integrity.

What about "press passes"? Any organization giving out press passes can determine how many passes to give to correspondents versus journalists. What about freedom of the press? I'm not saying correspondents or journalists should be muzzled. I'm advocating that, as with psychologists, the public should be protected from fraud and a misuse of the professional positions of people in the press. Anyone can say or claim anything they want, but journalists—in their professional capacity—should behave with a sense of integrity and ethics.

Here's how we might develop such a system.

Step 1: Have the profession develop a national journalistic code of ethics (it has to be specific enough in most cases it's relatively clear when someone has crossed the line). As with psychology, the goal is to ensure that people who obtain a journalist license understand their rights and their legal and ethical obligations; having such a code allows the public understand those obligations and rights as well.

Step 2: Set minimum training and work experience standards for someone to be eligible to become a journalist.

Step 3:
Once someone meets the minimum standards outlined in Step 2, he or she can take a test to demonstrate knowledge of the profession's rights and responsibilities (these will be based on Step 1). The cutoff score for passing is determined by the licensing board.

Step 4: Once licensed, the journalist must renew the license periodically, demonstrating continued education by earning continuing education credits (determined by the licensing board). Members of the public can bring complaints against a journalist for violating the ethical code; if the licensing board determines after a hearing that the charges have merit, the journalist's license may be revoked.

I'm not a journalist (clearly) and so this proposal may be unworkable or naive. But with the proliferation of "news" sources on the internet, in print, and on television, it seems to me that there must be some accountability and responsibility.

Monday, September 1, 2008

Clarifications About The Dark Knight



In a response to comment about an earlier blog entry, I noted the cumulative trauma that might explain why Harvey Dent went over the edge to become Two-Face:
his parents and sibling killed, love of his life killed--after she'd agreed to marry him, facial disfigurement, and extreme pain from the fire.
Since that post, I've read the "novelization" of the movie , written by former Batman writer/editor Denny O'Neil. (Click here, The Dark Knight, for a link to Amazon's webpage for the book)

The novelization provided interesting backstories for the events in the movie, and helped clarify a couple of the movie's more confusing elements. The novelization makes it clear that:
  1. The two murdered individuals with "Harvey" and "Dent" identification tags had no family relationship to Harvey Dent;
  2. The Joker intended to misdirect Batman when he revealed the locations where Rachel Dawes and Harvey Dent were being held. He tells Batman that Rachel is at one address and Dent at the other location. When Gordon then asks Batman which one he was going to, Batman replies, "Dent knew the risks," and he rushes off to the location that Joker said housed Rachel.  
They have deleted this line from the movie, which is too bad because it makes the subsequent sequence much clearer. And once Batman found Dent rather than Rachel at the location, the novelization explains (page 231):
"He had expected to see Rachel, to free her and get her out of the building. To tell her he loved her and would keep her safe forever. Instead, he saw Harvey Dent lying in a black puddle, bound to a chair. Next to him were two barrels, one on its side, and a timer. Shock and horror flooded over Batman…Rachel, thought Batman. Good God, I've failed her…

Saturday, August 16, 2008

Dr Robin on BlogTalkRadio.com

Today on blog talk radio Pop Culture America, I discussed with hosts John and Dave possible reasons for the success of The Dark Knight and chat about psychology of superheroes generally. Click on the link, above, to hear the show--my segment begins 30 minutes into the show.

Saturday, August 2, 2008

A Cohort Effect to The Dark Knight Experience?

I just returned from my second viewing of The Dark Knight, which I saw in an IMAX theatre. Let's get the IMAX part out of the way---Wow! The aerial scenes were breathtaking, literally. As a viewer, I felt that I was in the helicopter that housed the camera, flying above the city. And close-ups of the Joker were terrifying—it felt as if he had me in a stranglehold and was forcing me to look at him, as he did with Rachel Dawes.

I quickly habituated to the IMAX features of the movie (except for the aerial shots), and focused on the movie itself—plot, story, visual aspects, message. During my first viewing, I must confess that during the last third of the film, I was expending a lot of energy trying to keep up with the fast-paced twists and turns of the plot, and the cuts between different characters and storylines. The first time, I didn't quite get some of the quietly spoken words, and didn't understand some of the quick visuals. I was still be processing the implications of one scene—in terms of the plot, the character development, the emotional consequences—when I was ripped from that scene to the next complex scene.

There were times when I wished that I could press a pause button so that I could fully digest a scene before moving on to the next one. It was frustrating. I felt as if I was on a treadmill that would sometimes shift into a faster pace than I was prepared for, and it was all I could do to keep up and not fall off. In the last third of the film, rather than feeling energized, the pace left me fatigued. The younger people that I know who saw the movie did not feel this way, and they seemed to understand most everything that happened. And they didn't feel tired during the film, only afterward, when the adrenaline rush stopped.

In my second go-round with The Dark Knight, I could appreciate the finer details of the story that I missed this first time (such as the extent of Dent's grudge against Jim Gordon and Dent's use of his coin before versus after his kidnapping).

After this second viewing, my adult companion who had not seen the film before commented that he felt overloaded by the last third of the film. As with my first viewing, his cognitive and emotional energy to handle the film diminished with its length and the horror the Joker wrought.

Why the different responses between older viewers (I'm guessing the dividing line is at around 40+ years old) and younger viewers? You could chalk it up to an age difference—that younger people, but virtue of their age, are better able to track fast paced events, and so don't become particularly tired by it. I don’t think that's the answer, though. Rather, I think it's because people who are in their 20s and younger have been exposed to things that my generation wasn't exposed to as children: computer and console games, extreme multitasking with instant messaging and internet surfing, and fast passed television shows and movies. That exposure has influenced how this younger generation processes and responds to information. Had my generation been exposed to these technologies in our formidable years, I think we wouldn't be fatigued by the end of The Dark Knight either.



Psychologists call this a cohort effect: The impact of a common event or experience on a group of people, compared to those who do not share the event or experience. My hypothesis is that there is a cohort effect in how tired people feel after watching The Dark Knight. Younger viewers, by virtue of their technological and media experiences during their formidable years, experience the film differently than do older viewers.

There's an easy way to test my hypothesis about the cohort effect: Assess the reactions of two types of The Dark Knight viewers: Compare young viewers who grew up with significant a amounts of computer, gaming, and media exposure with those young people who did not. I welcome any comments you have about my hypothesis!

[On a related note, younger people in general are "smarter" than their elders: IQs scores have been increasing by, on average, 3 IQ points every 10 years. This finding is called the Flynn Effect, named after the social scientist, James Flynn, whose research revealed the increase. There are various explanations for the Flynn effect, including increasingly better health and nutrition in children, increased familiarity with IQ-type questions, and an increasingly complexity in the environments in which children are growing up. These are discussed in his most recent book, What is Intelligence?: Beyond the Flynn Effect. To read Malcom Gladwell's article about the Flynn effect, click here]

Monday, July 28, 2008

Can the Joker Be a Psychopath and Have Antisocial Personality Disorder?


In a brief review in Creative Loafing, I discussed whether the Joker was antisocial—whether he had antisocial personality disorder. The answer was "yes." In a previous blog entry, I discussed whether he was a psychopath. The answer was "yes." Are the two terms the same? No. What the difference between the two, and can he be both a psychopath and antisocial?

Antisocial personality disorder is a psychiatric diagnosis in the Diagnostic and Statistical Manual (DSM, fourth edition). The DSM includes 10 different personality disorders, each of which involves a different set of maladapative behavior patterns. The criteria for antisocial personality disorder require a pattern of at least three types of criminal behaviors (e.g., repeatedly breaking the law, even in small ways, such as disturbing the peace) or somewhat related behaviors, such as being impulsive, irresponsible, aggressive, or lying. To see the complete set of diagnostic criteria, click here. Note that when I said that the Joker had antisocial personality disorder in my 300-word review (!) I was was addressing criteria A1-A7 but not criteria B-D, as Jared's comment, below, rightly points out (thanks, Jared!).

People can engage in these behaviors for any number of reasons: hanging out with the "wrong crowd"; going through a period of being angry at "society" and not caring about the future; being temperamentally driven to seek out stimulating activities, some of which are illegal; or finding the suffering of others to be amusing. The diagnostic criteria are limited to the behaviors themselves, not the underlying reasons for the behaviors.

In studies, about 50-80% of American male prisoners meet the criteria for antisocial personality disorder. (I'm limiting this discussion to males because males are three times more likely to have this disorder and there is much more research on them than females with the disorder.)

In contrast, psychopathy (the "disorder" of psychopaths, although it is not listed in the DSM; the term sociopath is used interchangeably with psychopath) focuses less on behavior than on underlying traits that contribute to antisocial behavior patterns. These traits and behaviors can be categorized into two broad factors, and in brackets I note whether each characteristic seems true of the Joker:

Interpersonal/emotional, characterized by:
• Superficial charm [true]
• A grandiose sense of self-worth [no, because his sense of what he can do—what he's worth—seems accurate]
• Pathological lying [true]
• Tendency to manipulate others [true]
• Doesn’t feel guilt or remorse [true]
• Shallow feelings [hard to say for sure]
• Lack of empathy [true]
• Doesn't accept responsibility for his or her actions [true—although he "claims" responsibility, he seeks to evade any negative repercussions of his actions]


Social deviance, characterized by:
• Getting easily bored and needing frequent stimulation [Hard to say—he was able to plan and carryout capers and murders that would be difficult for someone who got bored easily. However, his escalating crimes suggest that he does "need" increasingly outrageous crimes]
• No realistic long-term goals [no; his long-term goal was to get the Batman, and he planned out a series of crimes in order to do so]
• Impulsive behavior [no]
• Having difficulty controlling behavior [doesn’t seem to be the case]
• Irresponsibility [true]
• Behavioral problems that arose at an early age, possibly with juvenile delinquency [unknown at this time]
• Engaging in different types of criminal behavior [true]

So the Joker has almost all the interpersonal and emotional qualities of a psychopath, but fewer of the socially deviant qualities. It would be interesting to know which of the Joker's various origins stories Nolan had in mind when developing the script and directing Heath Ledger, or perhaps Nolan created a new origin story. As a viewer I wanted to know more about how Joker came into being, when and how he crossed the line from being a criminal or mob boss (as in the Jack Nicholson/Tim Burton incarnation of the Joker) to the wildly sick and sadistic individual in The Dark Knight.

Back to the relationship between antisocial personality disorder and psychopathy: Only a minority of people (less than 40%) of men with antisocial personality disorder are also considered to be psychopaths. But over 80% of men with psychopathy also have antisocial personality disorder. And lest you think that prisons are overrun with psychopaths, that's not the case—only 15% of males in prison meet the criteria for psychopathy.

[The psychopathy factors are based on the Psychopathy Checklist-Revised, developed by psychologist Robert Hare, who has studied psychopathy extensively. Two of his books on the subject are Snakes in Suits: When Psychopaths Go to Work and Without Conscience: The Disturbing World of the Psychopaths Among Us.]

Monday, July 21, 2008

The Dark Knight--A Psychologist's View




This second installment of Christopher Nolan's vision of Batman and his world offers a psychologically rich and multi-layered portrayal of characters struggling with all-too-real issues of responsibility, trust, betrayal, the role of physical appearance in forming impressions of people, and of psychopathy unleashed. Whereas Batman Begins explores Bruce Wayne's evolution from anxious child to the Caped Crusader, The Dark Knight brings us a matured Batman battling against a terrifying psychopath whose main motivation appears to be similar to two-year but with deadlier consequences: he does "stuff" just to see what happens.

This film is really about the Joker. We're lured in to his world, where we learn what he's capable of and what he cares about—what motivates him. Learning more about him is like watching a car accident unfold, but worse and more frightening, because it feels like you might be hit next. Nolan's incarnation of the Joker, and Batman's reactions to him, seem so real that The Dark Knight doesn't feel like a superhero movie, but like a documentary on the emergence of a terrorist-cum-serial killer.

I was repeatedly struck by the Joker's cleverness and restraint: [spoiler alert!] He planned three simultaneous murders, Godfather style. [end of spoler alert] That takes extraordinary planning abilities, meticulous attention to detail, and the ability to defer gratification. Sound like anyone else in the movie--like Batman? This Joker is neither impulsive nor capricious, although he may appear that way at first blush. Just as with Batman, the Joker's actions are designed to create a particular impression, an impression that puts his adversaries at a disadvantage: that he's weird and unpredictable. That you never know how far he'll push something, so take him seriously. This, too, is part of the impression that Batman tries to create. But the Joker's got Batman's number because he knows that Batman isn't entirely unpredictable—Batman lives within certain self-imposed and societally-imposed rules. Because of those rules, Batman becomes predictable…at least to the Joker. Two men with similar talents, but in the Joker's case, his talents are used to create anarchy for his own amusement. Is he a psychopath? Let's investigate.

A psychopath is someone who displays personality traits that involve more than simply engaging in criminal acts. A psychopath:

(1) can be superficially charming and manipulative. The Joker is that. Imagine him without his clown makeup—imagine him with a handsome face doing and saying what he does in the film. He'd make a compelling—versus repellant—figure to some, wouldn't he?
(2) is emotionally callous, without remorse or guilt. The Joker lies freely when it suits him (he gives two different stories for how he came to have his scars, both likely untrue). He doesn't feel badly about his dastardly deeds—the lives he taken or ruined.
(3) has a socially deviant lifestyle. Not only is the Joker a criminal, but he seeks out the excitement of his life of crime. His crimes act as a drug and he needs the regular fixes of the intense stimulation his actions bring. Unfortunately, he seems to develop tolerance to the drug, wanting ever-increasing "doses" and so planning increasingly sadistic and elaborate capers.

Hats off to Christopher Nolan, Jonathan Nolan, and the cast and crew. Heath Ledger's performance deserves an Oscar.

For more about my take on Batman, look for me on the History Channel's Batman Unmasked: The Psychology of the Dark Knight and an essay wrote about "What's the Matter With Bruce Wayne?" in the anthology, "Batman Unauthorized."

Sunday, June 29, 2008

Dr. Robin on the History Channel



The History Channel aired a segment called Batman Unmasked: The Psychology of the Dark Knight. I spoke about Bruce Wayne, the trauma he experienced in watching his parents' murder, and how it has transformed him.


Here are links to riffs I've given on other superheroes:

Superheroes generally (in the Boston Phoenix) and movie reviews of the first Hellboy and Hulk movies.

Thursday, June 12, 2008

The Power of Imagination




Last week, Harry Potter author J. K. Rowling spoke at Harvard University's commencement exercises (for the full text of her speech, click here; you can also see it). In her moving speech, Rowland made explicit the links between imagination and empathy, and between empathy and social action:

Amnesty mobilises thousands of people who have never been tortured or imprisoned for their beliefs to act on behalf of those who have. The power of human empathy, leading to collective action, saves lives, and frees prisoners. Ordinary people, whose personal well-being and security are assured, join together in huge numbers to save people they do not know, and will never meet. My small participation in that process was one of the most humbling and inspiring experiences of my life.

Unlike any other creature on this planet, humans can learn and understand, without having experienced. They can think themselves into other people's minds, imagine themselves into other people's places.

Of course, this is a power, like my brand of fictional magic, that is morally neutral. One might use such an ability to manipulate, or control, just as much as to understand or sympathise.

And many prefer not to exercise their imaginations at all. They choose to remain comfortably within the bounds of their own experience, never troubling to wonder how it would feel to have been born other than they are. They can refuse to hear screams or to peer inside cages; they can close their minds and hearts to any suffering that does not touch them personally; they can refuse to know.

I was incredibly moved by her speech and, as a psychologist, I thought the links she made with empathy were profound insights into human nature: by imagining ourselves in other people's shoes, we develop a heightened empathy for them. In turn, this heightened empathy moves some of us to act on behalf of others. Her remarks also left me wondering about the implications for people with autism and Asperger's disorder, disorders that are typically marked by low levels of empathy. Did their difficulties with empathy mean that they also had difficulties imagining? It turns out that Rowling was on to something: children with autism have less imaginative play--they have difficulty generating novel ideas, particularly those that involve language (click here for an abstract about this work). Thus, some of the implications of her observations are supported by research.

Saturday, May 31, 2008

What Makes Iron Man So Good?




I just came back from seeing Iron Man, and it's been years since I've seen a superhero movie as good. Within the first 30 minutes I was hooked. (Truth be told, I was hooked within the first 5 minutes.) I sat in the theatre drawn in to the story, but also trying to hang back enough to figure out what the director, writers, and actors had done to make the film work so well. Here's what I came up with.

First, the film has wonderful character development. Tony Stark's character—his personality and motivation—come to life within minutes, but in a way that feels closer to three-dimensional rather than two-dimensional. We see that he's incredibly brilliant and bored hedonist, playing out the role smart and smart-ass industrialist without a real purpose in life. The first 30 minutes of the film don't feel like a superhero film because the characters are so well explicated and the relationships among them so clear. Unlike many superhero films, Iron Man shows the relationships among the characters rather than simply tells the viewer about the relationships through narration or embedded in dialogue.

Second, the story arc of Stark's transformation from a smart-ass hedonist into a superhero is believable. No science fiction needed—no aliens, no superpowers, no genetic mutations. Just a brilliant and creative engineer motivated to save himself and destroy his captors. (I suppose if I were an engineer I might think that the technical aspects of the story were beyond the realm of possibility, but I'm not, so the story didn't seem outlandish to me.)

Third, Stark fights two sets of enemies—one set that is clearly the "bad guys" (his captors) and the other is someone close to him. I won't give away too much here in case you haven't yet seen the film. But because Stark has enemies who are not quite so easy to spot, the story is less black-and-white than some other superhero movies where the villain is clear from the outset, such as Spiderman I.

I could go on, but I urge you to see the movie and make your own analysis.

Sunday, May 25, 2008

Superheroes: Fashion and Fantasy



I just returned from the new Superheroes: Fashion and Fantasy exhibit at the Metropolitan Museum of Art in New York.

One thing that struck me about the "fashion" part of the exhibit—that is the runway costumes, not the actual superhero costumes—is that the women's attire generally looked horribly uncomfortable to wear, whereas the men's runway costumes looked much more comfortable. For instance, in the photo above, Batman's costume (from this year's The Dark Knight film) is on the left, then there are two male fashion versions, and three women's fashion versions.

Would you rather wear the male or female costumes if you were going to be out all evening?

Of course I really shouldn't have been surprised by the difference in comfort and utility between men's "costumes" and women's "costumes". It's true of fashion in general. Men's clothes are built for comfort and practicality—ample deep pockets that can safely contain important objects (cell phones, keys, wallet) without danger of falling out or spearing buttocks when the wearer sits down.

In contrast, women's clothes are generally built to show off the women's shape in the best light, regardless of comfort: tight dresses and skirts that make it impossible to take long strides when walking; high heels that make standing or walking for even moderate periods of time painful, no functional pockets so that purses are required (and another opportunity for the fashion industry to make money with accessories!). I often wonder how men would react if men's suit designers did away with pockets so that men would need to carry some type of satchel for the important belongings.

But I digress…back to the Superheroes fashion exhibit and the utility of a costume. One more example—the "armor" section that includes the costume for Iron Man, and then a women's fashion version (on the right, in case you couldn't tell the two apart):



More of women's armor fashions are below. If I needed armor, I'd go for complete coverage.

Saturday, May 10, 2008

Brave New World, Part III: Instant Communication and Regulation Problems

(This is the third of three blog entries about possible cultural ramifications of growing up in the age of instant communications)

One upside of the instant communications (phone, texting, and Internet including email and instant messaging) is that it makes it easier for people who are experiencing emotional turmoil to obtain information and resources to help them, such as social support—from friends and family, even from strangers on the Internet through websites and chat rooms.

This instantaneous knowledge and support has a downside. Consider that when someone is angry or frustrated, it's almost too easy to vent those feelings:
Directly and privately to the person via email, texting, or phone;
Indirectly and privately by communicating with others about the person via email, texting, or phone;
Publicly, by posting the negative feelings via a website.

These rapid communications mean that people no longer have to "sit with" uncomfortable feelings: If we're annoyed by the service at a restaurant we can, with our web-enabled phone, rant about it on our blog or as a "reviewer" on a food website. If we're upset about a transaction with a friend, we can call most anyone to talk about it via our cell phone (and theirs), or send a long email about it. As soon as we have an uncomfortable emotions, instant communication technologies allow us interactive outlets to try to transform these negative feelings.

Before instant communications, people coped with life's emotional vicissitudes in a variety of ways:
• We talked to a friend or family member in person;
• We tried to distract ourselves with activities;
• We wrote, in a diary or in a letter to a friend, family member, or to the person who aroused the feelings in us, to vent or better understand our feelings;
• We thought about and reflected on our feelings, the events that precipitated them, and the effects of possible courses of action.

What these activities had in common is that they typically require some effort and sometimes some patience and ability to tolerate our feelings until we could transform them or act on them in a significant way—we had to find the people we wanted to talk to (no cell phones or pagers) and we might have had to wait until they were available to speak to us; we had to work a bit at distraction—leaving our homes or offices, or at least leaving our chairs (rather than districting ourselves with the internet on a computer always at hand). And thinking or writing about what had transpired gave us pause to reflect without acting on our feelings.

People can still cope in these ways, or with their modern communication equivalents; however, these newer modes of communication require significantly less effort and let's face it--why not go with the easier path? Moreover, the speed of our ability to express ourselves, and obtain a response, is much faster. This speed can be a good thing, minimizing the time that people experience discomfort without relief. But it also has a downside: People have less experience or practice managing their emotions without acting on them rapidly. Unfortunately, the instant aspect of these communications seems to discourage deep reflection and encourages immediate reaction, leading instead to simple venting or "look at me" type communications—profile updates and photos, comments on blogs or websites, and youtube videos—that are less an attempt to share considered, reflective thoughts and images.

For each of us, these new technologies can become, in essence, extensions of ourselves that help us regulate their emotions. Feeling blue? We can shop (or window shop) online, communicate with friends, find solace and support in a chat room, post a comment on a website. Angry? Vent your anger in a call, email or post (in which you can be anonymous).

We've become used to acting quickly in response to uncomfortable feelings; we've become less adept at being able to manage or regulate our feelings without acting. Then, in cases where instant communication fails us (either technologically or emotionally—because our feelings are so powerful or the instant communications don't work sufficiently), we are less able to tolerate or regulate the uncomfortable feelings. And when we're emotionally aroused, we become jangled and less able to reason and use to use our good sense. The combination of being used to acting instantly, in combination with being less able to reason things through, lead us to be more likely to act impulsively, without thought for the long term consequences.

Are my speculations in this three-part missive merely the typical grumbling of someone from an "older generation" complaining about the change arising in with the times and affecting newer generations? I don't think so. Neuroscience research has accumulated to the point were we know that brain development is affected by the patterns in our emotions, thoughts, and behavior. The new instant communications now available are influencing the ways that our brains develop--particularly younger generations who grew—or are growing--up with the new technologies during formative years. We just don't yet know the specific ways that brains and behavior are being affected.

We can't unplug the internet, nor should we. I just think that we should try to anticipate the unintended but predictable consequences, some of which I've laid out, and then try to prepare for their negative effects—a less reflective, more action-oriented (and perhaps impulsive) populace, with a greater desire to be seen and heard than previous generations, who have at their disposal an ever-increasing smorgasbord of ways to obtain attention.

Wednesday, April 30, 2008

Brave New World, Part II: The Objectification of a Generation

(This is the second of three blog entries about possible cultural ramifications of growing up in the age of instant communications)

The internet is a powerful communication tool that allows us express ourselves and to respond to other people's expressions by sending emails or posting comments on websites (theirs, ours, or some other website). Our self-expressions are shared through the internet with friends and family, and often with a nameless, faceless audience.

Combine this communication power with the uncommonly strong desire of young people to be seen and heard—to be noticed and to have an effect. To feel as if they're having an effect, teenagers may skirt (or cross over) the edge of appropriate behavior just to see what happens: Will anyone notice? And if people do, what will they do? On top of it all, teenagers are constantly "connected"—it's almost as if they think they don't exist if they're not connected. When they post photos of themselves or (yet again) update their profiles, it's almost as if it's more about being seen by other than about how they actually feel or how they experience themselves. That is, it's about being an object rather than a subject.

I think that that the way these new tools are used promotes a way of thinking about themselves predominantly as "seen by others"—from the outside—and less based on an internal awareness of themselves. This process is referred to as objectification, where the experience of being treated as an object results in coming to see oneself that way. Objectification theory was originally conceived of as primarily pertaining to females (original article by Barbara Frederickson and Tomi-Ann Roberts in 1997 is here:).

Unfortunately, the internet is propagating an extreme focus on appearance and the objectification of males as well as females. Both sexes are increasingly undergoing medical procedures to alter their appearance. Consider that in 2007 almost 11.7 million cosmetic procedures were performed compared to 2.1 million in 1997. Although females receive most of the procedures, the number of men is increasing (nearly 1 million in 2007), and will undoubtedly continue to increase.

It's not only about appearance; it's about being noticed in any way—standing out in the cacophony of the crowd. And what's clearer then it's ever been is that there's no such thing as bad news. Talking heads get noticed by being ever-more controversial and outrageous. When people want to comment on some topic, they may not necessarily set out to write/say what they think; instead, they write/say to make other people notice. So they're thinking about their own opinions as object rather than subject.

What this means is that a generation (and likely subsequent generations) may well develop a self-concept that is more about how they appear to others than how they feel inside to themselves. Current internet-based communications amplify the tendency for teens to be exhibitionists; they are reinforced—with attention—for being revealing. The attention, in turn, keeps them focused on how they are seen by others. They thus are less likely to "know" themselves--What they like, need, how they work best—except through the eyes of others.

I'm reminded of the old philosophy question: If a tree falls in the woods and no one hears it, did it make a sound? Will those raised in the wake of the internet feel they don't really exist if they are not seen/heard (online)? Are we raising a generation of exhibitionists—people who only feel alive when they are observed or heard by others? And if so, will that lead to an ever greater cacophony of "self-expression," with increased outrageousness in order to be noticed above the din? What can we, as a society, do to help upcoming generations feel alive even when no one is seeing or hearing them?


Coming up in Brave New World, Part III: Is the Internet raising a generation of people who will have problems regulating their feelings and impulses?

(Click on "Subscribe" at the bottom of the page to get an RSS feed so you'll know when the next entry is posted.)

Monday, April 14, 2008

Brave New World, Part I

(This is the first of three blog entries about possible cultural ramifications of growing up in the age of instant communications)


Stories abound in the news and among parents of teenagers and young adults about revealing risqué or possibly self-incriminating information through new technologies: On websites, in emails and instant messages, in text messages. Teens and young adults may put online or in other communications descriptions or photos of themselves involving sex, alcohol, or drug use (even if feigned). Although older adults try to caution them that, like plutonium, these communications are around forever, many young people don't seem to understand that even if they remove the a photo from their website, it's still exists in someone's—or some company's—hard drive. Even when admonished about possible college admissions or job offers rescinded, troubles with their schools or their parents, or even the fact that it may come back to haunt them years later, many teenagers and young adults continue to parade their inappropriate behavior in public or incautiously write about matters that shouldn't be public. Why do they persist?

I think there are at least three factors at work:

1. Young people tend to underestimate the risks that their own actions will lead to negative events. Research on perceived risks and health-related behaviors reveal that young people particularly tend to underestimate the risks of untoward events happening to them. For instance, they tend to underestimate their risk of developing a sexually transmitted disease from unprotected sex. The good news is that when they do think they are at risk, they are more likely to take action to prevent the untoward event. (Neil Weinstein, now at the University of Arizona College of Medicine, has done extensive research on the subject of risk perception and preventative health behavior, such as smoking, vaccination, and screening for medical problems.)

I'd venture a guess that a similar process occurs with the new technologies for communication: Young people (under)estimate their risk of negative consequences from their edgy electronic communications. Such communications include posting risqué profiles or photos about themselves online, and sending emails or text messages that could be make trouble if forwarded to the wrong hands.

2. They are used to immediate gratification. The current generation of young adults and those that follow after that have received levels of immediate gratification previously unknown. There are very few things the youngsters of this generation have had to wait for, except perhaps to grow up. Consider that during the childhoods of most Americans currently alive, people had to wait until summer to be able to eat "summer" fruits and vegetables. Similarly, most markets and stores were closed on Sundays (so you or your parents had to plan ahead for necessities or do without that day). If the stores near you didn't have the item you wanted, you probably couldn't get it unless you special ordered it, and that could takes weeks. And when people were out and about their business, away from a land-line phone, there was no other way to reach them short of physically going out and looking for them. Waiting was a part of life.

Not so for the current generation of young adults. These children have had little training in waiting—in delaying gratification or developing a tolerance for frustrated desire. Technology provides the possibility of instant knowledge, instant communication, and instant purchases (with overnight delivery!). You can answer most any questions by a Google search (although the answer isn't necessarily correct). You can reach out and electronically "touch someone" in seconds: cell phones, text messaging, emails, instant messaging, and GPS locators. And you can purchase your heart's desire over the Internet; even the basest longings can find a quick outlet via the Internet. No need to wait--at least not for very long.

Moreover, this generation may well have grown up with family interaction patterns that amplify the experience of immediate gratification: Parents who inadvertently promoted their child's reliance on immediate gratification. Many boomer parents avoid setting limits with their children to minimize conflict, for a variety of reasons. They may want to be "friends" with their children, sometimes forgetting that children need clear limits and that it's important for children to learn that their actions can have negative consequences for them. Or parents may simply be too emotionally drained to set limits that they don't feel able to enforce: A tired, harried parent has a harder time saying "no" than does one is isn't emotionally spent at the end of the day or week. (Why so tired? More families now have both parents working outside the home, and the jobs are more demanding and may require longer hours; more parents are also likely to be separated or divorced. All these factors make parents more emotionally spent.)

When I was a kid and expressed my impatience about something, my father would say, "patience is a virtue." As a child, I had no idea what he was talking about it—why would it be a virtue? It's not only patience that he was advocating; he was trying to help me develop a tolerance for waiting—for wrestling with the urge to get something I wanted "now." It seems to me there's a lot less wrestling going on these days.

So if next generations are being gratified with less delay, it makes sense that they'll send their electronic communications right away—they don't even have to wait until they get home. They can post the photos they took with their cell phones directly to their blogs via a web-enabled phone. Think it would be a blast to post these drunken photos online? Do it now. Even if they "understand" that the Internet is forever, they don't want to be deprived of the satisfaction of seeing themselves or their words sent out into the world for others to see. They don’t want to be deprived of the pleasure of sharing the photo, the joke, the edgy communication with friends. They want to be noticed, and don't want to wait.

3. They have immature frontal lobes. Research from psychologists and brain scientists found evidence that adolescents, and even young adults, may not have brains that are as mature as their bodies. Teenagers' frontal lobes—the area of the brain involved in planning, foresight, inhibition and judgment—aren't as adult-like as we may have previously thought. (For an overview on teenagers and brain development, see the PBS show on the subject: Inside the Teenage Brain) So young adults may have a harder time figuring out what the most prudent course of behavior is in a given situation; even when they know, they may have a hard time inhibiting themselves from the pursuing the most immediately gratifying path.

Back to why a teenager might post information (or send an email or text-message) that could later come back to haunt them. In sum, I think it's because it's hard to exercise good judgment and inhibit yourself from doing something you want to do when: (1) you think that bad consequences are unlikely to happen to you; (2) you really want to do it and you're used to instant gratification; (3) your brain's development is such that it is harder to gauge the possible consequences of an action.

I'm not saying we should do away with the Internet or cell phones (as if we could!). But I think there are unforeseen and unintended consequences of the cultural changes wrought by the new technologies. I'm suggesting we try to glimpse ahead to some of those changes and consequences so we can be better prepared for them. I'm reminded of China's one child policy: an unintended consequence was that it created a generation of significantly more males than females, creating a dating and mating problem for men, which in turn can lead to angry young men "competing" for women—a potentially unstable political situation. (How'd this happen? In a culture that valued males, female infants (first borns) were sometimes killed or abandoned, and some of those abandoned girls were adopted by foreigners. China's changed their adoption policy, creating very stringent criteria for foreign adoptions, ostensibly to protect the orphans, but more likely to prevent a continued shortage of females.)

Let's try to glimpse the future downstream so we're better prepared for the repercussions that will inevitably arise from the changes in communication.

Coming up in Brave New World, Part II: Is the Internet raising a generation of people viewing themselves as objects rather than subjects? (Click on "Subscribe" at the bottom of the page to get an RSS feed so you'll know when the next entry is posted.)

Sunday, March 23, 2008

A Community of Women

On a recent trip to New York, I went to the clothing store, Loehmann's. If you've never been to Loehmann's fitting rooms, let me explain. A fitting room there is generally a largish, mirrored room that can hold 8 women or more. Unlike fitting rooms in most stores, the typical Loehmann's fitting room is a communal room—the women share the space and can see each other try on their selected clothes. The atmosphere in these communal fitting rooms is a pleasure: women of all sizes, shapes, and colors creating a sense of camaraderie and sororitè (like the French "fraternitè" but for women).

For instance, if a woman tries on a garment with a hard to reach zipper, there's always someone happy to help her with it. Need an opinion about how a pair of pants look? Just ask, and the women will, nicely and tactfully, tell you when it doesn't suit you. Even when no advice is sought or given, the other patrons are warm and friendly. (Some women are uncomfortable trying on clothes in front of other women. Loehmann's has some individual fitting rooms. But at peak shopping times there is a significant wait for individual rooms, whereas the group fitting rooms usually don't have a wait.)

What is it about the communal fitting rooms that bring out such camaraderie among women? I think it's because we're willing to be seen by the women in the room with us, and recognize that that they're in the same position. In a society that promotes unrealistic and unhealthy ideals of beauty for women, each woman in the communal fitting room pushes back against these ideals—we are (semi)publicly acknowledging our body's imperfections, and in essence adding, "so what?"

Thursday, February 28, 2008

What's Wrong With "In Treatment"

I just watched the first week's episodes of the HBO series, In Treatment. As a clinical psychologist, I feel compelled to educate people about the ways in which the therapist, Paul, violates legal mandates of health care providers and how his work with at least some of his patients flies in the face of good clinical practice. There were enough legal and ethical transgressions that I am concerned about what viewers will come to believe about therapy in the "real world." Of course it's only a television show, but the show can shape viewers' beliefs and expectations about therapy (and whether to begin therapy), which in turn will influence their behavior. I'm not going to address Paul's theoretical approach to his clients or his specific interventions—or lack of interventions. I just want to set the record straight about what should happen during a first session with a mental health professional.

Health Insurance Portability and Accountability Act: Procedures, Policies and Confidentiality
If you've been to a health provider in the last couple of years, then at your first visit, you've received a copy of his or her "Notice" as mandated by the Health Insurance Portability and Accountability Act (HIPAA); this Notice outlines the policies and practices of the health professional, including matters regarding confidentiality and the limits of confidentiality.

During Paul's first sessions with his two new patients, Alex and Sophie, Paul makes no mention of HIPAA, of the limits of confidentiality, nor does he ask whether the patients have any questions about confidentiality or other office policies or procedures. It is always good therapeutic practice to discuss with new patients the limits of confidentiality and ask patients whether they have any questions. Yet Paul doesn't do this.

Evaluations as Part of Legal Cases and Treating a Minor
One of Paul's new patients, Sophie, is a 17-year-old girl, coming to him for an evaluation related to her dispute with an insurance company; she was hit by a car but the insurance company is concerned that the accident arose as a suicide attempt. There were three significant pieces of information that Paul did not explain to his minor patient:

1. Sophie is being evaluated by Paul as part of a dispute with an insurance company. Given the possibility of Paul having to testify later, he should address with at the outset the issue of confidentiality and its legal variant, privileged communication—who has the legal right to decide what is disclosed to others? Good clinical practice requires that at the outset of an evaluation that may involve the legal system, mental health clinician explain to the patient about privileged communication and the limits of patient confidentiality. (When the evaluation is court-ordered as part of a criminal case, the patient usually is not the person who decides how the information is used.)

2. Given that Sophie is a minor, she cannot legally consent to having Paul release any information to her lawyer or to the insurance company. Her parents would have to consent, and according to Paul, he had not spoken with her parents. Paul should have made it clear to Sophie and her parents that her parents are the ones who must give consent.

3. For patients or clients over the age of 18, the therapist is not legally allowed to disclose any information without the patient's explicit permission, except to speak with the patient's other caregivers, or in circumstances that that clinician is legally mandated to disclose information, such as imminent suicide risk or in cases of child abuse. With patients who are minors, though, parents and their child's therapist are legally allowed to speak with each other, even without the child's permission. Good clinical practice, however, dictates that the therapist let the minor know about limits of confidentiality and the parents' rights; when the child doesn't want the therapist to tell the parents about certain information, the therapist usually tries to work with the child and the parents to honor the child's wishes, unless there are legitimate concerns about withholding that information from the parents (e.g., suicide risk). Paul said nothing to Sophie about any of this.

When Therapists Want Help
On the final show of the first week, Paul's own troubling responses to his patients lead him to see Dr. Gina Toll in her office. What is the nature of their relationship and what type of help was he seeking from her?

When therapists are having a hard time managing their own feelings about patients, they may seek supervision, a time-limited (e.g., 1-4 session) consultation for advice, or their own therapy. Although the lines among these three forms of assistance may at times be blurry, both the therapist and his or her "helper" (supervisor, consultant, or therapist's therapist) are best served by a clear discussion of the goals of the visits: Supervision/consultation (to help the therapist become more effective with his or her patients) versus therapy (to help the therapist address personal issues that may, coincidentally, be affecting his or her work with patients). Neither Paul nor Dr. Toll clarified the nature of his visit (therapy, supervision, consultation) or his goals in seeking her out.

Sunday, February 10, 2008

Cultural Learned Helplessness

What do these societies have in common: Gotham City before Batman, the Star Wars universe under Emperor Palpatine, the wizarding world of Harry Potter, and the United States under George W. Bush? The citizens of these societies experienced learned helplessness. What's learned helplessness? It is the type of "giving up" that happens to people when they are repeatedly or chronically in an aversive situation and nothing they do can stop or help them escape from it. The original research on learned helplessness, by psychologists Bruce Overmeier and Martin Seligman, went like this: A dog was placed in a cage, and received shocks from which it could not escape. After an initial response of agitation and making noises, the dog eventually became passive in response to the shocks—as if it gave up. The dog was then put in a new cage—with two halves that had a low barrier between the halves. If the dog jumped over the barrier when shocked, it could escape the shock. But the dog never did go over the barrier. (Dogs who hadn't experienced inescapable shock would jump the barrier when shocks began.) Learned helplessness became a model of depression for people who experience uncontrollable or inescapable aversive situations such as abuse and discrimination. Learned helplessness can be a helpful lens through which to examine the citizenry of Gotham City, the Galactic Empire and the magical world of Harry Potter—as well as contemporary America.

First, Gotham City: Before Batman's arrival in Gotham City, criminals controlled the city, and the police force was riddled with corruption. Citizens feared to go out at night and they—and the good cops on the force—felt helpless against the ever-encroaching tide of crime and graft. Gotham residents seemed to have had learned helplessness in response to the crime in the city—nothing they did made a difference. But Bruce Wayne didn't feel this way. He became an agent of change on behalf of Gotham's citizens. Why didn't he experience learned helplessness—particularly after witnessing the murder of his parents? His wealth and position in society shielded him from the chronic and inescapable crime and corruption experienced by most of Gotham's residents—and hence from the learned helplessness-inducing cage of the city. So Wayne was in a better position to feel hope (or at least to feel that there was something he could do to turn things around). And he was right: When Batman arrived in Gotham City, his image and actions reversed the culture of the city: it was criminals who felt fear, and the upstanding citizens and good cops could mobilize to take back their city. Wayne's being an outsider was instrumental to his belief that he could change things.

Next, the Star Wars universe (episodes IV-VI). Emperor Palpatine was a ruthless tyrant, who ruled the galaxy with an iron fist. He stopped at nothing to instill fear in those who would oppose him and used terror tactics when necessary to hound the populace into submission. He sought to create learned helplessness across his Empire—to create a citizenry ready to do his bidding with no questions asked. In Episode IV ("A New Hope"), it was only a relatively small band of rebels who were sufficiently motivated and able to fight back in any way they could. Two of these rebels were Princess Leia and Luke Skywalker. (In fact, there's a sense in which Leia was the key player: she inspired Luke to join the Rebel Alliance. Han Solo only became part of the rebel cause through circumstances and his relationships with Leia and Luke.) How did Luke and Leia escape the learned helplessness prevalent throughout the Galactic Empire? Like Bruce Wayne, they were raised outside the dark cloud of fear and oppression—in remote regions of the galaxy outside the Emperor's glare. Leia, like Bruce Wayne, was raised in a position of privilege and was brought up to be a leader. Luke grew up on an even more remote planet. Both were untouched by the pervasive hopelessness in the larger society; they were outsiders, rebelling against the dark forces and instilling hope.

Onto the Harry Potter's wizarding world. In Goblet of Fire, Harry's godfather, Sirius Black recounts their world during Voldemort's first ascendance: "You know [Voldermort] can control people so that they do terrible things without being able stop themselves. You’re scared for yourself, and your family, and your friends. Every week, news comes of more deaths, more disappearances, more torturing … The Ministry of Magic’s in disarray, they don’t know what to do…Terror everywhere … panic … confusion … that’s how it used to be." Here again, the heroes are outsiders who rebel against dark forces and inspire others. Harry and Hermione grew up in the Muggle world, outside the terror of Voldemort's reign (Ron, raised within a terrorized society, is less of a leader than a follower. Like Han Solo, it is his relationships with the leaders that impel him to takes the risks that he does.)

I think that for many people in the United States, the years of George W. Bush's administration have been a period of learned helplessness. The administration and its cronies have engaged in fear-mongering, corruption and graft (legal though it may be in some cases), the curtailing of civil liberties, and outright lying about events relating to the Iraq war and other matters. Citizens who see these abuses of power have come to feel helpless to change them during his tenure; they are counting down until he leaves office. People also feel ground down by years of bitter political fighting between the parties.

Barack Obama is popular, in part, because he seems to be outside the system relative to other candidates. Like Batman, Princess Leia, and Harry and Hermione, when he looks at his society, he sees what's possible and wants to make that possibility a reality. But Obama goes one better than the heroes and heroines I've noted. In his work as a community organizer, he's done more than fight the good fight and inspire people, he's encouraged people to act on their own behalf. He didn't do all the work for them; he helped them to help themselves. I think that's part of his appeal. To use a Batman metaphor, he doesn’t offer to swoop in and round up all the bad guys; he'll help people figure out what they need in order to do their own policing, and then try to make sure they get what they need. People talk about Obama being more than a candidate—that he's the center of a movement. I think that analysis is correct: for many people, his candidacy isn't about what he'll do for Americans, but what he'll inspire us to do for ourselves.

Robin S. Rosenberg is a clinical psychologist and author; most recently she contributed to and edited the book Psychology of Superheroes.