jump to navigation

Homo Ludens—Is Playing Good for Us? November 30, 2010

Posted by Nina Rosenstand in Culture, Nina Rosenstand's Posts, Philosophy of Human Nature.
Tags: , , , ,
9 comments

 Years ago a Dutch researcher, Johan Huizinga, came out with a book, Homo Ludens, “The Playing Human,” which claimed that playing is older than human culture, that even adults play for the fun of it, and it’s good for us. That was actually an eye opener for most people at the time. Since then the scope of play behavior analysis has been extended to studying social animals, (see Bekoff and Pierce (Wild Justice) ) suggesting that social play allows for the development of a sense of fairness and justice, not only in humans, but in some species of animals as well.

In this article, “Why We Can’t Stop Playing,” we see the positive analysis of play continued—but this time the spotlight isn’t on playing as a social activity, but very much a solitary experience: “Casual games” that are played on our computers and our cell phones, mainly to pass the time while waiting for appointments:

Why do smart people love seemingly mindless games? Angry Birds is one of the latest to join the pantheon of “casual games” that have appealed to a mass audience with a blend of addictive game play, memorable design and deft marketing. The games are designed to be played in short bursts, sometimes called “entertainment snacking” by industry executives, and there is no stigma attached to adults pulling out their mobile phones and playing in most places. Games like Angry Birds incorporate cute, warm graphics, amusing sound effects and a reward system to make players feel good. A scientific study from 2008 found that casual games provide a “cognitive distraction” that could significantly improve players’ moods and stress levels.

Game designers say this type of “reward system” is a crucial part of the appeal of casual games like Angry Birds. In Bejeweled 2, for example, players have to align three diamonds, triangles and other shapes next to each other to advance in the game. After a string of successful moves, a baritone voice announces, “Excellent!” or “Awesome!”

In the 2008 study, sponsored by PopCap, 134 players were divided into groups playing Bejeweled or other casual games, and a control group that surfed the Internet looking for journal articles. Researchers, who measured the participants’ heart rates and brain waves and administered psychological tests, found that game players had significant improvements in their overall mood and reductions in stress levels, according to Carmen Russoniello, director of the Psychophysiology Lab and Biofeedback Clinic at East Carolina University’s College of Health and Human Performance in Greenville, N.C., who directed the study.

In a separate study, not sponsored by PopCap, Dr. Russoniello is currently researching whether casual games can be helpful in people suffering from depression and anxiety.

Hardly an incentive for further development of one’s sense of fairness and justice, like social play! But it may still have merit, if it can offset the unnaturally high levels of stress most of us labor under. For one thing, we can conclude that playing games by oneself adds an important dimension to the play behavior phenomenon; for another, I find it fascinating that the article doesn’t end with a Caveat such as, “You’re just being childish, needing approval from the world,” or “If you play too much you’ll become aggressive/a mass murderer/go blind” or whatever. For decades we’ve heard about the bad influence of computer gaming, as a parallel to the supposed bad influence of violent visual fiction. But the debate is ancient: to put it into classical philosophical terms, Plato warned against going to the annual plays in Athens, because he thought they would stir up people’s emotions and thus impair their rational, moral judgment; Aristotle, who loved the theater, suggested that  watching dramas and comedies would relieve tension and teach important moral lessons. In the last two-three decades most analyses of the influence of entertainment have, almost predictably, ended with a Platonic warning about the dangers of violent TV, movies, and videogames. Are we slowly moving in an Aristotelian direction? That would be fascinating, but here we should remember that Aristotle didn’t want us to OD on entertainment: the beneficial effects are only present if entertainment is enjoyed in moderation. 15 minutes of “Angry Birds” ought to be just enough…

Advertisements

Magical Thinking, in Moderation December 24, 2009

Posted by Nina Rosenstand in Culture, Ethics, Nina Rosenstand's Posts, Philosophy of Literature.
Tags: , , , , , ,
1 comment so far

Remember when children’s books weren’t allowed to contain anything imaginary? At least according to recommendations of child psychologists. We’re talking about the 1970s and well into the Eighties. No fairy tales allowed, no tooth fairy, no Santa, and above all no imaginary friends, because one wouldn’t want children to grow up with a bunch of illusions that life could never measure up to, would one? So instead they wrote children’s books about parents divorcing, Fluffy the dog dying, and other realistic in-your-face topics, to train kids for more in-your-face adult hardship. Oh joy! That wasn’t much fun, was it? And I suspect that magical thinking just never went away, it just went underground—and resurfaced in graphic novels. So for a while we’ve been used to Superheroes being part of the Collective Unconscious of kids. But now we even hear from psychologists that it is downright healthy for kids to not only be exposed to fantastic tales, but even to make up stories themselves. Imaginary friends are to be encouraged and welcomed into the family! Apparently, children’s cognitive powers thrive by being exposed to, and learning to be comfortable within an imaginary universe.

Psychologists like Jacqueline Woolley, a professor at the University of Texas at Austin, are studying the process of “magical thinking,” or children’s fantasy lives, and how kids learn to distinguish between what is real and what isn’t.

The hope is that understanding how children’s cognition typically develops will also help scientists better understand developmental delays and conditions such as autism. For instance, there is evidence that imagination and role play appears to have a key role in helping children take someone else’s perspective, says Dr. Harris. Kids with autism, on the other hand, don’t engage in much pretend play, leading some to suggest that the lack of such activity contributes to their social deficits, according to Dr. Harris.

…It is important but not necessary for parents to encourage fantasy play in their children, says Dr. Woolley. If the child already has an imaginary friend, for instance, parents should follow their children’s lead and offer encouragement if they are comfortable doing so, she says. Similarly, with Santa, if a child seems excited by the idea, parents can encourage it. But if parents choose not to introduce or encourage the belief in fictitious characters, they should look for other ways to encourage their children’s imaginations, such as by playing dress-up or reading fiction.

For a narrative ethicist like myself this is of course fun stuff: psychologists advocating magical story-telling as an enhancement of social skills! That’s what narrative ethicists call a moral thought experiment. All over the world, raconteurs of children’s stories have always engaged in such mind experiments, but it is encouraging to see such an activity being promoted by psychologists. However…there’s got to be more to the study than that. Exactly how, and when does the child learn the difference between what’s real and what isn’t? Where is the built-in reality check? How far is the encouragement supposed to go? And is there an upper age limit? Are we supposed to engage in magical thinking into adulthood? (Which of course brings up the whole question of religion, and numerous anthropological studies.) This could be the flip side of the austere no-fairy-tales attitude: an indiscriminate acceptance of fantasies and magic, and I’m already beginning to yearn for stories like “When Mom and Dad Split Up.” Storytelling as a cognitive/ethical device has to include a measure of moderation, and a clear understanding that fantasy only “works” when contrasted to reality. And the studies referred to  surely must include just such an understanding—it’s just not apparent from the article.

Be that as it may, there is another aspect that fascinates me: the similarity to the old discussion between Plato (who discouraged an interest in fiction) and Aristotle (who encouraged it). Arguments that were presented 24 centuries ago are still valid today: Plato’s concern that exposure to emotional fiction (in the theater) can make the audience forget the all-important self-control provided by rationality, contrasted with Aristotle’s enthusiasm for the moral and psychological cleansing provided by a good, emotional drama. But both Plato and Aristotle lived in a world where moderation (Maeden Agan) was a moral and aesthetic ideal. So if we go down the Aristotelian path and encourage an immersion in dramatic fiction we should remember that he never meant for it to replace our sense of reality, but to enhance it. Some imagination is good, and even necessary in order to understand other minds, and other possibilities. Too much of it is not a good thing!

So, getting back to the imaginary friends: since this is Christmas Eve, is our imaginary friend Santa a plus or a minus in the cognitive development of a child? You decide. I never had a problem with Santa, not even when I realized (around the age of 5) that he was my granddad. And I was very careful not to let on that I had figured him out, because he was so jolly, and I didn’t want to ruin his Christmas…