jump to navigation

2013 was the 50th Anniversary of “The Banality of Evil” December 17, 2013

Posted by Nina Rosenstand in Criminal Justice, Culture, Current Events, Ethics, Nina Rosenstand's Posts.
Tags: , ,
3 comments
While it is still 2013 I want to mark the anniversary of an important concept in moral philosophy, the re-legitimizing of the concept of evil through the writings of German philosopher Hanna Arendt. 50 years ago Arendt launched a concept that was to have an enormous influence on discussions in the field of ethics for the rest of the 20th century, discussions that continue to this day––an influence that has had wider consequences than she probably imagined. She launched the idea of the banality of evil.
Interestingly, philosophers in the early 20th century, especially within the Anglo-Saxon philosophical tradition, had all but given up on the word evil. Too much religious baggage (just think of the problem of evil, the theodicy: how can a good god allow horrible things to happen?), too judgmental, too moralizing––at a time when most English-speaking philosophers were grappling with the meaning of words rather than with the meaning of life. Back in the 19th century and further back, thinkers such as Immanuel Kant had no problem throwing themselves and their readers into discussions about the ultimate meaning and values in life, and the notion of an evil person was not alien or uncomfortable as a  topic of analysis.But what is a moral philosopher to do when, in the mid-twentieth century, the ultimate worst conceivable behavior runs rampant across the continent of Europe? Call it “bad behavior”? Call it “behavior frowned upon by most people according to the standards of Western societies?” Call it “a moral choice made by a cultural subset?” Or “just another form of acculturation?” What words could a mid-twentieth century philosopher use to describe Nazi atrocities that wold seem sufficient? In 1963  Arendt, a German Jew who had narrowly  escaped the Holocaust, witnessed the trial in Jerusalem of captured Nazi official and instigator of the Holocaust Adolph Eichmann, and observed that to her surprise he did not look like a monster.  He looked, and sounded, frighteningly normal. And that, she said in her book, Eichmann in Jerusalem: The Banality of Evil, was the terrifying key to the Nazi engagement in terror, torture and murders: The Nazi torturers were normal people who, either under pressure from superiors, or from twisted values and twisted thinking, had reached the conclusion that torturing and killing innocent human beings was the right and normal thing to do. It had become a banality––the banality of evil.
And thus was launched a renewal of the concept of evil in philosophy; removed from a religious context, it focused on the inner “Schweinehund,” our baser, cowardly instincts that make us follow the crowd, or go with the lowest common denominator when it comes to standing up for simple decency and humanity, to the point where we are willing to disregard the humanity of another person if someone in authority tells us we’re not responsible, that it is for the good of all, or that bad things will happen to us if we don’t comply. Stanley Milgram corroborated the phenomenon with his “electric shock experiments”––to see how far a person would go when told to shock people who had given the wrong answer to a question. It is a relief to know that nobody was really being shocked, but utterly disconcerting that most of Millgram’s subjects were quite willing to shock people to death when told that they had to. Banality of evil. And a few years later Philip Zimbardo ran his now infamous Stanford Prison Experiment unintentionally resulting in real  harm caused to “prisoners” by “prison guards.” The participants were merely students doing role-playing, because the “prison guards” had been told to be ruthless to the “prisoners,” but some of them took to their roles with zest, and very quickly. Zimbardo himself pointed out, in his book The Lucifer Effect from 2007, that the willingness of the “prison guards” to commit atrocities was the banality of evil rising in completely normal people put into a situation where their moral compass had somehow been disengaged—a situation that her saw paralleled in Abu Ghraib. (I’ve visited the subject several times before in this blog, such as in “The Concept of Evil, and Joseph Duncan,” and “On a Scale of 1 to 22…”, so essentially I’m merely summing up what I’ve said several years ago, but with the specific anniversary of the concept in mind.)
So since the 1960s we have, in philosophy, been able to use the term “evil” as a term for the moral weakness lurking in almost every individual which can make us follow or even give unspeakably vicious orders and forget about the humanity of other people. But is that enough? It took courage for Arendt to reintroduce the concept of evil into the philosophical vocabulary; it took integrity to give voice to a condemnation that most thinkers felt, but perhaps were too polite and well-schooled in non-judgmental meta-ethics to engage in.
And now it is the question whether we’re ready to, once again, redefine the concept of evil. Because surely not all acts that cause irreparable and deliberate harm is caused by people following orders or bad role models. Sometimes some individuals make choices they know will cause harm, and they know it is considered wrong by society, but they want to do these things, anyway, for gratification or profit. The march goes on, from school shootings to imprisonment in locked rooms or basements of women (sometimes strangers, sometimes daughters!) for sexual gratification, to serial killings. Is that the banality of evil, too? I don’t believe so. Immanuel Kant would say that such choices deserve the term evil because they are (1) deliberately done, (2) with awareness that they violate society’s standards, (3) causing intense physical or psychological pain or damage to innocent people, and (4) done for selfish reasons. Is it possible that those individuals are possibly more mentally ill than “evil?” Perhaps, but as long as our legal system allows us to recognize a criminal as “sane” if he or she has a sense of the moral rules of society, then breaking the rules is a deliberate act more so than the act of a helplessly sick person. So perhaps it is time to allow an old concept to return to our modern vocabulary of ethics? In my textbook, The Moral of the Story, I have adopted the term “egregious evil” for such behavior, to differentiate it from “banality of evil,” but it is still a good question whether we are referring to evil acts, or evil persons, because therein lies a world of difference.
Maybe in the future we can come up with a term that is better suited to describe such acts of deliberate, egregious harm, a term that will allow a society to express its moral outrage while at the same time recognizing whatever neuroscientific insight into criminal pathology that we may acquire? Until then I will feel comfortable with a cautious application of the old word evil in its two applications—banal and egregious.

The Ethics of Food: Why not Horse Meat? March 2, 2013

Posted by Nina Rosenstand in Culture, Current Events, Ethics, Food and Drink, Nina Rosenstand's Posts, Philosophy of Food.
Tags: , ,
6 comments

A scandal is shaking up Europe: meat departments in supermarkets have been pulling “beef” from the meat counters, because it has turned out that horse meat DNA has been present in what was sold as beef. And lately the Swedish furniture giant Ikea has been pulling their meatballs (oh no, not that!) and sausages from stores in 21 European countries. According to an AP report,

“Monday’s move comes after authorities in the Czech Republic said they had detected horse DNA in tests of 1-kilogram (2.2-pound) packs of frozen meatballs labeled as beef and pork.”

 According to Ikea their stores in the US are not affected, because they use meat from suppliers in the U.S. However, Burger King recently severed ties with an Irish supplier because of horse flesh contamination, and Taco Bell has has similar issues.

 So this is now an expanding scandal–but what exactly is the problem? First of all, it is of course a matter of consumer confidence: You buy something believing it is beef, so you don’t want something that’s not beef. (If you were actually shopping for a horse burger in France, you would not appreciate if the meat had been mixed with pork, or ostrich. It is a matter of consumer expectations.) But second of all, there’s the horse thing.

 In some places of the world they eat horse, and like it. Growing up in Europe, I’ve been served horse burgers myself, and I didn’t much care for them; they tasted too sweet for me, like a hamburger with honey. In certain cultures in Southeast Asia dog and cat meat is on the menu. In some villages in Africa they have, at least until recently, eaten gorilla. In some remote locations in the South Pacific “long pig” was considered an acceptable food item (at least according to legends and Hollywood movies) until well into the 20th century. And we all come from distant ancestors who ate just about anything that would keep them alive. In some places they have even eaten dirt, but that’s not digestible. Meat is. Food can be many things to many people, and just because something can be digested doesn’t mean we accept it as food. Food taboos are known all over the world, and some are founded in the culture’s religion (such as the ban on consuming pork in Judaism as well as in Islam, and the ban on eating beef in Hinduism), while others reflect memories of past contaminations (and historians speculate that perhaps most food taboos have such contamination fears as their point of origin).

 But some of the food taboos in a modern, largely secular culture such as ours are neither founded in religion nor based on past memories of contaminants. It isn’t inherently any more unhealthy to eat horse, dog, or cat that it is to eat beef, but most of us wouldn’t dream of serving or eating those animals, because we regard them as pets, and even as family members. So there is the familiarity factor, and the cuteness factor, but of course the food taboo can also include a “Yuck” factor such as in our reluctance to eat rats. (And how about snails? Oysters? Prairie oysters? Depends on what we’re used to. When the eponimous hero in the movie Tom Horn is served lobster for the first time, he quips, “I’ve never eaten a bug that big.”)

Our legislation doesn’t always reflect such taboos, or is even clear about the prohibitions, and the reasoning behind them. We can’t slaughter, serve or eat dogs and cats. Up until 2011 a horse could not be slaughtered (for human consumption) in the US, but Congress did not extend the ban which then expired.

  In Nov. 2011, Congress decided not to extend a ban on USDA horse meat inspections. Over the five years prior to that, Congress banned the USDA from using any taxpayer funds for horse slaughter inspections through its annual budget appropriations for the department. And since the Federal Meat Inspection Act requires the USDA’s Food Safety and Inspection Service to inspect animals for slaughter, carcass by carcass, there was no way for horses to make it to American dinner tables.

 But since the ban has been lifted, there still are no protocols for the USDA to conduct equine inspections.

 “Despite a November 2011 decision by Congress not to extend the ban on horse slaughter, the USDA says there are no establishments in the United States that slaughter horses.

“It is a hugely political issue – it has to do with the slaughter of horses and whether that’s acceptable to U.S. society or not – and so there are two sides to the argument,” said William Hallman, director of the Food Policy Institute at Rutgers University in New Jersey.

 Opponents of horse slaughter essentially say eating horses is not part of American culture, equating it to the slaughter of other pets.

 “We have a 250 year relationship in the United States with horses and eating them has never been a part of the equation,” said Wayne Pacelle, president and CEO of The Humane Society of the United States. “It would be quite a turn in the road to view animals who helped us settle the country as an appetizer or main course.” “

 But didn’t oxen also help us settle the country? Those big Conestoga wagons were sometimes pulled by oxen. And oxen have pulled plows. Every time we eat a steak or a burger, we bite into the remains of a steer. Some gratitude! The fact remains that our food taboos are selective, and based on feelings as well as tradition and convenience. Some people won’t eat “anything with a face.” Some won’t eat anything with a cute face. Some will eat anything as long as it no longer has a face. How do you feel about the horse meat issue? Would you eat horse? Why or why not? And is there an inherent moral difference between eating horse, beef, pork, snake, kangaroo, or grubs? Not to mention “long pig”? Let’s assume that none of the species are endangered…So where do we draw the line? At the level of intelligence, a Kantian response? Pigs are far more intelligent than horses, according to the experts. How about according to the amount of suffering, a utilitarian approach? If emotional suffering (=fear) counts, then we all know what “Silence of the lambs” means, and animal behaviorist Temple Grandin has taught us that the fear factor is very high in animals being led to the slaughter. How about another utilitarian angle, a distinction between the suffering of one animal feeding many people vs. the suffering of one animal feeding just a few? (A steer vs. a chicken, for example). (Or how about the choice of ethical egoism: satisfy your own needs in pursuit of your own happiness?) Regardless of our underlying moral theory we make choices, and they are grounded partly in our traditions, and partly in our feelings, rarely in dispassionate logic. So granted that our cultural choices of food are more driven by emotion than other considerations (unless we’re starving), then at what point does your food ethic kick in?

‘Tis the Season to Give (and Receive) Gifts December 9, 2012

Posted by Nina Rosenstand in Culture, Ethics, Nina Rosenstand's Posts.
Tags: , , , , , ,
1 comment so far

The philosophy of giving is an interesting little branch off the big branch of Generosity, growing on the tree of Virtue Ethics. Interestingly, Generosity is entangled with another branch, Gratitude. (I even wrote something about that in The Moral of the Story Chapter 11). So when an article in the Wall Street Journal focused on giving and regifting recebtly, I thought I’d share some of its points with you. First of all, an interesting illustration:

image

Next, some fascinating points made:

Some gift givers spend time and energy trying to find just the right gift. But thoughtful gifts don’t necessarily lead to greater appreciation, according to a study published in November in the Journal of Experimental Psychology: General. The benefit of a thoughtful gift actually accrues mainly to the giver, who derives a feeling of closeness to the other person, the study found.

People are more appreciative when they receive a gift they have explicitly requested, according to a similar study published last year in a separate publication called the Journal of Experimental Social Psychology.

Sharon Love once received a book that was clearly regifted: It was inscribed to the giver. She gave it back to him the following year. Ms. Love, who heads a marketing agency in New York, is herself a regifter when a gift is appropriate for another person.

“It turns out it’s not the thought that counts, it’s the gift that counts,” says Nicholas Epley, a professor of behavioral science at the University of Chicago…

Oh, where to start? What a smorgasbord of philo-associations!

Psychological egoism: They’ve said it all along, we give so we’ll feel good! BUT if the receiver doesn’t appreciate our gift, we won’t feel nearly as good, so we must have at least some interest in actually pleasing someone else.

Aristotle’s Golden Mean: There are a thousand ways to miss the bull’s-eye, and only one right way to hit it. There is one right gift for our friend/mom/dad/spouse/child/colleague out there, and if we have an excellent character we will know what that is.

The Revision-of-the-Golden-Rule philosophy/The Platinum Rule: And the right thing is how they want to be treated, not what you want to give them (because that’s what you’d want yourself! Think of Homer Simpson and the bowling ball for Marge) 🙂

And there is more support for Aristotle here:

Another study found spending more money on a gift doesn’t necessarily translate into greater appreciation. That might come as a surprise to many gift givers, who often assume that a more expensive gift conveys a higher level of thoughtfulness, according to the research, published in 2009 in the Journal of Experimental Social Psychology.

I don’t mean to sound sanctimonious, but some of us grew up in a less materialistic world, and the idea of “the more expensive, the better” is somewhat alien to us. But there’s always the assumption that if someone is going to return our gift to the store, then it looks better if they can get another gift at the value of $50 than at $15…That’s just human nature. But again, what would Aristotle say? The Golden Mean is a mean between two extremes, too much and too little. For each situation there is an appropriate action/feeling (and purchase), and sometimes what your recipient really really wants is something small and simple. Sometimes it is huge and expensive, to be sure, but then Aristotle would say that you are guided by the Golden Mean of your ability to give, and fondness for/past history with the recipient.

And then there are thoughts about regifting, about a gifted purse:

“I thought, ‘You know, I know someone else would like it more than I would.’ So I gave it to one of my friends for her birthday,” Ms. Sayeed says. About six months later, the friend came over to Ms. Sayeed’s aunt’s house, purse in hand, and the aunt exclaimed, “You know, Humera has a purse just like that!”

“I said, ‘You know Auntie, I loved it so much that I got her the same one,’ ” Ms. Sayeed fibbed. “I had a moment to probably come clean about it and I just decided it would be better not to, which I guess is why people feel sneaky about regifting.”

So a kind little utilitarian lie is also part of the discussion…And the upshot is,

The adage “It’s the thought that counts” was largely debunked by the recent study in the Journal of Experimental Psychology: General, which concluded that gift givers are better off choosing gifts that receivers actually desire rather than spending a lot of time and energy shopping for what they perceive to be a thoughtful gift. The study found thoughtfulness doesn’t increase a recipient’s appreciation if the gift is a desirable one. In fact, thoughtfulness only seemed to count when a friend gives a gift that is disliked.

And that brings me to my final branch of this discussion, on the tree of Virtue Ethics: the virtue of Gratitude. And this is where we switch from “descriptive” to “normative.” After all, we’re not doing psychology but philosophy here. So my response would be, Then start showing some gratitude for the thought, for goodness’ sake! Gratitude is not just a feeling, but an attitude (yes I know, it actually rhymes). You can show gratitude even if you don’t have that warm, overwheming feeling. If you wait for the feeling to arrive, somebody didn’t raise you right. So when you get that yucky somethingorother, regifted or not, then smile and say thank you, and if you can tell that somebody actually spent a lot of effort in getting that one thing to you, tell them it’s amazing how well they know you. And since I’m not a fan of regifting at all, since you risk offending a kind giver irreparably, then donate the gift that wasn’t perfect. Somebody out there in a thrift shop will thank you.

Merry Christmas/Happy Holidays!

Russell Means in Memoriam: Mitaku Oyasin October 25, 2012

Posted by Nina Rosenstand in Culture, Nina Rosenstand's Posts, Philosophy of Gender, Political Philosophy, Teaching.
Tags: , , ,
1 comment so far

Many of my students have heard me talk about Russell Means over the years. A complex man in complicated times, believing he saw a simpler solution to the culture war he believed still existed between the American Indian communities and mainstream America. A man who had his own vision, and sometimes version, of history. Russell Means passed away October 22 from throat cancer, and the spectrum of Americans in the public eye has lost a unique dimension.

So who was Russell Means? An Oglala Sioux Indian, with many different facets to his life. The American Indian activist Means was the chief organizer of the second Wounded Knee uprising in 1973, and was involved in various American Indian movements. His activist accomplishments are outlined here. The politician Russell Means ran as the vice presidential candidate to Larry Flynt’s presidential candidacy in 1983, and ran against Ron Paul as the Libertarian presidential candidate in 1987 (Paul won). The actor Russell Means was a significant Hollywood presence, playing the iconic character Chingachgook in the 1992 movie The Last of the Mohicans—and provided a special ambiance to the 2004 season of the HBO series Curb Your Enthusiasm. His film credits are numerous, as you can see on the Wikipedia site. The businessman Russell Means ran a website where he told his story, and the story of the plight of the American Indian, and sold CDs, art and t-shirts in support of the American Indian cause. And (this is where my connection to him comes in) the lecturer Russell Means would travel around the United States to college campuses, educating new generations of students to what became his own version of American Indian history. He was a speaker at San Diego Mesa College in the 1990s, and that was where I met him. (And I should also mention: the private citizen Russell Means had domestic problems, and was arrested for assault and battery toward his father-in-law back on the Navajo reservation. And he had more severe legal problems earlier on when he was indicted for murder on a reservation, but acquitted.)  And his image is familiar to Andy Warhol fans–Warhol painted Means 18 times.

But back to the lecturer persona. I wish I could tell you exactly when Means visited the Mesa College campus, but I don’t see any references to his visit on the Mesa College website, and all I remember is that it was in the early years when I was first teaching Philosophy of Women; so: the late 1990s. He was scheduled to give a talk in the room behind the cafeteria, and I decided to bring one of my classes to the talk. The room was packed with people, sitting on chairs, standing, sitting on the floor, and Means, 6 ft tall or more, hair in braids, was a very imposing sight, and a gifted speaker. Well, he kept on talking past the class period, and I ran back to the classroom and collected the students waiting for my next class, and brought them down to the cafeteria; Means was still talking. And he kept on talking for a good four hours, about what it is like to be an American Indian, about his battles and his careers, about American Indian traditions, about discrimination and near-genocide, and about the term “Indian” itself. He shocked most of us PC college people by declaring that he didn’t mind being called an Indian, and that the proper term to use was either “American Indian” or the tribal name such as Oglala Sioux Indian, but never Native American. He said it was a term invented by Washington for funding/political purposes (which is why I, to this day—and my students and readers of my books can verify that—always use the term American Indian). And the term Indian itself? Wikipedia (below) got his argument right, but whether it is also historically correct is something I can’t determine (and I have yet to find a historian who agrees with Means): 

Since the late 20th century, there has been a debate in the United States over the appropriate term for the indigenous peoples of North America. Some want to be called Native American; others prefer American Indian. Means said that he preferred “American Indian”, arguing that it derives not from explorers’ confusion of the people with those of India, but from the Italian expression in Dio, meaning “in God”.[17][18] In addition, Means noted that since treaties and other legal documents in relation to the United States government use “Indian”, continuing use of the term could help today’s American Indian people forestall any attempts by others to use legal loopholes in the struggle over land and treaty rights.

In addition he talked about what I referred to above as a culture war between the mainstream Euro-American tradition and the American Indian peoples. He said, the Euro-Americans are a culture of swords and violent domination, while the American Indians are a sharing culture, a culture of partnerships symbolized by a bowl or drinking cup. At that point my ears perked up, because I had just been reading Riane Eisler’s book The Chalice and the Blade, about the ancient gynocentric (female-oriented) partnership cultures of Old Europe symbolized by the chalice, vs the invading patriarchal dominator cultures worshipping the “lethal Blade.” So I asked Means if there was a connection, and he said, “Yes, there is this woman who wrote a book about the same phenomenon in Europe, and it fits the situation in this country between the indigenous peoples and the invaders. ” From from a gender-philosophical standpoint I found it fascinating that he would adopt the Eisler theory as an explanatory model for the American Indian culture (even if Eisler is also considered an activist and in no way a historian).  I tend to be skeptical of such arguments which tend to simplify very complex matters, and fan an ongoing (and possibly outdated) tension and enmity, and I see no reason to find Eisler’s theory more historically accurate because of the fact that Means liked it, but I found the confluence of research, activism and tradition to be intriguing.  If you want to experience him talking about the topic of partnership cultures and gynocentric (matriarchal) cultures, watch this YouTube clip.

Means ended his 4-hour lecture at Mesa by teaching his audience the end of every Oglala and Lakota prayer: two words that embrace all of creation, everywhere and for all time: Mitaku Oyasin: We are all related. And while much of his lecture was, to a scholar such as myself, a creative journey into personal interpretations rather than facts (and sometimes interpretations that were hard to swallow), his passionate sincerity rang true, and has stayed with me as a cherished memory. And his prayer still comes to mind sometimes when I’m looking for connections and common ground rather than analytical differences. So: Thanks for the lessons, Russell, and Mitaku Oyasin…

Cross-posted at Rosenstand’s Alternative Voice blog for Rosenstand’s Mesa students.

The Ethics of…Game of Thrones! May 29, 2012

Posted by Nina Rosenstand in Culture, Ethics, Film, Nina Rosenstand's Posts.
Tags: , , ,
4 comments

Time for some early summer fun; lucky those (like me), for whom fun and work often end up merging, such as in narrative ethics. And I’ve found the HBO series Game of Thrones to be seriously fun, once you get into the fictional, pseudo-historical universe. (Haven’t read any of the books yet—I understand the TV series is deviating from the original in increasingly dramatic ways.) If anybody wants to catch up on the series before the season finale on Sunday, HBO is running the entire season this week. You can have an early-summer GoT marathon—and afterwards you can acquire a copy of Game of Thrones and Philosophy from Blackwell, which I have just ordered as a light summer read! If you are not worried about reading a spoiler article, take a look at Time’s review of episode 9, “Game of Thrones Watch: Smoke on the Water, Fire in the Sky” (those of you over 50 can start humming now…). It contains good character analyses, and a particularly insightful view of the character who emerges as the real focal point of the story, Tyrion Lannister:

I was on the verge of calling Tyrion’s behavior “heroic,” but that’s not really the term. Notably, we see that this is not Tyrion rising to his true calling or discovering that it is a far, far greater thing her does, &c., &c. It’s a practical decision, in that if the defenders of the city are not inspired, he will die. He plays the part (and Peter Dinklage does) masterfully, but he rouses his men with a purely practical argument too: “Don’t fight for your king, and don’t fight for his kingdoms. Don’t fight for honor, don’t fight for for glory. Don’t fight for riches, because you won’t get any.”

And the reviewer could/should have added what comes next—what Tyrion tells his army: “Fight for your homes.” Because Tyrion may be pragmatic, but he is not altogether a cynic.

All in all, it is a story about moral decisions, big and small—split-second decisions that come from the heart, or weighed by a calculating mind, and which all have consequences. Some decisions are made from a utilitarian, some from a deontological stance. Lots of ethical egoism in there, too, and just knee-jerk egoism. And some characters are pure at heart, and we see their ethic of care, their virtue ethics unfold, such as Sansa who from being a victim all of a sudden finds strength in helping others.

And so forth! If you’re looking for a joyride this week, leading up to the season finale on Sunday (and have cable), watch the 9 shows on HBO and look for all the moral, immoral and amoral viewpoints swirling around. A well-told tale, well acted, just right for some summer speculations about fictional problems of fictional characters.

Enjoy your summer!

 

 

Titanic–a Tale to Remember April 14, 2012

Posted by Nina Rosenstand in Culture, Ethics, Nina Rosenstand's Posts.
Tags: , , ,
1 comment so far

So now we think we know what happened, on that night exactly 100 years ago. Divers have explored the wreck, animated computer models have been presented, rescued artifacts are making their rounds around the world, stories of lost souls and survivors have been told, documentaries and movies have been made. So after the 100th anniversary, can we now close the book on Titanic? Or will it become one of the stories of humankind that we will never quite be done with? If so, it will be because, for one thing, it speaks to something perennial in the human psyche—and for another, because the story is broad and deep enough that different times and ages can find their own reflections in it.

When the disaster happened, the world was different—and I’m not talking about technology. The very mindset of the western world in 1912 was vastly different from today, because of the enormous optimism felt on two continents: the new century was going to be magnificent; the advances in medicine would soon conquer all diseases; technology would take humanity to far-away places on the planet, at break-neck speeds; politically, democracies were spreading, and war seemed like a primitive option, left behind in the turmoil of the 19th century (and few people were in the position to be able to predict the start of the Great War (WWI) just tw years later). And nature, in all its forms, would soon be conquered by human know-how and willpower. And what better symbol of the new age than the sister ships being built in Belfast, the Olympic and the Titanic? And when the Titanic, the carrier of the dream of the future, sank on April 14, 1912, the dream of an invincible 20th century perished, too, and in its place rose a wave of cynicism that we have, in effect, been riding ever since.

As we all know from Cameron’s movie (if we didn’t know already): It wasn’t the architect who claimed the ship was unsinkable—the concept came from the owners and the advertisers. The sinking of Titanic gave rise to cynicism and skepticism about what authorities tell you (don’t worry, there will be another lifeboat), about what advertisers tell you, about the promises of technology and even the wisdom of applying it. In short, Titanic now became a symbol for human hubris and nemesis, and that is the mirror Titanic has held up to us for a century.  

But now? With the 100th anniversary the drumbeat of the moral lessons of Titanic is sounding a new beat, coming from James Cameron himself. Two themes are emerging that one hundred years ago were not high on the agenda; one wasn’t even on the horizon. The recently corroborated fact that of the 1500 people who died that night, a great number were 3rd class passengers, locked up in steerage like rats, without even a change of escaping, has become a new theme: When disaster strikes, everybody suffers, but some may be suffering more than others: the have-nots. According to statistics, 75 percent of steerage passengers died, while among the first class passengers “only” 37 percent were lost. So the social aspect of Titanic as a class experience has emerged as a moral lesson, added to the hubris theme. But Cameron sees yet another moral caveat in the story of Titanic: the hubris of a planet thinking it can go full steam ahead without worrying about icebergs, for the sake of profit. For him, Planet Earth is a Titanic forging ahead into climate change.

So is that an appropriate lesson to be learned from the story of Titanic, or does it somehow deflect and detract from the actual tragedy happening to real people 100 years ago? Are they being used merely as a means to a political end? That is up to us to decide, individually. What fascinates me is that the doomed ship can take on a new narrative role as a teacher of moral lessons that go far beyond the concerns of 100 years ago. But perhaps that is the case with all good stories; they not only tell a timeless tale, but their lesson can be adapted to new ages and different problems.

Facebook Revisited–New Policies for Professors April 25, 2011

Posted by Nina Rosenstand in Culture, Education, Nina Rosenstand's Posts, Teaching.
Tags: , ,
7 comments

It’s taken a while, but there is finally a growing realization among professors that “friending” their students is not such a good idea.  And school administrators are certainly also catching on. This from The Guardian (UK):

Teachers are being warned not to “friend” pupils on Facebook amid concerns over the blurring of boundaries between school staff’s professional and private lives.

In a fringe meeting at the National Union of Teachers’ annual conference on Sunday, teachers were told that pupils are getting access to potentially embarrassing information about teachers on their Facebook pages, while headteachers and school governors are increasingly using information posted on social networking sites to screen candidates for jobs.

Karl Hopwood, an internet safety consultant and former headteacher, told the NUT fringe meeting: “The line between private life and professional life is blurred now because of social media.”

The same concerns extend to the world of college professors and students, sharing a daily environment—but on a professional level, not a personal one. That distinction needs to be reestablished in this age of the social media, regardless of what Mark Zuckerberg may think about the declining value of the concept of privacy. I talked about the subject on this blog last year, where I explained my take on professors friending students (and got a great deal of very interesting comments), and my concerns then have only been confirmed in the past year. In the real world you have to be able to distinguish between who is your colleague, who is your client (for lack of a better word), who is your acquaintance, and who is your Friend…and then all the others who are just faces on Facebook.

Happiness is a “Moment of Grace”? January 23, 2011

Posted by Nina Rosenstand in Culture, Nina Rosenstand's Posts, Philosophy of Human Nature.
Tags: , ,
2 comments

The Philosophy of Happiness is a hot topic these days; what St. Augustine said about time, I think we can safely say about happiness, too: When you don’t ask me, I know what it is—when you ask me, I don’t. Here, in The Guardian, is an interesting interview with French philosopher and novelist Pascal Bruckner who focused on happiness before happiness was cool.

Now, 10 years after its French publication, Bruckner’s treatise on the nature of happiness has finally received an English translation under the title Perpetual Euphoria: On the Duty to Be Happy. As Bruckner acknowledges, happiness is a notoriously difficult concept to pin down. We can take it to mean wellbeing, contentment, joy and pleasure, as well as several other definitions, but whatever it entails, it’s a philosophical topic that dates back to the very beginnings of the discipline.

For the ancient Greeks, happiness was synonymous with the good life. To be happy was to fulfil a harmonious role in an ordered society. Christianity replaced happiness with salvation, a life of denial for the promise of eternal bliss after death. It was the Enlightenment that returned happiness to earth. Most famously, the American Declaration of Independence guaranteed the right “to life, liberty and pursuit of happiness”.

Today, however, says Bruckner, people feel an obligation to be happy, and if they can’t live up to it, their lives collapse:

Bruckner suggests that with nothing standing between ourselves and happiness, other than our willingness to grasp it, there is a moral compulsion weighing on us to be happy – and it’s precisely this social pressure that makes so many people unhappy. “We should wonder why depression has become a disease. It is a disease of a society that is looking desperately for happiness, which we cannot catch. And so people collapse into themselves.”

 Bruckner’s book is a rich mixture of philosophy, literary learning and social observation; a cultured diagnosis rather than a populist cure. He does not believe that happiness can be reliably identified, much less measured. “Wellbeing is the object of statistics,” he says. “Happiness is not.” But he is not above issuing advice. “You can’t summon happiness like you summon a dog. We cannot master happiness, it cannot be the fruit of our decisions. We have to be more humble. Not because we should praise frailty or humility but because people are very unhappy when they try hard and fail. We have a lot of power in our lives but not the power to be happy. Happiness is more like a moment of grace.”

Bruckner is at pains to emphasise that happiness has more in common with an accident than a self-conscious choice. Interestingly, the origin of the word lies in the Old Norse word for chance: happ. But leaving happiness to chance, warns Bruckner, is not the same as ignoring it. “It’s said that if you don’t look for happiness, it will come. In fact, it’s not so easy. If you turn your back on happiness, you might miss it. It’s a catch-22 and I don’t think there’s any way out, except perhaps that real happiness doesn’t care about happiness. You can reach it only indirectly.”

But how similar are we in our experience of happiness? As much as I am skeptical of the merits of relativism, it is obvious that different cultures have different views of the achievement and experience of happiness;  the sense of happiness achieved by a Frenchman may be ontologically and morally different than that of a Dane (and of course the Danes, my ancestral people, are supposed to be the happiest people on Earth).  Here we have a good example of a field of research that needs input from psychology, neuroscience, and anthropology, with a dash of literature and poetry, and a philosopher’s touch to tie it all together. Looking forward to reading Bruckner’s text.

Imagining John Lennon December 8, 2010

Posted by Nina Rosenstand in Art and Music, Culture, Current Events, Nina Rosenstand's Posts.
Tags: ,
3 comments

Today we should commemorate another passing, but this one lies 30 years in the past. Dec.8, 1980, John Lennon was murdered. For my generation it is a date we remember, always, because of Lennon’s standing as a cultural personality, as well as the symbolism of his passing. Now Rolling Stone Magazine has published his final interview with audio clips. For those of you who were around on that winter’s day in 1980, it may remind you of how so many of us felt. For those of you for whom this is ancient history, maybe this will give you a bit of a feel for why Lennon was such a significant person—even a philosopher, as some would call him, and why his death was so devastating for an entire generation. Some of us see the world through different eyes now, but that doesn’t mean his words have stopped resonating, because they came from the heart of a great artist.

Here is what MTV has to say today:

It was 30 years ago today that former Beatle John Lennon was murdered by a crazed fan outside his home in New York. To mark that tragic event, fans around the world are planning commemorations of the singer’s life and legacy on Wednesday (December 8), remembering his message of peace and love and paying tribute to one of the premier songwriters of the modern era.

As part of that celebration of Lennon’s life, Rolling Stone magazine has devoted its final 2010 issue to a nine-hour interview the singer did just three days before his death on December 8, 1980. Select excerpts from the interview writer Jonathan Cott conducted with Lennon ran in a tribute issue put out by the magazine in January 1981, but the full talk sat on a shelf in Cott’s closet for nearly 30 years.

In audio excerpts from the interview on Rolling Stone‘s website, Lennon laments, “I cannot live up to other people’s expectations of me, because they’re illusory,” he said of his efforts to include positive messages of hope and togetherness in his music and the pressure to live up to his legacy. “Give peace a chance, not shoot people for peace … I only put out songs and answer questions … I cannot be 18 and a be a punk … I see the world through different eyes. I still believe in love, peace and understanding, as Elvis Costello says.”

Homo Ludens—Is Playing Good for Us? November 30, 2010

Posted by Nina Rosenstand in Culture, Nina Rosenstand's Posts, Philosophy of Human Nature.
Tags: , , , ,
9 comments

 Years ago a Dutch researcher, Johan Huizinga, came out with a book, Homo Ludens, “The Playing Human,” which claimed that playing is older than human culture, that even adults play for the fun of it, and it’s good for us. That was actually an eye opener for most people at the time. Since then the scope of play behavior analysis has been extended to studying social animals, (see Bekoff and Pierce (Wild Justice) ) suggesting that social play allows for the development of a sense of fairness and justice, not only in humans, but in some species of animals as well.

In this article, “Why We Can’t Stop Playing,” we see the positive analysis of play continued—but this time the spotlight isn’t on playing as a social activity, but very much a solitary experience: “Casual games” that are played on our computers and our cell phones, mainly to pass the time while waiting for appointments:

Why do smart people love seemingly mindless games? Angry Birds is one of the latest to join the pantheon of “casual games” that have appealed to a mass audience with a blend of addictive game play, memorable design and deft marketing. The games are designed to be played in short bursts, sometimes called “entertainment snacking” by industry executives, and there is no stigma attached to adults pulling out their mobile phones and playing in most places. Games like Angry Birds incorporate cute, warm graphics, amusing sound effects and a reward system to make players feel good. A scientific study from 2008 found that casual games provide a “cognitive distraction” that could significantly improve players’ moods and stress levels.

Game designers say this type of “reward system” is a crucial part of the appeal of casual games like Angry Birds. In Bejeweled 2, for example, players have to align three diamonds, triangles and other shapes next to each other to advance in the game. After a string of successful moves, a baritone voice announces, “Excellent!” or “Awesome!”

In the 2008 study, sponsored by PopCap, 134 players were divided into groups playing Bejeweled or other casual games, and a control group that surfed the Internet looking for journal articles. Researchers, who measured the participants’ heart rates and brain waves and administered psychological tests, found that game players had significant improvements in their overall mood and reductions in stress levels, according to Carmen Russoniello, director of the Psychophysiology Lab and Biofeedback Clinic at East Carolina University’s College of Health and Human Performance in Greenville, N.C., who directed the study.

In a separate study, not sponsored by PopCap, Dr. Russoniello is currently researching whether casual games can be helpful in people suffering from depression and anxiety.

Hardly an incentive for further development of one’s sense of fairness and justice, like social play! But it may still have merit, if it can offset the unnaturally high levels of stress most of us labor under. For one thing, we can conclude that playing games by oneself adds an important dimension to the play behavior phenomenon; for another, I find it fascinating that the article doesn’t end with a Caveat such as, “You’re just being childish, needing approval from the world,” or “If you play too much you’ll become aggressive/a mass murderer/go blind” or whatever. For decades we’ve heard about the bad influence of computer gaming, as a parallel to the supposed bad influence of violent visual fiction. But the debate is ancient: to put it into classical philosophical terms, Plato warned against going to the annual plays in Athens, because he thought they would stir up people’s emotions and thus impair their rational, moral judgment; Aristotle, who loved the theater, suggested that  watching dramas and comedies would relieve tension and teach important moral lessons. In the last two-three decades most analyses of the influence of entertainment have, almost predictably, ended with a Platonic warning about the dangers of violent TV, movies, and videogames. Are we slowly moving in an Aristotelian direction? That would be fascinating, but here we should remember that Aristotle didn’t want us to OD on entertainment: the beneficial effects are only present if entertainment is enjoyed in moderation. 15 minutes of “Angry Birds” ought to be just enough…