jump to navigation

Are We Stories? Do We Want to Be? November 26, 2014

Posted by Nina Rosenstand in Nina Rosenstand's Posts, Philosophy, Philosophy of Human Nature, Philosophy of Literature.
Tags: , , , , , ,
add a comment

Every student of mine will know that sooner or later I will be introducing them to some story which illustrates some philosophical idea to perfection. And I am indeed a firm believer in the ability of good stories–film as well as literature–to provide the “meat” for the “bones” of a dry or complicated philosophical theory, especially in moral philosophy. Just think of Ursula Le Guin’s “The Ones Who Walk Away from Omelas” as a critical expose of utilitarianism. The film Extreme Measures, same thing. Ethical relativism, look no further than Kingsolver’s The Poisonwood Bible. And my latest addition to the moral universe of fiction: The television series Longmire, with Sheriff Walt Longmire being the most Kantian of heroes since Will Kane in High Noon. But rarely do we get into the core of narratology, the notion of personhood being inexorably linked with the ability of a person to tell his or her own story; it is really only in my Phil 111, Philosophy in Literature class that we have the luxury of getting into that corner of philosophy and storytelling. But this is where the field first saw the light of day, in the 1980s and 1990s, with philosophers such as Alasdair MacIntyre, Paul Ricoeur, Martha Nussbaum, Daniel Dennett, and a number of literature people such as Wayne Booth and David Carr. The idea that we become who we are because of our capacity to “connect the dots” in our lives into a narrative whole has caught on so that narratology today has two distinct areas, an epistemological/ontological side where the personal narrative becomes our human mode of being, and the ethical one where gathering one’s events into a story becomes a moral requirement in order to be a human being with care and direction.

But now there are voices, questioning the truth of “humans being storytelling animals,” at least as far as our own stories go. Because when we tell the story of our life, we are (like Ricoeur said) always in the middle, we don’t remember our beginning, and we won’t be able to tell the story of our end. In New Philosopher 11/25/2014 Patrick Stokes writes,

Biographers can describe a human life in narrative terms quite successfully, but they can only do so successfully from a certain distance, leaving out lots of trivial everyday detail. Zoom in close enough, and the ‘story’ of a human life starts to look like a pretty ineptly-scripted one, full of abandoned subplots and details that signify nothing and go nowhere.

Our lives don’t always resolve across a neat five-act structure either. 17th century French philosopher Blaise Pascal noted that the final act is always bloody, but very often that final act comes out of nowhere, a jarring interruption to the narrative coherence of our lives rather than a neat conclusion. And even if our lives are stories, we won’t be around to find out how they end.

That’s a problem for narrativists, because how stories end is central to their meaning. An alternative version of Romeo and Juliet where the protagonists survive isn’t the same story with a different ending – it’s a completely different story. The narrative meaning of everything leading up to the end turns out to be very different.

Stories have narrative shape, and only things with boundaries can have a shape. How a story begins and ends is an integral part of its narrative meaning and trajectory. But we have no idea how our lives will end, and quite possibly won’t know about it when they do. If that happens, we won’t ever have access to the final narrative meaning of our lives, we will never have known whether it was a tragic story of star-crossed loves or a tale of triumph. It’s like we’re watching a movie where we actually have some direct control of the plot, but realise we might never find out how it ends.

For one thing, Ricoeur solved that one, in his book Oneself as Another: He says to imagine one’s ending, and relate to the imaginary unity of one’s life that way. We can’t control our fate, but we can influence its direction through the story we tell. But there is another problem with seeing our lives as stories, and that is something that has made me a little more reluctant to embrace the theory of us being our stories. Because a good story, in order to have a point, invariably has to involve problems, problems that will then get resolved at the end. Maybe even horrific problems, tragedies, horror stories, tales of loss and grief, the depths of human misery. Because nobody wants their life to be a comedy, right? So if our story is supposed to be serious, we must embrace the drama, the tragedy. But perhaps most of us would rather just have a boring, safe life with predictable events, just some fun, some love, some sweetness, and then whatever problems that arise, get rid of them/get over them as fast as we can? But those lives don’t make great stories. In order to leave behind a worthy tale of our lives, we need to include the drama, the tragic, and then overcome it through a character arc.
Aside from the fact that most people’s lives will include tragedy whether we want it or not, it hardly seems like something to strive for, just so we can say that we improved on our character. Maybe most of us would prefer to read/watch fictional stories and biographies about other people’s tragedies and hope those things don’t happen to us…

2013 was the 50th Anniversary of “The Banality of Evil” December 17, 2013

Posted by Nina Rosenstand in Criminal Justice, Culture, Current Events, Ethics, Nina Rosenstand's Posts.
Tags: , ,
3 comments
While it is still 2013 I want to mark the anniversary of an important concept in moral philosophy, the re-legitimizing of the concept of evil through the writings of German philosopher Hanna Arendt. 50 years ago Arendt launched a concept that was to have an enormous influence on discussions in the field of ethics for the rest of the 20th century, discussions that continue to this day––an influence that has had wider consequences than she probably imagined. She launched the idea of the banality of evil.
Interestingly, philosophers in the early 20th century, especially within the Anglo-Saxon philosophical tradition, had all but given up on the word evil. Too much religious baggage (just think of the problem of evil, the theodicy: how can a good god allow horrible things to happen?), too judgmental, too moralizing––at a time when most English-speaking philosophers were grappling with the meaning of words rather than with the meaning of life. Back in the 19th century and further back, thinkers such as Immanuel Kant had no problem throwing themselves and their readers into discussions about the ultimate meaning and values in life, and the notion of an evil person was not alien or uncomfortable as a  topic of analysis.But what is a moral philosopher to do when, in the mid-twentieth century, the ultimate worst conceivable behavior runs rampant across the continent of Europe? Call it “bad behavior”? Call it “behavior frowned upon by most people according to the standards of Western societies?” Call it “a moral choice made by a cultural subset?” Or “just another form of acculturation?” What words could a mid-twentieth century philosopher use to describe Nazi atrocities that wold seem sufficient? In 1963  Arendt, a German Jew who had narrowly  escaped the Holocaust, witnessed the trial in Jerusalem of captured Nazi official and instigator of the Holocaust Adolph Eichmann, and observed that to her surprise he did not look like a monster.  He looked, and sounded, frighteningly normal. And that, she said in her book, Eichmann in Jerusalem: The Banality of Evil, was the terrifying key to the Nazi engagement in terror, torture and murders: The Nazi torturers were normal people who, either under pressure from superiors, or from twisted values and twisted thinking, had reached the conclusion that torturing and killing innocent human beings was the right and normal thing to do. It had become a banality––the banality of evil.
And thus was launched a renewal of the concept of evil in philosophy; removed from a religious context, it focused on the inner “Schweinehund,” our baser, cowardly instincts that make us follow the crowd, or go with the lowest common denominator when it comes to standing up for simple decency and humanity, to the point where we are willing to disregard the humanity of another person if someone in authority tells us we’re not responsible, that it is for the good of all, or that bad things will happen to us if we don’t comply. Stanley Milgram corroborated the phenomenon with his “electric shock experiments”––to see how far a person would go when told to shock people who had given the wrong answer to a question. It is a relief to know that nobody was really being shocked, but utterly disconcerting that most of Millgram’s subjects were quite willing to shock people to death when told that they had to. Banality of evil. And a few years later Philip Zimbardo ran his now infamous Stanford Prison Experiment unintentionally resulting in real  harm caused to “prisoners” by “prison guards.” The participants were merely students doing role-playing, because the “prison guards” had been told to be ruthless to the “prisoners,” but some of them took to their roles with zest, and very quickly. Zimbardo himself pointed out, in his book The Lucifer Effect from 2007, that the willingness of the “prison guards” to commit atrocities was the banality of evil rising in completely normal people put into a situation where their moral compass had somehow been disengaged—a situation that her saw paralleled in Abu Ghraib. (I’ve visited the subject several times before in this blog, such as in “The Concept of Evil, and Joseph Duncan,” and “On a Scale of 1 to 22…”, so essentially I’m merely summing up what I’ve said several years ago, but with the specific anniversary of the concept in mind.)
So since the 1960s we have, in philosophy, been able to use the term “evil” as a term for the moral weakness lurking in almost every individual which can make us follow or even give unspeakably vicious orders and forget about the humanity of other people. But is that enough? It took courage for Arendt to reintroduce the concept of evil into the philosophical vocabulary; it took integrity to give voice to a condemnation that most thinkers felt, but perhaps were too polite and well-schooled in non-judgmental meta-ethics to engage in.
And now it is the question whether we’re ready to, once again, redefine the concept of evil. Because surely not all acts that cause irreparable and deliberate harm is caused by people following orders or bad role models. Sometimes some individuals make choices they know will cause harm, and they know it is considered wrong by society, but they want to do these things, anyway, for gratification or profit. The march goes on, from school shootings to imprisonment in locked rooms or basements of women (sometimes strangers, sometimes daughters!) for sexual gratification, to serial killings. Is that the banality of evil, too? I don’t believe so. Immanuel Kant would say that such choices deserve the term evil because they are (1) deliberately done, (2) with awareness that they violate society’s standards, (3) causing intense physical or psychological pain or damage to innocent people, and (4) done for selfish reasons. Is it possible that those individuals are possibly more mentally ill than “evil?” Perhaps, but as long as our legal system allows us to recognize a criminal as “sane” if he or she has a sense of the moral rules of society, then breaking the rules is a deliberate act more so than the act of a helplessly sick person. So perhaps it is time to allow an old concept to return to our modern vocabulary of ethics? In my textbook, The Moral of the Story, I have adopted the term “egregious evil” for such behavior, to differentiate it from “banality of evil,” but it is still a good question whether we are referring to evil acts, or evil persons, because therein lies a world of difference.
Maybe in the future we can come up with a term that is better suited to describe such acts of deliberate, egregious harm, a term that will allow a society to express its moral outrage while at the same time recognizing whatever neuroscientific insight into criminal pathology that we may acquire? Until then I will feel comfortable with a cautious application of the old word evil in its two applications—banal and egregious.

The Ethics of Food: Why not Horse Meat? March 2, 2013

Posted by Nina Rosenstand in Culture, Current Events, Ethics, Food and Drink, Nina Rosenstand's Posts, Philosophy of Food.
Tags: , ,
6 comments

A scandal is shaking up Europe: meat departments in supermarkets have been pulling “beef” from the meat counters, because it has turned out that horse meat DNA has been present in what was sold as beef. And lately the Swedish furniture giant Ikea has been pulling their meatballs (oh no, not that!) and sausages from stores in 21 European countries. According to an AP report,

“Monday’s move comes after authorities in the Czech Republic said they had detected horse DNA in tests of 1-kilogram (2.2-pound) packs of frozen meatballs labeled as beef and pork.”

 According to Ikea their stores in the US are not affected, because they use meat from suppliers in the U.S. However, Burger King recently severed ties with an Irish supplier because of horse flesh contamination, and Taco Bell has has similar issues.

 So this is now an expanding scandal–but what exactly is the problem? First of all, it is of course a matter of consumer confidence: You buy something believing it is beef, so you don’t want something that’s not beef. (If you were actually shopping for a horse burger in France, you would not appreciate if the meat had been mixed with pork, or ostrich. It is a matter of consumer expectations.) But second of all, there’s the horse thing.

 In some places of the world they eat horse, and like it. Growing up in Europe, I’ve been served horse burgers myself, and I didn’t much care for them; they tasted too sweet for me, like a hamburger with honey. In certain cultures in Southeast Asia dog and cat meat is on the menu. In some villages in Africa they have, at least until recently, eaten gorilla. In some remote locations in the South Pacific “long pig” was considered an acceptable food item (at least according to legends and Hollywood movies) until well into the 20th century. And we all come from distant ancestors who ate just about anything that would keep them alive. In some places they have even eaten dirt, but that’s not digestible. Meat is. Food can be many things to many people, and just because something can be digested doesn’t mean we accept it as food. Food taboos are known all over the world, and some are founded in the culture’s religion (such as the ban on consuming pork in Judaism as well as in Islam, and the ban on eating beef in Hinduism), while others reflect memories of past contaminations (and historians speculate that perhaps most food taboos have such contamination fears as their point of origin).

 But some of the food taboos in a modern, largely secular culture such as ours are neither founded in religion nor based on past memories of contaminants. It isn’t inherently any more unhealthy to eat horse, dog, or cat that it is to eat beef, but most of us wouldn’t dream of serving or eating those animals, because we regard them as pets, and even as family members. So there is the familiarity factor, and the cuteness factor, but of course the food taboo can also include a “Yuck” factor such as in our reluctance to eat rats. (And how about snails? Oysters? Prairie oysters? Depends on what we’re used to. When the eponimous hero in the movie Tom Horn is served lobster for the first time, he quips, “I’ve never eaten a bug that big.”)

Our legislation doesn’t always reflect such taboos, or is even clear about the prohibitions, and the reasoning behind them. We can’t slaughter, serve or eat dogs and cats. Up until 2011 a horse could not be slaughtered (for human consumption) in the US, but Congress did not extend the ban which then expired.

  In Nov. 2011, Congress decided not to extend a ban on USDA horse meat inspections. Over the five years prior to that, Congress banned the USDA from using any taxpayer funds for horse slaughter inspections through its annual budget appropriations for the department. And since the Federal Meat Inspection Act requires the USDA’s Food Safety and Inspection Service to inspect animals for slaughter, carcass by carcass, there was no way for horses to make it to American dinner tables.

 But since the ban has been lifted, there still are no protocols for the USDA to conduct equine inspections.

 “Despite a November 2011 decision by Congress not to extend the ban on horse slaughter, the USDA says there are no establishments in the United States that slaughter horses.

“It is a hugely political issue – it has to do with the slaughter of horses and whether that’s acceptable to U.S. society or not – and so there are two sides to the argument,” said William Hallman, director of the Food Policy Institute at Rutgers University in New Jersey.

 Opponents of horse slaughter essentially say eating horses is not part of American culture, equating it to the slaughter of other pets.

 “We have a 250 year relationship in the United States with horses and eating them has never been a part of the equation,” said Wayne Pacelle, president and CEO of The Humane Society of the United States. “It would be quite a turn in the road to view animals who helped us settle the country as an appetizer or main course.” “

 But didn’t oxen also help us settle the country? Those big Conestoga wagons were sometimes pulled by oxen. And oxen have pulled plows. Every time we eat a steak or a burger, we bite into the remains of a steer. Some gratitude! The fact remains that our food taboos are selective, and based on feelings as well as tradition and convenience. Some people won’t eat “anything with a face.” Some won’t eat anything with a cute face. Some will eat anything as long as it no longer has a face. How do you feel about the horse meat issue? Would you eat horse? Why or why not? And is there an inherent moral difference between eating horse, beef, pork, snake, kangaroo, or grubs? Not to mention “long pig”? Let’s assume that none of the species are endangered…So where do we draw the line? At the level of intelligence, a Kantian response? Pigs are far more intelligent than horses, according to the experts. How about according to the amount of suffering, a utilitarian approach? If emotional suffering (=fear) counts, then we all know what “Silence of the lambs” means, and animal behaviorist Temple Grandin has taught us that the fear factor is very high in animals being led to the slaughter. How about another utilitarian angle, a distinction between the suffering of one animal feeding many people vs. the suffering of one animal feeding just a few? (A steer vs. a chicken, for example). (Or how about the choice of ethical egoism: satisfy your own needs in pursuit of your own happiness?) Regardless of our underlying moral theory we make choices, and they are grounded partly in our traditions, and partly in our feelings, rarely in dispassionate logic. So granted that our cultural choices of food are more driven by emotion than other considerations (unless we’re starving), then at what point does your food ethic kick in?

Chris Dorner: Not a Folk Hero February 14, 2013

Posted by Nina Rosenstand in Criminal Justice, Current Events, Ethics, Nina Rosenstand's Posts.
Tags: , , , , , ,
2 comments

It seems the saga of former LAPD cop and spree killer Chris Dorner has now come to an end, in a way that he himself predicted: He would not survive to experience the fallout. And I suspect that many of you, like myself, have been eerily mesmerized by the unfolding story over the past week. More fortunate than most, I have been able to discuss the case with a bunch of intelligent students, and we have exchanged viewpoints. I have also listened to talk shows, read online commentaries, followed news briefs, and read most of the Manifesto which Dorner had posted to Facebook. And I’m sitting here with a very bad feeling—not just for the four people who fell victims to Dorner’s vengeful rage, and for their families, but a bad feeling about the voices in the media who somehow seem to have elevated Dorner to some sort of folk hero, a Rambo, a Jason Bourne kind of character (as a guest on a talk show pointed out). When such views have been expressed, they have generally been prefaced with, “Yes, of course what he has done is wrong, BUT he has a point,” or “Of course he shouldn’t kill people, BUT even so, he is fighting the good fight.” In other words, his actions may be wrong/over the top, but somehow it is in a noble cause.

Now that upsets me. It upsets me, because that kind of evaluation shows a fundamental misunderstanding of the connection between having a cause and taking action, and perhaps even a politically motivated willingness to overlook certain very disturbing facts in favor of some subtext that some people feel ought to be promoted, such as “the LAPD is in need of reforms.”

So let us look at what Dorner actually did (allegedly, of course): He shot and killed a young woman and her fiance. The young woman was the daughter of an ex-cop from the LAPD who had been Dorner’s lawyer. He also shot and killed a Riverside police officer, as well as a San Bernardino deputy. In addition, he deprived three people of their right not to have their liberty interfered with (he tied up an elderly boat owner in San Diego, and two maids in Big Bear), he wounded several police officers, and he stole two cars. And for what purpose? In the Facebook Manifesto he states it clearly: Because he felt that he had been wronged when fired from the LAPD in 2009, he felt that the only way to “clear his name” was to kill members of the LAPD and their families.

Martha Nussbaum, the American philosopher, says that emotions should be considered morally relevant, provided that they are reasonable, meaning that they arise as a logical response to a situation, and thus inspire moral decisions/actions that are somehow reasonable/proportionate to the event that caused the anger (Nussbaum is also a philosopher of law). So let us allow for the possibility that Dorner experienced an emotion that was a relevant response to his (perhaps) unfair dismissal from the LAPD: He was angry. But exactly what is reasonable anger? That would be (according to Aristotle, whom Nussbaum admired) righteous anger that is directed toward the right people, for the right reason, at the right time, in the right amount. But even if he was unfairly dismissed (which is a common experience for many people), and even if he had experienced racism at his workplace,  would it ever be morally reasonable for him to exact revenge on the daughter of his lawyer? Or her fiance? Neither of them had anything to do with his being fired. The murders were simply a means to cause pain to her father. (For you Kant-aficionados: Dorner used his lawyer’s daughter merely as a means to an end to get back at him.)  The moment Dorner made good on his threat to start killing the relatives of LAPD officers was the moment where he lost any claim to a moral high ground, any claim to a righteous anger or any claim to taking justifiable action. That was the moment when he went from somebody with possibly a justified grievance to merely being a thug, and a petty, selfish one at that, taking his anger out on innocent victims.

And the killing of Riverside and San Bernardino law enforcement officers? That seems to have been dictated by his poor judgment, and his attempt to escape the dragnet cast over all of Southern California, not by his manifesto. He claimed to go after LAPD officers because the LAPD had “done him wrong,” but in the end, it was Riverside and San Bernardino that lost members of their police departments.  We can discuss, in the weeks to come, whether he was actually mentally stable in his final week. We can discuss whether the manifesto reveals an intelligent, reflective mind, or a person on the brink of insanity. We can discuss whether another outcome had been possible. We can even discuss whether his manifesto made some valid points. But the fact that he broke the basic covenant that he had been taught, as a police officer, to protect and serve those who need protection, and showed abysmal disregard for the lives of innocents, resulting in a chain of events that cost additional lives, removes him from the realm of folk heroes and reduces him to merely another criminal who will be remembered for the lives he took, not for his rationale. Even if it should turn out that original rationale may have been justified—he may have been right that he was treated unjustly—that does not justify in any way what he has done.  And for some media voices to overlook that fact is very disturbing…

‘Tis the Season to Give (and Receive) Gifts December 9, 2012

Posted by Nina Rosenstand in Culture, Ethics, Nina Rosenstand's Posts.
Tags: , , , , , ,
1 comment so far

The philosophy of giving is an interesting little branch off the big branch of Generosity, growing on the tree of Virtue Ethics. Interestingly, Generosity is entangled with another branch, Gratitude. (I even wrote something about that in The Moral of the Story Chapter 11). So when an article in the Wall Street Journal focused on giving and regifting recebtly, I thought I’d share some of its points with you. First of all, an interesting illustration:

image

Next, some fascinating points made:

Some gift givers spend time and energy trying to find just the right gift. But thoughtful gifts don’t necessarily lead to greater appreciation, according to a study published in November in the Journal of Experimental Psychology: General. The benefit of a thoughtful gift actually accrues mainly to the giver, who derives a feeling of closeness to the other person, the study found.

People are more appreciative when they receive a gift they have explicitly requested, according to a similar study published last year in a separate publication called the Journal of Experimental Social Psychology.

Sharon Love once received a book that was clearly regifted: It was inscribed to the giver. She gave it back to him the following year. Ms. Love, who heads a marketing agency in New York, is herself a regifter when a gift is appropriate for another person.

“It turns out it’s not the thought that counts, it’s the gift that counts,” says Nicholas Epley, a professor of behavioral science at the University of Chicago…

Oh, where to start? What a smorgasbord of philo-associations!

Psychological egoism: They’ve said it all along, we give so we’ll feel good! BUT if the receiver doesn’t appreciate our gift, we won’t feel nearly as good, so we must have at least some interest in actually pleasing someone else.

Aristotle’s Golden Mean: There are a thousand ways to miss the bull’s-eye, and only one right way to hit it. There is one right gift for our friend/mom/dad/spouse/child/colleague out there, and if we have an excellent character we will know what that is.

The Revision-of-the-Golden-Rule philosophy/The Platinum Rule: And the right thing is how they want to be treated, not what you want to give them (because that’s what you’d want yourself! Think of Homer Simpson and the bowling ball for Marge) :)

And there is more support for Aristotle here:

Another study found spending more money on a gift doesn’t necessarily translate into greater appreciation. That might come as a surprise to many gift givers, who often assume that a more expensive gift conveys a higher level of thoughtfulness, according to the research, published in 2009 in the Journal of Experimental Social Psychology.

I don’t mean to sound sanctimonious, but some of us grew up in a less materialistic world, and the idea of “the more expensive, the better” is somewhat alien to us. But there’s always the assumption that if someone is going to return our gift to the store, then it looks better if they can get another gift at the value of $50 than at $15…That’s just human nature. But again, what would Aristotle say? The Golden Mean is a mean between two extremes, too much and too little. For each situation there is an appropriate action/feeling (and purchase), and sometimes what your recipient really really wants is something small and simple. Sometimes it is huge and expensive, to be sure, but then Aristotle would say that you are guided by the Golden Mean of your ability to give, and fondness for/past history with the recipient.

And then there are thoughts about regifting, about a gifted purse:

“I thought, ‘You know, I know someone else would like it more than I would.’ So I gave it to one of my friends for her birthday,” Ms. Sayeed says. About six months later, the friend came over to Ms. Sayeed’s aunt’s house, purse in hand, and the aunt exclaimed, “You know, Humera has a purse just like that!”

“I said, ‘You know Auntie, I loved it so much that I got her the same one,’ ” Ms. Sayeed fibbed. “I had a moment to probably come clean about it and I just decided it would be better not to, which I guess is why people feel sneaky about regifting.”

So a kind little utilitarian lie is also part of the discussion…And the upshot is,

The adage “It’s the thought that counts” was largely debunked by the recent study in the Journal of Experimental Psychology: General, which concluded that gift givers are better off choosing gifts that receivers actually desire rather than spending a lot of time and energy shopping for what they perceive to be a thoughtful gift. The study found thoughtfulness doesn’t increase a recipient’s appreciation if the gift is a desirable one. In fact, thoughtfulness only seemed to count when a friend gives a gift that is disliked.

And that brings me to my final branch of this discussion, on the tree of Virtue Ethics: the virtue of Gratitude. And this is where we switch from “descriptive” to “normative.” After all, we’re not doing psychology but philosophy here. So my response would be, Then start showing some gratitude for the thought, for goodness’ sake! Gratitude is not just a feeling, but an attitude (yes I know, it actually rhymes). You can show gratitude even if you don’t have that warm, overwheming feeling. If you wait for the feeling to arrive, somebody didn’t raise you right. So when you get that yucky somethingorother, regifted or not, then smile and say thank you, and if you can tell that somebody actually spent a lot of effort in getting that one thing to you, tell them it’s amazing how well they know you. And since I’m not a fan of regifting at all, since you risk offending a kind giver irreparably, then donate the gift that wasn’t perfect. Somebody out there in a thrift shop will thank you.

Merry Christmas/Happy Holidays!

Russell Means in Memoriam: Mitaku Oyasin October 25, 2012

Posted by Nina Rosenstand in Culture, Nina Rosenstand's Posts, Philosophy of Gender, Political Philosophy, Teaching.
Tags: , , ,
1 comment so far

Many of my students have heard me talk about Russell Means over the years. A complex man in complicated times, believing he saw a simpler solution to the culture war he believed still existed between the American Indian communities and mainstream America. A man who had his own vision, and sometimes version, of history. Russell Means passed away October 22 from throat cancer, and the spectrum of Americans in the public eye has lost a unique dimension.

So who was Russell Means? An Oglala Sioux Indian, with many different facets to his life. The American Indian activist Means was the chief organizer of the second Wounded Knee uprising in 1973, and was involved in various American Indian movements. His activist accomplishments are outlined here. The politician Russell Means ran as the vice presidential candidate to Larry Flynt’s presidential candidacy in 1983, and ran against Ron Paul as the Libertarian presidential candidate in 1987 (Paul won). The actor Russell Means was a significant Hollywood presence, playing the iconic character Chingachgook in the 1992 movie The Last of the Mohicans—and provided a special ambiance to the 2004 season of the HBO series Curb Your Enthusiasm. His film credits are numerous, as you can see on the Wikipedia site. The businessman Russell Means ran a website where he told his story, and the story of the plight of the American Indian, and sold CDs, art and t-shirts in support of the American Indian cause. And (this is where my connection to him comes in) the lecturer Russell Means would travel around the United States to college campuses, educating new generations of students to what became his own version of American Indian history. He was a speaker at San Diego Mesa College in the 1990s, and that was where I met him. (And I should also mention: the private citizen Russell Means had domestic problems, and was arrested for assault and battery toward his father-in-law back on the Navajo reservation. And he had more severe legal problems earlier on when he was indicted for murder on a reservation, but acquitted.)  And his image is familiar to Andy Warhol fans–Warhol painted Means 18 times.

But back to the lecturer persona. I wish I could tell you exactly when Means visited the Mesa College campus, but I don’t see any references to his visit on the Mesa College website, and all I remember is that it was in the early years when I was first teaching Philosophy of Women; so: the late 1990s. He was scheduled to give a talk in the room behind the cafeteria, and I decided to bring one of my classes to the talk. The room was packed with people, sitting on chairs, standing, sitting on the floor, and Means, 6 ft tall or more, hair in braids, was a very imposing sight, and a gifted speaker. Well, he kept on talking past the class period, and I ran back to the classroom and collected the students waiting for my next class, and brought them down to the cafeteria; Means was still talking. And he kept on talking for a good four hours, about what it is like to be an American Indian, about his battles and his careers, about American Indian traditions, about discrimination and near-genocide, and about the term “Indian” itself. He shocked most of us PC college people by declaring that he didn’t mind being called an Indian, and that the proper term to use was either “American Indian” or the tribal name such as Oglala Sioux Indian, but never Native American. He said it was a term invented by Washington for funding/political purposes (which is why I, to this day—and my students and readers of my books can verify that—always use the term American Indian). And the term Indian itself? Wikipedia (below) got his argument right, but whether it is also historically correct is something I can’t determine (and I have yet to find a historian who agrees with Means): 

Since the late 20th century, there has been a debate in the United States over the appropriate term for the indigenous peoples of North America. Some want to be called Native American; others prefer American Indian. Means said that he preferred “American Indian”, arguing that it derives not from explorers’ confusion of the people with those of India, but from the Italian expression in Dio, meaning “in God”.[17][18] In addition, Means noted that since treaties and other legal documents in relation to the United States government use “Indian”, continuing use of the term could help today’s American Indian people forestall any attempts by others to use legal loopholes in the struggle over land and treaty rights.

In addition he talked about what I referred to above as a culture war between the mainstream Euro-American tradition and the American Indian peoples. He said, the Euro-Americans are a culture of swords and violent domination, while the American Indians are a sharing culture, a culture of partnerships symbolized by a bowl or drinking cup. At that point my ears perked up, because I had just been reading Riane Eisler’s book The Chalice and the Blade, about the ancient gynocentric (female-oriented) partnership cultures of Old Europe symbolized by the chalice, vs the invading patriarchal dominator cultures worshipping the “lethal Blade.” So I asked Means if there was a connection, and he said, “Yes, there is this woman who wrote a book about the same phenomenon in Europe, and it fits the situation in this country between the indigenous peoples and the invaders. ” From from a gender-philosophical standpoint I found it fascinating that he would adopt the Eisler theory as an explanatory model for the American Indian culture (even if Eisler is also considered an activist and in no way a historian).  I tend to be skeptical of such arguments which tend to simplify very complex matters, and fan an ongoing (and possibly outdated) tension and enmity, and I see no reason to find Eisler’s theory more historically accurate because of the fact that Means liked it, but I found the confluence of research, activism and tradition to be intriguing.  If you want to experience him talking about the topic of partnership cultures and gynocentric (matriarchal) cultures, watch this YouTube clip.

Means ended his 4-hour lecture at Mesa by teaching his audience the end of every Oglala and Lakota prayer: two words that embrace all of creation, everywhere and for all time: Mitaku Oyasin: We are all related. And while much of his lecture was, to a scholar such as myself, a creative journey into personal interpretations rather than facts (and sometimes interpretations that were hard to swallow), his passionate sincerity rang true, and has stayed with me as a cherished memory. And his prayer still comes to mind sometimes when I’m looking for connections and common ground rather than analytical differences. So: Thanks for the lessons, Russell, and Mitaku Oyasin…

Cross-posted at Rosenstand’s Alternative Voice blog for Rosenstand’s Mesa students.

Kevin Coe Revisited: Locked Up for Life September 29, 2012

Posted by Nina Rosenstand in Criminal Justice, Current Events, Nina Rosenstand's Posts.
Tags: , ,
2 comments

I had to look up my original post about Kevin Coe, and to my surprise it’s been four years since I wrote it. The wheels of justice grind slowly, or at least the wheels of the justice system. If you go back to that post from October 2008, you can refresh your memory about the serial rapist Fred (Kevin) Coe who had been serving time for 25 years for one rape, the only sentence they could make stick after several other sentences were overturned because of a technicality. When the date for Coe’s release was approaching, the state of Washington decided that he was too dangerous to be let loose, so a new law was applied that allowed the WA court system to commit Coe to a mental institution. And now Coe’s final WA Supreme Court appeal has been denied, and his life will be spent on McNeil Island (barring new developments, of course).

Coe and his attorneys appealed his commitment on the argument that Superior Court Judge Kathleen O’Connor erred when she allowed assistant attorneys general to introduce evidence from 36 sexual assaults that did not result in criminal charges against Coe. They also contended that he had ineffective counsel and that he deserved a new trial.

“Finding no reversible error in any of Coe’s claims, we affirm his commitment,” Justice Susan Owens wrote for the majority.

“In general, I believe allegations of uncharged crimes should not be admitted into evidence,” Chambers wrote. “Experts should not act as funnels to allow lawyers to get into evidence through their expert opinion what is otherwise inadmissible.”

Yet, Chambers noted that he agreed with the decision to uphold the conviction because of the “unusual elements” of the case.

“In many, though not all, of the uncharged crimes, the perpetrator put fingers into victims’ mouths; attempted to induce the victim to urinate or defecate upon him; and asked personal and offensive questions,” he wrote. “The overwhelming untainted evidence supports the jury’s verdict.”

The decision likely ends Coe’s legal pursuit of exoneration after his arrest in 1981, following dozens of rapes attributed to the so-called South Hill rapist.

In my  post from 2008 I expressed my satisfaction with the judges’ decision, but at the same time I found it disturbing that principles seemed to be pushed aside for the sake of pragmatism. I can’t say I’ve changed my mind—I am still relieved that Coe is not going to be out and about, and I still find a forward-looking approach to justice disquieting. But look how common sense prevailed, in Chamber’s note. They had the guilty person all these years, he made so many women suffer and never had to answer for it except for one case, and now—he is not being punished for something he hasn’t been convicted of doing, but the community is being kept safe based on evidence of habitual activities. Justice.

Intoxication September 27, 2012

Posted by Dwight Furrow in Dwight Furrow's Posts, Food and Drink, Philosophy of Food.
Tags: , ,
2 comments

Seilenos-Dionysus

Cross-Posted at Edible Arts

A bottle of Jack Daniels is intoxicating if you drink enough of it. The ambient music of Steve Roach is intoxicating as well. Clearly, they are not intoxicating in the same way.

The Jack Daniels will cause drunkenness; but the experience of drinking plays no role in the intoxication just as the experience of taking a sleeping pill has nothing to do with its effects. The effect is all that matters, and you will be just as intoxicated if you drink absentmindedly. Drunkenness is an experience, but it is an experience in which our attention is not directed at anything in particular.

By contrast, with music, the listening itself is crucial to the intoxication. The intoxication is not just an effect of the music; the experience itself, and the attention we give to it, is a necessary component. The hearing is itself intoxicating, and the experience is about something—namely the music.

Happily, wine is intoxicating in both respects. In sufficient quantities it causes the intoxication of drunkenness but the experience of tasting wine is itself intoxicating. The smells, flavors, and textures of wine can be moving and exhilarating just as the sounds of music can be when we direct our attention to them.

As Roger Scruton writes in his article “The Philosophy of Wine” (available in this anthology)

“The intoxication that I feel is not just caused by the wine: it is, to some extent at least, directed at the wine, and not just a cause of my relishing the wine, but in some sense a form of it. The intoxicating quality and the relishing are internally related, in that the one cannot be properly described without reference to the other….I have not swallowed the wine as I would a tasteless drug; I have taken it into myself, so that its flavour and my mood are inextricably bound together.”

Scruton’s analysis seems right up to a point but I doubt that aesthetic intoxication is wholly unrelated to the mild, alcoholic buzz induced by wine. The flush of exhilaration caused by the alcohol (in small quantities) seems to sharpen one’s anticipation, and lends itself to feelings of enchantment that may influence our perceptions and judgments. Even in contexts where I taste and evaluate many wines, and must spit and dump to remain sharp, enough alcohol is absorbed through mouth tissue and accidental swallowing to influence my mood. The attentional focus of relishing and savoring are important but I doubt that they are the whole story.

The intoxication of music may also depend on effects that go beyond savoring. Music influences our moods and expectations in ways that are likely to profoundly influence our judgments about the music.

Recent research has demonstrated the role of neruotransmitters in our enjoyment of music:

Our experience of the music we love stimulates the pleasure chemical dopamine in our brain, concludes a new study produced by a slew of scholars at McGill University. The researchers followed the brain patterns of test subjects with MRI imaging, and identified dopamine streaming into the striatum region of their forebrains “at peak emotional arousal during music listening.”

Not only that, but the scientists noticed that various parts of the striatum responded to the dopamine rush differently. The caudate was more involved during the expectation of some really nice musical excerpt, and the nucleus accumbens took the lead during “the experience of peak emotional responses to music.”

In other words, just the anticipation our favorite passage stimulates the production of dopamine.

I doubt that this kind of influence necessarily involves critical reflection, although the study does not explicitly address this point. It is also not  surprising that increased levels of dopamine are implicated in drunkenness.

Scruton wants to distinguish between intoxication I (drunkenness) and intoxication II (aesthetic appreciation) by insisting that relishing or savoring—a kind of critical inspection—is involved in the latter but not in the former. It is that moment of thoughtful reflection, and our ability to form a representation of the music or wine, that enables us to appreciate the finer points of wine or music.

But I doubt that the content of that critical inspection can be sharply distinguished from causal effects of the wine or music that may not be part of our representation of the wine or music.

It may be that the two forms of intoxication are more closely related than Scruton allows.

Dionysus would not be surprised.

Neandertals Adorned with Feathers, Thinking Symbolically September 22, 2012

Posted by Nina Rosenstand in Nina Rosenstand's Posts, Philosophy of Human Nature, Science.
Tags: , , ,
2 comments

Here is a wonderful example of why I, as a philosopher, have a passion for every bit of new info and speculation coming out about human evolution. To me there is no deeper philosophical question than the one about human identity: Who are we? Who were we? And how do we differ from those who are our close relatives today (the apes), and who were our even closer living relatives in the past (now three separate relatively recent groups of hominins coexisting with early Homo sapiens: the Neandertals, the Denisovans, and the elusive “Hobbits”, Homo floresiensis)? The categories we used to indicate our human extraordinary nature have been steadily challenged in the last decades. We used to be the only tool users. Then, because we found that apes (and birds) use tools, too, we became the only tool makers. But apes and birds make tools, too. So we became the only rational species. Ah, but now it turns out that many other species are quite capable of basic reasoning. Then we were the only species that has self-recognition. But so do apes, dolphins, elephants, ravens, magpies, pigs, and maybe even (if we are to believe the very latest findings) all big-brained, social species. But aren’t we at least the only ones who deliberately create art, and use body decorations? Because a brain that can conceive of art and decorations is capable of thinking symbolically. As late as ten years ago the great anthropologist Ian Tattersall claimed that humans were the only ones with the capacity for symbolic thinking. The Neandertals, with their big brains, still didn’t count as a self-aware species because they didn’t have symbolic thinking. Well, according to Scientific American blogger Kate Wong, they did:

Experts agree that Neandertals hunted large game, controlled fire, wore animal furs and made stone tools. But whether they also engaged in activities deemed to be more advanced has been a matter of heated debate. Some researchers have argued that Neandertals lacked the know-how to effectively exploit small prey, such as birds, and that they did not routinely express themselves through language and other symbolic behaviors. Such shortcomings put the Neandertals at a distinct disadvantage when anatomically modern humans availed of these skills invaded Europe—which was a Neandertal stronghold for hundreds of thousands of years—and presumably began competing with them, so the story goes.

Over the past couple decades hints that Neandertals were savvier than previously thought have surfaced, however. Pigment stains on shells from Spain suggest they painted, pierced animal teeth from France are by all appearances Neandertal pendants. The list goes on. Yet in all of these cases skeptics have cautioned that the evidence is scant and does not establish that such sophistication was an integral part of the Neandertal gestalt.

But now some new results have come in: Neandertals, across the entire western Eurasia, wore feathers they harvested from birds of prey—in particular black feathers.

Exactly what the Neandertals were doing with the feathers is unknown, but because they specifically sought out birds with dark plumage, the researchers suspect that our kissing cousins were festooning themselves with the resplendent flight feathers. Not only are feathers beautiful, they are also lightweight, which makes them ideal for decoration, Finlayson points out. “We don’t think it’s a coincidence that so many modern human cultures across the world have used them.”

Speakers at a conference on human evolution held in Gibraltar last week extolled the study, and agreed with the team’s interpretation of the remains as evidence that Neandertals adorned themselves with the feathers as opposed to using them for some strictly utilitarian purpose. If the cutmarked bones from Gibraltar had been found in association with early modern humans, researchers would assume that the feathers were symbolic, says paleoanthropologist John Hawks of the University of Wisconsin notes. The same standards should apply to Neandertals. “We’ve got to now say that Neandertals were using birds. Period. They were using them a lot. They were wearing around their feathers,” he comments. “They clearly cared. A purely utilitarian kind of person does not put on a feathered headdress.”

So. The Neandertals had symbolic thinking after all. (And those researchers who pointed out, over ten years ago, that the jewelry found in Neandertal archeological sites would indicate as much, as well as the little fact that they buried their dead, they can now feel vindicated.) And how far back in time did the symbolic, self-aware thinking originate?

 

“[This] is something many of us thought was unique to Homo sapiens,” [John] Shea adds. “But [it] turns out to be either convergently evolved with Neandertals or more likely something phylogenetically ancient we simply haven’t picked up in the more ancient archaeological record. It’s probably something [our common ancestor] Homo heidelbergensis did, we just haven’t found archaeological evidence for it yet.”

Homo heidelbergensis. At least 500,000 years ago. So we are not unique in our symbolic thinking. Now that doesn’t mean humans are not exceptional. Of course we are. We have managed to extend our influence and interest into space (literally), and time, by our research and imagination, reaching into the dim past as well as affecting and imagining possible futures. We can leave our legacy through our languages, our imagery (provided it doesn’t all go digital and disappears), our artifacts, our music, our buildings (and also the strip mines, the polluted lakes, the mass graves of discarded civilians, and all the other less wonderful stuff that is part of human history). Our reach, for better and for worse, is far greater than the other social animals on this planet. But the point is, it now seems to be fundamentally a matter of degree, not of a radically different kind.  

Authenticity and Food Rules August 30, 2012

Posted by Dwight Furrow in Dwight Furrow's Posts, Food and Drink, Philosophy of Food.
Tags: , , , ,
3 comments

Authentic stampCross-posted at Edible Arts 

Cooking is an art mired in tradition. Each nation has its food rules encrusted with the patina of age and each region within each nation has its way of doing things that seem natural and “right”, and violations are met with moral indignation and contempt.

In Italy, grated cheese is never added to seafood, oil and vinegar is the only proper salad dressing,  and coffee is never consumed during a meal. In France, salad is always eaten after the main dish, never before, ketchup is not a condiment for pommes frites.  Even in the “anything goes” United States, beans are part of a chili recipe only in certain regions of the country; and do not eat Carolina Barbeque in Texas.

But of what value is authenticity? Does it matter if these rules are followed or broken?

Italian chef Sara Jenkins points out that such “food rules” conceal more than they reveal.

Italian food and flavors changed dramatically after 1492 with the influx of the New World fruits and vegetables — tomatoes, corn, beans, peppers, potatoes — that were gradually integrated over four centuries of gardening and cooking and are at the core of today’s version of Italian food. If we wanted to be really authentic with Italian food, shouldn’t we do away with all the invasive species? Doesn’t that make tomato sauce and polenta inauthentic?

“Food rules” ignore the fact that all food traditions have been influenced by outsiders, all are a hybrid hash of influences thrown together by the movement of populations. Whatever “authenticity” means it cannot mean pure or unadulterated.

Authenticity is not about origins but about the commitments people make and what those commitments reveal about their sensibility. There is a reason why tomato sauces marry nicely with pasta and why a tomato served with olive oil and basil is heavenly. Tomatoes may not be originally Italian, but Italians have done wonderful things with tomatoes. They committed themselves to tomatoes, discovered how they resonate with their local ingredient, and now there is a certain way with tomatoes that is uniquely Italian.

So should we just throw out the food rules?  I think not. Food rules must be respected because they set the table for innovation—they define the standards that innovation must meet. Food rules say: “If you want to violate this tradition it better be good.” Without tradition, innovation is just novelty.

However, anyone who is just a slave to tradition and rigidly conforms without entertaining new ideas is violating the very identity of the tradition—its’ ability to be affected. That is, after all, what sensibility is. What makes traditions great—and this is certainly true of Italian food traditions—is their capacity to seamlessly absorb new influences.

Tradition and authenticity are not opposed to innovation–they depend on it. No tradition can remain alive if it does not innovate by accepting and transforming influences from abroad. Jenkins wonders about whether innovations can be authentic:

I have found the combination of soy sauce and extra virgin olive oil to be delicious. Is that a bad thing? It’s certainly inauthentic right now, but will it be considered a standard element in Italian cuisine 50 years from now?

If there is something about Italian cuisine that is enhanced by soy sauce, then soy sauce will become authentically Italian. If I should hazard a guess it will gain entry as an addition to fig puree or the secret ingredient in a meat ragu. Or perhaps if Chef Jenkins is bold she will offer it as a variation on Florentine Steak.

Follow

Get every new post delivered to your Inbox.

Join 66 other followers