jump to navigation

Are We Stories? Do We Want to Be? November 26, 2014

Posted by Nina Rosenstand in Nina Rosenstand's Posts, Philosophy, Philosophy of Human Nature, Philosophy of Literature.
Tags: , , , , , ,
1 comment so far

Every student of mine will know that sooner or later I will be introducing them to some story which illustrates some philosophical idea to perfection. And I am indeed a firm believer in the ability of good stories–film as well as literature–to provide the “meat” for the “bones” of a dry or complicated philosophical theory, especially in moral philosophy. Just think of Ursula Le Guin’s “The Ones Who Walk Away from Omelas” as a critical expose of utilitarianism. The film Extreme Measures, same thing. Ethical relativism, look no further than Kingsolver’s The Poisonwood Bible. And my latest addition to the moral universe of fiction: The television series Longmire, with Sheriff Walt Longmire being the most Kantian of heroes since Will Kane in High Noon. But rarely do we get into the core of narratology, the notion of personhood being inexorably linked with the ability of a person to tell his or her own story; it is really only in my Phil 111, Philosophy in Literature class that we have the luxury of getting into that corner of philosophy and storytelling. But this is where the field first saw the light of day, in the 1980s and 1990s, with philosophers such as Alasdair MacIntyre, Paul Ricoeur, Martha Nussbaum, Daniel Dennett, and a number of literature people such as Wayne Booth and David Carr. The idea that we become who we are because of our capacity to “connect the dots” in our lives into a narrative whole has caught on so that narratology today has two distinct areas, an epistemological/ontological side where the personal narrative becomes our human mode of being, and the ethical one where gathering one’s events into a story becomes a moral requirement in order to be a human being with care and direction.

But now there are voices, questioning the truth of “humans being storytelling animals,” at least as far as our own stories go. Because when we tell the story of our life, we are (like Ricoeur said) always in the middle, we don’t remember our beginning, and we won’t be able to tell the story of our end. In New Philosopher 11/25/2014 Patrick Stokes writes,

Biographers can describe a human life in narrative terms quite successfully, but they can only do so successfully from a certain distance, leaving out lots of trivial everyday detail. Zoom in close enough, and the ‘story’ of a human life starts to look like a pretty ineptly-scripted one, full of abandoned subplots and details that signify nothing and go nowhere.

Our lives don’t always resolve across a neat five-act structure either. 17th century French philosopher Blaise Pascal noted that the final act is always bloody, but very often that final act comes out of nowhere, a jarring interruption to the narrative coherence of our lives rather than a neat conclusion. And even if our lives are stories, we won’t be around to find out how they end.

That’s a problem for narrativists, because how stories end is central to their meaning. An alternative version of Romeo and Juliet where the protagonists survive isn’t the same story with a different ending – it’s a completely different story. The narrative meaning of everything leading up to the end turns out to be very different.

Stories have narrative shape, and only things with boundaries can have a shape. How a story begins and ends is an integral part of its narrative meaning and trajectory. But we have no idea how our lives will end, and quite possibly won’t know about it when they do. If that happens, we won’t ever have access to the final narrative meaning of our lives, we will never have known whether it was a tragic story of star-crossed loves or a tale of triumph. It’s like we’re watching a movie where we actually have some direct control of the plot, but realise we might never find out how it ends.

For one thing, Ricoeur solved that one, in his book Oneself as Another: He says to imagine one’s ending, and relate to the imaginary unity of one’s life that way. We can’t control our fate, but we can influence its direction through the story we tell. But there is another problem with seeing our lives as stories, and that is something that has made me a little more reluctant to embrace the theory of us being our stories. Because a good story, in order to have a point, invariably has to involve problems, problems that will then get resolved at the end. Maybe even horrific problems, tragedies, horror stories, tales of loss and grief, the depths of human misery. Because nobody wants their life to be a comedy, right? So if our story is supposed to be serious, we must embrace the drama, the tragedy. But perhaps most of us would rather just have a boring, safe life with predictable events, just some fun, some love, some sweetness, and then whatever problems that arise, get rid of them/get over them as fast as we can? But those lives don’t make great stories. In order to leave behind a worthy tale of our lives, we need to include the drama, the tragic, and then overcome it through a character arc.
Aside from the fact that most people’s lives will include tragedy whether we want it or not, it hardly seems like something to strive for, just so we can say that we improved on our character. Maybe most of us would prefer to read/watch fictional stories and biographies about other people’s tragedies and hope those things don’t happen to us…

Neandertals Adorned with Feathers, Thinking Symbolically September 22, 2012

Posted by Nina Rosenstand in Nina Rosenstand's Posts, Philosophy of Human Nature, Science.
Tags: , , ,
2 comments

Here is a wonderful example of why I, as a philosopher, have a passion for every bit of new info and speculation coming out about human evolution. To me there is no deeper philosophical question than the one about human identity: Who are we? Who were we? And how do we differ from those who are our close relatives today (the apes), and who were our even closer living relatives in the past (now three separate relatively recent groups of hominins coexisting with early Homo sapiens: the Neandertals, the Denisovans, and the elusive “Hobbits”, Homo floresiensis)? The categories we used to indicate our human extraordinary nature have been steadily challenged in the last decades. We used to be the only tool users. Then, because we found that apes (and birds) use tools, too, we became the only tool makers. But apes and birds make tools, too. So we became the only rational species. Ah, but now it turns out that many other species are quite capable of basic reasoning. Then we were the only species that has self-recognition. But so do apes, dolphins, elephants, ravens, magpies, pigs, and maybe even (if we are to believe the very latest findings) all big-brained, social species. But aren’t we at least the only ones who deliberately create art, and use body decorations? Because a brain that can conceive of art and decorations is capable of thinking symbolically. As late as ten years ago the great anthropologist Ian Tattersall claimed that humans were the only ones with the capacity for symbolic thinking. The Neandertals, with their big brains, still didn’t count as a self-aware species because they didn’t have symbolic thinking. Well, according to Scientific American blogger Kate Wong, they did:

Experts agree that Neandertals hunted large game, controlled fire, wore animal furs and made stone tools. But whether they also engaged in activities deemed to be more advanced has been a matter of heated debate. Some researchers have argued that Neandertals lacked the know-how to effectively exploit small prey, such as birds, and that they did not routinely express themselves through language and other symbolic behaviors. Such shortcomings put the Neandertals at a distinct disadvantage when anatomically modern humans availed of these skills invaded Europe—which was a Neandertal stronghold for hundreds of thousands of years—and presumably began competing with them, so the story goes.

Over the past couple decades hints that Neandertals were savvier than previously thought have surfaced, however. Pigment stains on shells from Spain suggest they painted, pierced animal teeth from France are by all appearances Neandertal pendants. The list goes on. Yet in all of these cases skeptics have cautioned that the evidence is scant and does not establish that such sophistication was an integral part of the Neandertal gestalt.

But now some new results have come in: Neandertals, across the entire western Eurasia, wore feathers they harvested from birds of prey—in particular black feathers.

Exactly what the Neandertals were doing with the feathers is unknown, but because they specifically sought out birds with dark plumage, the researchers suspect that our kissing cousins were festooning themselves with the resplendent flight feathers. Not only are feathers beautiful, they are also lightweight, which makes them ideal for decoration, Finlayson points out. “We don’t think it’s a coincidence that so many modern human cultures across the world have used them.”

Speakers at a conference on human evolution held in Gibraltar last week extolled the study, and agreed with the team’s interpretation of the remains as evidence that Neandertals adorned themselves with the feathers as opposed to using them for some strictly utilitarian purpose. If the cutmarked bones from Gibraltar had been found in association with early modern humans, researchers would assume that the feathers were symbolic, says paleoanthropologist John Hawks of the University of Wisconsin notes. The same standards should apply to Neandertals. “We’ve got to now say that Neandertals were using birds. Period. They were using them a lot. They were wearing around their feathers,” he comments. “They clearly cared. A purely utilitarian kind of person does not put on a feathered headdress.”

So. The Neandertals had symbolic thinking after all. (And those researchers who pointed out, over ten years ago, that the jewelry found in Neandertal archeological sites would indicate as much, as well as the little fact that they buried their dead, they can now feel vindicated.) And how far back in time did the symbolic, self-aware thinking originate?

 

“[This] is something many of us thought was unique to Homo sapiens,” [John] Shea adds. “But [it] turns out to be either convergently evolved with Neandertals or more likely something phylogenetically ancient we simply haven’t picked up in the more ancient archaeological record. It’s probably something [our common ancestor] Homo heidelbergensis did, we just haven’t found archaeological evidence for it yet.”

Homo heidelbergensis. At least 500,000 years ago. So we are not unique in our symbolic thinking. Now that doesn’t mean humans are not exceptional. Of course we are. We have managed to extend our influence and interest into space (literally), and time, by our research and imagination, reaching into the dim past as well as affecting and imagining possible futures. We can leave our legacy through our languages, our imagery (provided it doesn’t all go digital and disappears), our artifacts, our music, our buildings (and also the strip mines, the polluted lakes, the mass graves of discarded civilians, and all the other less wonderful stuff that is part of human history). Our reach, for better and for worse, is far greater than the other social animals on this planet. But the point is, it now seems to be fundamentally a matter of degree, not of a radically different kind.  

Scientists: Humans and Non-Humans—We Are All Conscious August 26, 2012

Posted by Nina Rosenstand in Animal Intelligence, Current Events, Ethics, Nina Rosenstand's Posts, Philosophy of Human Nature, Science.
Tags:
2 comments

A watershed of an event happend recently–if you’re in any way interested in the nature of consciousness. My students from Phil 107 and 108, and readers of my book, The Human Condition, know how vital I consider this topic, both in its ontological and  ethical aspects. I hope to expand this post later. For now, let me just share the URLs and a few quotes:

http://io9.com/5937356/prominent-scientists-sign-declaration-that-animals-have-conscious-awareness-just-like-us

An international group of prominent scientists has signed The Cambridge Declaration on Consciousness in which they are proclaiming their support for the idea that animals are conscious and aware to the degree that humans are — a list of animals that includes all mammals, birds, and even the octopus. But will this make us stop treating these animals in totally inhumane ways?

 While it might not sound like much for scientists to declare that many nonhuman animals possess conscious states, it’s the open acknowledgement that’s the big news here. The body of scientific evidence is increasingly showing that most animals are conscious in the same way that we are, and it’s no longer something we can ignore.

http://www.huffingtonpost.com/christof-koch/consciousness-is-everywhere_b_1784047.html

The two principal features that distinguish people from other animals is our hypertrophied ability to reflect upon ourselves (self-consciousness) and language. Yet there is little reason to deny consciousness to animals simply because they are mute or, for that matter, to premature infants because their brains are not fully developed. There is even less reason to deny it to people with severe aphasia who, upon recovery, can clearly describe their experiences while they were incapable of speaking. The perennial habit of introspection has led many intellectuals to devalue the unreflective, nonverbal character of much of life. The belief in human exceptionalism, so strongly rooted in the Judeo-Christian view of the world, flies in the face of all evidence for the structural and behavioral continuity between animals and people.

And here is the declaration in its entirety:

http://fcmconference.org/img/CambridgeDeclarationOnConsciousness.pdf

The Young Brain—Why Does it Take So Long to Grow Up? January 30, 2012

Posted by Nina Rosenstand in Education, Nina Rosenstand's Posts, Philosophy of Human Nature, Science.
Tags: , , , ,
2 comments

Welcome to the Spring 2012 semester, where we will post occasional blog entries as our schedules and moods allow! Here is something that I think will interest those of you who are under 25, or happen to know someone who is! Finally we understand the adolescent brain, and furthermore, that the adolescent brain will last well into young adult years these days, because what makes a brain “adult” is that is has responsibilities. Uh-oh! Does that mean some people will never grow up? Maybe…and there is a name for that: the Peter Pan Syndrome. Perhaps there will be a neurological explanation for that, now…

But in the meantime, this is what professor of psychology Alison Gopnik writes in her article, “What’s Wrong With the Teenage Mind?”: Puberty is happening earlier, but adulthood seems to be delayed. So we will have to live with “teenage weirdness” longer than in past centuries.

The crucial new idea is that there are two different neural and psychological systems that interact to turn children into adults. Over the past two centuries, and even more over the past generation, the developmental timing of these two systems has changed. That, in turn, has profoundly changed adolescence and produced new kinds of adolescent woe. The big question for anyone who deals with young people today is how we can go about bringing these cogs of the teenage mind into sync once again.

The first of these systems has to do with emotion and motivation. It is very closely linked to the biological and chemical changes of puberty and involves the areas of the brain that respond to rewards.

Recent studies in the neuroscientist B.J. Casey’s lab at Cornell University suggest that adolescents aren’t reckless because they underestimate risks, but because they overestimate rewards—or, rather, find rewards more rewarding than adults do. The reward centers of the adolescent brain are much more active than those of either children or adults. Think about the incomparable intensity of first love, the never-to-be-recaptured glory of the high-school basketball championship.

The second crucial system in our brains has to do with control; it channels and harnesses all that seething energy. In particular, the prefrontal cortex reaches out to guide other parts of the brain, including the parts that govern motivation and emotion. This is the system that inhibits impulses and guides decision-making, that encourages long-term planning and delays gratification.

This control system depends much more on learning. It becomes increasingly effective throughout childhood and continues to develop during adolescence and adulthood, as we gain more experience. You come to make better decisions by making not-so-good decisions and then correcting them.

In the past (from hunter-gatherers all the way to the recent past) those two systems were in sync, but they are no longer.

The experience of trying to achieve a real goal in real time in the real world is increasingly delayed, and the growth of the control system depends on just those experiences. The pediatrician and developmental psychologist Ronald Dahl at the University of California, Berkeley, has a good metaphor for the result: Today’s adolescents develop an accelerator a long time before they can steer and brake.

This doesn’t mean that adolescents are stupider than they used to be. In many ways, they are much smarter. An ever longer protected period of immaturity and dependence—a childhood that extends through college—means that young humans can learn more than ever before. There is strong evidence that IQ has increased dramatically as more children spend more time in school, and there is even some evidence that higher IQ is correlated with delayed frontal lobe development….

But there are different ways of being smart. Knowing physics and chemistry is no help with a soufflé. Wide-ranging, flexible and broad learning, the kind we encourage in high-school and college, may actually be in tension with the ability to develop finely-honed, controlled, focused expertise in a particular skill, the kind of learning that once routinely took place in human societies. For most of our history, children have started their internships when they were seven, not 27.

Recognize the problems of Will Hunting in Good Will Hunting? He has all the theoretical knowledge in the world, but has no idea how to live (and doesn’t even dare to). So what to do about it? Gopnik suggests to increase the level of varied hands-on experience of the young person, an extended apprenticeship-adolescence with responsibilities:

Instead of simply giving adolescents more and more school experiences—those extra hours of after-school classes and homework—we could try to arrange more opportunities for apprenticeship. AmeriCorps, the federal community-service program for youth, is an excellent example, since it provides both challenging real-life experiences and a degree of protection and supervision.

“Take your child to work” could become a routine practice rather than a single-day annual event, and college students could spend more time watching and helping scientists and scholars at work rather than just listening to their lectures. Summer enrichment activities like camp and travel, now so common for children whose parents have means, might be usefully alternated with summer jobs, with real responsibilities.

Hmmm…maybe we professors should recruit teams of secretaries and teaching assistants from among our students, for their own good?

 

 

So Did or Did We Not Interbreed with Neandertals? August 26, 2011

Posted by Nina Rosenstand in Nina Rosenstand's Posts, Philosophy of Human Nature, Science.
Tags: , , , , ,
4 comments

Back on the perch, folks–time for a new season of occasional insights or at least sharing of interesting stories from the web!

Only last week I watched another show in a long line of mocumentaries/supposedly nonfictional shows with a good deal of play-acting about Neandertals and early humans, the Cro-Magnons. I’m a sucker for those. I love to see human actors in some kind of crude make-up depicting the latest ideas of what our closest relatives ever on this planet may have looked and acted like. I also love to see the scientists act in front of the camera, in fairly minimal make-up. But I was surprised to see that the scientists interviewed came down massively against the idea that there might be Neandertal DNA in the human gene pool—after all, in 2010, after the Neandertal genome was decoded, researcher at the Max Planck Institite Svante Paabo was quoted as saying that 1-4 percent of genetic material in the human population that left African around 60,000 years ago came from sexual encounters with Neandertals. So why the categorical denial? Of course it could have been a dated show, but my impression was that it was recent. The fact that the show was centered around renowned paleoanthropologist Ian Tattersall could have had something to do with it—he has, for years, argued that (1) there is in all likelihood no genetic connection between living humans and Neandertals, and (2) Neandertals probably couldn’t speak or even think rationally because they lacked symbolic thinking. (The fact that crude jewelry has been found among Neandertal remains apparently hasn’t been enough to change his mind, although philosophically I’d have to say that deliberately adorning oneself with body art/ornaments shows some kind of symbolic thought, and their brain and throat structures do not exclude the power of speech.) Otherwise the show had interesting moments, such as floating the theory that perhaps Cro-Magnons didn’t actually exterminate the Neandertals by force, but by transferring diseases to them to which they had no immunity, much like it happened to the American Indian population in the 19th century.

And then we have the news, now quoted and tweeted all over cyberspace, that it seems that we—at least the descendants of those who migrated out of Africa—have Neandertal DNA in our genes after all! And it may have helped us become the extraordinarily successful species that we are (at least in the short term–who knows how long we’ll last?) by adding an immunity boost to our constitution. That, and possible interbreeding with that mysterious new-found Siberian hominin species the Denisovans may have secured our survival:

Indeed, DNA inherited from Neanderthals and newly discovered hominids dubbed the Denisovans has contributed to key types of immune genes still present among populations in Europe, Asia and Oceania. And scientists speculate that these gene variants must have been highly beneficial to modern humans, helping them thrive as they migrated throughout the world.

This DNA has had “a very profound functional impact in the immune systems of modern humans,” said study first author Laurent Abi-Rached, a postdoctoral researcher in the lab of senior author Peter Parham of the Stanford University School of Medicine.

From the analysis, the scientists estimated, for example, that more than half of the genetic variants in one HLA gene in Europeans could be traced to Neanderthal or Denisovan DNA. For Asians, that proportion was more than 70%; in people from Papua New Guinea, it was as much as 95%.

“We expected we’d see some, but the extent that these contributed to the modern [genomes] is stunning,” Abi-Rached said of the findings, released Thursday by the journal Science.

Though the researchers haven’t proved it, the vast reach of these gene variants in people today suggests that they probably gave some early modern humans an advantage over others, he said.

Our ancestors’ HLA systems may have been perfectly tailored for Africa but naive to bacteria, viruses and parasites that existed in Europe or Asia, rendering them susceptible to disease.

Mating and mixing their genomes with those of their Neanderthal and Denisovan relatives could have been a speedy way to set up their immune systems to combat new, unencountered threats.

What is philosophically interesting from the point of view of speculations about human nature (philosophical anthropology) is not so much whether we slept with Neandertals or not. My own hunch is that we did interbreed and created viable offspring, but like I posted in an earlier blog entry, it was probably because of hunters raping women of the other species rather than nice, romantic interspecies marriages. What is philosophically interesting is our reaction to these theories: Why is it so important for some people to see it verified that we didn’t interbreed? And what makes it so vital for others that we did? I’m not saying that the scientists work out theories that fit their preferred view, but many laypeople (such as myself) who follow these stories have usually taken sides. Can this be boiled down to on the one hand a wish to keep human nature separate and special, and on the other hand a wish to see us closely related to all life on this planet? Competing visions of exclusivity vs. inclusivity? And where will such visions take us? Just remember Kennewick Man and the battle over his origins: Was he an early European, an American Indian ancestor, or perhaps a visitor from Asia? Each explanation carries its own political slant. Ask yourself, in your heart, would you rather that humans who migrated out of Africa were distantly related to Neandertals, or would you rather they/we weren’t? And then ask yourself, Why?

Two Little Girls–One Mind? June 2, 2011

Posted by Nina Rosenstand in Current Events, Nina Rosenstand's Posts, Philosophy of Human Nature, Science.
Tags: , , ,
2 comments

Two little girls in British Columbia will grow up as conjoined twins;  they are craniopagus, connected at the head, sharing a part of their brain structure. Separating them is apparently not an option. That phenomenon, disturbing as it may be, is not in itself the reason why these little 4-year old girls are getting attention from cognitive neuroscientists. It is because they apparently share not only brain matter, but also sensory experiences:

Twins joined at the head — the medical term is craniopagus — are one in 2.5 million, of which only a fraction survive. The way the girls’ brains formed beneath the surface of their fused skulls, however, makes them beyond rare: their neural anatomy is unique, at least in the annals of recorded scientific literature. Their brain images reveal what looks like an attenuated line stretching between the two organs, a piece of anatomy their neurosurgeon, Douglas Cochrane of British Columbia Children’s Hospital, has called a thalamic bridge, because he believes it links the thalamus of one girl to the thalamus of her sister. The thalamus is a kind of switchboard, a two-lobed organ that filters most sensory input and has long been thought to be essential in the neural loops that create consciousness. Because the thalamus functions as a relay station, the girls’ doctors believe it is entirely possible that the sensory input that one girl receives could somehow cross that bridge into the brain of the other. One girl drinks, another girl feels it.

The girls surely have a complicated conception of what they mean by “me.” If one girl sees an object with her eyes and the other sees it via that thalamic link, are they having a shared experience? If the two girls are unique individuals, then each girl’s experience of that stimulus would inevitably be different; they would be having a parallel experience, but not one they experienced in some kind of commingling of consciousness. But do they think of themselves as one when they speak in unison, as they often do, if only in short phrases? When their voices joined together, I sometimes felt a shift — to me, they became one complicated being who happened to have two sets of vocal cords, no less plausible a concept than each of us having two eyes. Then, just as quickly, the girls’ distinct minds would make their respective presences felt: Tatiana smiled at me while her sister fixated on the television, or Krista alone responded with a “Yeah?” to the call of her name.

Although each girl often used “I” when she spoke, I never heard either say “we,” for all their collaboration. It was as if even they seemed confused by how to think of themselves, with the right language perhaps eluding them at this stage of development, under these unusual circumstances — or maybe not existing at all. “It’s like they are one and two people at the same time,” said Feinberg, the professor of psychiatry and neurology at Albert Einstein College of Medicine. What pronoun captures that?

The average person tends to fall back on the Enlightenment notion of the self — one mind, with privacy of thought and sensory experience — as a key characteristic of identity. That very impermeability is part of what makes the concept of the mind so challenging to researchers studying how it works, the neuroscientist and philosopher Antonio Damasio says in his book, “Self Comes to Mind.” “The fact that no one sees the minds of others, conscious or not, is especially mysterious,” he writes. We may be capable of guessing what others think, “but we cannot observe their minds, and only we ourselves can observe ours, from the inside, and through a rather narrow window.”

And yet here are two girls who can possibly — humbly, daily — feel what the other feels. Even that extraordinary dynamic would still put the girls on the continuum of connectivity that exists between ordinary humans. Some researchers believe that when we observe another person feeling, say, the prick of a pin, our neurons fire in a way that directly mimics the neurons firing in the person whom the pin actually pricks. So-called mirror neurons are thought to foster empathy, creating connections of which we are hardly aware but that bind us in some kind of mutual understanding at a neurological level.

The article, written by Susan Dominus (New York Times Magazine) who visited with the girls, includes several incidents that would indicate some form of shared sensory experience. I recommend that you read the rest of the article. The girls have not been studied extensively because of their young age, but if they remain healthy we may be treated to insight about one of the many ways of being human that just hasn’t been scientifically explored yet—the sharing of a mind…The philosophical implications of this phenomenon are overwhelming, to say the least.

Genderless, or Clueless? May 24, 2011

Posted by Nina Rosenstand in Current Events, Nina Rosenstand's Posts, Philosophy of Gender, Philosophy of Human Nature.
Tags: , , , , , ,
4 comments

First of all, Hi everybody—sorry I’ve been so quiet lately. Just finishing the stacks of papers to be graded, and other work to be completed before the summer—it’s been a busy semester. Good classes, good discussions, but very little energy left over for blogging. I have, however, been tweeting! You can find my tweets under “@Socalethicsprof.”

Next, the story: I read it this morning, and it has been poking at me ever since: A family in Toronto has made a decision which seems to me right out of the Seventies (yes, I remember them well): they are raising their third baby without telling anyone his/her gender.

“When the baby comes out, even the people who love you the most and know you so intimately, the first question they ask is, ‘Is it a girl or a boy?’” says Witterick, bouncing Storm, dressed in a red-fleece jumper, on her lap at the kitchen table.

“If you really want to get to know someone, you don’t ask what’s between their legs,” says Stocker.

When Storm was born, the couple sent an email to friends and family: “We’ve decided not to share Storm’s sex for now — a tribute to freedom and choice in place of limitation, a stand up to what the world could become in Storm’s lifetime (a more progressive place? …).”

It seems to me that the parents are trying to do two different things, and they aren’t necessarily compatible: For one thing, they’re trying to educate the world about its knee-jerk ways of gender assumptions. Well, that’s been attempted since the 1960s, and while it is a noble thought—and I’ve done my share of attempting to Educate the World over the years, giving my cousins’ and friends’ babies stuffed toys and farm animal figures instead of dolls and toy trucks, and choosing green and yellow baby clothes instead of pinks and blues—you’re up against 100,000 years of Homo Sapiens stereotypes. And it is somewhat naive of the parents to think they can put a dent in hardwired human nature. However, things have changed since the mid-20th century, and gender roles have become more flexible, due to new ideals and a willingness to be nonconformist. But the other side to their project seems to me far less noble, and mostly self-serving: They are trying to force Storm into a mold that they consider politically preferable: a world where gender roles are a matter of choice. They’re waiting to see what kind of person s/he will choose to be—but after the sad case of David Reimer in the 1990s and other failed attempts at enforcing the psychosexual neutrality theory, haven’t we all had to realize that a fair amount of sexual identity is hardwired? In other words, Storm will discover who s/he is, not choose it, and no amount of societal pressure from people making assumptions about her/his gender is going to make a bit of difference. I’m afraid the only thing the parents will accomplish is turning their child into a social experiment. In a way all of us, as children, have of course been social experiments, and most of us have turned out fairly well-functioning, but part of being a child is being allowed to feel safe, and to belong. Children are hungry for rules and predictability, and little Storm is being set up so s/he will be the oddball of whatever community s/he will be a part of. Choice is great, but not until one is mature enough to know what one is choosing.

A psychologist, Diane Ehrensaft, author of Gender Born, Gender Made, has some good comments to the story:

Ehrensaft believes there is something innate about gender, and points to the ’70s, when parents experimented by giving dolls to boys and trucks to girls.

“It only worked up to a certain extent. Some girls never played with the trucks, some boys weren’t interested in ballet … It was a humbling experiment for us because we learned we don’t have the control that we thought we did.”

But she worries by not divulging Storm’s sex, the parents are denying the child a way to position himself or herself in a world where you are either male, female or in between. In effect they have created another category: Other than other. And that could marginalize the child.

“I believe that it puts restrictions on this particular baby so that in this culture this baby will be a singular person who is not being given an opportunity to find their true gender self, based on also what’s inside them.”

Ehrensaft gets the “What the heck?!” reaction people may have when they hear about Storm. “I think it probably makes people feel played with to have that information withheld from them.”

As Socrates would say, a well-balanced person is not just someone who understands himself or herself, but who also is a well-adjusted citizen. You can’t become a well-adjusted citizen in a world where other people think you’re trying to fool them…

Behavioral Ethics–Explanation or Excuse? March 30, 2011

Posted by Nina Rosenstand in Ethics, Nina Rosenstand's Posts, Philosophy of Human Nature.
Tags: , ,
5 comments

I read an interesting piece in Harvard Magazine, “On Behavioral Ethics” by the authors of Blind Spots: Why We Fail to Do What’s Right and What to Do about It, Straus professor of business administration Max H. Bazerman and Ann E. Tenbrunsel, Martin professor of business ethics at Notre Dame’s Mendoza College of Business. Here is an excerpt from their piece which is also taken from Chapter 1 in their book:

In the wake of troubling decisions—cooking the books at Enron, going to war in Iraq on suspect grounds, making mortgage loans to indigent borrowers and passing the risk on to others—scholars in many fields are examining how individuals and organizations conduct themselves relative to ethical standards.

[The authors] seek answers not in philosophy, but through analysis of cognition and behaviors, such as “ethical fading.”

Ethics interventions have failed and will continue to fail because they are predicated on a false assumption: that individuals recognize an ethical dilemma when it is presented to them. Ethics training presumes that emphasizing the moral components of decisions will inspire executives to choose the moral path. But the common assumption this training is based on—that executives make explicit trade-offs between behaving ethically and earning profits for their organizations—is incomplete. This paradigm fails to acknowledge our innate psychological responses when faced with an ethical dilemma.

Findings from the emerging field of behavioral ethics—a field that seeks to understand how people actually behave when confronted with ethical dilemmas—offer insights that can round out our understanding of why we often behave contrary to our best ethical intentions. Our ethical behavior is often inconsistent, at times even hypocritical. Consider that people have the innate ability to maintain a belief while acting contrary to it. Moral hypocrisy occurs when individuals’ evaluations of their own moral transgressions differ substantially from their evaluations of the same transgressions committed by others.

Traditional approaches to ethics, and the traditional training methods that have accompanied such approaches, lack an understanding of the unintentional yet predictable cognitive patterns that result in unethical behavior. By contrast, our research on bounded ethicality focuses on the psychological processes that lead even good people to engage in ethically questionable behavior that contradicts their own preferred ethics.

If ethics training is to actually change and improve ethical decision-making, it needs to incorporate behavioral ethics, and specifically the subtle ways in which our ethics are bounded. Such an approach entails an understanding of the different ways our minds can approach ethical dilemmas and the different modes of decision-making that result.

 Of course I have not read the entire book, so my evaluation is merely based on the excerpt, but while I at first thought the idea of bounded ethicality sounded interesting, on second thought I’m not so sure. Of course it is always interesting for a philosophy of human nature to figure out why people can’t live up to their own moral standards, in the business world or elsewhere. But that doesn’t mean we have to give up on those standards. For one thing, dismissing the entire tradition of moral philosophy because (business) people can’t live up to their own ideals is sort of like throwing the baby out with the bathwater–a waste, and hardly rational. For another, this supposed realization that people aren’t very ethical is hardly news. From “The spirit is willing but the flesh is weak” to “Do as I say, not as I do,” humans have struggled with that internal battle for as long as we’ve had records of human behavior. The difficulty of maintaining our moral ideals under pressure is precisely the raison d’etre for ethics—moral values are traditionally hard to live up to. If it were easy to be ethical, it wouldn’t be a perennial topic for our arts, stories, religions, and other cultural expressions.  It seems to me that the behavioral ethics project, as described here, amounts to (1) a mere psychological analysis of what people actually do, instead of discussing the normative concept of what they ought to do, and why, and (2) an excuse for not even trying to live up to a set of challenging moral standards. If the authors don’t want to do philosophy, that’s fine.  But if you don’t want to include the concept of prescription in a study of ethics, well, then you’re simply not studying ethics in the traditional sense.

Update on Abbie Dorn March 26, 2011

Posted by Nina Rosenstand in Current Events, Ethics, Nina Rosenstand's Posts, Philosophy of Human Nature.
Tags: , ,
8 comments

You may remember the case of Abbie Dorn who ended up braindamaged after giving birth to her triplets in 2006 due to a series of medical errors. Her mother requested that Abbie’s children, being raised by her (now ex-) husband, should have regular visits with their mother, but their father refused on the grounds that it would be traumatic for the children, and claiming  that Abbie would not benefit from it, either, due to her reduced mental state. Now a judge in Los Angeles has ruled that Abbie will indeed get visitation rights:

In a tentative 10-page ruling, Judge Frederick C. Shaller said that Abbie Dorn, 34, can see her daughter, Esti, and sons Reuvi and Yossi, for a five-day visit each year pending a trial in the acrimonious custody case. She also entitled to a monthly online Skype visit. A trial date has yet to be set.

“We are thrilled,” said Felicia Meyers, one of Dorn’s attorneys.

Although “there is no compelling evidence that the visitations by the children will have any benefit to Abby,” Shaller wrote, “…there is no compelling evidence that visitation with Abby will be detrimental to the children.”

In my previous post about Abbie’s situation I concluded (and pardon me for quoting myself! It’s easier to paste it in, here on a Saturday morning, than to rephrase it),

It’s not such a hard question. Be Solomonic. Err on the side of inclusive personhood—as long as there is a chance that Abbie is having experiences and wishes, respect them, and her. She is on a long, dark journey, and adding insult to her terrible injury by disregarding her potential personhood is unworthy these days. On the other hand, there is no reason why visitation rights should be granted from one day to the next, with the risk of traumatizing her toddlers. After all, she’s not asking for custody. If Abbie’s parents, and Abbie, want the best for the children (who at this point don’t even know they have a mother), they should be left with their father, and slowly be introduced to the story, with pictures, video, etc. Writing letters and drawing pictures to their mother could be the start of a relationship, building up a unique situation over months. I would assume that having a mother without a voice, or without arms that can hold them, but with loving eyes speaking a language of their own (if indeed Abbie herself is still behind those eyes), is a whole lot better than having no birth mother at all in their lives, and being told the story later when it is too late to amend the situation …what “might have been” is going to be cold comfort…

It seems that Judge Shaller holds the same view of Abbie and her children.

The Ethics of Self-Sacrifice March 25, 2011

Posted by Nina Rosenstand in Current Events, Ethics, Nina Rosenstand's Posts, Philosophy of Human Nature.
Tags: , , ,
add a comment

Following the story of the “Fukushima 50,” now up to 1000 workers, still working in shifts under what seems to be an increasing threat level, here from USAToday:

…Two workers have gone missing and 25 have been hurt or overexposed to radiation since the magnitude-9.0 earthquake hit March 11, according to the Tokyo Electric Power Co., which owns and runs the plant. Most of the injuries occurred during explosions that resulted from uncontrolled buildups of hydrogen and oxygen in two reactor units.The latest injuries were reported Thursday, when TEPCO said two workers were sent to the hospital after their legs were contaminated with radiation, indicating the facility remains dangerous. Gregory Jaczko, chairman of the Nuclear Regulatory Commission (NRC), says it could be weeks before the radiation is under control.”Anybody that voluntarily enters a situation that puts their lives on the line can be called a hero, and those workers certainly meet that definition,” says David Lochbaum, director of the nuclear safety project for the Union of Concerned Scientists.”I don’t know any other way to say it, but this is like suicide fighters in a war,” says Keiichi Nakagawa, associate professor in the Department of Radiology at the University of Tokyo Hospital.

In the emergency, Japanese authorities increased the permissible radiation exposure to five times what plant workers normally are allowed in a year.

That move “ethically is a problem,” says Irwin Redlener, a pediatrician at Columbia University in New York and director of the National Center for Disaster Preparedness. “On the other hand, there are large-scale population needs and somehow that needs to be balanced. It’s basically men and women voluntarily putting themselves in harm’s way so thousands of others can be safe.”Such self-sacrifice is not uniquely Japanese, Redlener says. “It is something about human nature in emergencies that people step up to the plate in the interest of the greater good,” he says, citing battlefield troops and responders who entered the burning World Trade Center towers on Sept. 11, 2001.

The key word here, from a Western moral perspective at least, is volunteering. Willingly taking on a burden that will help others, but endanger your own life and wellbeing is what makes the ethics of altruism so challenging, and fascinating. It is hard to evaluate what cultural/professional pressures that may be involved in the current situation at Fukushima, because the Japanese tradition does value the ethics of self-sacrifice—but as long as we’re not talking about a company deliberately sacrificing its workers for the common good, utilitarian-style, a group-ethics pressure to volunteer doing helpful, but life-threatening work still requires a personal decision, and that decision is still a heroic act—even if it may be embedded in the cultural tradition, and expected in times of need. And, as the article points out, it is not unique to the Japanese tradition.