jump to navigation

Neandertals Adorned with Feathers, Thinking Symbolically September 22, 2012

Posted by Nina Rosenstand in Nina Rosenstand's Posts, Philosophy of Human Nature, Science.
Tags: , , ,
2 comments

Here is a wonderful example of why I, as a philosopher, have a passion for every bit of new info and speculation coming out about human evolution. To me there is no deeper philosophical question than the one about human identity: Who are we? Who were we? And how do we differ from those who are our close relatives today (the apes), and who were our even closer living relatives in the past (now three separate relatively recent groups of hominins coexisting with early Homo sapiens: the Neandertals, the Denisovans, and the elusive “Hobbits”, Homo floresiensis)? The categories we used to indicate our human extraordinary nature have been steadily challenged in the last decades. We used to be the only tool users. Then, because we found that apes (and birds) use tools, too, we became the only tool makers. But apes and birds make tools, too. So we became the only rational species. Ah, but now it turns out that many other species are quite capable of basic reasoning. Then we were the only species that has self-recognition. But so do apes, dolphins, elephants, ravens, magpies, pigs, and maybe even (if we are to believe the very latest findings) all big-brained, social species. But aren’t we at least the only ones who deliberately create art, and use body decorations? Because a brain that can conceive of art and decorations is capable of thinking symbolically. As late as ten years ago the great anthropologist Ian Tattersall claimed that humans were the only ones with the capacity for symbolic thinking. The Neandertals, with their big brains, still didn’t count as a self-aware species because they didn’t have symbolic thinking. Well, according to Scientific American blogger Kate Wong, they did:

Experts agree that Neandertals hunted large game, controlled fire, wore animal furs and made stone tools. But whether they also engaged in activities deemed to be more advanced has been a matter of heated debate. Some researchers have argued that Neandertals lacked the know-how to effectively exploit small prey, such as birds, and that they did not routinely express themselves through language and other symbolic behaviors. Such shortcomings put the Neandertals at a distinct disadvantage when anatomically modern humans availed of these skills invaded Europe—which was a Neandertal stronghold for hundreds of thousands of years—and presumably began competing with them, so the story goes.

Over the past couple decades hints that Neandertals were savvier than previously thought have surfaced, however. Pigment stains on shells from Spain suggest they painted, pierced animal teeth from France are by all appearances Neandertal pendants. The list goes on. Yet in all of these cases skeptics have cautioned that the evidence is scant and does not establish that such sophistication was an integral part of the Neandertal gestalt.

But now some new results have come in: Neandertals, across the entire western Eurasia, wore feathers they harvested from birds of prey—in particular black feathers.

Exactly what the Neandertals were doing with the feathers is unknown, but because they specifically sought out birds with dark plumage, the researchers suspect that our kissing cousins were festooning themselves with the resplendent flight feathers. Not only are feathers beautiful, they are also lightweight, which makes them ideal for decoration, Finlayson points out. “We don’t think it’s a coincidence that so many modern human cultures across the world have used them.”

Speakers at a conference on human evolution held in Gibraltar last week extolled the study, and agreed with the team’s interpretation of the remains as evidence that Neandertals adorned themselves with the feathers as opposed to using them for some strictly utilitarian purpose. If the cutmarked bones from Gibraltar had been found in association with early modern humans, researchers would assume that the feathers were symbolic, says paleoanthropologist John Hawks of the University of Wisconsin notes. The same standards should apply to Neandertals. “We’ve got to now say that Neandertals were using birds. Period. They were using them a lot. They were wearing around their feathers,” he comments. “They clearly cared. A purely utilitarian kind of person does not put on a feathered headdress.”

So. The Neandertals had symbolic thinking after all. (And those researchers who pointed out, over ten years ago, that the jewelry found in Neandertal archeological sites would indicate as much, as well as the little fact that they buried their dead, they can now feel vindicated.) And how far back in time did the symbolic, self-aware thinking originate?

 

“[This] is something many of us thought was unique to Homo sapiens,” [John] Shea adds. “But [it] turns out to be either convergently evolved with Neandertals or more likely something phylogenetically ancient we simply haven’t picked up in the more ancient archaeological record. It’s probably something [our common ancestor] Homo heidelbergensis did, we just haven’t found archaeological evidence for it yet.”

Homo heidelbergensis. At least 500,000 years ago. So we are not unique in our symbolic thinking. Now that doesn’t mean humans are not exceptional. Of course we are. We have managed to extend our influence and interest into space (literally), and time, by our research and imagination, reaching into the dim past as well as affecting and imagining possible futures. We can leave our legacy through our languages, our imagery (provided it doesn’t all go digital and disappears), our artifacts, our music, our buildings (and also the strip mines, the polluted lakes, the mass graves of discarded civilians, and all the other less wonderful stuff that is part of human history). Our reach, for better and for worse, is far greater than the other social animals on this planet. But the point is, it now seems to be fundamentally a matter of degree, not of a radically different kind.  

Scientists: Humans and Non-Humans—We Are All Conscious August 26, 2012

Posted by Nina Rosenstand in Animal Intelligence, Current Events, Ethics, Nina Rosenstand's Posts, Philosophy of Human Nature, Science.
Tags:
2 comments

A watershed of an event happend recently–if you’re in any way interested in the nature of consciousness. My students from Phil 107 and 108, and readers of my book, The Human Condition, know how vital I consider this topic, both in its ontological and  ethical aspects. I hope to expand this post later. For now, let me just share the URLs and a few quotes:

http://io9.com/5937356/prominent-scientists-sign-declaration-that-animals-have-conscious-awareness-just-like-us

An international group of prominent scientists has signed The Cambridge Declaration on Consciousness in which they are proclaiming their support for the idea that animals are conscious and aware to the degree that humans are — a list of animals that includes all mammals, birds, and even the octopus. But will this make us stop treating these animals in totally inhumane ways?

 While it might not sound like much for scientists to declare that many nonhuman animals possess conscious states, it’s the open acknowledgement that’s the big news here. The body of scientific evidence is increasingly showing that most animals are conscious in the same way that we are, and it’s no longer something we can ignore.

http://www.huffingtonpost.com/christof-koch/consciousness-is-everywhere_b_1784047.html

The two principal features that distinguish people from other animals is our hypertrophied ability to reflect upon ourselves (self-consciousness) and language. Yet there is little reason to deny consciousness to animals simply because they are mute or, for that matter, to premature infants because their brains are not fully developed. There is even less reason to deny it to people with severe aphasia who, upon recovery, can clearly describe their experiences while they were incapable of speaking. The perennial habit of introspection has led many intellectuals to devalue the unreflective, nonverbal character of much of life. The belief in human exceptionalism, so strongly rooted in the Judeo-Christian view of the world, flies in the face of all evidence for the structural and behavioral continuity between animals and people.

And here is the declaration in its entirety:

http://fcmconference.org/img/CambridgeDeclarationOnConsciousness.pdf

The Young Brain—Why Does it Take So Long to Grow Up? January 30, 2012

Posted by Nina Rosenstand in Education, Nina Rosenstand's Posts, Philosophy of Human Nature, Science.
Tags: , , , ,
2 comments

Welcome to the Spring 2012 semester, where we will post occasional blog entries as our schedules and moods allow! Here is something that I think will interest those of you who are under 25, or happen to know someone who is! Finally we understand the adolescent brain, and furthermore, that the adolescent brain will last well into young adult years these days, because what makes a brain “adult” is that is has responsibilities. Uh-oh! Does that mean some people will never grow up? Maybe…and there is a name for that: the Peter Pan Syndrome. Perhaps there will be a neurological explanation for that, now…

But in the meantime, this is what professor of psychology Alison Gopnik writes in her article, “What’s Wrong With the Teenage Mind?”: Puberty is happening earlier, but adulthood seems to be delayed. So we will have to live with “teenage weirdness” longer than in past centuries.

The crucial new idea is that there are two different neural and psychological systems that interact to turn children into adults. Over the past two centuries, and even more over the past generation, the developmental timing of these two systems has changed. That, in turn, has profoundly changed adolescence and produced new kinds of adolescent woe. The big question for anyone who deals with young people today is how we can go about bringing these cogs of the teenage mind into sync once again.

The first of these systems has to do with emotion and motivation. It is very closely linked to the biological and chemical changes of puberty and involves the areas of the brain that respond to rewards.

Recent studies in the neuroscientist B.J. Casey’s lab at Cornell University suggest that adolescents aren’t reckless because they underestimate risks, but because they overestimate rewards—or, rather, find rewards more rewarding than adults do. The reward centers of the adolescent brain are much more active than those of either children or adults. Think about the incomparable intensity of first love, the never-to-be-recaptured glory of the high-school basketball championship.

The second crucial system in our brains has to do with control; it channels and harnesses all that seething energy. In particular, the prefrontal cortex reaches out to guide other parts of the brain, including the parts that govern motivation and emotion. This is the system that inhibits impulses and guides decision-making, that encourages long-term planning and delays gratification.

This control system depends much more on learning. It becomes increasingly effective throughout childhood and continues to develop during adolescence and adulthood, as we gain more experience. You come to make better decisions by making not-so-good decisions and then correcting them.

In the past (from hunter-gatherers all the way to the recent past) those two systems were in sync, but they are no longer.

The experience of trying to achieve a real goal in real time in the real world is increasingly delayed, and the growth of the control system depends on just those experiences. The pediatrician and developmental psychologist Ronald Dahl at the University of California, Berkeley, has a good metaphor for the result: Today’s adolescents develop an accelerator a long time before they can steer and brake.

This doesn’t mean that adolescents are stupider than they used to be. In many ways, they are much smarter. An ever longer protected period of immaturity and dependence—a childhood that extends through college—means that young humans can learn more than ever before. There is strong evidence that IQ has increased dramatically as more children spend more time in school, and there is even some evidence that higher IQ is correlated with delayed frontal lobe development….

But there are different ways of being smart. Knowing physics and chemistry is no help with a soufflé. Wide-ranging, flexible and broad learning, the kind we encourage in high-school and college, may actually be in tension with the ability to develop finely-honed, controlled, focused expertise in a particular skill, the kind of learning that once routinely took place in human societies. For most of our history, children have started their internships when they were seven, not 27.

Recognize the problems of Will Hunting in Good Will Hunting? He has all the theoretical knowledge in the world, but has no idea how to live (and doesn’t even dare to). So what to do about it? Gopnik suggests to increase the level of varied hands-on experience of the young person, an extended apprenticeship-adolescence with responsibilities:

Instead of simply giving adolescents more and more school experiences—those extra hours of after-school classes and homework—we could try to arrange more opportunities for apprenticeship. AmeriCorps, the federal community-service program for youth, is an excellent example, since it provides both challenging real-life experiences and a degree of protection and supervision.

“Take your child to work” could become a routine practice rather than a single-day annual event, and college students could spend more time watching and helping scientists and scholars at work rather than just listening to their lectures. Summer enrichment activities like camp and travel, now so common for children whose parents have means, might be usefully alternated with summer jobs, with real responsibilities.

Hmmm…maybe we professors should recruit teams of secretaries and teaching assistants from among our students, for their own good?

 

 

Culture–It’s Not Just for Humans Anymore October 24, 2011

Posted by Nina Rosenstand in Animal Intelligence, Nina Rosenstand's Posts, Science.
Tags:
2 comments

What a difference a couple of decades make. Back in the Twentieth Century they used to tell us that humans were the only beings who had culture, and whatever traditions nonhuman animals displayed in their groups could be explained as instinct. That concept began to erode already with Jane Goodall’s research, although we still encounter holdout animal behaviorists who maintain that whatever it is that chimpanzees do when they share and transmit inventions and traditions, it isn’t culture (which brings to mind long-range visionary David Hume who not only thought that emotions have primacy over rationality, but also that if nonhuman animals display emotional and intellectual behavior similar to humans, it should be given similar labels). So what would an example of a  chimp culture be like? From a Scientific American blog, “Cultural Transmission in Chimpanzees”:

While nonhuman primates don’t have obvious cultural traditions the same way humans do, such as variation in their clothing or adding extra spice to their food, primatologists have nonetheless identified behavioral practices that vary between communities and which are transmitted through social learning. For a behavior to be considered a cultural practice in nonhuman primates it must meet certain conditions: the behavior must be practiced by multiple members of the community, it must vary between societies, and the potential for that same behavior must exist in other societies.

A good example of such a cultural trait was just discovered last year and published in the journal Current Biology (review here). Kibale Forest chimpanzees were found to use sticks to get at the honey in a fallen log, whereas Budongo Forest chimpanzees used chewed leaves as sponges to collect the same thing. Both societies had the same tools at their disposal, but they each chose a different approach. A single individual first used one of these techniques and other members of the group adopted it through imitation and social learning. This is merely the latest example of cultural traditions in different chimpanzee societies.

So let’s assume that we are convinced that chimps invent and transmit culture; the question now becomes how? In a Swedish study  quoted by the Scientific American blog a new idea has been proposed: that culture is being transmitted by female chimps. Chimp societies are patrilocal (the males stay put, the females move between groups), so whatever traditions the females have learned from growing up within a group they will bring with them to their new home, and teach them to their kids:

Because females express and transmit more culture than males, and because females transfer between communities bringing with them their cultural knowledge, the number of cultural traits present in any given chimpanzee community should depend on the number of females in that community. Thus, we hypothesize that the number of cultural traits in chimpanzee communities should correlate with the average number of females in chimpanzee communities, but not with the average number of males.

This implies that females are critical in chimpanzees for transmitting cultural traits and maintaining cultural diversity. The reported pattern may be explained by the fact that females transfer between communities, bringing with them novel cultural traits and consequently increasing the cultural diversity of the community as a whole.

And that’s not all: from a  group of Swiss anthropologists  we now hear that orangutans also have culture–particularly interesting, because orangutans aren’t perceived (by most of us laypeople) as being as social as chimps:

Researchers from the University of Zurich have now studied whether the geographic variation of behavioral patterns in nine orangutan populations in Sumatra and Borneo can be explained by cultural transmission. They have concluded that it can.

The team analyzed more than 100,000 hours of behavioral data and created genetic profiles of more than 150 wild orangutans. They measured the ecological differences between the habitats of the different populations using satellite imagery and remote sensing techniques.

Co-author of the study, published in Current Biology, Carel van Schaik said: “The novelty of our study is that, thanks to the unprecedented size of our dataset, we were the first to gauge the influence genetics and environmental factors have on the different behavioral patterns among the orangutan populations.”

It seems that the days when researchers would claim that only humans have culture will be over fairly soon. No word on whether orangutan females play the same role as chimp females.

 

 

 

So Did or Did We Not Interbreed with Neandertals? August 26, 2011

Posted by Nina Rosenstand in Nina Rosenstand's Posts, Philosophy of Human Nature, Science.
Tags: , , , , ,
4 comments

Back on the perch, folks–time for a new season of occasional insights or at least sharing of interesting stories from the web!

Only last week I watched another show in a long line of mocumentaries/supposedly nonfictional shows with a good deal of play-acting about Neandertals and early humans, the Cro-Magnons. I’m a sucker for those. I love to see human actors in some kind of crude make-up depicting the latest ideas of what our closest relatives ever on this planet may have looked and acted like. I also love to see the scientists act in front of the camera, in fairly minimal make-up. But I was surprised to see that the scientists interviewed came down massively against the idea that there might be Neandertal DNA in the human gene pool—after all, in 2010, after the Neandertal genome was decoded, researcher at the Max Planck Institite Svante Paabo was quoted as saying that 1-4 percent of genetic material in the human population that left African around 60,000 years ago came from sexual encounters with Neandertals. So why the categorical denial? Of course it could have been a dated show, but my impression was that it was recent. The fact that the show was centered around renowned paleoanthropologist Ian Tattersall could have had something to do with it—he has, for years, argued that (1) there is in all likelihood no genetic connection between living humans and Neandertals, and (2) Neandertals probably couldn’t speak or even think rationally because they lacked symbolic thinking. (The fact that crude jewelry has been found among Neandertal remains apparently hasn’t been enough to change his mind, although philosophically I’d have to say that deliberately adorning oneself with body art/ornaments shows some kind of symbolic thought, and their brain and throat structures do not exclude the power of speech.) Otherwise the show had interesting moments, such as floating the theory that perhaps Cro-Magnons didn’t actually exterminate the Neandertals by force, but by transferring diseases to them to which they had no immunity, much like it happened to the American Indian population in the 19th century.

And then we have the news, now quoted and tweeted all over cyberspace, that it seems that we—at least the descendants of those who migrated out of Africa—have Neandertal DNA in our genes after all! And it may have helped us become the extraordinarily successful species that we are (at least in the short term–who knows how long we’ll last?) by adding an immunity boost to our constitution. That, and possible interbreeding with that mysterious new-found Siberian hominin species the Denisovans may have secured our survival:

Indeed, DNA inherited from Neanderthals and newly discovered hominids dubbed the Denisovans has contributed to key types of immune genes still present among populations in Europe, Asia and Oceania. And scientists speculate that these gene variants must have been highly beneficial to modern humans, helping them thrive as they migrated throughout the world.

This DNA has had “a very profound functional impact in the immune systems of modern humans,” said study first author Laurent Abi-Rached, a postdoctoral researcher in the lab of senior author Peter Parham of the Stanford University School of Medicine.

From the analysis, the scientists estimated, for example, that more than half of the genetic variants in one HLA gene in Europeans could be traced to Neanderthal or Denisovan DNA. For Asians, that proportion was more than 70%; in people from Papua New Guinea, it was as much as 95%.

“We expected we’d see some, but the extent that these contributed to the modern [genomes] is stunning,” Abi-Rached said of the findings, released Thursday by the journal Science.

Though the researchers haven’t proved it, the vast reach of these gene variants in people today suggests that they probably gave some early modern humans an advantage over others, he said.

Our ancestors’ HLA systems may have been perfectly tailored for Africa but naive to bacteria, viruses and parasites that existed in Europe or Asia, rendering them susceptible to disease.

Mating and mixing their genomes with those of their Neanderthal and Denisovan relatives could have been a speedy way to set up their immune systems to combat new, unencountered threats.

What is philosophically interesting from the point of view of speculations about human nature (philosophical anthropology) is not so much whether we slept with Neandertals or not. My own hunch is that we did interbreed and created viable offspring, but like I posted in an earlier blog entry, it was probably because of hunters raping women of the other species rather than nice, romantic interspecies marriages. What is philosophically interesting is our reaction to these theories: Why is it so important for some people to see it verified that we didn’t interbreed? And what makes it so vital for others that we did? I’m not saying that the scientists work out theories that fit their preferred view, but many laypeople (such as myself) who follow these stories have usually taken sides. Can this be boiled down to on the one hand a wish to keep human nature separate and special, and on the other hand a wish to see us closely related to all life on this planet? Competing visions of exclusivity vs. inclusivity? And where will such visions take us? Just remember Kennewick Man and the battle over his origins: Was he an early European, an American Indian ancestor, or perhaps a visitor from Asia? Each explanation carries its own political slant. Ask yourself, in your heart, would you rather that humans who migrated out of Africa were distantly related to Neandertals, or would you rather they/we weren’t? And then ask yourself, Why?

Two Little Girls–One Mind? June 2, 2011

Posted by Nina Rosenstand in Current Events, Nina Rosenstand's Posts, Philosophy of Human Nature, Science.
Tags: , , ,
2 comments

Two little girls in British Columbia will grow up as conjoined twins;  they are craniopagus, connected at the head, sharing a part of their brain structure. Separating them is apparently not an option. That phenomenon, disturbing as it may be, is not in itself the reason why these little 4-year old girls are getting attention from cognitive neuroscientists. It is because they apparently share not only brain matter, but also sensory experiences:

Twins joined at the head — the medical term is craniopagus — are one in 2.5 million, of which only a fraction survive. The way the girls’ brains formed beneath the surface of their fused skulls, however, makes them beyond rare: their neural anatomy is unique, at least in the annals of recorded scientific literature. Their brain images reveal what looks like an attenuated line stretching between the two organs, a piece of anatomy their neurosurgeon, Douglas Cochrane of British Columbia Children’s Hospital, has called a thalamic bridge, because he believes it links the thalamus of one girl to the thalamus of her sister. The thalamus is a kind of switchboard, a two-lobed organ that filters most sensory input and has long been thought to be essential in the neural loops that create consciousness. Because the thalamus functions as a relay station, the girls’ doctors believe it is entirely possible that the sensory input that one girl receives could somehow cross that bridge into the brain of the other. One girl drinks, another girl feels it.

The girls surely have a complicated conception of what they mean by “me.” If one girl sees an object with her eyes and the other sees it via that thalamic link, are they having a shared experience? If the two girls are unique individuals, then each girl’s experience of that stimulus would inevitably be different; they would be having a parallel experience, but not one they experienced in some kind of commingling of consciousness. But do they think of themselves as one when they speak in unison, as they often do, if only in short phrases? When their voices joined together, I sometimes felt a shift — to me, they became one complicated being who happened to have two sets of vocal cords, no less plausible a concept than each of us having two eyes. Then, just as quickly, the girls’ distinct minds would make their respective presences felt: Tatiana smiled at me while her sister fixated on the television, or Krista alone responded with a “Yeah?” to the call of her name.

Although each girl often used “I” when she spoke, I never heard either say “we,” for all their collaboration. It was as if even they seemed confused by how to think of themselves, with the right language perhaps eluding them at this stage of development, under these unusual circumstances — or maybe not existing at all. “It’s like they are one and two people at the same time,” said Feinberg, the professor of psychiatry and neurology at Albert Einstein College of Medicine. What pronoun captures that?

The average person tends to fall back on the Enlightenment notion of the self — one mind, with privacy of thought and sensory experience — as a key characteristic of identity. That very impermeability is part of what makes the concept of the mind so challenging to researchers studying how it works, the neuroscientist and philosopher Antonio Damasio says in his book, “Self Comes to Mind.” “The fact that no one sees the minds of others, conscious or not, is especially mysterious,” he writes. We may be capable of guessing what others think, “but we cannot observe their minds, and only we ourselves can observe ours, from the inside, and through a rather narrow window.”

And yet here are two girls who can possibly — humbly, daily — feel what the other feels. Even that extraordinary dynamic would still put the girls on the continuum of connectivity that exists between ordinary humans. Some researchers believe that when we observe another person feeling, say, the prick of a pin, our neurons fire in a way that directly mimics the neurons firing in the person whom the pin actually pricks. So-called mirror neurons are thought to foster empathy, creating connections of which we are hardly aware but that bind us in some kind of mutual understanding at a neurological level.

The article, written by Susan Dominus (New York Times Magazine) who visited with the girls, includes several incidents that would indicate some form of shared sensory experience. I recommend that you read the rest of the article. The girls have not been studied extensively because of their young age, but if they remain healthy we may be treated to insight about one of the many ways of being human that just hasn’t been scientifically explored yet—the sharing of a mind…The philosophical implications of this phenomenon are overwhelming, to say the least.

Red Pill or Blue Pill? April 5, 2011

Posted by Nina Rosenstand in Ethics, Nina Rosenstand's Posts, Science.
Tags: , ,
6 comments

I can’t even begin to say how nauseated this article from The Guardian made me feel:

A pill to enhance moral behaviour, a treatment for racist thoughts, a therapy to increase your empathy for people in other countries – these may sound like the stuff of science fiction but with medicine getting closer to altering our moral state, society should be preparing for the consequences, according to a book that reviews scientific developments in the field.

Drugs such as Prozac that alter a patient’s mental state already have an impact on moral behaviour, but scientists predict that future medical advances may allow much more sophisticated manipulations.

The field is in its infancy, but “it’s very far from being science fiction”, said Dr Guy Kahane, deputy director of the Oxford Centre for Neuroethics and a Wellcome Trust biomedical ethics award winner.

“Science has ignored the question of moral improvement so far, but it is now becoming a big debate,” he said. “There is already a growing body of research you can describe in these terms. Studies show that certain drugs affect the ways people respond to moral dilemmas by increasing their sense of empathy, group affiliation and by reducing aggression.”

Researchers have become very interested in developing biomedical technologies capable of intervening in the biological processes that affect moral behaviour and moral thinking, according to Dr Tom Douglas, a Wellcome Trust research fellow at Oxford University’s Uehiro Centre. “It is a very hot area of scientific study right now.”

He is co-author of Enhancing Human Capacities, published on Monday, which includes a chapter on moral enhancement.

But would pharmacologically-induced altruism, for example, amount to genuine moral behaviour? Guy Kahane, deputy director of the Oxford Centre for Neuroethics and a Wellcome Trust biomedical ethics award winner, said: “We can change people’s emotional responses but quite whether that improves their moral behaviour is not something science can answer.”

He also admitted that it was unlikely people would “rush to take a pill that would make them morally better.

“Becoming more trusting, nicer, less aggressive and less violent can make you more vulnerable to exploitation,” he said. “On the other hand, it could improve your relationships or help your career.”

And on it goes, concluding that such chemicals would be nifty in the criminal justice system. Can anyone say A Clockwork Orange? Undoubtedly, this is the way we’re heading. It probably has its pros, but all I see right now are cons. I’m one of those philosophers who regard the new connections between philosophy and neuroscience with a lot of optimism. Well, let’s just say I feel less optimistic this morning…

Human Nature Decoded–No Whiskers, and No Penile Spines! March 10, 2011

Posted by Nina Rosenstand in Nina Rosenstand's Posts, Philosophy of Human Nature, Science.
Tags: , , , ,
add a comment

I’ve written a book about the Philosophy of Human Nature, The Human Condition. I’ve given talks, and written papers and blogs, and tweeted about the subject. I’ve been devouring every morsel of information about human evolution I could get my hands on since I was 13 years old. I teach two classes per year focusing on Phil of Human Nature. And what do I read this morning in the CNN online Health section? Brand new research about a significant difference between apes and humans: Ape males have penile spines and human males don’t. First thing I thought was, “And a good thing, too!” Now wipe the smirks off your faces—this finding turns out to have seriously philosophical consequences:

We know that humans have larger brains and, within the brain, a larger angular gyrus, a region associated with abstract concepts. Also, male chimpanzees have smaller penises than humans, and their penises have spines. Not like porcupine needles or anything, but small pointy projections on the surface that basically make the organ bumpy.

Gill Bejerano, a biologist at Stanford University School of Medicine, and colleagues wanted to further investigate why humans and chimpanzees have such differences. They analyzed the genomes of humans and closely related primates and discovered more than 500 regulatory regions — sequences in the genome responsible for controlling genes — that chimpanzees and other mammals have, but humans do not. In other words, they are making a list of DNA that has been lost from the human genome during millions of years of evolution. Results from their study are published in the journal Nature.

…[The scientists] found that in one case, a switch that had been lost in humans normally turns on an androgen receptor at the sites where sensory whiskers develop on the face and spines develop on the penis. Mice and many other animals have both of these characteristics, and humans do not.

“This switch controls the expression of a key gene that’s required for the formation of these structures,” said David Kingsley, a study co-author at Stanford University. “If you kill that gene — smash the lightbulb — which has been done previously in mouse genetics, the whiskers don’t grow as much and the penile spines fail to form at all.”

To sum up: Humans lack a switch in the genome that would “turn on” penile spines and sensory whiskers. But our primate relatives, such as chimpanzees, have the switch, and that’s why they differ from us in these two ways.

So what does it matter, other than, presumably,  a different female sexual experience, and a lack of ability to sense things a few inches from our faces?

The other “switch” examined in this study probably has to do with the expansion of brain regions in humans. Kingsley and colleagues believe they have found a place in their genome comparisons where the loss of DNA in humans may have contributed to the gain of neurons in the brain. That is to say, when humans evolved without a particular switch, the absence of that switch allowed the brain to grow further.

The earliest human ancestors probably had sensory whiskers, penile spines and small brains, Kingsley said. Evolutionary events to remove the whiskers and spines and enlarge the brain probably took place after humans and chimpanzees split apart as separate species (Some 5 million to 7 million years ago), but before Neanderthals and humans diverged (about 600,000 years ago), Kingsley said.

So there you have it: We were on the fast track to becoming Homo Sapiens when the switch for sensory whiskers and penile spines was turned off! Make of that what you want, in this Women’s History Month! For me, that story made my day!

(I thought of calling this blog post “Of Mice and Men”, but that would be unfair to Steinbeck.)

The Winner is Watson February 17, 2011

Posted by Nina Rosenstand in Artificial Intelligence, Current Events, Nina Rosenstand's Posts, Science, Technology.
Tags: , , ,
2 comments

So it has finally happened: a computer has outwitted the humans—Watson won on “Jeopardy.” As reported by the New York Times’ John Markoff,

For I.B.M., the showdown was not merely a well-publicized stunt and a $1 million prize, but proof that the company has taken a big step toward a world in which intelligent machines will understand and respond to humans, and perhaps inevitably, replace some of them.

Watson, specifically, is a “question answering machine” of a type that artificial intelligence researchers have struggled with for decades — a computer akin to the one on “Star Trek” that can understand questions posed in natural language and answer them.

One of Watson’s developers, Dr. Ferrucci, refers to the computer as though it is a person who actually deliberates. That,  for you Trekkers,  is also reminiscent of numerous Star Trek episodes:

Both Mr. Jennings and Mr. Rutter are accomplished at anticipating the light that signals it is possible to “buzz in,” and can sometimes get in with virtually zero lag time. The danger is to buzz too early, in which case the contestant is penalized and “locked out” for roughly a quarter of a second.

Watson, on the other hand, does not anticipate the light, but has a weighted scheme that allows it, when it is highly confident, to buzz in as quickly as 10 milliseconds, making it very hard for humans to beat. When it was less confident, it buzzed more slowly. In the second round, Watson beat the others to the buzzer in 24 out of 30 Double Jeopardy questions.

“It sort of wants to get beaten when it doesn’t have high confidence,” Dr. Ferrucci said. “It doesn’t want to look stupid.”

And what’s next?

For I.B.M., the future will happen very quickly, company executives said. On Thursday it plans to announce that it will collaborate with Columbia University and the University of Maryland to create a physician’s assistant service that will allow doctors to query a cybernetic assistant. The company also plans to work with Nuance Communications Inc. to add voice recognition to the physician’s assistant, possibly making the service available in as little as 18 months.

“I have been in medical education for 40 years and we’re still a very memory-based curriculum,” said Dr. Herbert Chase, a professor of clinical medicine at Columbia University who is working with I.B.M. on the physician’s assistant. “The power of Watson- like tools will cause us to reconsider what it is we want students to do.”

I.B.M. executives also said they are in discussions with a major consumer electronics retailer to develop a version of Watson, named after I.B.M.’s founder, Thomas J. Watson, that would be able to interact with consumers on a variety of subjects like buying decisions and technical support.

But…here’s the ultimate Star Trek question: Will Watson and others of its kind have the right to refuse the tasks they will be assigned to do? Because otherwise (thank you, Melissa Snodgrass, writer of that classic Star Trek: The Next Generation episode, “The Measure of a Man”) we will have created a new breed of—slaves. Provided that Watson actually develops a sense of self. But we have yet to see evidence of that. 🙂

Stem Cells Without Scruples? November 8, 2010

Posted by Nina Rosenstand in Ethics, Nina Rosenstand's Posts, Science.
Tags: , , ,
1 comment so far

I think we bloggers often express several kinds of misgivings about the future on this blog, for different reasons, but here’s something that should make us rejoice: medical news has reached a stage that I could only dream about when I devoured science-fiction novels in the 1980s to prepare myself for the what-if scenarios of good ethical discussions (and also because I enjoyed a good space yarn):

 First, the prospect of lab-grown livers is now becoming a reality:

The researchers created “working livers” the size of a walnut which functioned normally in laboratory conditions.

They believe that in around five years they will be able to upscale the process and transfer the procedure from laboratory to hospital.

 The development could eventually solve the transplant shortage and also remove the need for powerful drugs to prevent the body rejecting the organ.

“We are excited about the possibilities this research represents, but must stress that we’re at an early stage and many technical hurdles must be overcome before it could benefit patients,” said the project director, Associate Professor Shay Soker from the Wake Forest Institute for Regenerative Medicine in North Carolina.

 The technology opens up the prospect of growing other replacement organs, including kidneys or pancreases, for patients who are able to donate stem cells.

Artificially grown livers could be transplanted into patients or used to test the safety of experimental drugs.

 This could be a milestone—not only because it may solve the transplant shortage, but also because it will remove the fear of, and hypothetical need for, reproductive cloning for the sake of organs, the scenario in the movie The Island, for you movie buffs. Which means that other, more realistic arguments for and against cloning can proceed.

And here is the other amazing piece of news, about creating artificial blood supplies, perhaps even in abundance:

Canadian scientists have turned human skin cells directly into blood cells, the first time one kind of mature human cell has been converted into another, they reported Sunday in the journal Nature.

The transformation was completed without first rewinding the skin cells into the flexible pluripotent stem cells that have most frequently been used to grow needed tissues. By skipping the pluripotent step, the researchers believe they have skirted the risk that the replacement cells might form dangerous tumors.

The team created blood progenitor cells — the mother cells that multiply to produce other blood cells — as well as mature blood cells, according to the report. Both types of cells could be useful in medical treatments, said study leader Mick Bhatia, a stem cell scientist at McMaster University in Hamilton, Ontario.

“There is a great need for alternative sources of human blood,” Bhatia said. “Since this source would come from a patient’s own skin, there would be no concern of rejection of the transplanted cells.”

For some of us the idea of therapeutic cloning, using stem cell research to further life-saving medical intervention, is not a morally questionable issue at all, but there are many Americans for whom the thought of using stem cells is morally repugnant, and while I don’t share that view, I respect it. It isn’t clear what the source of the liver stem cells is, but the creation of blood progenitor cells bypasses the entire moral issue of using embryonic  stem cells, because it comes from the adult person’s own skin.  We can always be cynical about the ultimate cost, availability, and potential for political manipulation of such new methods, but for now let’s just rejoice that there’s good news to report!