jump to navigation

Scientists: Humans and Non-Humans—We Are All Conscious August 26, 2012

Posted by Nina Rosenstand in Animal Intelligence, Current Events, Ethics, Nina Rosenstand's Posts, Philosophy of Human Nature, Science.
Tags:
2 comments

A watershed of an event happend recently–if you’re in any way interested in the nature of consciousness. My students from Phil 107 and 108, and readers of my book, The Human Condition, know how vital I consider this topic, both in its ontological and  ethical aspects. I hope to expand this post later. For now, let me just share the URLs and a few quotes:

http://io9.com/5937356/prominent-scientists-sign-declaration-that-animals-have-conscious-awareness-just-like-us

An international group of prominent scientists has signed The Cambridge Declaration on Consciousness in which they are proclaiming their support for the idea that animals are conscious and aware to the degree that humans are — a list of animals that includes all mammals, birds, and even the octopus. But will this make us stop treating these animals in totally inhumane ways?

 While it might not sound like much for scientists to declare that many nonhuman animals possess conscious states, it’s the open acknowledgement that’s the big news here. The body of scientific evidence is increasingly showing that most animals are conscious in the same way that we are, and it’s no longer something we can ignore.

http://www.huffingtonpost.com/christof-koch/consciousness-is-everywhere_b_1784047.html

The two principal features that distinguish people from other animals is our hypertrophied ability to reflect upon ourselves (self-consciousness) and language. Yet there is little reason to deny consciousness to animals simply because they are mute or, for that matter, to premature infants because their brains are not fully developed. There is even less reason to deny it to people with severe aphasia who, upon recovery, can clearly describe their experiences while they were incapable of speaking. The perennial habit of introspection has led many intellectuals to devalue the unreflective, nonverbal character of much of life. The belief in human exceptionalism, so strongly rooted in the Judeo-Christian view of the world, flies in the face of all evidence for the structural and behavioral continuity between animals and people.

And here is the declaration in its entirety:

http://fcmconference.org/img/CambridgeDeclarationOnConsciousness.pdf

Advertisements

Poetry in Food August 23, 2012

Posted by Dwight and Lynn Furrow in Dwight Furrow's Posts, Food and Drink, Philosophy of Food.
Tags: , ,
4 comments

atelier crenn

Cross-posted at Edible Arts

One of the great obstacles to thinking of food as a form of art is that we are accustomed to thinking of food as a collection of flavors and textures that, although pleasurable, lack meaning. Flavors and textures, so it is argued, are not about anything and thus are not representations of an object, place, or person. In this they differ from painting, linguistic arts, and more controversially music, all of which have meaning and which thus qualifies them as art forms.

Chef Crenn, owner of Atelier Crenn, a restaurant in San Francisco, is pushing against this view and understands the depth of meaning that food can have.

Ms. Crenn’s dishes, which she dubs “poetic culinaria,” are all meant to express artistic ideas, in the same way that a line of poetry is meant to communicate more than the sum of its words. A recent 12-course, $160 grand tasting menu was also written as a poem. On the menu, the line “a shallow pool stirs,” for example, accompanied a dish of radish tea with sea urchin and caviar; “as first buds appear” went with a dish of oysters and egg-white foam decorated with tiny flowers.

The rest of the article describes how Crenn used a bird’s nest spotted on a walk as inspiration for a dish called “Birth” which resembled a bird’s nest and which signified the new beginning she must undertake after the foie gras ban in California goes into effect.

One could argue that Crenn’s cooking gets its meaning and thus its artistry from the stunning visual appearance of the food and the title of the dish. Thus, it is poaching on the visual and linguistic dimension for its claim to be art. In other words, the flavors and textures, the elements related to taste, are not doing much artistic work. Having not tasted Crenn’s intriguing culinaria I cannot say what work flavor is doing to enhance the perception of genuine artistry. But there is nothing in the nature of art that entails that art can employ only a single sensory modality. Film for instance employs many sensory modalities. And the dish did include remnants of her foie gras supply, thus clearly flavor and texture contribute to the meaning of the dish.

Many works of art get some of their meaning from language. We would be hard pressed to grasp the meaning of a work such as  Debussy’s La Mer (The Sea) if he hadn’t given it that title. Yet surely instrumental music is an art form despite the difficulty in locating its meaning.

The exclusion of food (and wine) from the realm of fine art increasingly seems like a mere prejudice (or a matter of historical practice) thanks to chefs such as Ms. Crenn, whose cooking I look forward to sampling the next time I’m in San Francisco.

The Ethics of…Game of Thrones! May 29, 2012

Posted by Nina Rosenstand in Culture, Ethics, Film, Nina Rosenstand's Posts.
Tags: , , ,
4 comments

Time for some early summer fun; lucky those (like me), for whom fun and work often end up merging, such as in narrative ethics. And I’ve found the HBO series Game of Thrones to be seriously fun, once you get into the fictional, pseudo-historical universe. (Haven’t read any of the books yet—I understand the TV series is deviating from the original in increasingly dramatic ways.) If anybody wants to catch up on the series before the season finale on Sunday, HBO is running the entire season this week. You can have an early-summer GoT marathon—and afterwards you can acquire a copy of Game of Thrones and Philosophy from Blackwell, which I have just ordered as a light summer read! If you are not worried about reading a spoiler article, take a look at Time’s review of episode 9, “Game of Thrones Watch: Smoke on the Water, Fire in the Sky” (those of you over 50 can start humming now…). It contains good character analyses, and a particularly insightful view of the character who emerges as the real focal point of the story, Tyrion Lannister:

I was on the verge of calling Tyrion’s behavior “heroic,” but that’s not really the term. Notably, we see that this is not Tyrion rising to his true calling or discovering that it is a far, far greater thing her does, &c., &c. It’s a practical decision, in that if the defenders of the city are not inspired, he will die. He plays the part (and Peter Dinklage does) masterfully, but he rouses his men with a purely practical argument too: “Don’t fight for your king, and don’t fight for his kingdoms. Don’t fight for honor, don’t fight for for glory. Don’t fight for riches, because you won’t get any.”

And the reviewer could/should have added what comes next—what Tyrion tells his army: “Fight for your homes.” Because Tyrion may be pragmatic, but he is not altogether a cynic.

All in all, it is a story about moral decisions, big and small—split-second decisions that come from the heart, or weighed by a calculating mind, and which all have consequences. Some decisions are made from a utilitarian, some from a deontological stance. Lots of ethical egoism in there, too, and just knee-jerk egoism. And some characters are pure at heart, and we see their ethic of care, their virtue ethics unfold, such as Sansa who from being a victim all of a sudden finds strength in helping others.

And so forth! If you’re looking for a joyride this week, leading up to the season finale on Sunday (and have cable), watch the 9 shows on HBO and look for all the moral, immoral and amoral viewpoints swirling around. A well-told tale, well acted, just right for some summer speculations about fictional problems of fictional characters.

Enjoy your summer!

 

 

The Moral of the Story 7/e is Out! April 15, 2012

Posted by Nina Rosenstand in Education, Ethics, Nina Rosenstand's Posts, Philosophy Profession.
Tags: , , ,
5 comments

I’m happy to announce that the seventh edition of my ethics textbook The Moral of the Story is now available:

The cover painting is by Karen Barbour, Bay Area artist, and every edition of the book has had a painting by her on the cover. She has a wonderfully visionary style, and I love being able to maintain the visual consistency in this new edition. This image in particular perfectly illustrates the maze of thoughts we often find ourselves in, in regard to moral issues. (And as with all mazes, there is always a way out, even if it is not within view…)

McGraw-Hill has a website where you can check out the Table of Contents and other features of the new edition. Instructors can request a desk copy. Among the new sections are a thoroughly updated Chapter 1, and sections on Happiness studies, Moral Naturalism, updated research on ethics and neuroscience, ethics and empathy, a new Nietzsche section, an updated Ayn Rand section, and several new movies and novels including Avatar, State of Play, True Grit, The Invention of Lying, and A Thousand Spendid Suns. And  Chapter 10 has a picture of Dwight Furrow! 🙂

Titanic–a Tale to Remember April 14, 2012

Posted by Nina Rosenstand in Culture, Ethics, Nina Rosenstand's Posts.
Tags: , , ,
1 comment so far

So now we think we know what happened, on that night exactly 100 years ago. Divers have explored the wreck, animated computer models have been presented, rescued artifacts are making their rounds around the world, stories of lost souls and survivors have been told, documentaries and movies have been made. So after the 100th anniversary, can we now close the book on Titanic? Or will it become one of the stories of humankind that we will never quite be done with? If so, it will be because, for one thing, it speaks to something perennial in the human psyche—and for another, because the story is broad and deep enough that different times and ages can find their own reflections in it.

When the disaster happened, the world was different—and I’m not talking about technology. The very mindset of the western world in 1912 was vastly different from today, because of the enormous optimism felt on two continents: the new century was going to be magnificent; the advances in medicine would soon conquer all diseases; technology would take humanity to far-away places on the planet, at break-neck speeds; politically, democracies were spreading, and war seemed like a primitive option, left behind in the turmoil of the 19th century (and few people were in the position to be able to predict the start of the Great War (WWI) just tw years later). And nature, in all its forms, would soon be conquered by human know-how and willpower. And what better symbol of the new age than the sister ships being built in Belfast, the Olympic and the Titanic? And when the Titanic, the carrier of the dream of the future, sank on April 14, 1912, the dream of an invincible 20th century perished, too, and in its place rose a wave of cynicism that we have, in effect, been riding ever since.

As we all know from Cameron’s movie (if we didn’t know already): It wasn’t the architect who claimed the ship was unsinkable—the concept came from the owners and the advertisers. The sinking of Titanic gave rise to cynicism and skepticism about what authorities tell you (don’t worry, there will be another lifeboat), about what advertisers tell you, about the promises of technology and even the wisdom of applying it. In short, Titanic now became a symbol for human hubris and nemesis, and that is the mirror Titanic has held up to us for a century.  

But now? With the 100th anniversary the drumbeat of the moral lessons of Titanic is sounding a new beat, coming from James Cameron himself. Two themes are emerging that one hundred years ago were not high on the agenda; one wasn’t even on the horizon. The recently corroborated fact that of the 1500 people who died that night, a great number were 3rd class passengers, locked up in steerage like rats, without even a change of escaping, has become a new theme: When disaster strikes, everybody suffers, but some may be suffering more than others: the have-nots. According to statistics, 75 percent of steerage passengers died, while among the first class passengers “only” 37 percent were lost. So the social aspect of Titanic as a class experience has emerged as a moral lesson, added to the hubris theme. But Cameron sees yet another moral caveat in the story of Titanic: the hubris of a planet thinking it can go full steam ahead without worrying about icebergs, for the sake of profit. For him, Planet Earth is a Titanic forging ahead into climate change.

So is that an appropriate lesson to be learned from the story of Titanic, or does it somehow deflect and detract from the actual tragedy happening to real people 100 years ago? Are they being used merely as a means to a political end? That is up to us to decide, individually. What fascinates me is that the doomed ship can take on a new narrative role as a teacher of moral lessons that go far beyond the concerns of 100 years ago. But perhaps that is the case with all good stories; they not only tell a timeless tale, but their lesson can be adapted to new ages and different problems.

Ugly Food March 18, 2012

Posted by Dwight and Lynn Furrow in Art and Music, Dwight Furrow's Posts, Philosophy of Food.
Tags: , , , ,
2 comments

ugly pig

Cross-posted at Edible Arts

One persistent, serious argument against the view that food preparation can be an art is that food preparation, unlike the visual arts, lacks deep meaning and the ability to represent the many dimensions of human life. While paintings can represent and comment on the horrors of war, mine the endless permutations of modern alienation, or subtly expose the character flaws of a fatuous nitwit, food is about only flavor and texture. We learn little about ourselves or the world through food regardless of how well-prepared so the argument goes.

Food writer John Mariani recently gave a version of this argument:

There is ugly art (Hieronymus Bosch) and troubling art (Goya’s Disasters of War) and art that is deliberately in your face (Kerouac’s On the Road), disorienting (Kubrick’s 2001), even repulsive (the Sex Pistols’ “God Save the Queen”). Cooking, on the other hand, should be none of these things except, perhaps, beautiful to look at on the plate and delicious on the tongue. Creative cooking might well enlighten a person to new possibilities or ways of thinking about a pea shoot, and that is a good thing in a world of fast, frozen, chemically-enhanced foods. Cooking can be provocative, but it is the rare chef who makes food that is deliberately distasteful or that seeks to outrage people, as great art often does.

Apparently, Mariani missed the “Wicked Meal” episode of Top Chef, where the challenge was to make “evil” food for the Evil Snow Queen fetchingly played by Charlize Theron. And he must have missed the anthropological accounts of women expressing anger and resentment through the inedible dishes they serve to guests and families.  (See “Thick Sauce” by Stoller and Olkes reprinted here). Clearly, chefs and cooks, when they are so inclined, can make food that represents the horrible and ugly. But, nevertheless, Mariani is right that food, in the ordinary contexts in which food is served, must taste good or it will not serve its main functions of nourishment and enjoyment. The food served on Top Chef during the above-referenced episode was tasty despite the grotesque connotations.

However, I think that episode of Top Chef is in fact instructive, not only regarding the nature of food, but the nature of art. The wasted, deformed bodies depicted in Goya’s Disasters of War are indeed grotesque. Yet even the ugly must seduce if it is to be art.

disasters of war

We don’t recoil from viewing these etchings and run screaming from the museum in a fit of rage or fright. We are fascinated by Goya’s extraordinary ability to use line and shadow as a vehicle to highlight atrocity. The spectacle of a artist relishing violence and mayhem is itself seductive and the contrast between the blindness of atrocity and the prurient insight we gain from viewing is part of the seduction. Formally, the rough lines and use of shading focus our attention but the muted colors have a distancing effect on the viewer.  The aim is to spark reflection on atrocity but the vaguely cartoonish characters contain a different message—the slaughter bench of history is so pervasive that one in the end can only laugh.

We might react emotionally and empathically to violent visual art but we do so because we view it from a safe distance—where real fear or real revulsion are inappropriate. Our response to the ugly and horrifying is sublime in Edmund Burke’s sense of that term—art puts us at a safe distance so we can reflect, not merely react. Something similar could be said of the Sex Pistols—they use revulsion in order to depict the sterility and nihilism of modern society. But if we felt revulsion toward the Sex Pistols, we simply would not listen. What they represent is repulsive but their means of representation is not, at least for their fans.

Art like food must “taste good” , give us pleasure, if its representation is to succeed. I do not know and do not wish to know anyone whose aesthetic appreciation is of the ugly as such—who gets pleasure not only in viewing what is ugly but reveling in the ugliness of the presentation. That is surely pathological.

Our reaction to food is quite similar to our response to Goya’s painting. There is nothing in ordinary life more violent than the act of eating. We rend and tear at our food after it has been slaughtered, butchered and burned to a crisp—and then we swallow and assimilate it to our own substance. Yet we are attracted to the act of eating via the pangs of hunger and the charms of flavor and aroma. All eating represents the horrible and the grotesque. That we fail to attend to it is testimony both to our capacity for self-deception and the talents of chefs who induce us to find pleasure in their presentation.

Food may be limited in what it can depict (although I think its limitations are exaggerated) but it is not mute when it comes to representing the ugly.

My comments on other aspects of Mariani’s argument are here.

Real Men Don’t Eat Fiddly Foods! February 15, 2012

Posted by Dwight and Lynn Furrow in Dwight Furrow's Posts, Food and Drink, Philosophy of Food.
Tags:
4 comments

124598_4563

Cross-posted at Edible Arts

 

Esquire’s “Eat Like a Man” blog features John Mariani confidently contending that cooking is a craft, never an art.

Thus, imagination and creativity go into cooking, often at a very high level, at which point it is called haute cuisine. But there is nothing that rises to the level of true art in a craft whose very existence depends on the constant replication of a dish, night after night, week after week.

The occasion for Mariani’s diatribe against culinary art is a new book which consists mainly of pictures of:

…cooks’ hands putting the final touches on dishes — a periwinkle on tapioca, a dot of sauce on octopus, a blow torch used on cactus pads.

Given the venue, I suppose the subtext here is that real men don’t eat fiddly foods topped with periwinkles, when the carcasses of large-boned animals can be slathered with Q-sauce and washed down with a pitcher of Bud Light for a fraction of the price.

Subtext aside, Mariani’s arguments are interesting in much the same way a speech by Newt Gingrich is interesting—one shivers in anticipation of impending collapse when bluster is so perilously perched on non-sequitur. So it is worth unpacking the arguments if only for the spectacle.

With a healthy dose of charity, I can discern 5 arguments in Mariani’s piece:

(1) Cooking requires the constant replication of a dish and is thus inherently a reproduction; works of art are unique.

(2) Cooking is science-based and thus cannot be an art

(3) Art can be ugly, troubling, or repulsive; food by contrast cannot be deliberately distasteful.

(4) In cooking, form must follow function. Thus, cooks must make guests happy and this often requires simplicity and making things “taste like what they are”. In art, (by implication) form is not bound to function, simplicity is not a virtue, and art is essentially about creatively modifying the object being represented, not showing it as it is.

(5) What is typically called culinary art involves extravagant display or adding decorative flourishes to traditional ingredients. This is not art because (by implication) art is not about decoration or extravagance.

There is too much misunderstanding of both cooking and the arts to reply in one blog post. So I will take up these arguments in separate posts over the next week or so.

But his first argument that individual dishes are reproductions and thus cannot be original works is simple nonsense. Copies of paintings are indeed mere reproductions, not original works. A print of the Mona Lisa is not a work of art because painting is an autographic art—only the painter can directly cause the work to exist, and there can be only one legitimate instance of it. But many arts are allographic—copies of an original are genuine instances of the original. My copy of Hamlet is a work of art even though it is a duplication of the original. CD’s by Springsteen or performances of Beethoven are instances of works of art despite the fact they are reproductions.

Cooking is similarly allographic. Individual dishes are instances of a recipe just as a performance of Beethoven’s Ninth Symphony is an instance of its score. So the fact that line cooks churn out 25 copies of a dish in no way shows that cooking is not an art–unless Mariani is prepared to claim Beethoven and Shakespeare are mere craftsmen.

The Young Brain—Why Does it Take So Long to Grow Up? January 30, 2012

Posted by Nina Rosenstand in Education, Nina Rosenstand's Posts, Philosophy of Human Nature, Science.
Tags: , , , ,
2 comments

Welcome to the Spring 2012 semester, where we will post occasional blog entries as our schedules and moods allow! Here is something that I think will interest those of you who are under 25, or happen to know someone who is! Finally we understand the adolescent brain, and furthermore, that the adolescent brain will last well into young adult years these days, because what makes a brain “adult” is that is has responsibilities. Uh-oh! Does that mean some people will never grow up? Maybe…and there is a name for that: the Peter Pan Syndrome. Perhaps there will be a neurological explanation for that, now…

But in the meantime, this is what professor of psychology Alison Gopnik writes in her article, “What’s Wrong With the Teenage Mind?”: Puberty is happening earlier, but adulthood seems to be delayed. So we will have to live with “teenage weirdness” longer than in past centuries.

The crucial new idea is that there are two different neural and psychological systems that interact to turn children into adults. Over the past two centuries, and even more over the past generation, the developmental timing of these two systems has changed. That, in turn, has profoundly changed adolescence and produced new kinds of adolescent woe. The big question for anyone who deals with young people today is how we can go about bringing these cogs of the teenage mind into sync once again.

The first of these systems has to do with emotion and motivation. It is very closely linked to the biological and chemical changes of puberty and involves the areas of the brain that respond to rewards.

Recent studies in the neuroscientist B.J. Casey’s lab at Cornell University suggest that adolescents aren’t reckless because they underestimate risks, but because they overestimate rewards—or, rather, find rewards more rewarding than adults do. The reward centers of the adolescent brain are much more active than those of either children or adults. Think about the incomparable intensity of first love, the never-to-be-recaptured glory of the high-school basketball championship.

The second crucial system in our brains has to do with control; it channels and harnesses all that seething energy. In particular, the prefrontal cortex reaches out to guide other parts of the brain, including the parts that govern motivation and emotion. This is the system that inhibits impulses and guides decision-making, that encourages long-term planning and delays gratification.

This control system depends much more on learning. It becomes increasingly effective throughout childhood and continues to develop during adolescence and adulthood, as we gain more experience. You come to make better decisions by making not-so-good decisions and then correcting them.

In the past (from hunter-gatherers all the way to the recent past) those two systems were in sync, but they are no longer.

The experience of trying to achieve a real goal in real time in the real world is increasingly delayed, and the growth of the control system depends on just those experiences. The pediatrician and developmental psychologist Ronald Dahl at the University of California, Berkeley, has a good metaphor for the result: Today’s adolescents develop an accelerator a long time before they can steer and brake.

This doesn’t mean that adolescents are stupider than they used to be. In many ways, they are much smarter. An ever longer protected period of immaturity and dependence—a childhood that extends through college—means that young humans can learn more than ever before. There is strong evidence that IQ has increased dramatically as more children spend more time in school, and there is even some evidence that higher IQ is correlated with delayed frontal lobe development….

But there are different ways of being smart. Knowing physics and chemistry is no help with a soufflé. Wide-ranging, flexible and broad learning, the kind we encourage in high-school and college, may actually be in tension with the ability to develop finely-honed, controlled, focused expertise in a particular skill, the kind of learning that once routinely took place in human societies. For most of our history, children have started their internships when they were seven, not 27.

Recognize the problems of Will Hunting in Good Will Hunting? He has all the theoretical knowledge in the world, but has no idea how to live (and doesn’t even dare to). So what to do about it? Gopnik suggests to increase the level of varied hands-on experience of the young person, an extended apprenticeship-adolescence with responsibilities:

Instead of simply giving adolescents more and more school experiences—those extra hours of after-school classes and homework—we could try to arrange more opportunities for apprenticeship. AmeriCorps, the federal community-service program for youth, is an excellent example, since it provides both challenging real-life experiences and a degree of protection and supervision.

“Take your child to work” could become a routine practice rather than a single-day annual event, and college students could spend more time watching and helping scientists and scholars at work rather than just listening to their lectures. Summer enrichment activities like camp and travel, now so common for children whose parents have means, might be usefully alternated with summer jobs, with real responsibilities.

Hmmm…maybe we professors should recruit teams of secretaries and teaching assistants from among our students, for their own good?

 

 

Tasteless Philosophy November 21, 2011

Posted by Dwight and Lynn Furrow in Dwight Furrow's Posts, Philosophy.
Tags: , ,
1 comment so far

Despite being preoccupied with analyzing sensory experience, philosophers have ignored taste, smell, and touch, focusing instead on vision (and to a degree sound) as the most important sense.

Hans Jonas’s “The Nobility of Sight” is a representative example. Only vision, he argued, points us in the direction of the eternal, universal truths, which have been philosophy’s concern throughout most of its history. Vision puts us in mind of the eternal because time is not essential to its functioning. When we view a landscape we see the visual field displayed all at once, in no time; and an object can be visually identified immediately without a sequence of appearances over time, in contrast to sound, touch, or taste that need time to reveal the character of their objects. And visual objects have stability. We can view an object, look away, and then return to the very same object as if nothing has changed unlike the fleeting, ever-changing objects of taste, smell, and sound.

Furthermore, Jonas argues, with vision we can see things better if we maintain a distance from them. Touch, smell, and taste require that we be intimate with the object thus increasing the chances of personal bias might influence our understanding of it.

Despite their illustrious pedigree, these are very bad arguments. We learn nothing of the eternal through vision, or any other sensory mechanism, and vision without the opportunity for subsequent confirmation, in time, would be the source of constant error. Furthermore, our sense of the stability of objects is as dependent on the sense of touch as on vision. The stability of our visual field is dependant on the body’s orientation is space, which is maintained, in part, by our tactile contact with solid objects.

As to the alleged objectifying distance of vision, science shows that vision involves intimate contact with physical objects–swarms of photons. And we seem just as capable of misinterpreting those photons as we are the signals from taste buds. Recent psychological research is demonstrating the unreliability of eye-witness testimony. If anything introduces subjective bias into perceptual judgments it is the fact that objects are often seen at a distance or under conditions otherwise unsuitable for reliable identification. Apparently seeing is misbelieving.

At best, vision’s distance and the illusion of simultaneity allow us to spin metaphors about the eternal and universal. But misleading metaphors are bad metaphors.

There is an important contrast between vision and the other senses however. Through vision we do gain a sense of an horizon, an area beyond our present space. This is surely important for the development of our imagination.

By contrast, sound, touch, taste, and smell root us in the here and now. Objects must be spatially and temporally present for them to effect these sensory modalities. But why should experience rooted in the here and now be uninteresting to philosophy?

If taste is philosophically uninteresting, perhaps it is because philosophers lack taste.

The Synergy of Music and Wine (or how to waste time on the Internet) November 15, 2011

Posted by Dwight and Lynn Furrow in Dwight Furrow's Posts, Food and Drink.
Tags: , , ,
add a comment

Synergy occurs when two or more things function together to produce a result that they cannot achieve independently.

Synergy is essential in the world of food and wine. Good food and wine pairings are an example of synergy. Adding salt or acidity to a dish often enhances other flavors—another example of synergy.

But what about synergy between music and drink? Are their natural affinities between music and particular consumables? A new website, called Drinkify, assumes so. Enter the name of an artist you want to listen to and a song by that artist starts playing and a drink recommendation pops up.

The idea was conjured at a recent meeting of Music Hack Day Boston, where tech geeks gather to meld software and music.

I usually ignore web-based gimmicks. But I couldn’t resist this. So I plugged in one of my favorite bands, Steely Dan, and received the recommendation to drink a bottle of red wine—topped with nutmeg? Now if you happen to like red wine and Steely Dan, I’m sure they will enhance each other, especially towards the bottom of the bottle. But is their some further connection here? The music of Steely Dan is sophisticated and complex, and some red wine is sophisticated and complex as well, but the last thing I’m going to do with a sophisticated, complex wine is sprinkle nutmeg on top! Nutmeg is a flavor note one often detects in pinot noir. I guess if all I had was a bottle of Two Buck Chuck, I could sprinkle a little nutmeg and pretend to be tasting Burgundy. But why bother?

I’m beginning to suspect this is nonsense.

But wait. Here’s another hypothesis. Steely Dan got their name from a William Burroughs reference to a dildo in Naked Lunch. And Burroughs killed his wife trying to shoot a wine glass off her head in a drunken game of William Tell. Ah. I guess that’s the connection.

I decided to go classical and plugged in Stravinsky. Their drink recommendation—Ogogoro, a Nigerian beverage distilled from the sap of palm trees. Well, Stravinsky’s Rite of Spring featured primitive themes and syncopated rhythms inspired by African music. Not bad. A bit more precise than the Steely Dan reference.

How about some Coltrane? 4 oz of red wine with the instructions to serve neat and stir vigorously. Huh?

I’m beginning to suspect random associations.

Oh just one more. Elvis Costello. The recommendation–8 oz of fassionola, which is a red syrup used in bar drinks, 10 oz. water and 8 oz. of half-and-half.

That is just disgusting.

I can’t believe I just wasted 20 minutes on this.

x-posted at Edible Arts