jump to navigation

The Emperor’s Feast November 8, 2010

Posted by Dwight Furrow in Culture, Dwight Furrow's Posts, Food and Drink.
Tags: ,
add a comment

From the famed cookbook of Apicius, (A Roman cookbook from the 4th Century)

The proposed menu for a banquet:


Jellyfish and eggs; sow’s udders stuffed with salted sea urchins; patina of brains cooked with milk and eggs; boiled tree fungi with peppered fish-fat sauce; sea urchins with spices, honey, oil and egg sauce


Fallow deer roasted with onion sauce, rue, Jericho dates, raisins, oil and honey; boiled ostrich with sweet sauce; turtle dove boiled in its feathers; roast parrot; dormice stuffed with pork and pine kernels; ham boiled with figs and bay leaves, rubbed with honey, baked in pastry crust; flamingo bioled with dates.


Fricassee of roses with pastry; stoned dates stuffed with nuts and pine kernels, fried in honey; hot African sweet-wine cakes, with honey. (h/t Brian Leiter)

Does anyone know where I can get sow’s udder in San Diego?

Meanwhile back in the contemporary world, empire just isn’t what it used to be.

From Talking Points Memo:

The Cheese Industrial Complex

Here’s an article in the Times that is both disturbing and oddly comic, if darkly so. The US government is now making a major push to combat obesity. It’s the First Lady’s big cause. But for years Americans have been moving away from full-fat to reduced fat or skim milks. And this has created a surplus of whole milk and milk fat.

So what to do? While trying to get Americans to reduce fat intake and eat better, the USDA has also created a marketing arm called ‘Dairy Management’ which has the job of teaming with companies to find ways to get more cheese into consumers’ diets.

The story in the lede is about how ‘Dairy Management’ helped Dominos overcome sagging pizza sales by introducing pizzas with 40% more cheese. It’s been a rousing success and sales have doubled.

Is this progress?

book-section-book-cover2 Dwight Furrow is author of

Reviving the Left: The Need to Restore Liberal Values in America

For political commentary by Dwight Furrow visit: www.revivingliberalism.com


A More Nuanced View of the French Protests October 25, 2010

Posted by Dwight Furrow in Culture, Dwight Furrow's Posts, politics.
Tags: ,
add a comment

We have been hearing a lot in the press about the strikes, protests, and demonstrations that have brought France to a standstill. They are expressing opposition to a government proposal to raise the age for a minimum pension from 60 to 62. Typical of the mainstream press is the following headline which appeared on the San Diego Union Tribune’s website:

The French are striking over what? Retiring at 62?

What’s French for “huh?” France is on strike, the population outraged by a proposed pension reform that would raise the retirement age. From — zut alors! — 60 to 62.

The attitude of the U.S. media has been to poke fun at those silly French who hate to work and want to retire when they’re 60 and sip Calvados on the public dime.

But, of course, as usual the mainstream press is very likely misleading the public (especially the odious right-wing UT).

Here is an alternative perspective from Bob Vallier:

Currently, a worker has to contribute to social security—which is not at all the same as American social security, in that it also includes universal health coverage, unemployment insurance, and a whole host of other social benefits that constitute the social safety net for all citizens—for 40.5 years; under Sarkozy’s reform, a year would be added on to the contribution (and again, several members of the left agree that this may be necessary, and in itself is not so bad).   If someone starts working in a public sector job, as for example a mechanic at the SNCF, at the age of 18, then even with the reforms, the absolute earliest they could retire would be 60.  Of course, almost no one starts working at the age of 18 at such jobs, because (a) the unemployment rate among 18-to-26 year olds is the highest at 38%, and (b) such jobs require qualifications that you can get only after at least two years of training and apprenticeship.  So a minimum retirement age at 62 is mathematically realistic and fiscally responsible, and everyone knows it.   That’s not really the problem.  The problem is that once you reach the minimum retirement age, you could retire only if you’ve been paying into social security for 41.5 years, an even then, you could retire only on a partial pension.   You are  currently not entitled to a full pension until you are 65, and under the proposed reforms, this would be raised to 67, which implies that you would not start working and contributing in full until you are 25.5, which, given unemployment rates, is by no means obvious.  Any time off for disability or due to a period unemployment between jobs—i.e., when you are not earning a salary and thus not contributing to social security—would actually count against you, forcing you to work longer.  If you do all the math, it soon becomes apparent that the real age at which you would be eligible to take your retirement would be approaching 65 or 66, while the age at which you could receive a full pension is approaching 70 or 71.  So, it’s not at all a matter of adding just two years on to the minimum retirement age; in real practice, these reforms would add between 8 and 10 years onto the time you’d have to wait before you’d be eligible for retirement at full pension. […]

After describing how France’s social safety net works and the costs it imposes on employers:

In the past few years, Sarko (and Chirac before him) has tried to reform social security (and again, everyone recognizes that it needs to be reformed), and the proposed reforms (which largely failed because of strikes similar to those we see today) were all about shifting the costs of social security away from employers and to employees, i.e., increasing the rate of employee contributions.   Sarko and company argue that such reforms would stimulate employment, but what such reforms would mean on a practical level is that each employee would be taking home even less in real net income.  So once again, the strikes today are not just about raising the minimum retirement age; they are about protecting a broad ranger of employee benefits, which are rightly viewed as under threat.  If these present reforms succeed, then Sarko  and his government will have a strong hand (even if his approval rating is a dismal 26%) to pursue other reforms in social security that will be deleterious to workers, and the various social agents (unions, etc.) will be viewed as weak, ineffectual, and unable to protect les acquis, the rights and entitlements they have all fought for.  And it wouldn’t be just the working-class that is affected; it would be everyone.  And that’s why there is such strong support for the present actions.

And it turns out, according to Vallier, that the unions have proposed their own pension and social security reforms that would finance the system but would be paid for by big business and hence cannot get a hearing.

I have no independent knowledge of the French situation and I am unfamiliar with the writer here so I don’t know if all of this is accurate. But it is nuanced unlike the drivel we get in the media.

I would not be a bit surprised if the U.S. press accounts are systematically misleading.

book-section-book-cover2 Dwight Furrow is author of

Reviving the Left: The Need to Restore Liberal Values in America

For political commentary by Dwight Furrow visit: www.revivingliberalism.com

Welcome to the Future of Academia October 25, 2010

Posted by Nina Rosenstand in Culture, Education, Nina Rosenstand's Posts, Teaching.
Tags: , ,
add a comment

Following up on the “Sad Stats” post below, perhaps it isn’t too hard to discern a trend here, in this excerpt from a recent article in the Wall Street Journal, “Putting a Price on Professors”:

As budget pressures mount, legislators and governors are increasingly demanding data proving that money given to colleges is well spent. States spend about 11% of their general-fund budgets subsidizing higher education. That totaled more than $78 billion in fiscal year 2008, according to the National Association of State Budget Officers.

The movement is driven as well by dismal educational statistics. Just over half of all freshmen entering four-year public colleges will earn a degree from that institution within six years, according to the U.S. Department of Education.

And among those with diplomas, just 31% could pass the most recent national prose literacy test, given in 2003; that’s down from 40% a decade earlier, the department says.

“For years and years, universities got away with, ‘Trust us—it’ll be worth it,'” said F. King Alexander, president of California State University at Long Beach.

But no more: “Every conversation we have with these institutions now revolves around productivity,” says Jason Bearce, associate commissioner for higher education in Indiana. He tells administrators it’s not enough to find efficiencies in their operations; they must seek “academic efficiency” as well, graduating more students more quickly and with more demonstrable skills. The National Governors Association echoes that mantra; it just formed a commission focused on improving productivity in higher education.

This new emphasis has raised hackles in academia. Some professors express deep concern that the focus on serving student “customers” and delivering value to taxpayers will turn public colleges into factories. They worry that it will upend the essential nature of a university, where the Milton scholar who teaches a senior seminar to five English majors is valued as much as the engineering professor who lands a million-dollar research grant.

And they fear too much tinkering will destroy an educational system that, despite its acknowledged flaws, remains the envy of much of the world. “It’s a reflection of a much more corporate model of running a university, and it’s getting away from the idea of the university as public good,” says John Curtis, research director for the American Association of University Professors.

Efforts to remake higher education generally fall into two categories. In some states, including Ohio and Indiana, public officials have ordered a new approach to funding, based not on how many students enroll but on what they accomplish.

You need to read the rest of the article. It’s too long to copy here, but I’ll end this post with the voice of one professor who tries to argue why it can’t all be boiled down to retention, exams, funding and profit:

Mr. Dunning teaches two classes a semester and has won several teaching awards. His salary of about $90,000 a year also covers the time he spends researching Russian literature and history. His most recent book argues that Alexander Pushkin’s drama “Boris Godunov” was a comedy, not a tragedy.

 Mr. Dunning says his scholarly work animates his teaching and inspires his students. “But if you want me to explain why a grocery clerk in Texas should pay taxes for me to write those books, I can’t give you an answer,” he says.

His eyes sweep his cramped office, lined with books. Then Mr. Dunning finds his answer. “We’ve only got 5,000 years of recorded human history,” he says, “and I think we need every precious bit of it.”

American Sins Against Socrates September 27, 2010

Posted by Dwight Furrow in Culture, Dwight Furrow's Posts, politics, Uncategorized.
Tags: ,
add a comment

David Schneider laments our lack of self-reflection:

I’ll give one thing to the demagogues – they sure know something about basic human psychology. For those of us waterboarded by the economy, we’re close to Depression desperation. It’s a commonplace that depression is “anger focused inward”; and the cheap-and-easy way out, if you’re too cash-strapped for the shrink or the meds, is to displace that anger outward to the nearest, easiest target.

O America, if there’s anything we suck at, it’s adequate self-reflection. Oh sure, we love looking at ourselves, we paragons of self-flattery on the flat screen; but thinking about ourselves (by which we mean, interrogating history) – well, that’s injurious to our self-esteem. After all, we tried it a couple times: Jimmy Carter, and what the right-wing called the “politics of resentment” in the “radical left-wing” academy of the ’80s and ’90s. Reagan’s “Morning in America,” and the Neoconservative revels after Communism’s collapse, sure showed those liberal pantywaists. The power of positive thinking. Huh.

I’ve thought a lot about the acolytes of that cipher, George W. Bush, as the last decade broke and darkened. And I thought of my father, who, as I was growing up, could do almost anything but admit he was wrong. I thought about hard-line Communists in the Politburo, as the Soviet Union dissolved: what happens when everything you’ve believed in is a lie?

When the economy collapses and your phallus is your finances, you’re getting kicked in the nuts. Pretty humiliating.

So you can actually feel really embarrassed, humiliated and ashamed – and pledge to reform, and actually reform – but that involves a lot of thinking, and gee, there’s so much to think about already. On the other hand, you can get angry. Throw that anger away from yourself, as far as you possibly can: to the Other: socialists, terrorists, illegal immigrants, and the mythical chimaera of all three, the President of the United States of America.

In Britain, August is “the silly season”; in America, we scapegoat. It’s a necessary action, according to the Old Testament – all the sins of the Israelites, placed upon a goat’s head, which is then thrown off a cliff or banished to the wilderness. It’s the prerequisite to Atonement, which Sarah Palin and Glenn Beck pantomimed before the giant of Lincoln, in the shadow of Martin Luther King, Jr. Only then, after the scapegoat is cast out, and the ceremony of Atonement is complete, can you re-establish the Covenant, and be written into the Book of Life again, as the new Republican Pledge attempts.

Tragedy is the goat’s song.

I’m theorizing here, with no more or less credence than the Beck himself. (Heck,he made bank off his conspiracy theories; why can’t I?) I’m only trying to dig into the deep substrata of our national mythologies, attempting to discover any rationale for America’s persistent avoidance of self-knowledge: that we were taken for fools. Every day, we are confronted by our own financially fatal gullibility and the deceit of our neighbors. The litany is so omnipresent, so perpetual, that we are apt to plug our fingers in our ears and shout “LA LA LA!” In the last month alone, I’m appalled to read about Nevin Shapiro, who pled guilty to defrauding investors across America of $880 million; George L. Theodule, “man of God,” who stole at least $4 million (and as much as $23 million) from his Haitian-American church congregation; Marcia Sladish, a Giants Stadium ticket collector, who collected $15 million from a Reverend Sun-Myung Moon-afilliated church congregation and is now serving 70 months in prison; the trio of miscreants who, until recently, ran North Providence, R.I., blackmailing and cajoling bribes out of anyone who wanted to do a bit of honest business; and the entire city council of Bell, California, which ran their poverty-stricken town like malevolent lords over a provincial fiefdom.  

It’s pretty much the same story across the board, from John Farahi in southern California to Scott Rothstein in my hometown of Fort Lauderdale: be charismatic and charming, promise the world to your fellow believers, take their money, buy some hot cars and chic restaurants and maybe a mansion or three. Beat the Johnsons. Repeat as necessary until you’re in the dock, blubbering for leniency, very LiLo-like.

It’s sickening.

And it’s easy to get angry.
It’s easy to be misanthropic.
It’s tempting to look for easy answers.

But the fact is, many of the fraudsters who’ve downed our economy are being exposed due to the diligence of the Obama administration, and quite perversely, we don’t like it.

As far back as 2004, the FBI was complaining that mortgage fraud was a major threat to the American economy. The Bush administration had shifted the vast majority of the FBI’s manpower toward counterterrorism efforts (a fact often emphasized in The Wire), leaving the agency unable to respond to financial crimes. Each year, the FBI petitioned the Bush administration for more agents; each year, the requests were denied.

Under the Obama administration, the FBI radically stepped up investigations and prosecutions of financial fraud, according to last Wednesday’s testimony before the Senate Judiciary Committee. For a mere three-and-a-half months, the FBI’s been engaged in a sweep called Operation Stolen Dreams, arresting 525 people allegedly responsible for more than $3 billion in losses. And, if you read the report, that’s just the tip of the iceberg.

We, the people, are furious (according to the mainstream media); we decry “porkbarreling” and “sweetheart deals” in Congress; we are terrified that the economy will not “recover” to its “previous level.” The fact is, the economy was never at its “previous level.” Scuppered by our own self-aggrandizement (which we euphemize as “self-esteem”) we have defrauded ourselves to believe that we are worth much more than we are. Often, we’ve deluded ourselves and others. Some of us have done so to a degree that is criminal. And those that have done so are guilty, and ashamed, and in denial, and are angry at themselves, and may well take shelter under the right wing of the tea partiers, who repent for us all, and champion the unbounded freedom to hoodwink us to our national ruin.

After all, one must protect one’s own interests. That’s the American way.

The press claims the upcoming election is a referendum on Obama’s economic plan. But Schneider is right that much of our current political debate is the politics of projection, avoidance and self-deception. The upcoming election is really a referendum on the American public and its capacity for self-reflection.

book-section-book-cover2 Dwight Furrow is author of

Reviving the Left: The Need to Restore Liberal Values in America

For political commentary by Dwight Furrow visit: www.revivingliberalism.com

Wikipedia on Trial August 3, 2010

Posted by Dwight Furrow in Culture, Dwight Furrow's Posts, Education.
Tags: ,
add a comment

Larry Sanger is one of the founders of Wikipedia, although he quit the project because of disagreements about the quality of Wikipedia articles. He was also trained as a philosopher with a specialization in epistemology, and thus has an interesting perspective on some of the problems of using Wikipedia as a source of knowledge.

Here is an excerpt from his Slate interview:

Why did you feel so strongly about involving experts?

Because of the complete disregard for expert opinion among a group of amateurs working on a subject, and in particular because of their tendency to openly express contempt for experts. There was this attitude that experts should be disqualified [from participating] by the very fact that they had published on the subject—that because they had published, they were therefore biased. That frustrated me very much, to see that happening over and over again: experts essentially being driven away by people who didn’t have any respect for those who make it their lives’ work to know things.

Where do you think that contempt for expertise comes from? It’s seems odd to be committed to a project that’s all about sharing knowledge, yet dismiss those who’ve worked so hard to acquire it.

There’s a whole worldview that’s shared by many programmers—although not all of them, of course—and by many young intellectuals that I characterize as “epistemic egalitarianism.” They’re greatly offended by the idea that anyone might be regarded as more reliable on a given topic than everyone else. They feel that for everything to be as fair as possible and equal as possible, the only thing that ought to matter is the content [of a claim] itself, not its source.

It seems to me that this conflict between amateurs and experts boils down to a conflict between egalitarianism and credibility. You gestured toward this conflict in an essay on the Edge.com, where you wrote, “It’s Truth versus Equality, and as much as I love Equality, if it comes down to choosing, I’m on the side of Truth.” Do you find that it really is a zero-sum gamethat, as a practical matter, we need to choose between these two goods?

I doubt very much that it’s a zero-sum game. I think it’s absolutely a great thing that people regardless of their credentials can contribute to the shaping of knowledge. And I think we have to creatively design ways of recognizing both the value of amateur work, on the one hand, and the objective value of the knowledge of people who are experts in various fields.

The idea behind Wikipedia is that by pooling information held by multiple authors truth will emerge in the marketplace of ideas. No planner or centralized authority is necessary because multiple authors will be self-correcting. If one author makes a mistake, other authors will notice the mistake and correct it.

But as Sanger points out, it is not obvious that Wikipedia actually works that way. The loudest or most persistent voice is not necessarily the voice of truth. The idea that a talented amateur is in a position to trump the judgment of experts who have spent years studying a subject is a modern but pernicious conceit.

book-section-book-cover2 Dwight Furrow is author of

Reviving the Left: The Need to Restore Liberal Values in America

For political commentary by Dwight Furrow visit: www.revivingliberalism.com

Anti-Americanism Explained June 30, 2010

Posted by Dwight Furrow in Culture, Dwight Furrow's Posts, Food and Drink.
1 comment so far

Via Huffington Post

The video below has been making the rounds, and for good reason. In it, an American tourist (YouTube user simoneharuko) visits what she calls, “pretty much the coolest grocery store of all time” in Alexanderplatz, Berlin, and found something she had never seen before: an American ethnic section. As Eater pointed out, “most of this stuff seems to be there for expatriates who want brands they recognize.”

Here is the video.


And here is the list of products included in the “U.S.A” ethnic food section:

  • Swiss Miss Hot Chocolate mix
  • Cans of V8
  • Hershey’s Chocolate Syrup (original and “Shell” style)
  • Maple syrup
  • Regular old syrup
  • Betty Crocker Baking Mixes: Blueberry, Chocolate Chip Cookie, Brownie, Cake, Muffins, Bisquik,
  • Betty Crocker frosting: Vanilla and Chocolate
  • Five (5) Pain Is Good Hot Sauce varieties
  • Jim Beam Barbecue Sauce, Steak Sauce, Hot Sauce, and Mustard
  • Four (4) Jack Daniel’s Barbecue Sauces
  • Paul Prudhomme “Magic” Seasoning blends
  • Paul Newman salad dressings
  • Hellman’s Mayonnaise
  • Wish Bone Blue Cheese Dressing
  • Marshmallow Fluff (original and strawberry)
  • Kraft Macaroni & Cheese
  • Cheese Zip (cheese whiz)
  • Head Country Barbecue Sauces
  • Bull’s Eye Barbecue Sauce
  • Hunt’s Barbecue Sauces
  • Cheddar, Nacho, and Jalapeno-flavored squeeze-bottled cheese
  • Mustards
  • Heinz sweet relish
  • Crisco shortening
  • Marshmallows
  • Campbell’s Soups

There is not much here worth eating.

I’ve just returned from Spain and Portugal, and I have spent some time in Italy and Germany as well. If there is one thing Europeans do well it is eat. If you are an ex-pat American living in Europe do you really pine for this stuff? You really want cheese whiz when you can have a nice Allgau Emmentaler?

Of course, our ethnic food sections don’t look much like a real market either. But much of what one finds there is at least edible and sometimes interesting.

A box of Kraft macaroni and cheese might induce all manner of speculation about the “American character” deficit.

book-section-book-cover2 Dwight Furrow is author of

Reviving the Left: The Need to Restore Liberal Values in America

For political commentary by Dwight Furrow visit: www.revivingliberalism.com

Is the Future Over? June 20, 2010

Posted by Dwight Furrow in Culture, Dwight Furrow's Posts, Science, Technology.
Tags: ,
1 comment so far

William Gibson thinks maybe so:

Say it’s midway through the final year of the first decade of the 21st Century. Say that, last week, two things happened: scientists in China announced successful quantum teleportation over a distance of ten miles, while other scientists, in Maryland, announced the creation of an artificial, self-replicating genome. In this particular version of the 21st Century, which happens to be the one you’re living in, neither of these stories attracted a very great deal of attention.

In quantum teleportation, no matter is transferred, but information may be conveyed across a distance, without resorting to a signal in any traditional sense. Still, it’s the word “teleportation”, used seriously, in a headline. My “no kidding” module was activated: “No kidding,” I said to myself, “teleportation.” A slight amazement.

The synthetic genome, arguably artificial life, was somehow less amazing. The sort of thing one feels might already have been achieved, somehow. Triggering the “Oh, yeah” module. “Artificial life? Oh, yeah.”

New devices are cool; new human possibilities with new meaning? Eh. Not so much.

Alvin Toffler warned us about Future Shock, but is this Future Fatigue? For the past decade or so, the only critics of science fiction I pay any attention to, all three of them, have been slyly declaring that the Future is over. I wouldn’t blame anyone for assuming that this is akin to the declaration that history was over, and just as silly. But really I think they’re talking about the capital-F Future, which in my lifetime has been a cult, if not a religion. People my age are products of the culture of the capital-F Future. The younger you are, the less you are a product of that. If you’re fifteen or so, today, I suspect that you inhabit a sort of endless digital Now, a state of atemporality enabled by our increasingly efficient communal prosthetic memory. I also suspect that you don’t know it, because, as anthropologists tell us, one cannot know one’s own culture.

The Future, capital-F, be it crystalline city on the hill or radioactive post-nuclear wasteland, is gone. Ahead of us, there is merely…more stuff. Events. Some tending to the crystalline, some to the wasteland-y. Stuff: the mixed bag of the quotidian.

The future used to be a place of radically new promises and perils, game changers made possible by science. But he welcomes this new realism.

This newfound state of No Future is, in my opinion, a very good thing. It indicates a kind of maturity, an understanding that every future is someone else’s past, every present someone else’s future. Upon arriving in the capital-F Future, we discover it, invariably, to be the lower-case now.

As he points out (and he should know), science fiction is more about present hopes and fears that it is about the future.

If you are a William Gibson fan, his comments on his own writing career and his forthcoming new book are quite interesting.

If Pattern Recognition was about the immediate psychic aftermath of 9-11, and Spook Country about the deep end of the Bush administration and the invasion of Iraq, I could say that Zero History is about the global financial crisis as some sort of nodal event, but that must be true of any 2010 novel with ambitions on the 2010 zeitgeist. But all three of these novels are also about that dawning recognition that the future, be it capital-T Tomorrow or just tomorrow, Friday, just means more stuff, however peculiar and unexpected. A new quotidian. Somebody’s future, somebody else’s past.

book-section-book-cover2 Dwight Furrow is author of

Reviving the Left: The Need to Restore Liberal Values in America

For political commentary by Dwight Furrow visit: www.revivingliberalism.com

The Gaze of Empathy June 1, 2010

Posted by Nina Rosenstand in Culture, Ethics, Nina Rosenstand's Posts, Philosophy of Human Nature, Teaching.
Tags: , , ,

In the midst of scientific reports that humans in general are far more empathetic than selfish (at least by nature) we all of a sudden hear that college students are less empathetic now than in generations past.

 Researchers analyzed data from studies conducted between 1979 and 2009, and found the sharpest drop in empathy occurred in the last nine years.

 For instance, today’s students are less likely to agree with statements like, “I sometimes try to understand my friends better by imagining how things look from their perspective” and “I often have tender, concerned feelings for people less fortunate than me.”

According to one of the lead researchers, Ed O’Brien, “It’s harder for today’s college student to empathize with others because so much of their social lives is done through a computer and not through real life interaction.”

So some researchers blame computers and the social sites like, yes, Facebook. You can communicate about yourself endlessly, without being expected to reciprocate (“Thanks for asking about my day—how was yours?”). But one comment, from “Cricket,” on the article quoted above really adds something to the discussion:

A fellow storyteller noticed that this year’s Master of Library Science class in storytelling (don’t laugh — good storytelling and story collecting involves a huge amount of research) didn’t make eye contact. This is an affluent group of white females — a culture in which eye contact has always been considered appropriate. (In some cultures it’s an invasion of privacy.) After discussing it with them, she learned they didn’t realize eye contact was appropriate. I remember parents and teachers used to insist on it: “Look at me when I’m talking to you / when you’re talking to me.” Since then, they have said that her class is more friendly than others, and it’s the only class where they socialize together after class.

That comment triggered a veritable Aha-moment for me, because I have observed the same phenomenon in my classes, increasingly, over the past decade: there are some students who hide and avoid eye contact because they haven’t studied the material. That’s nothing new—we’ve all done that when we were in school. And then there are students from some non-Western cultures who may have been taught that it is rude to look a person of authority straight in the eye. So cultural differences can account for some incidents.  But when good students with a Western cultural background are avoiding eye contact, it gets interesting. Increasingly I have students who bring their laptops or their Kindle devices to class. Some instructors prohibit such devices, I don’t—yet. I just ban non-class-related activity. And what I see is those students—the good ones— being utterly absorbed by what it is they’re watching, or doing, on the screen. Usually it’s note taking, and not game-playing (and I check!)…. But even when you take notes, you’re supposed to look up once in a while and look at the instructor performing his or her stand-up routine there in front of you. We’re not just standing up there at the whiteboard to repeat a lesson, like Tivo on a 3-D TV—we’re actually there to create a teaching moment from scratch every day, and some of it is improv! What creates the most significant difference between a classroom experience and an online course is the face-to-face encounter with questions and ideas. But without the basic eye contact participation you might as well be at home behind your screen, taking an online course (which has its merits, but the face-to-face learning moment isn’t one of them). When I have told my students that I expect eye contact from them, they have—to my enormous consternation—been surprised. And now  I realize that they simply may not be accustomed to eye contact being appropriate, because of having grown up frequently—maybe even primarily— communicating electronically with peers. The first generation in the history of humanity where eye contact is no longer the first clear human outreach? Now that is fundamentally frightening. The gaze of The Other is fundamental to many 20th century philosophies, in particular Sartre’s, who sees it as (by and large) a competition,  and Levinas’s, who sees it as humanity looking right at you, asking for your empathy. Look at Vermeer’s  “Girl with the Pearl Earring,” the picture I use for my “Gravatar,” as well as for the cover of my book, The Human Condition:

detailed view of face

This is the face of the Other. She is looking right at you, with the gaze of a human being, real and timeless. She expects a response. But if we withhold our gaze and think that’s normal, well, then there is no empathy coming forth.

More on Facebook and Privacy May 17, 2010

Posted by Dwight Furrow in Culture, Dwight Furrow's Posts, Ethics, Technology.
Tags: , , ,

Nina’s post about privacy on Facebook thoroughly covered the issue.

But Facebook’s habit of thumbing their nose at privacy concerns provoked a couple of interesting posts on Crooked Timber as well.

Apparently, Mark Zuckerberg, founder and owner of Facebook, is quoted in a forthcoming book making some dismissive remarks about privacy concerns:

“You have one identity,” he emphasized three times in a single interview with David Kirkpatrick in his book, “The Facebook Effect.” “The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly.” He adds: “Having two identities for yourself is an example of a lack of integrity.”

As Henry at Crooked Timber points out:

Facebook appears to be deliberately and systematically making it harder and harder for people to vary their self-presentations according to audience. I think that this broad tendency (if it continues and spreads) impoverishes public life.

Kirkpatrick explains what is wrong with this:

Individuals are constantly managing and restricting flows of information based on the context they are in, switching between identities and persona. I present myself differently when I’m lecturing in the classroom compared to when I’m have a beer with friends. I might present a slightly different identity when I’m at a church meeting compared to when I’m at a football game. This is how we navigate the multiple and increasingly complex spheres of our lives.

And Kieren Healy argues that having integrity is not about having a consistent self-presentation:

Having an identity and having a secret are in fact quite closely related, and not just for superheroes. Here’s a piece from the Times from the pre-FB era that makes the point:

“In a very deep sense, you don’t have a self unless you have a secret, and we all have moments throughout our lives when we feel we’re losing ourselves in our social group, or work or marriage, and it feels good to grab for a secret, or some subterfuge, to reassert our identity as somebody apart,” said Dr. Daniel M. Wegner, a professor of psychology at Harvard. … Psychologists have long considered the ability to keep secrets as central to healthy development. Children as young as 6 or 7 learn to stay quiet about their mother’s birthday present. In adolescence and adulthood, a fluency with small social lies is associated with good mental health. … The urge to act out an entirely different persona is widely shared across cultures as well, social scientists say, and may be motivated by curiosity, mischief or earnest soul-searching. Certainly, it is a familiar tug in the breast of almost anyone who has stepped out of his or her daily life for a time, whether for vacation, for business or to live in another country. “It used to be you’d go away for the summer and be someone else, go away to camp and be someone else, or maybe to Europe and be someone else” in a spirit of healthy experimentation, said Dr. Sherry Turkle, a sociologist at the Massachusetts Institute of Technology. Now, she said, people regularly assume several aliases on the Internet, without ever leaving their armchair …”

This idea that it is dishonest or insincere to withhold information about oneself is fundamentally mistaken. Social life isn’t enhanced by brutal honesty and integrity is not about having a single self-presentation.

Integrity is a matter of consistently acting on the basis of one’s system of values and sustaining the value of the variety of things we care about. Not only is that consistent with having different self-presentations in different contexts—integrity requires a variety of self-presentations.

If I value my students and their education some facets of my private life will be irrelevant or inimical to their development. And if I value my family relationships, my self-presentation as a teacher must at times be suppressed.

But Zuckerberg does provide us with an example of the lack of integrity. As one commentator on Crooked Timber puts it:

Hey, you know what really is a lack of integrity is trying to conceal very obvious monetary motives behind a veneer of moralizing. How much more honest would it be if Zuckerberg just came out and said, yeah, we don’t give a damn about your privacy, this is how we’re going to make money. Then we could all know where we stand. The worst aspect of all of this is the pretense that anyone on Facebook’s corporate end cares about this and their projection of their own moral deficiencies onto people with legitimate privacy concerns. Not that I’m, like, surprised or anything.

It is easy for a straight, privileged man like Zuckerbeg to extol the virtues of a single identity while hiding behind his body guards and wealth. Women and anyone from marginalized social groups cannot afford to be so sanguine about privacy. But of course straight, privileged men tend to think they are the only people who matter.

book-section-book-cover2 Dwight Furrow is author of

Reviving the Left: The Need to Restore Liberal Values in America

For political commentary by Dwight Furrow visit: www.revivingliberalism.com


Facebook–Where Everyone Knows Your Name May 6, 2010

Posted by Nina Rosenstand in Culture, Current Events, Ethics, Nina Rosenstand's Posts.
Tags: , , , ,

An Indiana woman’s home was burglarized recently, while she was at a concert. The culprits turned out to be Facebook “friends”;  she had announced, online,  that she’d be at the concert. With friends like that, we surely don’t need any enemies, as the old saying goes.  Facebook, along with MySpace and Twitter, is one of the institutions in which a generation may see itself mirrored and reach self-comprehension, and it is a fascinating phenomenon, socially, psychologically and philosophically. Most of my students, and most of my friends’ kids, have Facebook pages, and I see the amazing accumulation of “friends” displayed on their sites—in some cases thousands.  I think it probably compares to “counting coup” in the Old West, a new form of collection mania, or transition rites of adolescence (such as collecting phone numbers that you’ll never, ever call—as if they’re proofs of friendship). I assume that everybody knows this is just a new term for temporary, occasional contacts, and not genuine friends, but even so, words are seductive, and some of these contacts get to know a wealth of details about each other that I (coming from a different time and place—I’m kind of a time traveler. We all are, the older we get) would reserve for perhaps only two or three people in my entire life. A friend, to me, is someone who you do activities with (according to Deborah Tannen: the male friendship model), and/or talk about big and small things with (Tannen: the female model), or both. It doesn’t have to involve proximity: a friend can be a good friend, even if you don’t see them for years.  Online/phone contact makes up for physical presence in many of our current friendships. On the other hand, people you see every day and deal with on a superficial level, are acquaintances, not friends. So I am not a big fan of the friending phenomenon online, or the social websites where some people spend part of their social life—perhaps even all of it.

However, I, too, have a Facebook page, and there is nothing, absolutely nothing personal on it, on or behind the Wall. I don’t check it very often, because I don’t maintain it to accumulate friends. From time to time I get “friend” requests from strangers, and I ignore them. But quite often I get such requests from students—former and present. I appreciate the (presumably) amicable intent, and I don’t want to seem rude and alienate nice people—but on the other hand, sharing personal information with students  is downright unprofessional for an instructor, and may even be construed as professionally unethical:  are you more “friends” with students on your Facebook page than with the students who aren’t on your “friends” list? That could lead to the suspicion of preferential treatment of some students. In addition, it may in some cases invite trouble: some people can’t tell the difference between a real Friend and a Facebook contact, and they don’t know where the line should be drawn. So I don’t add anyone as a friend who is not either a real old face-to-face friend or a colleague I know personally, and on my page I state specifically that I don’t add students as friends.

But this issue goes way beyond such personal choices in changing times: it illustrates the new questions arising about how much and when to make oneself available to friends, to students, colleagues, teachers, and the world in general—because this is not an innocent world. Years ago, when I was the same age as students now collecting friends on Facebook, we loved Carlos Castaneda’s Don Juan books, about the old brujo teaching the young anthropologist the secrets of power (as some of us suspected, most of those books were, shall we say, fantasies rather than actual anthropological reporting). One ground rule was, loosely paraphrased, Don’t give away too much information about yourself. The more you spread your information out there, the less control you have over your life. Now Carlos wanted to use this rule for a deeper understanding and use of the powers of the mind, but I’d say that it is a pretty good rule to bring back in these days when privacy is becoming a thing of the past. Our intimate information will soon be out there, anyway, whether it be through ubiquitous webcams, health records online, tax records online, or other means. And enterprising people—with or without political and legal legitimacy—will be able to mine all that information for power and profit. It is already happening. Why add to it by sharing details about your life, simply for narcissistic reasons? Facebook is being challenged by U.S. lawmakers as to changes in its privacy policy, which would allow  Facebook members other than your friends to access personal information about you—but even if Facebook restricts the access to “Friends,” it would not be much of a protection, when people add “friends” indiscriminately as a form of collecting trophies, and share details about their lives with untold strangers because it feels good. In addition, the phenomenon of phishing is getting increasingly sophisticated. This excerpt comes from a blogger who is a regular user of Facebook, Dan Tynan from ITWorld:

I still have a dozen other group invitations from various friends. I don’t trust any of them now. I don’t even want to click “ignore” on the odd chance it will somehow corrupt my account and spam all 700-odd people in my FB posse. So this spam attack has effectively killed that feature for me. And if spammers can manipulate Facebook’s group recommendations that easily, imagine what they could do to Facebook’s plan to butter “Like” buttons all over the Web.

We’ll see much more of this erosion of privacy in the future. So your old Professor Cautious recommends: think twice before you share your personal information with selected friends and accumulated strangers on Facebook and elsewhere in Cyberspace…

PS  The latest development from The Atlantic: The Facebook Privacy Wars Heat Up.

PPS May 11: In case you were in doubt: here’s what’s been going on since December, according to Wired Epicenter (long and informative article):