jump to navigation

Russell Means in Memoriam: Mitaku Oyasin October 25, 2012

Posted by Nina Rosenstand in Culture, Nina Rosenstand's Posts, Philosophy of Gender, Political Philosophy, Teaching.
Tags: , , ,
1 comment so far

Many of my students have heard me talk about Russell Means over the years. A complex man in complicated times, believing he saw a simpler solution to the culture war he believed still existed between the American Indian communities and mainstream America. A man who had his own vision, and sometimes version, of history. Russell Means passed away October 22 from throat cancer, and the spectrum of Americans in the public eye has lost a unique dimension.

So who was Russell Means? An Oglala Sioux Indian, with many different facets to his life. The American Indian activist Means was the chief organizer of the second Wounded Knee uprising in 1973, and was involved in various American Indian movements. His activist accomplishments are outlined here. The politician Russell Means ran as the vice presidential candidate to Larry Flynt’s presidential candidacy in 1983, and ran against Ron Paul as the Libertarian presidential candidate in 1987 (Paul won). The actor Russell Means was a significant Hollywood presence, playing the iconic character Chingachgook in the 1992 movie The Last of the Mohicans—and provided a special ambiance to the 2004 season of the HBO series Curb Your Enthusiasm. His film credits are numerous, as you can see on the Wikipedia site. The businessman Russell Means ran a website where he told his story, and the story of the plight of the American Indian, and sold CDs, art and t-shirts in support of the American Indian cause. And (this is where my connection to him comes in) the lecturer Russell Means would travel around the United States to college campuses, educating new generations of students to what became his own version of American Indian history. He was a speaker at San Diego Mesa College in the 1990s, and that was where I met him. (And I should also mention: the private citizen Russell Means had domestic problems, and was arrested for assault and battery toward his father-in-law back on the Navajo reservation. And he had more severe legal problems earlier on when he was indicted for murder on a reservation, but acquitted.)  And his image is familiar to Andy Warhol fans–Warhol painted Means 18 times.

But back to the lecturer persona. I wish I could tell you exactly when Means visited the Mesa College campus, but I don’t see any references to his visit on the Mesa College website, and all I remember is that it was in the early years when I was first teaching Philosophy of Women; so: the late 1990s. He was scheduled to give a talk in the room behind the cafeteria, and I decided to bring one of my classes to the talk. The room was packed with people, sitting on chairs, standing, sitting on the floor, and Means, 6 ft tall or more, hair in braids, was a very imposing sight, and a gifted speaker. Well, he kept on talking past the class period, and I ran back to the classroom and collected the students waiting for my next class, and brought them down to the cafeteria; Means was still talking. And he kept on talking for a good four hours, about what it is like to be an American Indian, about his battles and his careers, about American Indian traditions, about discrimination and near-genocide, and about the term “Indian” itself. He shocked most of us PC college people by declaring that he didn’t mind being called an Indian, and that the proper term to use was either “American Indian” or the tribal name such as Oglala Sioux Indian, but never Native American. He said it was a term invented by Washington for funding/political purposes (which is why I, to this day—and my students and readers of my books can verify that—always use the term American Indian). And the term Indian itself? Wikipedia (below) got his argument right, but whether it is also historically correct is something I can’t determine (and I have yet to find a historian who agrees with Means): 

Since the late 20th century, there has been a debate in the United States over the appropriate term for the indigenous peoples of North America. Some want to be called Native American; others prefer American Indian. Means said that he preferred “American Indian”, arguing that it derives not from explorers’ confusion of the people with those of India, but from the Italian expression in Dio, meaning “in God”.[17][18] In addition, Means noted that since treaties and other legal documents in relation to the United States government use “Indian”, continuing use of the term could help today’s American Indian people forestall any attempts by others to use legal loopholes in the struggle over land and treaty rights.

In addition he talked about what I referred to above as a culture war between the mainstream Euro-American tradition and the American Indian peoples. He said, the Euro-Americans are a culture of swords and violent domination, while the American Indians are a sharing culture, a culture of partnerships symbolized by a bowl or drinking cup. At that point my ears perked up, because I had just been reading Riane Eisler’s book The Chalice and the Blade, about the ancient gynocentric (female-oriented) partnership cultures of Old Europe symbolized by the chalice, vs the invading patriarchal dominator cultures worshipping the “lethal Blade.” So I asked Means if there was a connection, and he said, “Yes, there is this woman who wrote a book about the same phenomenon in Europe, and it fits the situation in this country between the indigenous peoples and the invaders. ” From from a gender-philosophical standpoint I found it fascinating that he would adopt the Eisler theory as an explanatory model for the American Indian culture (even if Eisler is also considered an activist and in no way a historian).  I tend to be skeptical of such arguments which tend to simplify very complex matters, and fan an ongoing (and possibly outdated) tension and enmity, and I see no reason to find Eisler’s theory more historically accurate because of the fact that Means liked it, but I found the confluence of research, activism and tradition to be intriguing.  If you want to experience him talking about the topic of partnership cultures and gynocentric (matriarchal) cultures, watch this YouTube clip.

Means ended his 4-hour lecture at Mesa by teaching his audience the end of every Oglala and Lakota prayer: two words that embrace all of creation, everywhere and for all time: Mitaku Oyasin: We are all related. And while much of his lecture was, to a scholar such as myself, a creative journey into personal interpretations rather than facts (and sometimes interpretations that were hard to swallow), his passionate sincerity rang true, and has stayed with me as a cherished memory. And his prayer still comes to mind sometimes when I’m looking for connections and common ground rather than analytical differences. So: Thanks for the lessons, Russell, and Mitaku Oyasin…

Cross-posted at Rosenstand’s Alternative Voice blog for Rosenstand’s Mesa students.


Facebook Revisited–New Policies for Professors April 25, 2011

Posted by Nina Rosenstand in Culture, Education, Nina Rosenstand's Posts, Teaching.
Tags: , ,

It’s taken a while, but there is finally a growing realization among professors that “friending” their students is not such a good idea.  And school administrators are certainly also catching on. This from The Guardian (UK):

Teachers are being warned not to “friend” pupils on Facebook amid concerns over the blurring of boundaries between school staff’s professional and private lives.

In a fringe meeting at the National Union of Teachers’ annual conference on Sunday, teachers were told that pupils are getting access to potentially embarrassing information about teachers on their Facebook pages, while headteachers and school governors are increasingly using information posted on social networking sites to screen candidates for jobs.

Karl Hopwood, an internet safety consultant and former headteacher, told the NUT fringe meeting: “The line between private life and professional life is blurred now because of social media.”

The same concerns extend to the world of college professors and students, sharing a daily environment—but on a professional level, not a personal one. That distinction needs to be reestablished in this age of the social media, regardless of what Mark Zuckerberg may think about the declining value of the concept of privacy. I talked about the subject on this blog last year, where I explained my take on professors friending students (and got a great deal of very interesting comments), and my concerns then have only been confirmed in the past year. In the real world you have to be able to distinguish between who is your colleague, who is your client (for lack of a better word), who is your acquaintance, and who is your Friend…and then all the others who are just faces on Facebook.

Welcome to the Future of Academia October 25, 2010

Posted by Nina Rosenstand in Culture, Education, Nina Rosenstand's Posts, Teaching.
Tags: , ,
add a comment

Following up on the “Sad Stats” post below, perhaps it isn’t too hard to discern a trend here, in this excerpt from a recent article in the Wall Street Journal, “Putting a Price on Professors”:

As budget pressures mount, legislators and governors are increasingly demanding data proving that money given to colleges is well spent. States spend about 11% of their general-fund budgets subsidizing higher education. That totaled more than $78 billion in fiscal year 2008, according to the National Association of State Budget Officers.

The movement is driven as well by dismal educational statistics. Just over half of all freshmen entering four-year public colleges will earn a degree from that institution within six years, according to the U.S. Department of Education.

And among those with diplomas, just 31% could pass the most recent national prose literacy test, given in 2003; that’s down from 40% a decade earlier, the department says.

“For years and years, universities got away with, ‘Trust us—it’ll be worth it,'” said F. King Alexander, president of California State University at Long Beach.

But no more: “Every conversation we have with these institutions now revolves around productivity,” says Jason Bearce, associate commissioner for higher education in Indiana. He tells administrators it’s not enough to find efficiencies in their operations; they must seek “academic efficiency” as well, graduating more students more quickly and with more demonstrable skills. The National Governors Association echoes that mantra; it just formed a commission focused on improving productivity in higher education.

This new emphasis has raised hackles in academia. Some professors express deep concern that the focus on serving student “customers” and delivering value to taxpayers will turn public colleges into factories. They worry that it will upend the essential nature of a university, where the Milton scholar who teaches a senior seminar to five English majors is valued as much as the engineering professor who lands a million-dollar research grant.

And they fear too much tinkering will destroy an educational system that, despite its acknowledged flaws, remains the envy of much of the world. “It’s a reflection of a much more corporate model of running a university, and it’s getting away from the idea of the university as public good,” says John Curtis, research director for the American Association of University Professors.

Efforts to remake higher education generally fall into two categories. In some states, including Ohio and Indiana, public officials have ordered a new approach to funding, based not on how many students enroll but on what they accomplish.

You need to read the rest of the article. It’s too long to copy here, but I’ll end this post with the voice of one professor who tries to argue why it can’t all be boiled down to retention, exams, funding and profit:

Mr. Dunning teaches two classes a semester and has won several teaching awards. His salary of about $90,000 a year also covers the time he spends researching Russian literature and history. His most recent book argues that Alexander Pushkin’s drama “Boris Godunov” was a comedy, not a tragedy.

 Mr. Dunning says his scholarly work animates his teaching and inspires his students. “But if you want me to explain why a grocery clerk in Texas should pay taxes for me to write those books, I can’t give you an answer,” he says.

His eyes sweep his cramped office, lined with books. Then Mr. Dunning finds his answer. “We’ve only got 5,000 years of recorded human history,” he says, “and I think we need every precious bit of it.”

Sad Stats About Community Colleges October 21, 2010

Posted by Nina Rosenstand in Education, Nina Rosenstand's Posts, Teaching.

For those of us who spend our professional everyday lives teaching at 2-year/community colleges in California it usually gives us a warm and fuzzy feeling to think about the importance of our services rendered: a ground-level quality education (mostly!) that allows large numbers of hopeful, skillful young and youngish people to pursue their dream and seek further higher learning, good careers, and a fulfilling life, by transferring to 4-year colleges. We believe we add to that  elusive concept of flourishing that happiness is all about, by channeling students into further academic studies. But apparently we need to do a reality check. According to a study by the Institute for Higher Education Leadership & Policy at Cal State Sacramento,

Seventy percent of students seeking degrees at California’s community colleges did not manage to attain them or transfer to four-year universities within six years, according to a new study that suggests that many two-year colleges are failing to prepare the state’s future workforce.

Conducted by the Institute for Higher Education Leadership & Policy at Cal State Sacramento, the report, released Tuesday, found that most students who failed to obtain a degree or transfer in six years eventually dropped out; only 15% were still enrolled.

In addition, only about 40% of the 250,000 students the researchers tracked between 2003 and 2009 had earned at least 30 college credits, the minimum needed to provide an economic boost in jobs that require some college experience.

And the affirmative action efforts seem not to be working, either:

There were also significant disparities in the outcomes of black and Latino students. Only 26% of black students and 22% of Latino students had completed a degree or certificate or transferred after six years, compared to 37% of whites and 35% of Asian Pacific Islanders.

Latino students were half as likely as white students to transfer to a four-year university — 14% versus 29% — and black students were more likely than others to transfer to private, for-profit institutions without obtaining the credits needed for admission to the University of California or Cal State.

So why is this happening?

Students face many barriers, including not being prepared for college-level study, as well as financial, work and family constraints. Black and Latino students, the study notes, are more likely to have attended segregated and overcrowded elementary and high schools and to have had less access to highly qualified teachers and counselors. But some community college campuses do a better job than others, and the research found that students who pass college-level math and English early in their college careers and complete at least 20 credits in their first year of enrollment had higher rates of success.

Ah, here comes the recommendation:

The study encourages community colleges to improve data collection about enrollment patterns and student progress and also calls for a new state funding model that rewards schools when students complete degrees and transfer.

The community colleges already are putting more emphasis into ensuring that students master basic math and English skills early in their college careers, she said. Legislation signed this year also establishes an associate’s degree that will provide more seamless transfer of community college students to UC and Cal State University. Under another new law, the community colleges’ Board of Governors will create a task force to consider ways to improve retention and degree attainment.

This sounds proactive and nice, and in many ways it is, but those of us in the CC trenches read an additional subtext: a demand for even more faculty time spent on managing and monitoring students, in a field where the workload is already  heavy (5 classes taught per semester, plus school-related work). I will need to hear more about that report—but off the cuff it seems to me to be an argument for more faculty involvement in class management and paperwork rather than more emphasis on what we are good at doing, and like to do, teaching good classes and fire up fresh minds, thus improving retention and hopes of transfer. So now I’m really depressed.

Study Hard–But How? September 20, 2010

Posted by Nina Rosenstand in Nina Rosenstand's Posts, Teaching.
Tags: , ,
1 comment so far

All you good people who started fall classes recently—you think you know how to study? Well, maybe your teachers and professors have been wrong all along. According to psychologists, studying in a quiet room without any distractions doesn’t give you particularly good memory retention; neither does studying the same material twice. It’s good for short-term memorization, but not for long term, and as we all know, you’re supposed to learn for life, not for school—non scholae, sed vitae. In a recent New York Times piece by Benedict Carey, “Forget What You Know about Good Study Habits,” we learn that some of the things we instructors have bent over backwards trying to incorporate into our classroom style is, apparently, nothing but BS.

Take the notion that children have specific learning styles, that some are “visual learners” and others are auditory; some are “left-brain” students, others “right-brain.” In a recent review of the relevant research, published in the journal Psychological Science in the Public Interest, a team of psychologists found almost zero support for such ideas. “The contrast between the enormous popularity of the learning-styles approach within education and the lack of credible evidence for its utility is, in our opinion, striking and disturbing,” the researchers concluded.

So what really works? In the classroom setting, the student should be exposed to several kinds of stimuli:

Varying the type of material studied in a single sitting — alternating, for example, among vocabulary, reading and speaking in a new language — seems to leave a deeper impression on the brain than does concentrating on just one skill at a time.

In addition, the article points out that what really works, after the student has studied the material once, is being faced with a hard test afterwards!

Of course, one reason the thought of testing tightens people’s stomachs is that tests are so often hard. Paradoxically, it is just this difficulty that makes them such effective study tools, research suggests. The harder it is to remember something, the harder it is to later forget.

Pretty interesting article. The ultimate learning environment is one of a multitude of sensory impressions? Such as, a classroom with a window to the outside, a textbook/laptop in front of you, a bit of PowerPoint stuff once in a while, some discussion, some lecturing? Sounds like a normal, modern classroom setting to me, so perhaps we’re doing something right. But I can’t really warm up to these sweeping ideas that Everything You Think You Know is Wrong. Learning has been going on for a long time in the history of humanity, and folk wisdom tells us that there really is no shortcut. Learning takes time and effort. Maybe not the kind of effort we used to think was necessary, but a generation of school kids has been guinea pigs for new learning methods that apparently aren’t so useful after all, either. Some people just have the knack for learning anything and everything, and the rest learn how to learn when exposed to the right material, the right teacher, and the right motivation. And as we all know, teachers have vastly different teaching styles. And if you, the motivated philosophy student, can hook up with the instructor with the teaching style that appeals  to you, and challenges you the most, then  good for you. But regardless of whether you’ve found the perfect stand-up philosopher or not, you’d better just expect the learning situation to consist of mainly reading and analyzing texts, and being tested on them, regardless of whether the learning process is accompanied by nice views, soft music, jokes in a foreign tongue, or whatever else the psychologists can think of…

Intellectual Illness September 16, 2010

Posted by Dwight Furrow in Dwight Furrow's Posts, Philosophy, Teaching, Uncategorized.
Tags: ,

Laurie Findrich reported on a classroom experience that indicates a pervasive intellectual illness:

The other day, during a class I was teaching on Leonardo da Vinci, the subject of how we know what we know about the artist came up. During the discussion, a student casually asserted, without rancor or even a touch of political commentary, that he thought it a “good possibility that Obama was a Muslim.” Another student nodded in agreement. Might be true, might not, they seemed to be saying. I got the distinct feeling that they thought that in their openness to the “possibility” that Obama was a Muslim, they were demonstrating their general openness to ideas—something I, as their professor, would be pleased to see. […]
Like everyone who pays attention to things, I know very well about the Pew study from this past August showing nearly one in five Americans think Obama is a Muslim. Why shouldn’t a few of those one in five Americans show up in a college classroom?
The students I encounter in my courses generally work hard and want to do well in college. They are intelligent. They want to learn things. But the “critical thinking” that’s been touted for the past several years seems to be yielding students who think that it’s a waste of time to think about things in terms of whether they are true or false.  Instead, many seem to be learning that the proper and best attitude toward everything they encounter is doubt, and that nonstop doubt is the equivalent of being open-minded.
Some of the blame for this can be laid at the feet of modernism, which celebrated doubt. Modernist doubt grew out of philosophical skepticism, on the one hand, and on the other hand, the disastrous 20th century, which took the wind out of the sails of Western civilization in the minds of many. (Two vicious world wars that killed millions, it seems, caused certain sensitive people to question the civilization that brought them on.) Sometimes, the reaction to modern doubt is to retreat to certainty—God said it, I believe it, that settles it.  Oftentimes, the reaction is increased tolerance—openness to new ideas and a tolerance of others who are different. With many college students today, however, it seems to mean simply giving credence to anything, no matter how absurd. In absorbing the lesson that there are limits to reason, they’re concluding that more or less nothing can be ruled out by reason. Their philosophy can be summed up in these words:  “I’m just saying, who knows?”
They’re cool with amorphous ideas that contain no rigor. Why try to figure it out? Maybe some people can talk to the dead; global warming may or may not be true; Princess Diana possibly was murdered; maybe 400 mcg’s of folic acid, taken three times a day, makes you smarter. Or maybe Obama is a Muslim. Who knows?

Skepticism can be aid to critical thinking. It is essential to philosophy and to science. It drives inquiry forward by sustaining the uncomfortable feeling of doubt and discouraging the premature leap to an unwarranted conclusion. But it is only a virtue if it is accompanied by a fierce commitment to seek the truth, as it was for Socrates or Descartes. Without a commitment to truth, skepticism is toxic, an invitation to intellectual laziness, boredom, and ultimately a stultifying inability to act.

Findrich is right to be concerned. […]

While higher-education critics are diligently trying to figure out how much students are not learning and how much it’s costing taxpayers (and the students and their parents) not to learn anything, we’re facing a slow meltdown of knowledge—an insidious, ongoing event, on a colossal scale, whose consequence is unpredictable. As surely as knowledge disappeared in the past by the burning of the great library at Alexandria, and the loss of ancient languages (in Europe in the late middle ages), knowledge can be destroyed by the attitude of indifference. If knowledge collapses into nothing but, “Maybe, who knows?” we’ll end up longing for the glory days when students lovingly visited Wikipedia in order to find the truth.

I am afraid we often encourage the skepticism but leave out the bit about the pursuit of truth. That is a tragic error.

book-section-book-cover2 Dwight Furrow is author of

Reviving the Left: The Need to Restore Liberal Values in America

For political commentary by Dwight Furrow visit: www.revivingliberalism.com

The Gaze of Empathy June 1, 2010

Posted by Nina Rosenstand in Culture, Ethics, Nina Rosenstand's Posts, Philosophy of Human Nature, Teaching.
Tags: , , ,

In the midst of scientific reports that humans in general are far more empathetic than selfish (at least by nature) we all of a sudden hear that college students are less empathetic now than in generations past.

 Researchers analyzed data from studies conducted between 1979 and 2009, and found the sharpest drop in empathy occurred in the last nine years.

 For instance, today’s students are less likely to agree with statements like, “I sometimes try to understand my friends better by imagining how things look from their perspective” and “I often have tender, concerned feelings for people less fortunate than me.”

According to one of the lead researchers, Ed O’Brien, “It’s harder for today’s college student to empathize with others because so much of their social lives is done through a computer and not through real life interaction.”

So some researchers blame computers and the social sites like, yes, Facebook. You can communicate about yourself endlessly, without being expected to reciprocate (“Thanks for asking about my day—how was yours?”). But one comment, from “Cricket,” on the article quoted above really adds something to the discussion:

A fellow storyteller noticed that this year’s Master of Library Science class in storytelling (don’t laugh — good storytelling and story collecting involves a huge amount of research) didn’t make eye contact. This is an affluent group of white females — a culture in which eye contact has always been considered appropriate. (In some cultures it’s an invasion of privacy.) After discussing it with them, she learned they didn’t realize eye contact was appropriate. I remember parents and teachers used to insist on it: “Look at me when I’m talking to you / when you’re talking to me.” Since then, they have said that her class is more friendly than others, and it’s the only class where they socialize together after class.

That comment triggered a veritable Aha-moment for me, because I have observed the same phenomenon in my classes, increasingly, over the past decade: there are some students who hide and avoid eye contact because they haven’t studied the material. That’s nothing new—we’ve all done that when we were in school. And then there are students from some non-Western cultures who may have been taught that it is rude to look a person of authority straight in the eye. So cultural differences can account for some incidents.  But when good students with a Western cultural background are avoiding eye contact, it gets interesting. Increasingly I have students who bring their laptops or their Kindle devices to class. Some instructors prohibit such devices, I don’t—yet. I just ban non-class-related activity. And what I see is those students—the good ones— being utterly absorbed by what it is they’re watching, or doing, on the screen. Usually it’s note taking, and not game-playing (and I check!)…. But even when you take notes, you’re supposed to look up once in a while and look at the instructor performing his or her stand-up routine there in front of you. We’re not just standing up there at the whiteboard to repeat a lesson, like Tivo on a 3-D TV—we’re actually there to create a teaching moment from scratch every day, and some of it is improv! What creates the most significant difference between a classroom experience and an online course is the face-to-face encounter with questions and ideas. But without the basic eye contact participation you might as well be at home behind your screen, taking an online course (which has its merits, but the face-to-face learning moment isn’t one of them). When I have told my students that I expect eye contact from them, they have—to my enormous consternation—been surprised. And now  I realize that they simply may not be accustomed to eye contact being appropriate, because of having grown up frequently—maybe even primarily— communicating electronically with peers. The first generation in the history of humanity where eye contact is no longer the first clear human outreach? Now that is fundamentally frightening. The gaze of The Other is fundamental to many 20th century philosophies, in particular Sartre’s, who sees it as (by and large) a competition,  and Levinas’s, who sees it as humanity looking right at you, asking for your empathy. Look at Vermeer’s  “Girl with the Pearl Earring,” the picture I use for my “Gravatar,” as well as for the cover of my book, The Human Condition:

detailed view of face

This is the face of the Other. She is looking right at you, with the gaze of a human being, real and timeless. She expects a response. But if we withhold our gaze and think that’s normal, well, then there is no empathy coming forth.

Educating Teachers March 7, 2010

Posted by Dwight Furrow in Dwight Furrow's Posts, Education, Teaching.
Tags: ,

I am skeptical of education reform in this country.

One sort of reforming wants to spend more money to improve education despite the fact that throwing money at the problem hasn’t worked. The other sort wants to use standardized tests to measure teacher performance, institute merit pay for successful teachers and fire the unsuccessful teachers. But this assumes that teachers have the knowledge and skills to teach well but are just too lazy to do the job without a financial incentive. This is a wholly unwarranted assumption that has circulated among right-wing, anti-union groups for years and has now escaped into the mainstream, apparently influencing the Obama Administration.

Lack of motivation is not the problem. Most teachers are dedicated people who care deeply about their students. Teaching complex material to unprepared, unmotivated, distracted students will always be a difficult challenge at best. But we have to get better at it if our society is to flourish. Punitive measures are not sufficient.

Most recent attempts to find models of education that work involve cherry picking the best teachers, administrators, and students, putting them together with adequate funding and some new, bright idea about curriculum; and then pointing to their success as evidence that—? Well, I guess that good students will learn from good teachers. But we already knew that.

The problem with these experiments is that they are not scalable. We need thousands of new teachers each year to teach millions of students. Thus, neither the teachers nor the students will be the “cream of the crop”. Educational policy cannot be about hiring the best and the brightest—we need too many teachers for that. Among a workforce of millions of teachers, there will be some good ones and some bad ones. But rewarding the good ones; and firing the bad ones will have little impact on outcomes. What matters is the average teacher. The successful educational policy will get average people to perform to the best of their ability.

This article in the New York Times Magazine is interesting because it reports on new research in teacher training that actually might do some good.

Working with Hyman Bass, a mathematician at the University of Michigan, Ball began to theorize that while teaching math obviously required subject knowledge, the knowledge seemed to be something distinct from what she had learned in math class. It’s one thing to know that 307 minus 168 equals 139; it is another thing to be able understand why a third grader might think that 261 is the right answer. Mathematicians need to understand a problem only for themselves; math teachers need both to know the math and to know how 30 different minds might understand (or misunderstand) it. Then they need to take each mind from not getting it to mastery. And they need to do this in 45 minutes or less. This was neither pure content knowledge nor what educators call pedagogical knowledge, a set of facts independent of subject matter, like Lemov’s techniques. It was a different animal altogether. Ball named it Mathematical Knowledge for Teaching, or M.K.T. She theorized that it included everything from the “common” math understood by most adults to math that only teachers need to know, like which visual tools to use to represent fractions (sticks? blocks? a picture of a pizza?) or a sense of the everyday errors students tend to make when they start learning about negative numbers. At the heart of M.K.T., she thought, was an ability to step outside of your own head. “Teaching depends on what other people think,” Ball told me, “not what you think.”

The idea that just knowing math was not enough to teach it seemed legitimate, but Ball wanted to test her theory. Working with Hill, the Harvard professor, and another colleague, she developed a multiple-choice test for teachers. The test included questions about common math, like whether zero is odd or even (it’s even), as well as questions evaluating the part of M.K.T. that is special to teachers. Hill then cross-referenced teachers’ results with their students’ test scores. The results were impressive: students whose teacher got an above-average M.K.T. score learned about three more weeks of material over the course of a year than those whose teacher had an average score, a boost equivalent to that of coming from a middle-class family rather than a working-class one. The finding is especially powerful given how few properties of teachers can be shown to directly affect student learning. Looking at data from New York City teachers in 2006 and 2007, a team of economists found many factors that did not predict whether their students learned successfully. One of two that were more promising: the teacher’s score on the M.K.T. test, which they took as part of a survey compiled for the study. (Another, slightly less powerful factor was the selectivity of the college a teacher attended as an undergraduate.)

Ball also administered a similar test to a group of mathematicians, 60 percent of whom bombed on the same few key questions.

The whole article is worth reading. But what stands out  is the recognition that teachers need to know more than subject matter and educational theory—the two main elements of teacher training. They also need a detailed understanding of how unformed minds can misunderstand the subject matter.

I suspect that the difference between an experienced teacher and an inexperienced teacher is that the experienced teacher has a wealth of information about what is hard about the subject they teach.

book-section-book-cover2 Dwight Furrow is author of

Reviving the Left: The Need to Restore Liberal Values in America

For political commentary by Dwight Furrow visit: www.revivingliberalism.com

Only 6% of Scientists are Republican August 6, 2009

Posted by Dwight Furrow in Dwight Furrow's Posts, Science, Teaching.
Tags: , ,
1 comment so far

Professors in the humanities and social sciences tend overwhelming to be liberal and vote Democratic, according to some studies. Unsurprisingly, conservatives have accused academia of discrimination in its hiring practices although no evidence of overt discrimination has surfaced.

It is more difficult to plausibly charge discrimination in light of this recent study, via Leiter Reports:

A new Pew study finds that just 6% of scientists are Republicans.

How would partisan politics enter into hiring decisions regarding biochemists, physicists, or astronomers?

As Brian Leiter remarks:

So this raises the question whether other factors are at work in explaining political party affiliation.  A serious investigation of the question would have to consider what role intelligence, emotional or psychological health, and/or bigotry play in explaining why the Republican Party can attract only a small minority of intellectuals and scholars to its ranks any longer.


book-section-book-cover2 Dwight Furrow is author of

Reviving the Left: The Need to Restore Liberal Values in America

or Visit the Website: www.revivingliberalism.com

Ethicists Are Not Ethical June 18, 2009

Posted by Dwight Furrow in Dwight Furrow's Posts, Ethics, Philosophy Profession, Teaching.
Tags: , ,

Via Inside Higher Ed:

According to a paper written by two philosophy professors, Eric Schwitzgebel of the University of California at Riverside and Joshua Rust of Stetson University, a college professorship in ethics does not necessary translate into moral behavior. At least, that’s what the people who work with ethicists say.

Their results:

Most of the 277 survey respondents reported no positive correlation between a professional focus on ethics and actual moral behavior. Respondents who were ethicists themselves shied away from saying that ethicists behave worse than those outside the discipline – generally reporting that ethicists behave either the same or better – but non-ethicists were mostly split between reporting that ethicists behave the same as or worse than others. Even those ethicists who did rank their peers’ behavior as better than average said their moral behavior is just barely better than average – hardly a ringing endorsement.

I don’t find this surprising. Why think that people who study ethics are morally superior to people who don’t. Are psychologists more mentally stable than non-psychologists? Are chemists better cooks?

One of the paper’s authors goes on to express some doubt about whether ethics courses improve student’s behavior:

“People do sometimes justify ethics courses on the assumption that taking ethics courses will improve students’ behavior down the road,” Schwitzgebel said, noting legal and business ethics as examples, although they are separate from ethics courses in the philosophy department. “I think there is a potential this line of research could undercut the justification for those classes.”

But, as Schwitzgebel was quick to point out, his study does not imply that. The jump from ethics professors’ immoral behavior to students’ benefiting (or not) from ethics courses is a long one to make, he said.

I think there is some confusion here.  People who behave well tend to be well-motivated. But theorizing about ethics probably has little influence on motivational states. People who lack moral motives because they are narcissistic, excessively selfish, authoritarian, etc. will not acquire moral motives through theoretical reasoning. (My Kantian friends might disagree.)

However, if a person is well-motivated, studying ethical theory can give her the tools to think more clearly and consistently about ethical behavior. Studying ethics makes well-motivated people better; scoundrels will need more than a finely-honed argument to get well.

For an entertaining debate about this, head over to Crooked Timber.


book-section-book-cover2 Dwight Furrow is author of

Reviving the Left: The Need to Restore Liberal Values in America

or Visit the Website: www.revivingliberalism.com