Escape from nowhere: more reasons for community-centered schools

Higher IQs but lower test scores? What’s going on?

From World War II until now the average American IQ rose by more than 15 points. That’s a startling change. “The average child in 2010 would have been exceptional in 1950,” said Marc Bauerlein, senior editor of First Things in “The troubling trend of cultural IQ.”

Kids are smarter now but they can’t read as well (as they did in 1950).

What’s even more startling is that as those historic gains were occurring, school performance as measured by standardized tests plummeted. Both college professors and employers are struck by how many students and younger workers are “terribly deficient” in basic knowledge and skills. Although test scores have been quite static since 1980 (despite massive commotion due to a series of “reform” initiatives beginning in 1983), from 1962 to 1980 scores on the SAT verbal exam dropped a shocking fifty-four points. That loss has never been made up.

The number of incoming college freshmen who need remediation has kept climbing, and the numbers are now 10% at selective schools, 30% at typical colleges and 60% at two-year schools. The National Assessment of Educational Project (NAEP), our best benchmark for educational improvement or decline, has shown small gains in basic reading skills by young children but these do not result in measurable gains by high schoolers trying to read adult literature. Bauerlein said this is because “the reading tests include passages with diction exceeding the gains made in elementary school.”

So why haven’t large gains in IQ led to any improvement in academic performance? Bauerlein said this is easily understood by drilling down into the IQ data. The IQ tests consist of several subtests that measure different mental functions, such as memory or attention or spatial reasoning. Over the years, changes in various subtests have varied dramatically. What is crucial to understand in relation to academic proficiency is that students’ performance on the subtests for arithmetic and vocabulary have been essentially flat. This is consistent with what the NAEP shows. From 1972 to 2002 general information knowledge scores showed no improvement and vocabulary moved only minimally. Students today are no more capable of comprehending difficult texts than they were before decades “school reform.” Most are not ready for either college or the modern workplace.

What the school reform movement has made clear—after the initiatives and the remedial classes and the revised curriculums and the literacy coaches—is that there are no magic bullets or quick fixes. We should think harder and commit more deeply, maybe, because that verbal reasoning that fell in the 1960s and 1970s is vital for civic engagement in any setting amid the marketplace of ideas, including universities and the professional and managerial workplace. As things stand now, those higher IQ scores are not helping people to evaluate the rhetoric of a Barrack Obama or a Donald Trump or to perceive the veiled bias of a news story or to comprehend the moral distance between the competing claims of pop culture movements. They aren’t helping mothers and fathers find wisdom amid the sea of blarney that washes over citizens in the information age.

Youth culture can isolate adolescents from adult voices.

Another interesting fact that Bauerlein points out is that adults have shown gains in knowledge and vocabulary as measured by the Weschler Adult Intelligence Scale(WAIS). This is most likely because many of them have attended college and took classes in such core subjects as literature, history, psychology, economics, and science. This raises the obvious question why then haven’t their children shown gains? We would expect larger vocabularies and more knowledge to affect both the reading and the conversation of adults, which should create a richer intellectual context in the home for their children. So we would expect rising rather than flat test scores for their children.

Why hasn’t this happened? One interesting possibility is that high schools themselves isolate teenagers from the adult intelligence that might otherwise surround them. According to New Zealand social scientist James R. Flynn (whose studies brought widespread attention to the rising IQ scores), since the 1950s a teenage subculture has developed that insulates young people from “adult speak.” Adolescents hang out together, adopting their own idiom, fashions, mores, movies, and music—creating what the great education researcher James Coleman called “the adolescent society.” An uncharitable observation would be that our teenagers are failing to learn very much because they are cooped up in high schools all day.

In 1909, fewer than 9% of Americans graduated from high school, the rest moving quickly into an adult-centered society. But by 1960, about 70% of teenagers stayed in high school all four years. They saw each other all day in classes, in the halls, at the cafeteria, and they made after-school plans. A youth subculture formed and the authority of adult voices waned. This matters because the lingo of youth culture is less sophisticated than adult conversation, less rich in the content knowledge grownups use to make sense of their world. Teens immersed in youth culture tend to have dawdling vocabularies and thin knowledge of art, politics, economics, history, religion, science and philosophy. The language and the facts such young people most need to act intelligently in the world (not to mention to score well on standardized tests) is not often present in the company that dominates in their world.

Some schools mimic the liturgy of rock concerts in their design of assemblies.

Many schools no longer offer much resistance to youth culture. Visiting a school will make it clear to which schools are more shaped and formed by pop culture than they are to whatever academic communities survive in our universities. Many schools are adopting a marketing approach, trying to offer whatever “sells” in the youth market. This makes perfect sense to anyone whose main intellectual context is pop culture. Some administrators have begun to mimic the liturgical form of the rock concert for school assemblies. The lobbies are full of propaganda, somewhat resembling the Capitol in the Hunger Games films.

In response to a widely perceived sorry state of affairs, the goal of the Common Core State Standards, a Gates Foundation-funded initiative, was to prepare low- and middle-income students for the rigors of a college education. Predictably, it crashed upon the reality that a college curriculum is presented in language beyond the reach of many students. Slogans such as “every child can learn” and “no child left behind” have no effect on the fact that the College Board sets college readiness at a score of 1180 on the SAT but we’ve only managed to get 10% of seventeen-year-olds reading at that level.

We push college for everyone, so now more students than ever begin college, but graduation rates have been stuck in the low thirties, suggesting an intellectual barrier we have learned no way to breach. So large numbers of first-year students pay college-level fees for remedial courses but cannot stick it out till graduation, leaving without diplomas but with unconscionable levels of debt.

Is hope justified?

Is there a solution? Bauerlein doesn’t offer one. He observed that “parents and mentors need to spend more time conversing with youths, reading the newspapers together, going on cultural outings. . . and adding grownup affairs to the menu of adolescence.” But he recognizes that saying such things isn’t a solution. “The parents and mentors inclined to heed our exhortations probably already recognize the problem and strive to restrain it—they don’t need our advice—while the others haven’t the space to listen or the disposition to act.”

American society has operated for decades now on flawed understandings of is best for adolescents. “Few things in this world,” he said, “have stronger momentum than cultural mores and values that settle into people’s heads as the way reality operates.”

The great need, to the extent that Bauerlein is right, is for teens to spend more time talking with adults about grownup matters. I’m at least as skeptical as Bauerlein is that we are going to get to such a society—where high schoolers performance is a match for their IQs—anytime soon. I’m quite sure that yet another argument with reasons and statistics is not going to have much influence on schools. The trouble, if that’s what it is, arises in the culture from which today’s Americans get their notions of what is worth wanting, what is worthy of effort and what the point of all our striving might be. A rock star influences pop culture, and thus school climate, more by intoning “We don’t need no education” than a professor publishing the latest article in Educational Leadership.

My personal experience

Students in St. Ignatius, Montana, interview Hermann Detert in his home as part of an oral history project.

I’ve earned my skepticism through hard work and money spent. Over a dozen years I spent more than $8 million promoting a different vision for schooling. Working with the Library of Congress and the Liz Claiborne and Art Ortenberg Foundation, I directed the Heritage Project, enlisting 34 Montana high schools to reconnect high schoolers with the people in the community who were doing the adult work of building and sustaining communities. The heart of the project was having those adults assist students with collaborative research on real concerns in real places. The way forward was to escape from nowhere—the abstract curriculum and impersonal teaching championed by people from away, sitting at a big table in the convention center.

I wrote a book based on that experience. At that time, I talked about “community-centered” teaching practices (which were a form of pushback at the “student-centered” teaching that dominated professional training at the time). The romantic urge to cater to the fast-moving attention of high schoolers was very strong among teachers and administrators, and in many discussions about how to advance the game I encountered little discussion about including students in the circle of grownups talking about larger and more enduring concerns. We have too many adults trying to join the conversations in youth culture rather than trying to bring teenagers into adult conversations. Eudora Welty wisely observed that “To cater to is not to serve, and it’s not to love very well either.”

I began with a lot of optimism. “Montana’s future is being decided right now in its 176 public high schools,” I said. “They are foundational institutions. If they fail, none of our economic or cultural developments will succeed.” My optimism grew in part from an “integrating vision” that I observed growing in the nation—one that both Democrats and Republicans supported. I thought I saw a grassroots movement spreading through America, going by many names: character education, civic education, service learning, and place-based instruction. I tried to unify these various movements under the phrase “community-centered teaching.” At the heart of these various approaches was a simple and unifying insight: we cannot separate education from the community (a corollary was that community development and school improvement are two sides of the same coin).

It seemed to me that various strands of this insight led to an equally simple conclusion: we can revitalize our high schools by making the study of community their central organizing principle. This would mean offering classes that study our civic institutions as they have developed in time and as they are practiced in the real world of our particular communities. It would mean studying history and ecology by including local illustrations. It would mean providing every student opportunities to study ways the local community interacts with its ecological, geographical, business, and historical contexts. Every subject could inject real life into its curriculum by considering what the community had to teach–either by good example or bad. It’s a truism that the only place the universe can actually be studied is locally. There need be nothing narrow or parochial about local studies (though the danger of failing to link local findings to the larger issues is real).

Such studies could go beyond textbook abstractions into detailed examinations of such topics as the role of forests in local economies and in watersheds or the engineering constraints for local water and sewer systems. Working with state and local agencies, students might conduct feasibility studies for businesses or sociological comparisons of varying cultural practices and their impacts on health. They might study historical effects of immigration or infrastructure  projects on particular people.

It was hardly a secret that such approaches had been called for repeatedly by leading educational researchers. High school students are at the developmental stage when they are beginning to form communities, which is why they tend to be so cliquish. The most important educational need of adolescents is to be guided into intelligent explorations of community in all its aspects. One great risk of youth in today’s America is intellectual and spiritual capture by one of the unintelligent communities, real or virtuous, that surround young people and compete for their allegiance. Gangs are only the worst example. Young people are hard-wired to join, and if intelligent communities are unavailable or unattractive then stupid ones will do.

Furthermore, we know that classroom instruction unrelated to real situations often does not lead to understanding or the ability to transfer knowledge from the classroom to the world. It was my faith, confirmed by the work of many excellent teachers, that when young people use academic skills to analyze real issues in the world they know, they move from dull abstractions to deep learning.

They also create social capital. Through the 1950s, one teacher in Pennsylvania connected his high school seniors with local officials to research aspects of the local community. Thirty years later researchers tracked down these students to see whether the experience had measurable long-term effects. The results were stunning. Students who had been involved in local studies in high school were four times more likely than other students to have joined voluntary associations.
By tackling the real issues in their communities alongside committed adults, those students felt a part of the community. They learned to find meaning in shared work. They developed a commitment to civic engagement that lasted throughout their lives. “Imagine the impact on Montana’s future if every student in every high school had similar opportunities,” I said.

I thought of it as a beginning. As schools became more community-centered, communities would become more education-centered. All our agencies, public and private, could have parts to play. Television stations, artists, newspapers, tribal elders, museums, parks, clubs, businesses, chambers of commerce, grandparents, and cowboys could re-examine their roles, seeing what resources they could contribute to the work of engaging our youth in understanding the world in which we make our place. It didn’t seem too much of a stretch: lots of agencies have already figured out they can’t fulfill their missions without educating the public.

What we needed, I thought, was leadership in building suitable frameworks for collaboration. The phrase “citizen science” wasn’t common now, but today I would point to Cornell’s fabulous eBird project, which is channeling the data provided by an army of nonscientists birders into huge computers that are forming a much more complex and fluid picture of our world that has been available before. We need more such projects, with support for high school teachers. I suggested that university researchers could guide rigorous research projects into local communities and ecosystems, using high school classes in a variety of ways. This would involve training teachers, but also guiding local projects and sending graduate students into the field to help students gather, organize, preserve and interpret their field data.

Scientists with the Long-Term Ecological Research Network had used students to assist with cutting-edge scientific problems. In one project, classes at a high school in Seattle and at one in Tuscaloosa took measurements of temperature, pH, dissolved oxygen, nitrate, phosphate, total dissolved solids, total bacteria counts and net primary production while a group of scientists measured the same variables at a pristine site in Antarctica. This allowed researchers both to follow what was happening at each site and to make cross-site comparisons.

The Library of Congress through my work gained experience using high school students to collect oral histories of veterans throughout the nation. Their experiences with the Heritage Project led them to create the ongoing Veterans History Project, modeled on the work we did in Montana. High schools and other community organizations are invited to conduct historical research and document contemporary aspects of community life for the Library’s permanent collections. At that time, I said that “Our educational leaders should be talking in earnest about what research can be undertaken in collaboration with high schools, and our communities should be talking in earnest about what informational infrastructure they need to build, starting with the schools.”

The vision entertained the possibility that when most high schools in Montana were involved in linked, statewide research projects through the universities, our libraries and museums and other cultural institutions as well as our land management agencies, our students’ educations would get a powerful boost at the same time we would all get useful information in an accessible form. Most information in the information age is local because we need detailed local knowledge for our own purposes. Foresters prepare prescriptions for specific sites, based on careful study and historical data. Entrepreneurs conduct original research that closely examines possibilities at particular locations. I know what roses grow well in that spot just north of the two blue spruce trees.

“Montana, and every community in Montana, needs to study itself extensively if it is to thrive,” I said. “No one else will do it for us.”

It’s how we survive and thrive

To a great degree, the issue is bigger than what we usually mean by “education.” The global economy doesn’t—can’t—care what happens here, though it’s become a habit to associate education with the global economy—mainly because the people who benefit most from globalization also tend to be manipulating our laws and institutions for their own benefit. We need to remember that the global economy is never going to have a place for all of us. This will become more and more the case as the robotics revolution proceeds. The global economy needs to be augmented by robust local economies, and it is in the interactions of local economies that we develop our social connections, find the dignified and important roles that make our lives matter, decrease our vulnerability to the restructurings that are routine in global markets, and make it more likely that we will be able to find fresh vegetables and plumbers.

“Most of Montana’s economy will always be local,” I said. “More than anything, Montana needs a generation of educated young people who understand the places they live and want to stay, and who have an entrepreneurial spirit, confidence, and commitment to finding new ways to live well. To develop a thriving local economy, we need to develop a thriving local culture of people who are self-aware, committed to mutual support, and prepared to inquire and learn.

“By organizing our high schools around local studies, we can create what we need.” I still think that’s true. And more than ever, I think saying so is unlikely to make much difference. But then, some things take time.

Lessons Learned

My experiences have suggested several insights: none of them earth-shaking:

A student visits with philanthropist Art Ortenberg at a Youth Heritage Festival in the state capitol. The active participation of Art, and his wife Liz Claiborne, was helpful for getting the state’s major cultural institutions on board, including the Office of Public Instruction, the Montana Historical Society, the Montana Committee for the Humanities, and the Montana Arts Council.

1. The imprimatur of prestigious institutions such as the Library of Congress affects school administrators in ways that tightly reasoned professional publications with footnotes and everything do not. School-level leaders adopt programs more readily when doing so involves meeting famous people or hearing that they may find opportunities for professional advancement. Schools are more often led by careerists than by scholars (though the two categories are rarely mutually exclusive).

2. Prestigious institutions are hard to enlist in education initiatives but are not so hard to bribe with promises of foundation money and “public/private partnerships.” Art Ortenberg suggested approaching recalcitrant officials by using “the force of money.”

3. Students believe things are important more readily when prestigious leaders say they are important. They will work harder for recognition (and the chance to travel) than they will to raise their SAT scores. Great things happen when they are invited to do something that matters, supported as they work at it, and then recognized far and wide for what they accomplish.

4. It’s best to work with only with teachers who have voluntarily joined. Teachers who are only pretending to be on board (a routine schoolish tactic) are like sludge in the machinery. They use up scarce resources (mostly time) to no real purpose.

5. Teachers respond to leadership from beyond the school best when they are led to form enduring teams with considerable control over ways to incorporate the principles espoused by the outside agency. Regular face-to-face meetings with the other team members is a necessary part of the work.

6. Developing the vision and learning how to collaborate are the “secrets” to accomplishing enduring change. They remain secrets in spite of being broadcast from rooftops because both are hard to do well. Everything worth doing is difficult, at first and for a while.

7. High schools aren’t actually necessary for the real work. It’s just that right now that’s where the young people are. This is helpful to keep in mind now that there are signs that they are dissolving.


Witch hunt! “It’s dangerous to believe” –Part 2

Review: Mary Eberstadt, It’s Dangerous to Believe: Religious Freedom and Its Enemies

Eberhardt’s understanding of our culture war is that it’s a moral panic—the same pattern as the Salem Witch Trials, the McCarthy hearings, and other purity crusades where people aflame with self-righteousness destroyed others without good evidence.

Proof of transgression resides not in actual evidence but whether the accusations issue from a socially-approved class of inquisitors.

Proof of transgression resides not in actual evidence but whether the accusations issue from a socially-approved class of inquisitors.

In chapter 2, she lays out that care, that attacks on Christians in contemporary America are similar to the day-care panic in 1983, or the McCarthy hearings of the 1950s, or the witch trials of Salem in 1692. People believe things that are not true and act on the basis of imagined evidence. She cites Stacy Schiff, author of a recent book on the Salem trials: “We too have been known to prefer plot to truth; to deny the evidence before us in favor of the ideas behind us; to do insane things in the name of reason; to take that satisfying step from the righteous to the self-righteous.”

She has in mind “ubiquitous shouts of ‘bigot’ and ‘hater’ aimed at people who harbor newly impermissible opinions about marriage.” She cites many examples of “the targeting of believers in workplaces, on campuses, and elsewhere,” noting that “today’s secularist campaign abounds with one element essential to all witch hunts: inquisitorial zeal.” Activists indulge in “moral irrationalism” to accuse people who hold unpopular beliefs in the name of making society a “safer” place. “Under this new dispensation, ‘bigot’ and ‘hater’ are the new ‘wizard’ and ‘witch.’”

Since the 1960s there has been a sea change of belief about the moral structure of the universe and a fundamental belief of the new morality is “self-will.” The master ethic is “doing what you want.” So it follows that “traditional moral codes represent systems of unjust repression.” Yesterday’s sinners “have become the new secular saints,” and yesterday’s sins are now virtues, “positive expressions of freedom.”

She sees that the primary battleground in the larger conflict between cultures is in attitudes about sex. Of the many movements swirling together in the cultural revolution of the 1960s, it is the sexual revolution that has become the absolutist core of the new faith. Most of the saints of secular modernity have been warriors in the sexual revolution:

. . .proselytizers for abortion and contraception, like Margaret Sanger and Helen Gurley Brown and Gloria Steinem; crypto-scholastics whose work is revered by generation after generation of the faithful and off-limits for intellectual revisionism, such as Alfred Kinsey and Margaret Mead; quasi-monastic ascetics, like the grim public priestesses of the National Abortion Rights Action League and Planned Parenthood and Emily’s List, fighting to end the pregnancies of other women; and even foreign “missionaries,” in the form of representatives within progressive charities and international bureaucracies—those who carry word of the revolution, and the sacraments of contraception and abortion, to women in poorer countries around the world.

The logic of the revolution is not exactly Aristotlean, Eberhardt says. “Syllogisms include ‘if you are against abortion, therefore you are anti-woman’; ‘if you believe in Christian teaching, therefore you hate people who endorse same-sex marriage.’” But fallacious reasoning has never been fatal to revolutionary passion.

Actors in the era of political correctness have become timid about doing anything that might inflame the anti-Christian forces that monitor them. Alastair Bruce, whose job it was to ensure the historical accuracy of the popular television series Downton Abbey, admitted that a paramount concern was hiding the religious practice that was so much a part of daily life in the early twentieth century. For example, the show never depicts the beginning of a because it would have been unthinkable for such characters to have begun eating without saying grace. But Bruce worried that showing such details would have induced a “panic.”

Religion is perceived “as menacing laissez-faire sexual morality.” Christianity’s historical morality has celebrated sex within marriage and condemned all sex outside of marriage, but “the sexual revolution. . .is the centerpiece of a new orthodoxy and a new morality that elevates pleasure and self-will to first principles. This has become, in effect, a rival religion.”

It is the religious zeal of the new faith that leads to Eberhardt to see parallels with old Salem. She observes that Facebook offers 58 gender options for American users but “priests cannot use the title ‘Fr.’ on their personal pages, and are shut down if they attempt to—even though Facebook’s official policy is that people should use the names they are known by, and even though most Catholic priests are known as ‘Father.’”

Such forms of banishment make sense to people under the influence of what psychologists and economists call “herd behavior,” where “large numbers of people act the same way at the same time.” Many universities have become zones of herd-like conformity: “99 percent of the faculty and staff at Princeton University who donated to presidential candidates gave to Barack Obama. In 2016, 91 percent of Harvard’s faculty donations went to Hillary Clinton.” Such plays are unified by their common mythology. Hugh Trevor-Roper said of the Eurpean witch craze that “the mythology created its own evidence, and effective disproof became ever more difficult.” People are believed to be “bigots” or “phobic” simply by virtue of being religious believers.

Once someone is accused by a Puritan minister or a crusading congressman, the accused faces the difficult logical task of proving a negative. It’s not simple to prove such claims as “I am not a witch.” “I am not committing ritual blood libel.” “I am not controlling the media/Pentagon/banks.” “I am not a hater.” And for true believers, such proof would not be persuasive. “In Western societies today, as in Salem, ‘proof’ of transgression—in this case, against newly built orthodoxy concerning the sexual revolution—resides not in actual evidence of wrongdoing; but rather in whether the accusations issue from a socially-approved, priestly class of inquisitors.”

Some people played along with the trials in Salem hoping to avoid being accused themselves. Something similar is likely true in America today. And those who are not immediately in the dock have reason to be afraid. An interesting fact about revolutionary purges or witch hunts, is that formerly “safe” inquisitors do end up facing the accusers. Revolutions do devour their children, as a journalist watching the end stages of the French Revolution observed. The revolutionary fervor either advances or it dies, and the way it advances is by expanding the list of sins and the list of enemies. In Salem at the end, Minister Samuel Parris found himself the object of the fury he helped unleash.

At the present moment, we see the transgender activists turning their ire toward formerly esteemed feminists, such as Germaine Greer, for her brazen insistence that surgery cannot make a man into a woman, thus violating the new orthodoxy. Andrew Sullivan, one of the first leaders of the same-sex marriage movement has recently argued that “religious freedom is fundamental to this country,” for which a Twitter mob named him “offensive, misogynist, and transphobic.”

Eberhardt uses history to better grasp what is happening, and her knowledge of history also gives her faith that the current moral panic will pass. “Within just a few years of hanging the last witch, a new social consensus formed according to which the entire episode had been a massive injustice,” she said. “Less than a hundred years later, John Adams would write that the trials were a “foul stain” on the country, and almost everyone else would henceforth agree. Cotton Mather, for all his other accomplishments—he was the first to introduce inoculation to the New World, among other innovations—would nonetheless go down through the centuries as one of history’s villains.”

A few thoughts on planning an oral history project in China

I just returned from Changsha, China, where I was invited to a conference at Hunan Library to discuss my experiences with dozens of oral history projects in 33 rural communities in Montana, using high schoolers as the primary researchers. The sponsor of the conference was the Evergreen Education Foundation, which has been doing good work in rural China for many years.

Hunan Library

Hunan Library in Changsha, which hosted the conference in partnership with the Evergreen Education Foundation.

I confess I was a bit wary. It had been a while since I attended a conference sponsored by one of the big foundations or socialized with the tribe that gathers there. They tend to be people drawn to the humane slogans of late modernity which have replaced older traditions. It was all so familiar—the endless talk about more precise assessments, improved monitoring, better implementation and dissemination, and, of course, sustainability. Such concerns are expressed in a framework of humane aspirations, having to do with social justice. We are, after all, nice people. Still, to tweak Drucker’s phrase, doing things the right way is much easier than doing the right things.

I understand the need to be cautious when straying from our accountability rituals. The models are adapted from the corporate world where ambitious people have shown, if nothing else, that they can organize lots of people into vast projects focused on measurable outcomes. How else could the world be run from the commanding heights? Still, it seems important to have mixed feelings about how eagerly newcomers to such conferences are attracted to the bright lights and big names, how quickly they adopt the vocabulary and language of the people on stage. It could be tragic to mislead them.

I easily blended in with the veteran attendees as they shared experiences, enjoyed the buffets, greeted old friends and luxuriated in a reliable sense of deja vu. Lots of nice people. And it did feel nice to be there, invited to a conversation about humane values at a costly hotel where insiders gathered amid chandeliers and wine glasses, comfortable with warm dreams backed by resources. The allure of money—of being invited to the table—can be enchanting.

The real work

Weiming Tu

Weiming Tu, One of the most influential thinkers about China of our time. He is founding director of the Institute for Advanced Studies at Peking University and Senior Fellow of Asia Center at Harvard University.

But will it work? Are we oriented toward the direction where we need to go? One topic that stayed on my mind throughout the conference—a topic that did not get enough attention, I thought–was how to understand governance more powerfully than the business accountability models we’ve all learned. The keynote speaker, Weiming Tu, founding director of the Institute for Advanced Studies at Peking University and Senior Fellow of Asia Center at Harvard, spoke to the point, presenting a big picture view of what the real work that we now face may be.

His plea was essentially for better character education—through the classic liberal arts method of aiming at a moral outcome through intellectual means. Right reason will lead to right action. Our current plight, Tu suggested, is that we must regain the wisdom to make choices inspired by desires more intelligent than those inflamed by consumer culture. To so educate desire in China, Confucianism is important. “We need curriculum reform that includes Chinese classical learning in college but also in primary education,” he said. We need to foster a conversation between Enlightenment values and our older spiritual traditions. Though the Enlightenment has been the most powerful ideology in world history—practicing such values as rationality, liberty, equality and the dignity of the individual–and because of it the modern world is better than the pre-modern world, we have now arrived at a point where we see clearly that Enlightenment values alone are not enough. Without powerful spiritual values, a kind of anthropocentricism has emerged wherein reason has become mainly instrumental, aiming not at self-realization but at power. There is something “fundamentally discomforting” about current values, he said, which lead to the dominance of “Economic Man.”

He followed Samuel Huntington in calling for a conversation between Enlightenment values and Confucian values, as well as Christian values and those of other groups, aiming at clarifying principles that can be accepted by members of all religious traditions. The voice of spiritual humanism has become “quite feeble” in China.

If we do not know about invisible worlds–levels of meaning higher than money–and talk about them as though they matter, they will have little force in governing the world we are making. To a great extent, talking about them as though they matter, bringing them up in venues large and small, giving them form that makes them accessible, testifying in favor of them–this in itself may be our salvation. In the West, Socrates taught that we must ask the serious question: “What is the good life?” The good life, as he understood it, is to be forever asking the question again and again, in the light of each new circumstance.

Linking practice to big ideas

Yuelu Academy

Faith Chao, Director of the Evergreen Education Foundation, translated for us during our visit to the ancient Yuela Academy, founded during the Song Dynasty in 976 AD at what is now Hunan University. The Academy remained loyal to Confucian ideals of moral self-cultivation and community solidarity.

Most speakers focused on smaller issues—the practical matters involved in conducting and archiving oral history projects in rural places. Such matters are important and getting more thoughtful and precise about them is fundamentally important. But it would be unfortunate if we let the details distract us from taking Professor Tu seriously, from asking the obvious question: Can our oral history projects provide suitable occasions for the sort of conversations about higher values that, Tu said, we may need if humanity is to survive?

I believe they can.

To make such conversations likely, care may be taken in how the projects begin and how they end. Specifically, the projects should be planned with big questions to be explored–the enduring questions that take us to the heart of our humanity–made clear and explicit at the beginning; they should end with original writing by the researchers in which they grapple with the meaning of their findings with reference to the enduring questions that began their quest. It is not necessary to come to tidy conclusions, like the perfunctory little upbeat platitude that often ends “human interest” stories in small town newspapers, but it is important to ponder the truths of the human condition as they are manifest, sometimes subtly, in the transcripts that are being added to the record of human experience.

Big questions
To begin, enduring questions can be formed by reading significant texts, classic or contemporary, that relate to the topic to be investigated. Good interviewers have spent time gaining the background knowledge they need to ask real questions, and to demonstrate real interest to the interviewee, and gaining that background knowledge and creating a set of questions—both enduring questions to guide the researcher, and more specific questions to ask during the interview—can be done while reading deep and rich texts.

The focus should be on only few enduring questions–maybe three or four. Their purpose is not to limit the interviewing only to those issues that are clearly or directly linked to the big questions. Their purpose is to orient the researchers toward a general direction, which one might well forget at times while engaging the specificity of actual persons living through actual events. The focus, during interviews, should be on bringing as much love as one can bear in one’s attention to the interviewee, really listening and genuinely following his or her thoughts. Love is not often mentioned in how to guides to doing oral history, but it is love that most readily opens a speaker to a hearer, and it is the “secret” of many who excel at asking and listening.

This is not, of course, inconsistent with a quest for light on such questions as these:

What should we part with?
What should we keep?
What should never be for sale?
What should one never do for money?
In recent times, what has been lost or is being lost?
What has been gained or is being gained?
What goods are in conflict?
What has changed?
What has not changed?

Enduring questions serve to focus the interviewer, but they are not questions that usually will be directly asked of the subject, though if the conversation tends that way they may be.

The interviewer should remember that the mental movement from event to meaning can be slow and difficult—and often very personal–and the oral historian or journalist who hopes to avoid the hard work of thought by asking the subject the big question directly will usually be disappointed by the answer, which is most likely to come in the form of either confusion at the impossibility of simple answers to vast queries or vague platitudes and rambling attempts at making sense.

The focus most often should be on the interviewee’s memory and experiences, with an aim of hearing richly detailed narratives or careful descriptions. Few people can address big philosophical questions off the cuff in an articulate way.

Instead, when the interviewer asks open-ended questions that invite the subject to share experiences and think out loud, the interviewer is more likely to be surprised and delighted by the answers. A certain modesty is required. The interviewer should not ask leading questions, even if they are very big leading questions. It may help to keep in mind the observation of the American anthropologist Clifford Geertz, who in his last essay spoke of how “the shattering of larger coherences … has made relating local realities with overarching ones … extremely difficult.” Indeed. “If the general is to be grasped at all,” Geertz wrote, “and new unities uncovered, it must, it seems, be grasped not directly, all at once, but via instances, differences, variations, particulars – piecemeal, case by case. In a splintered world, we must address the splinters.”

Getting at what it means

A highlight of the trip for Valerie and me was a visit to a local middle school, arranged for us by Jingchao Yan--one of Dr. Faith Chao's staff.  We were accompanied by Ruth Olson of the University of Wisconsin-Madison.

A highlight of the trip for Valerie and me was a visit to a local middle school, arranged for us by Jingchao Yan–part of Dr. Faith Chao’s staff. We were accompanied by Ruth Olson, of the University of Wisconsin-Madison.

To grasp the general via the particulars—that is the work of essays or presentations that researchers should do as the culmination of their projects, which may be similar to the last chapter of a dissertation—the conclusions and recommendations. Though reflection should have been occurring throughout the work, frequent returns to the enduring questions to check how one’s understanding has changed or deepened, it is in synthesizing all one’s work into a final intellectual product or cultural artifact that reflection becomes the main work. If a student has read some Confucius on the duties of children, and then conducted an oral interview where a person talked about her particular family during a tumultuous time in the past, the attempt to write an accurate and truthful account of what happened and what it might mean will be time spent pondering what really matters in this life. Perhaps the Great Foundations could do worse than give such documents careful attention when the time comes to evaluate what has been accomplished.

In doing such work, might we be also teaching our young that the art of living is in part the art of ordering one’s life as a series of research projects, with “research” understood as the process of seeking information, knowledge and wisdom in many intellectual and spiritual modes, from various sources. Confucius understood that the way to govern a people well is first to teach them to govern themselves by wise principles. Christians also believe this.

It’s everyone’s story
Another thing that was on my mind was how a project in Montana might collaborate with a project in China. One way that comes to mind is simply to begin with the same, or similar, enduring questions. I suspect that we would find many things in common—and not just in the experiences of minorities. It would be one way of having a conversation across cultures about core values that we share.

It isn’t just indigenous people whose culture is being hollowed out or trammeled by the peddlers and prophets of late modernity. All of us who remain disinclined to live mainly for money or whose souls are not transfixed by Apple’s latest wonder sense that things are being pushed aside to make way for things of less worth. Any Confucian or Christian is likely to experience moments, sometimes important moments, when one’s deepest commitments are taken as nothing by market zealots or crusading ideologues. The displacement of Native Americans due to the faith that powerful men at their big tables had in their own wisdom, in their certainty that everyone’s duty comes down to assimilation to technological innovation and expanding markets is, I think, one of those historical occurrences that resonates for many of us. It’s a timeless metaphor. In typological terms, it is everyone’s story.

The twentieth century happened to us all.

The spread of ideology and dogmatism in the school reform movement

"We have in our time a very peculiar generation of scholars who all are clear about it:  ideologies are finished.  Each one in his way has taken this or that ideology and criticized it so that nothing is left of it.  Nevertheless, he does not quite see what to do afterwards, so we have a peculiar fence-straddling generation.  These people are very serious;  but their having seen that all is wrong still doesn't mean they know what is right. . . ." —Eric Voegelin

“We have in our time a very peculiar generation of scholars who all are clear about it: ideologies are finished. Each one in his way has taken this or that ideology and criticized it so that nothing is left of it. Nevertheless, he does not quite see what to do afterwards, so we have a peculiar fence-straddling generation. These people are very serious; but their having seen that all is wrong still doesn’t mean they know what is right. . . .” —Eric Voegelin

Much of teaching can be quite routine because both the material and the sorts of difficulties commonly encountered by people new to the material are familiar. But if the classroom is not to become merely another spiritual desert in the institutionalized existence of children born to late modernity, the teacher needs to maintain an openness both to the material and to the students. In the classroom, the language through which curricular knowledge lives combines with the minds of students to constitute a field of experience in which the teacher must act as a participant if he is not to rigidify and die, hardening into a mere enforcer of a system.

Symptoms of such a death include the repetition of linguistic formulas in response to questions, the assertion of bland moralisms by way of escaping uncomfortable facts, and the inability to provide concrete illustrations of whatever he is talking about and talking about and talking about. Dogmatism and refusals of the Question are the hallmarks of ideological systems, which are never true but always opposed to truth.

All our systems are wrong, to the extent that they obscure reality by erecting between us and the real world a second reality of language, routinely protected by interdictions on the asking of questions. Nearly all school reform programs are, of course, such systems.  Schooling in the age of reform has made both the life of the mind and the life of the spirit increasingly difficult, and we have few public forums where people can discuss education at the level of reality.  A staff that has been sufficiently cowed into unreality will, at the end of enervating hour or two of what is called professional development, have no questions. Institutions governed by ideology do not entertain questions aimed at the premises or the telos. Experienced practitioners recognize this and suffer the scotosis in silence.

The school change industry recruits participants who yearn to be a stars in the professional society which their studies or their position have opened for them. The usual panoply of goods is available to those who are willing to play: travel for conferences and site visits, release from mundane chores to sit at the big table, public praise, professional opportunities. Successful school reform leaders and consultants often have a fascination with conceptual schemes, and they mistake their ability to become fluent in such schemes for a grasp on reality.

As they master a second reality—the linguistic machine that underlies the reform plan—their sense of truth begins to shift and deform. Instead of accurate representations of the situations that practitioners actually face, they begin to judge as true those statements that are coherent with the conceptual scheme they have adopted. It can take considerable cognitive power to master complex conceptual schemes, such as Marxism or positivism, and some consultants find real intellectual pleasure in knowing their complicated things and in putting their knowledge on display.

Still, dogmatism is a formidable obstacle to anyone looking for truth and it is also the eternal enemy of teaching and learning.

Badlands: life sans religion, sans philosophy

Dakota Badlands

Kit and Holly enact a fairy tale made entirely of cliches and self-approval. They are anti-heroes of the American type.

Terrence Malick’s Badlands works as a period piece for that post-Vietnam time of self-absorption and loss of moral clarity that also gave us Butch Cassidy and the Sundance Kid and Bonnie and Clyde. But it’s not just a period piece—Kit Carruthers and Holly Sargis are somewhat timeless in their possession by unfocused, impulsive desires that attach to people and events in the kaleidoscope moments of a journey from unclear beginning to unknown terminus, and they “think” about what is happening and what they are doing entirely by repeating slogans and catch phrases they’ve picked up from the cultural milieu around them. In other words, they are quite like many people you know.

Most of what they say has a self-forgiving quality; their parallel monologues form a series of incoherent verbal gestures that help them feel good about themselves. As they bounce from murder to murder, they continue believing they are “good” people, though they are not. They are very bad people. They are bundles of appetites, no better (or much different) than snakes swallowing live mice. People are endowed with a moral sense. They should develop it.

Simple people may be saved by a good heart, as with Forrest Gump. But desire is not wise or good, for most of us, without some education and discipline. More and more of us get our moral education from our folkways, and our folkways are becoming increasingly toxic for inarticulate people with inarticulate desires. Holly and Kit never have thoughts, properly speaking.

“It sent a chill down my spine,” said Holly. “Where would I be this very moment if Kit had never met me? Or killed anybody? This very moment?” Her thinking never becomes more precise or more clear. “Kit never let on why he’d shot Cato. He said that just talking about it could bring us bad luck and that right now, we needed all the luck we could get.” That’s her “reflection” after one murder. About the next one, she observes: “He claimed that as long as you’re playing for keeps and the law is coming at ya, it’s considered OK to shoot all witnesses. You had to take the consequences, though, and not whine about it later. He never seemed like a violent person before, except for once, when he said he’d like to rub out a couple of guys whose names he didn’t care to mention. It all goes to show how you can know a person and not really know him at the same time.”

As for Kit, he segues from event to event, narrating his own story entirely in cliches and banalities. “Of course, uh, too bad about your dad. . .I can’t deny we’ve had fun though. . .it takes all kinds.” Nothing important can ever happen to him. He’s incapable of it.

Their moral sense has shrunk to effortless recitals of rationalizations—instead, they view life in aesthetic terms. Holly rejects the outlaw life because the wilderness is void of bright lights and pleasant food. Kit beams with a self-satisfied feeling of success when the officers escorting him to prison observe that he looks like James Dean.

The film endures because Malick is right about important things. He’s right about the woeful state of people whose minds are not enlivened by religion or enlightened by philosophy–in his stories, stupidity and evil are often kindred conditions. Malick’s films are frequently hideous, in precisely the way life among the folk is sometimes hideous.

By the dim and flaring lamps

The fiercely free individual is nothing against the vast forces of modernity. Nostalgia is weak against what is here and what is coming.

The fiercely free individual is nothing against the vast forces of modernity. Nostalgia is weak against what is here and what is coming.

Savannah depicts a nostalgic and weak reaction against the principalities and powers that mostly rule the world. Ward Allen leaves the position and status he inherited to make a free life as a market hunter, but he doesn’t succeed. He achieves a sort of eccentricity and notoriety, but freedom eludes him.

The film has a beauty. I agree with Stephen Klugewicz that we “rightly revel in its broad and beautiful cinematic brushstrokes: its scene painting of the joys of the bucolic way of life, its depiction of the formative power of the past, its idealization of the thoroughly non-modern man. ‘Maybe we are here to remake everything, reshape everything, create our own new idea of perfection and leave God’s idea to the dim shades of history,’ Allen declares during one courtroom appearance. ‘And maybe I, having fought against that new idea, rejected that idea, found that idea abhorrent, maybe I was wrong. But I do not think so.'” It does, as Klugewicz suggests, warm the heart.

The film brought to my mind the Southern Agrarians and their reactionary manifesto, I’ll Take My Stand. It was a book brought to my attention by John Baden when I met him in his home near Gallatin Gateway, on one of my forays through Montana in search of a better conversation. The book is a collection of essays by something of a literary tribe, who understood their plight in terms of the loss of their Southern identity amid the displacements of “northern industrialism.” The Lost Cause was a conversation about being somebody in some place. Dixie was a place, unlike the trampling out the vintage, which was an abstraction. They sided with Thetis and against her son Achilles, that his shield should have borne the images of “White flower-garlanded heifers” and “athletes at their games” rather than nameless, faceless players acting their assigned roles. We should be thoughtful about what we fight for. Theirs was an ambiguous movement jousting ineffectually at the thousand tentacles of modernity. That book, too, has an air of nostalgia about it.

In Savannah, Ward Allen resists game laws and developments that drain the wild out of his river, leaving individuals amid places dying into nameless processes. “This is real,” he says to his wife, when he finally takes her to one of his sacred places, though by then it is too late. Many will sympathize with him. We see the soulless machinery of international financial conspiracy subject us all to corrupt law, we know something of the flattening education the Capitol favors, where young people “engage” in literacy tasks organized around reading passages nearly void of meaning, practicing the bland skills that might provide a paycheck in the institutional hallways and cubicles that await them out there. We sense that in the world they are making, there really is no place for us, and if we are not young, we know that the simulacrum offers no satisfying alternative.

Ward Allen does not know what to do, and his action at the end of the story has more to do with giving up than with finding a way. It is a film filled with beauty, evoking what is being lost. I would have liked him to say more about what he understood about God’s idea. Lesser topics may serve no good.

Irony and multiculturalists and a sense of place

I’m sometimes prone to a quixotic hope that knowing and loving a particular place might be an adequate antidote to modernity. Theories are always simpler than reality, and thus wrong. I’ve played with these thoughts before:

powwow-file0001606276919“Yai Ya!” my grandson called, as he opened the kitchen door and walked in. It’s the Salish name for one’s mother’s mother. This happened nearly daily or several times a day once he was old enough to walk the few blocks from his parents’ house to ours. He didn’t knock, of course. And he was partly calling his grandmother and partly just announcing he had arrived. It was partly greeting and partly invitation. He was here! Where we were! It was good.

How many generations of young Salish boys showed up at their grandparents’ homes with just those words? How long had those syllables been echoing, more or less unchanged, through the abodes of people living in this valley? This place?

Place is a tricky concept to nail down, but I tend to enjoy all the various ways people have tried. They end up talking about the central realities of human life–-story, memory, kin, tradition, culture and land. My grandson traces his heritage in this place we share back, on his father’s side, into the “time immemorial” that the Salish like to talk about. I trace my own heritage back to Kansas, and then to Maryland and then to the Irish highlands on one side, and back to Utah, and then Ohio, then Massachusetts and then to the London slums on the other. Those connections, of course, are also part of my grandson’s history.

One of the ironies of the multiculturalists is that whether one talks to someone advocating a Salish language class on a reservation Montana or activists resisting cultural domination of Islam by the Dutch in the Netherlands, one will encounter similar conceptual machinery leading to parallel categorizations of thought. Multiculturalists around the world share what French political philosopher Chantal Delsol calls a “clandestine ideology.” This unifying ideology, she says, is “ultimately the mandatory litmus test” that people of any culture must pass to avoid being marginalized.

To be acceptable among the right sort of people,

one must join the call for equal representation for both sexes in all spheres of power. One must consider delinquency to be a result of poverty caused by social injustice. Contemporary man must hate all moral order; he must equate the Catholic Church with the Inquisition, but never equate communism with its gulags. He must a priori be suspicious of profit and financial institutions; he must be suspicious of the virtuous, who invariably must be disguising hypocritical vices even more dangerous than the vices of the depraved. He must hate colonizers, unless they are former victims themselves. On the other hand, our contemporary must legitimize all behaviors and all ways of life. He must call for equality everywhere, and fight for ever greater freedom for ever younger individuals.

She predicts that most people who read the excerpt above “will immediately suspect the author of wanting to defend colonial powers or a strict moral order.” This, Delsol says, is precisely what

shows so clearly that a mandatory way of thinking really does exist, and that contemporary man is unable to distance himself from it. Whoever dares to question it, or to even express a doubt about the validity of this sacred discourse, doubtlessly belongs to the camp of the opponent.

There’s a moral certainty in the ideology of late modernity–an absolutism–that disguises itself in talk of openness and inclusion and tolerance. At bottom it flows from a metaphysical dream that first took its characteristic modern form in the cafes of the Palais-Royal in Paris before the Revolution, amid an intoxicating mixture of philosophy, drugs, food and sex. Amid the feasts of foreign delicacies and the prostitutes, distinctions of rank were obliterated, and hedonistic liberty created an atmosphere of social equality that combined illusion with gratification, making total secular happiness seem within reach. Reality would be reconstructed by intellectuals. The old morality would be dissolved.

Living as we do on this side of the cascading sequence of horrors orchestrated by secular ideology, from the Reign of Terror to Auschwitz to the gulags, we tend to wince and retreat a little when people begin speaking too confidently about their truths.

So we now often encounter moral certainty without truth. Our institutions are staffed with many who know what is right but who are also averse to real argument. Their moral certainty is ofen expressed as derision for those who have the wrong thoughts, and the aversion to discussion of fundamental assumptions appears as a smug distaste for the contentions caused by those who persist in old certainties. Better to maintain the peace–a bland equality without strong positions. Except, of course, for the modernist orthodoxy on which that world is premised.

Most years, I read D’Arcy McNickle’s novel Wind from an Enemy Sky with high schoolers on the Flathead Reservation, where I have always lived. McNickle’s father was Scottish and his mother was Métis, and they arrived on the Reservation in time to be included in the tribal rolls, so McNickle was an enrolled member of a tribe in which he had no actual blood. He did share the cultural experiences of many Salish children, even attended a boarding school for natives. He spent his working life as a Ph.D.-bearing bureaucrat in the Bureau of Indian Affairs in Washington. He had a rich experience of cultural pluralism.

He crafted a novel about cultural misunderstandings, one that has no real villain, in the sense of someone intentionally causing harm–but that is nonetheless a tragedy. Those of us who grew up in the same place he grew up might find in its broad cast of characters the sort of dislocations and patterns of misunderstanding that are familiar.

My students and I encounter McNickle in the social context of public schools–in fact, the teaching of native literature is required by the state–that have taken much of their character from modern ideology, including texts such as this:

. . .if allegory is the attempt to move beyond beginnings, creating an abstracted colonial narrative, McNickle shows how this narrative continually fails, as the voice of the colonized continually erupts through it. Turning the colonizer into a corpse, McNickle ironically feeds off his displaced body. Through the ironic portrayal of the colonizers’ naivete, McNickle recontexualizes allegory, making a homeland for it, as well as turning it back as a weapon upon the enemy. It is in this sense that the figure of Washington is perpetually parodied for its ineffective allegories, turning its authority into yet another corpse–the emptied figurehead of colonial control.

I encounter no argument about whether colonial categories provide an adequate approach to this story–just an assumption that “colonilism” is the overarching structure within which we are to make our meanings. I do seem to encounter moral certainty, linked to clear categories. The categories support an enduring guilt–the presence that hovers over us all. The question the author angles up to is how should white people, such as myself, read native literature:

Let me then address the question outright: (how) should whites read indigenous texts? The “how” in this case is parenthetical because the sentence without it has never been adequately addressed. And yet the doubling of the discourse, along with the use of the English language, suggests that Euro-Americans are at least intended to be a partial audience. There is also the fact that native texts are being not only read, but taught, analyzed, and incorporated into an expanding English canon. So the question of “how” Euro-American culture should read these texts seems essential at this point, regardless of the intended audience.

She notes that when whites attempt to read texts written by indigenous writers, they often try to avoid their complicity by looking for native authenticity:

Possibly as a means of assuaging colonial guilt, indigenous literature is often treated with kid gloves–the “it’s not our place to say” mentality of colonial cultures who, while attempting to preserve some kind of native authenticity, simultaneously squat on their territory.

The author argues the Native American novel

serves to “interrogate” Euro-American discourse, rewriting European history in America through the “counter-discursive” practice of allegory. Through D’Arcy McNickle’s text, I have also attempted to reveal how Euro-Americans can read indigenous fiction for these counter-discursive practices. My hope has been that taking a seemingly postmodern trope (allegory) and reinvesting it with post-colonial interrogations might serve to define the genre of the Native American novel, which should be read differently than both the Euro-American novel and native oral tradition.

She cites revisionist historian Patricia Limerick, who said “the frontier only closed when the Indian was turned into an artifact.” In other words,

the representational system used to annex the receding frontier only became closed or complete when Indians were adequately accounted for as artifacts–unable to change or affect the new discourse. In a sense, this closing of the frontier is what has made natives appear safe, though inaccessible, for the first time. The closed nature of colonial discourse, which would turn natives into allegorical figures in a master discourse, or frame native literature in a precolonial moment, has had its day. It is time that the frontier was opened again.

That’s one way to think about what happened here and what McNickle is doing. It does seem, to me, to be a somewhat deadening discourse, hankering after fixed categories of oppressed and oppressor, white and indigenous–the categories with which intellectuals are wont to construe our world.

I’m not sure that those of us who live in such places as McNickle has in mind ever stopped experiencing race as a frontier–fluid and undergoing continuous redefinition and renegotiation. Time–as experienced and thought about for decades at particular places–is a dance within a constant flux of biology and culture–life. The ideological categories of oppressed and oppressor, Indian and white, fail to do justice.

The historian Elliott West once observed that in terms of what actually happened in the American West, the metaphor of marriage seems more useful than that of war and conquest. The rigid categories of war and enemy did arise, but they were less frequent and less sustainable than other modes of intercourse as groups of actual, living people found each other in the American West. The fur traders often took native wives and got on with the business of human commerce. Marriage has been a primary means of cultural survival and continuity.

James Hunter made a fascinating study of all that in Scottish Highlanders, Indian Peoples: Thirty Generations of a Montana Family. He tells the story of a different Scottish family than McNickle’s–that of the MacDonald family–which today is one of the largest tribal families on the reservation. Members of that family are descended both from medieval warriors who battled for independence from England in the Scottish Highlands and from Nez Perce and Salish warriors who contested with the formidable Blackfeet of the northern plains over access to buffalo country. At the center of that story is the 1842 marriage between Hudson Bay fur trader Angus McDonald and his wife Catherine, a Nez Perce/Iroquois woman.

A friend of mine whose father who was fullblood Kootenai–a man of the twentieth century who made his living as a gypo logger–calls people who think about such things as emptied figureheads of colonial control “college Indians.” What they espouse they found at universities and not in traditional folkways.

A happier way of thinking about what happened here was offered to me by my friend Clarence Woodcock years ago. He was one of the local leaders in preserving and restoring Salish culture on the Flathead Reservation–including the creation of a Salish Culture Committee, to record, publish, teach and sustain traditional culture. Clarence was also a devout Catholic. The singing of Catholic hymns in Salish was a weekly feature of worship at the church in St. Ignatius, when he was alive. I asked him about that once–if he sometimes felt a conflict between his love and teaching of Salish culture and his Catholicism. He said he didn’t. “The cultures are much alike,” he said. “We didn’t know about Jesus, but the rest of it–knowing how to live in good ways–that was here.”

I’m neither Salish nor Catholic, but I think I understood.

A lynch mob is an extreme form of gossip

Communications technologies magnify destructive as well as constructive information

As our power increases, we often need to develop better discipline if we want to avoid self-destructive patterns. As food became more plentiful, people needed to learn to discipline their eating habits better. Obesity replaced starvation as our big problem with food. Increased wealth and power always put greater demands on character.

Our age is notable for dramatic increases in the ease and power of communication, and this is accompanied by a great democratization of communication–by which I mean hierarchical constraints on communication have eroded. The gatekeepers are gone. It was seldom easy to get a story in a national publication, and normally doing so required persuading seasoned and somewhat dispassionate editors who were as concerned about maintaining credibility as about making a splash. Fact checking, adequate sourcing, maintaining a temperate voice, providing enough context and balance to lend perspective–some editors took such things very seriously.

These days, speed tends to matter more, and in any case it’s no longer necessary to get past the gatekeepers. The comments sections of lots of big websites appear unmoderated. They are stuffed with commentary that would never have been published in the print age.

Unfortunately, communication is not an unmitigated good, any more than calories are. Speech can be destructive as well as constructive. Lies and misinformation can make us stupider rather than more intelligent. Our religious traditions warn us to discipline our tongues. Gossip, slander, backbiting, lying and the like are not innocent little hobbies. They are powerfully destructive forces in most societies.

lynch-mobBetween 1889 and 1930, 3,724 people were lynched in the United States (more than 80 percent of them were black). In his study of this phenomena, Arthur R. Raper described the pattern that led to these violent acts: “As the crowd grows and discusses the case, the details inevitably are exaggerated. These exaggerated reports, in turn, further excite the excited people who exaggerated them. After a time, the various stories of the crime take on a sort of uniformity, the most horrible details of each version having been woven into a supposedly true account. The milling process continues until an inflammatory speech, the hysterical cry of a woman, the repetition of a slogan, the accidental firing of a gun, the waving of a handkerchief, the racing of an automobile engine, the remarks of some bystander, or some other relatively trivial thing, throws the group into a frenzy and sets it on a career of arson, sadistic mutilations, and murder.”

It’s easy to see those same communications patterns in the Zimmerman case, as both men have been vilified by overexcited people eager to feel righteous. A lynch mob is an extreme form of gossip.

The lynchings were stopped, finally, by the imposition of a hierarchical system of justice that “disempowered” the local people, replacing pure democratic action with a system of authoritative constraints. Authorities constrained horizontal communications and forced communications to move vertically, and they customarily required messages to be associated with documentary evidence. We continue moving toward pure democracy, in which there is no law–only the will of the people as it changes from moment to moment, usually due to the persuasion of a charismatic leader.

Among the questions we face now the verdict is in is whether the judicial system was corrupted by mob fervor rather than operating as a constraint. The question we face moving forward is whether mob fervor will corrupt other institutions rather than being constrained by them.

Was the world more beautiful 100 years ago?

Was the world a better place 100 years ago? Glenn Reynolds links to the photographic evidence, but one of the things I enjoy about many films set in the early 20th Century is how beautiful that world seems to be. This includes movies as diverse as the Harry Potter films to the television productions of Downton Abby and Malick’s visually sublime Days of Heaven.

Save the culture? First, we must save ourselves

story-file0002003501002Rod Treher argues that conservatives need to do better at presenting their views through stories.

Argument has its place, but story is what truly moves the hearts and minds of men. The power of myth—which is to say, of storytelling—is the power to form and enlighten the moral imagination, which is how we learn right from wrong, the proper ordering of our souls, and what it means to be human. Russell Kirk, the author of The Conservative Mind whose own longtime residence in his Michigan hometown earned him the epithet “Sage of Mecosta,” considered tending the moral imagination to be “conservatism at its highest.”

Through the stories we tell, we come to understand who we are and what we are to do. This is true for both individuals and communities.

Stories, as carriers of ideas, have consequences. Lincoln, upon meeting Uncle Tom’s Cabin author Harriet Beecher Stowe, supposedly remarked, “Is this the little woman who made the great war?”

Dreher argues that ordinary people understand policies through stories. This is not a new idea. James Davison Hunter critiques it in To Change the World: The Irony, Tragedy, and Possibility of Christianity in the Late Modern World. He points out that Evangelicals “have been distinguished by their massive cultural output in books and book publishing, magazines, radio, music, bible studies, theology, Christian education at all levels, and so on” (29). This cultural production has not ended their cultural marginalization, and Hunter offers eleven propositions that might explain why creating conservative stories probably won’t lead to a conservative renaissance in the larger culture.

One of these is that “cultures change from the top down, rarely if ever from the bottom up.” Though sometimes economic revolutions and social movements appear to result from mobilizing ordinary people, “the work of world-making and world-changing are, by and large, the work of elites: gatekeepers who provide creative direction and managment within spheres of social life. Even where impetus for change draws from popular agitation, it does not gain traction until it is embraced and propagated by elites” (41).

Dreher notes correctly that “Stories work so powerfully on the moral imagination because they are true to human experience in ways that polemical arguments are not. And because the moral imagination often determines which intellectual arguments—political, economic, theological, and so forth—will be admitted into consideration, storytelling is a vital precursor to social change.” It probably is a precursor. But the distance that will remain to be traveled even if conservatives develop powerful stories is daunting. Dreher stops short of arguing that stories are sufficient to cause widespread changes in the culture, and if Hunter is correct, such change usually requires the participation of elites and their institutions:

…cultures are profoundly resistant to intentional change—period. They are certainly resistant to the mere exertion of will by ordinary individuals or by a well-organized movement of individuals. The idea, suggested by James Dobson, that “in one generation, you change the whole culture”13 is nothing short of ludicrous. Change in political systems and economic conditions can occur relatively quickly but the most profound changes in culture typically take place over the course of multiple generations. The most profound changes in culture can be seen first as they penetrate into the linguistic and mythic fabric of a social order. In doing so, it then penetrates the hierarchy of rewards and privileges and deprivations and punishments that organize social life. It also reorganizes the structures of consciousness and character, reordering the organization of impulse and inhibition. One cannot see change taking place in these ways. It is not perceptible as an event or set of events currently unfolding. Rather, cultural change of this depth can only be seen and described in retrospect, after the transformation has been incorporated into a new configuration of moral controls.

In this light, we can see that evangelism, politics, social reform, and the creation of artifacts—if effective—all bring about good ends: changed hearts and minds, changed laws, changed social behaviors. But they don’t directly influence the moral fabric that makes these changes sustainable over the long term, sustainable precisely because they are implicit and as implicit, they form the presuppositional base of social life. Only indirectly do evangelism, politics, and social reform effect language, symbol, narrative, myth, and the institutions of formation that change the DNA of a civilization.

Imagine, in this regard, a genuine “third great awakening” occurring in America, where half of the population is converted to a deep Christian faith. Unless this awakening extended to envelop the cultural gatekeepers, it would have little effect on the character of the symbols that are produced and prevail in public and private culture. And, without a fundamental restructuring of the institutions of culture formation and transmission in our society—the market, government-sponsored cultural institutions, education at all levels, advertising, entertainment, publishing, and the news media, not to mention church—revival would have a negligible long-term effect on the reconstitution of the culture. Imagine further several social reform movements surrounding, say, educational reform and family policy, becoming very well organized and funded, and on top of this, serious Christians being voted into every major office and appointed to a majority of judgeships. Legislation may be passed and judicial rulings may be properly handed down, but legal and political victories will be short-lived or pyrrhic without the broad-based legitimacy that makes the alternatives seem unthinkable.

Dreher holds out the hope that “if conservatives become better storytellers, they might save the culture.” They might, but I suspect a more useful goal might be to strengthen families and churches, to withstand onslaughts from a dominant culture than knows how to care for neither.