Wednesday, November 23, 2011

This Thanksgiving, Are We Celebrating Cooperation or Genocide






Anyone at least my age remembers participating in Thanksgiving Day pageants in elementary school. In these pageants, some of us would dress like pilgrims, some like Indians (always an inaccurate term, now a distasteful one too), and some would even dress as turkeys. We would sing songs and perform little skits which all designed reinforce the American mythos of a cooperative relationship between Native Americans and early settlers of the "New World." Though the holiday (which was made an official by Abraham Lincoln, undoubtedly for political purposes) had been intended to celebrate the cooperation between the Plymouth Colony and the Wapanoag tribe, the holiday was often connected to Christopher Columbus who we were always taught had "discovered" the New World. In fact, in Canada, Thanksgiving is celebrated on the same day the U.S. celebrated Columbus Day.

Of course, this myth has come under fire in the past couple decades as well meaning folks have asked whether or not we ought to be celebrating the fact that an imperialist European force crossed an ocean and subdued a continent full of people. For this reason, I was surprised to see my sons come home from day care the other day with construction paper headdresses rather like the ones I remember making for our Thanksgiving pageants. Of course, I also noticed that the headdresses were altered from those that I remember, as if they were hiding what they were. Instead of cutouts of feathers, the headdresses were adorned with cutouts of hand prints in yellow, green, and blue. They combined feathers with our old technique of drawing turkeys.

These, along with the general spirit of the season got me thinking. How should well meaning people view Thanksgiving? How should we view our early history as a nation? Were we heroes in search of religious freedom, or wicked imperialists bent to grabbing as much land as we could, at the expense of those who already lived here? Is either of these views accurate or fair?

Certainly, the myths of my childhood are inadequate. It will not do for me to celebrate Columbus (since we have connected these events in our perception of cultural history) for discovering a "New World." How can this be when millions of people already lived here. To do so requires that we take a decidedly Euro-centric view of history. Only Europeans matter, and only their exploits are worthy of history.

But might the other extreme be lacking as well? Should I refuse to celebrate the crimes against humanity perpetrated by those Imperialist Europeans from whom I descended and who are ultimately responsible for my being here? I would argue that this view also involves assumptions that are both Euro-centric and inaccurate.

This view, in most of its forms, seems to assume a benevolent population of Native Americans who were bewildered by a European understanding of land ownership and who were ultimately overcome by superior force. This view positions European settlers as pirates (Vonnegut used this term explicitly) but it also seems to position Native Americans as romantically naive, almost childish people who lived off the land, to which they believed they belonged.

This view is every bit as racist as the first. It ignores the richly complex sociopolitical milieu of the Native American tribes living on the east coast in the seventeenth century. It ignores the complex relationship and the history of the relationship between early settlers and Native Americans, particularly the Wapanoags, the tribe if the "first" Thanksgiving.

A well nuanced explanation of the history of early settlers and their relationship to those people already living in the Massachusetts Bay area can be found in Nathaniel Philbrick's Mayflower. His book provides wonderful historical insight as to how the events we talk about at Thanksgiving actually occurred, who the people involved were, and what their relationship was like. Through this book, we gain a better sense of what we ought to regret, and what we ought to celebrate.

The first generations of settlers at Plymouth and Massachusetts Bay were far from being conquistadors looking for cities of gold. Instead, they were half-inept sailors sick and weary from a long voyage. They might have resembled illegal immigrants who have just crossed the Arizona desert more than a well armed force of confident Europeans. For their part, the Native Americans living in New England were certainly not romantic nomads quietly living off the land. Instead, they were farmers in permanent settlements. They lived on close proximity to other tribes and were thus politically savvy, engaging in land disputes and forging treaties, much like those is crowded Europe.

When colonists first arrived on Massachusetts Bay, they were hardly conquerors ready to subdue a bunch of disorganized natives. In fact, it seems likely that, had the Wapanoags wanted to, they could have pushed the settlers back into the sea. When fighting did begin to break out, it was localized. The first (and most important) major conflict came with King Philip's war, decades after the settling of Plymouth. Philbrick describes the ultimate cause of the war in this way:




By the midpoint of the seventeenth century. . .the attitudes of of many of the Indians and English had begun to change [from that of a spirit of cooperation and cohabitation]. . .Both sides had begun to envision a future that did not include the other.

This first war was much more evenly matched that one might assume. By this point, Native Americans had been able to acquire the same weapons as the settlers, and King Philip (the English name adopted by Metacomet, sachem of the Pokanokets) was well educated and often dressed in expensive clothing he bought in Boston. On the other side, one of the chief agitators (perhaps accidentally) of the war was a Harvard educated Native American Christian named John Sassamon.

The point is, The breakdown of diplomacy between early settlers and Native Americans was hardly a one sided affair, and it is hardly accurate to think of early settlers as marauding invaders sacking the villages of unsuspecting natives. Instead, two groups who had one lived in cooperation had begun to want to push the other out. So, with a more nuanced view of how things really were, how are we to think about and celebrate Thanksgiving?

Without a doubt, the policies of the young United States toward Native Americans was deplorable. The evils of the White Man's Burden, western expansion, and so on should not be minimized, and certainly should not be celebrated. And it may be impossible to separate entirely these travesties from our early history as settlers. But the spirit of cooperation between Native Americans and settlers which allowed Plymouth to survive can and should be remembered and celebrated. As Philbrick points out:




For a nation that that has come to recognize that one of its greatest strengths is its diversity, the first fifty years of Plymouth Colony stand as a model of what America might have been from the very beginning.

Without a doubt, much of the history that follows these first years is tragic and shameful. As greed and hatred are common to all mankind, tragedy and shame mar all of human history, not just that of this country. But this first generation of settlers shared with us the dream of a cooperative utopia, and though they were sick and poor, that dream was very nearly realized because two groups who did not know each other, could not speak the same language, and had no reason to trust one another nevertheless shared with one another. Thanksgiving, then, may and should still be celebrated as a picture of what could have been--a picture of what still may be.

Sunday, November 20, 2011

Brecht, Alienation, Reality TV

I wrote my first non-realistic play my senior year of college. The play was decidedly (in retrospect at least) a closet drama. The play contained all the things I loved about absurdist drama: blatant Marxism, Brechtian alienation, and so on. That is to say, it was way too European for American audiences.

When I got to graduate school and began looking closer at American and English non-realistic plays by writers like Edward Albee, Arthur Kopit, Tom Stoppard, Harold Pinter, and so on, I realized that these writers had not been able to do what continental European writers like Ionesco, Beckett, Adamov, Weiss, and other had considered crucial. They had been unable to banish subtext.

European Absurdists accepted Brecht's notion that an audience could never fully be convinced by characters in a play. They could never be convinced that the characters had cross motives, sexual desires, inner conflict, because they could never be convinced that the people they saw on stage were characters at all. The fact that they were spectators in a theatre would always break any illusion that an audience might have that these actors were real people or that their conflicts, problems, and relationships were genuine. Coleridge's notion of "willing suspension of disbelief," then, is in fact impossible. The space alone precludes it.

Since this is the case, according to Brecht, the theatre must find a means outside of emotion to
communicate with an audience. For Brecht, this means was via the intellect, through what he called the "Epic theatre." He wrote

The essential point of the epic theatre is that it appeals less to the feelings than to the spectator's reason. Instead of sharing an experience the spectator must come to grips with things.
The theatre, then, is didactic. It "must not believe that one can identify oneself with our world by empathy, nor must it want this." Instead, the theatre should ask the audience to think. In order to accomplish this, Brecht espoused a technique (or class of techniques) knows as "alienation." A play, rather than trying to bring the audience in, should continually remind an audience that it is apart from the action, that it is not invited into the world of the play, that there is, in fact, not a world of the play at all, but only actors on a stage reciting lines.

So, as I thought of Brecht's desire that the theatre should be didactic--that it should cause an audience to think about things--I wondered how the theatre might carry this out without the European burden of alienating the audience. American audiences expect to be emotionally involved and want to be drawn in, even if this is not fully possible. How does the American theatre accomplish Brecht's goals (a theatre that forces an audience to consider) when it cannot accept his methods?

I then thought of the ABC television show "What Would You Do." In this reality show, which combines the "happening" of the 1960's with Chris Hanson style investigative journalism, actors in public settings act out shocking scenarios in order to record the reactions of unwitting spectators. The idea is to conduct a social experiment investigating whether or not people will react ethically. "What Would You Do" represents an anti-Brechtian form of didactic theatre. That is, it attempts to perform a didactic role by asking the audience to consider how and whether they would act in these shocking situations. Yet it does so not by alienating the audience so that it is reminded that the spectacle isn't real, but by convincing the audience that the scene is real. In order to do so, it blurs the lines between the traditional theatrical relationships of actor, audience, and space.

Actor and Audience:
In "What Would You Do" the shocking situations that make up the dramatic conflict are scripted and performed by professional actors. And, obviously, the viewing audience represents a very traditional passive audience. But the show blurs the line between actor and audience by staging its scenes in public places full of unwitting people who just happen to be in the right place at the right time. By filming these people's reactions to events that they do not know are scripted, and by enticing them into action, the show turns audience members into actors.

Also, when these people do react and intercede into the scenes, they then take agency from the professional actors who had up until then been performing roles. These actors then, albeit briefly, become audience members. They must watch and listen to the intercessors in order to respond appropriately and continue the action, which has become improvised. Thus, the show subverts the roles of actor and audience, though both roles do still clearly exist.

Space:
The use of space in these performances is the most important reason these ruses work. What makes the willing suspension of disbelief ultimately impossible in traditional theatre is the space itself. An audience member at a traditional play can't help but to know he's in a theatre. After all, he has bought a ticket, he has probably dressed up, driven to the theatre, taken his wife out to eat before the show, read the program, sat down when the lights dimmed. And throughout the action of the play, he sits in an area lit very differently than the space the actors occupy, likely in a row of chairs physically separated from the actors. The space, then, is alienating in and of itself.

In "What Would You Do?" on the other hand, the space the audience member enters is a public place, likely one she frequents. The space is not the Orpheum Theatre, but rather it is the restaurant that she would have gone to on the way to the theatre. She has no ticket and no program. She sits among the actors and in fact cannot know which are actors and which are other audience members like herself. She does not even know she's in a theatrical space.

Un-Alienation
This use of space then allows the spectator to be convinced that what they are seeing is, in fact, real. There is no physical reminder that the actors are reciting a memorized script, thus the audience member can fully empathize with the actor, whom they regard as a real person. This use of space, then, un-alienates (admittedly, a clumsy term) the spectator by bringing him within the same physical space as the actor. They share the stage.

The show also brings the audience member into the play in a very literal way. The entire purpose of the script it to build and build the action until some audience member reacts and intrudes into the scene. The show thus un-alienates the spectator by making him an actual part of the play. Of course, the beauty is that the spectator does not know he has become an actor in a play. Instead, he is convinced that his actions and the actions of the professional actors are real. He has unwillingly suspended his disbelief.

In this way the audience member has, through the theatrical event, been forced not only to think about what she would do, but she has been put to the test. She has had to either act or refuse to act. Similarly, the viewing audience has been able to empathize with the duped unwitting accidental actor because he could just as easily be tricked the next time he is on the subway or at a bar. He, therefore, must think about what he would have done if this shocking scene had presented itself to him.

The show then is didactic and theatrical and it is so without alienating, but in fact by un-alienating, the audience. It has then done what Brecht asked the theatre to do (caused the audience to think about things) while maintaining the aesthetic demands of an American audience to be "moved."

Saturday, October 08, 2011

The Hands-in-Pocket Revolution


Last night, Charissa and I sent our boys off to their grandparents' for the weekend, so we spent a night on the town. We went downtown for chocolates and coffee then decided to take a stroll to see the OKC National Memorial because Charissa had never seen it at night. On the way, we were surprised to see a large group of people in a downtown park, which is usually home to a number of homeless. But this looked like a rally. Complete with a News 9 van. So seeing some Bricktown police officers I know, we went and asked what was going on.

Turns out, the Occupy Wall Street movement had come to Oklahoma City. So, in front of me were a group of well meaning revolutionaries wanting to voice their concern that the wealth of this country was in the hands of a very small percentage of the population.

This is a obviously not a new complaint, and it's a concern that I should be careful to note here that I share. I think it's appropriate to ask whether or not a democracy where so much power is held by so few can be a democracy at all. When small groups of people hold the money, and therefore the power to make their voices heard, the rest must either agree with them or be silenced. This is obviously a huge problem in a democracy, and one that I think is a political reality in our country. I think it's fair to ask if our democracy isn't something more like an oligarchy.

I also sympathize with the complaint that wealth is not spread more equitably. So few have so much, while so many have so little. This is and always has been a reality in our world. But the problem seems somehow more insidious in a country in which hard work is supposed to be the key to success, where our own national mythos holds that if one simply works as hard as one can, one can and will rise. Of course, for many, this is simply untrue. For the uneducated, the poor, the handicapped, and many others is positions of disadvantage, it is awfully hard in this country to rise much above one's stature no matter how hard one works. After all, if you go from being a fry cook to a shift leader, you haven't really gone so far after all. This is a sad and awful reality. One that compromises our democratic ideals. One that, perhaps, calls for revolution.

But what I saw in the park last night was not a revolution. It was not revolutionary. It was not even in touch with the real problem. The fact is, the whole "I am the 99%" line is a sham. I did not see the poor, the stricken, the underprivileged, the oppressed. I saw no minorities (though I did see one African American woman on the News 9 story), and in an interesting irony, I saw not one homeless person in a park where homeless people sleep every night. What I did see were a lot of upper-middle-class white people, many college students (one even carrying a sign with an OU logo printed on it), wearing interesting and expensive body art and piercings, taking pictures of each other with smart phones, and all-in-all having a jolly time.

Obviously, there's nothing wrong with upper-middle-class white people noticing and speaking out about social inequities (I am, after all, all of these things). But as I listened to their speeches and watched their interviews, I became more and more aware that the people I was listening to didn't have a clue, and in fact had no compulsion to enact anything like the kind of social revolution that would change this country.

I found that I wanted to get up and speak myself (I didn't). I wanted to ask for a show of hands in answer to these questions: How many of you know what is at 800 W. California? What about 1335 W. Sheridan? How may of you have ever been to SW 8th and Rockwood? SW 15th and Westwood? NE 28th and Kelley?

These addresses describe the City Rescue Mission and the Jesus House. SW 8th and Rockwood is an intersection less than a mile from the opulence of the new Devon Building. The street at this intersection and for a few blocks on either side of it is in such disrepair that it is not even paved. Meanwhile the city that says it doesn't have money in the budget to fix the street is re-doing the Myriad Gardens--the front yard of the Devon building. These other intersections are the locations of large public housing projects in the inner city.

Then I wanted to ask them how many of them had been to Bricktown? To a Thunder game? To an OU-Texas party?

What I saw last night was not a collection of people dedicated to bringing about change. Instead, I saw a collection of college students who wanted to take part in something and leftovers from the 60's wanting to relive their own college years. Occupy OKC may as well have been a flash mob.

For those of us in the upper middle class, real change must begin with our admission that we are not victims in some class struggle between millionaires and thousandaires. We are suspects. We are people who demand to be heard but who blithely ignore the problems of people much less fortunate than ourselves--people who really are oppressed and poverty stricken. We complain that our student loan rates are unfair considering our sorry job prospects while we ignore the man working six days a week sweeping the street up after our protests. We don't even know that when we're sleeping in after a Saturday full of college football games and fun nights in the Paseo, he'll be getting up at 4:00 and walking to Labor Ready so he can be in line before all the good jobs are gone. We don't know about. We don't care about him. Yet we demand that something be done about him.

So, my complaint in this blog is against pretend revolutionaries. My exhortation, though, is for real ones--would be revolutionaries who don't know what to do. It's as simple as this:

Do Something!

Realize that standing in a park under the Sandridge building hoping Tom Ward will throw some money down does nothing to change the status quo. Get out of the park and into the community. I've often half-joked that I want a revolution led by Jesus Christ and Kurt Vonnegut. That's because these men both realized that change happens by getting among people and helping, eating with them, hearing their problems, being a voice for them. And doing these things (as Paulo Freire has suggested) not as patronizing liberators but as neighbors.

Sure, both these men took to the streets as protesters (Jesus was after all killed for the things he said), but they both understood that this in not where the real work is done. The real work is done by being kind to people. By living among them.

You want to see wealth redistributed? Start with your own. Feed and clothe people, even if it costs you your smart phone. Care for the sick, even if it costs you a semester or two of college. Go into foreign countries or even poor neighborhoods in your own country, even if you're uncomfortable. Use the privilege of your education to understand and voice the concerns of people who need an advocate. Put down your sign and pick up your shovel.

Tuesday, August 30, 2011

Love, the Self, and the Divine

My friend and fellow writer and teacher Jessa Sexton once very flatteringly quoted me in a book. At some point, I told her that "you cannot love others until you learn to love yourself," or something like that. What I think I meant was that you cannot value others until you are able to value yourself, something that makes sense. As soon as Jessa told me that she wanted to quote me, I was embarrassed by this quote. Obviously, Jessa is my friend and I allowed her to quote me (I was pleased to have ever said anything that anyone found inspirational, especially someone as intelligent as my friend), but not without a little shame--mostly because the quote feels a bit cliched to me. But as I look at the quote now, I realize something else that I dislike about it. It assumes, I think, a kind of emotional state in which my ability to love others is dependent on my ability to feel good about myself. It's a matter of self esteem, a concept I've since come to see as dubious, fleeting, and perhaps even irrelevant.

At 32 years old, having been married now for ten years to a fantastic wife with whom I have had two sons, my thinking about love and what it is have changed dramatically. Love is, and this is every bit as cliched as my quote in Jessa's book, much more active than my original quote assumed. It is not a feeling one has, though feeling is certainly a part of it. Love is a stuff one does. It is service and sacrifice. It is treating others as if they have been made in God's image, an understanding of the value of individuals.

With that in mind, I'd like to revise my thinking in my old quote. In fact, I'd like to reverse it: I could not love myself until I learned to love others.

You see, despite my seemingly arrogant bravado, I am a very self-conscious person. I often doubt my intelligence, question my talent, and fail to see my own worth. Often, it seems to me that there's just not much to love about myself. But, rather in opposition to my old quote, it is when I love others that I begin to understand my place and consequently my own worth as a child of God.

My faith has taught me that God is, above any other aspect of his complex personality, defined by love. It is therefore no surprise that his most important commandment to the people he has declared to be created in his image is to love. My capacity to love, as God has loved, is one of the most important ways in which I am created in the image of God.

Therefore, it is when I love others--through service, through kindness, through grace, through sharing a meal or a kind word--that I most able to see God in me. And when I realize that God lives within me, that I am in essence a container for the divine, I am able to see and accept my own value. When I see others through God's eyes, I see myself through them as well.

Thus, in my love for others, I develop a purpose and a sense of self--and a love for what I am and what I am called to do.

Monday, August 01, 2011

Who's Kelly Dodson? One year after the Bed Intruder

July 28th was the one year anniversary of the original airing of the WFF-Huntsville story that made Antoine Dodson a household name. The story itself became a viral notable. Then the story became a commercial success for Dodson when the Gregory Brothers auto-tuned the interview, creating the Bed Intruder Song. The song propelled both Dodson and the Gregory Brothers (who had been auto-tuning news clips for some time already) into Web 2.0 fame.

In the year since, I've shared both versions of the video on Facebook, talked about the Gregory Brothers remix in PhD level rhetoric and composition courses, shown the video to four sections of English Composition, read articles, papers, and blog posts about what the phenomenon means with regard to old notions of authorship/copyright/creativity and so on.

As interesting as the phenomenon has been as a student of rhetoric, there is one element of the story that has always troubled me--that is, the exploitative element of the story itself.

When the original story first became a meme, I was troubled by what I saw as an act of exploitation by WFF in its piece. I worried that what made the story so compelling, the reason it was being passed around the internet so much, and indeed the reason WFF edited the interviews in the way they did was that Antoine Dodson is so laughable. The troubling thing is that what makes Dodson so laughable is that he seems to fit into so many stereotypes. We instantly peg him as a ghetto-dwelling, Ebonics-speaking homosexual--proof that the inner city really is inhabited by the something-less-than-humans that middle class suburbanites think it is.

Of course, the part of me that was troubled by the fact that a news station would take advantage of these aspects of Dodson's persona was soon silenced by Dodson himself. It was quickly apparent that he was soaking up the attention, and loving it. And as soon as the Gregory Brothers song began selling on iTunes, he began to make money off the story as well. So, if the story was exploitative, it was at least co-exploitative. Dodson gained from it as much as WFF did.

But what gets lost in a discussion of whether or not Dodson was exploited is what is lost in the original story as well. That is: the victim, Kelly Dodson.

When the story first aired, I noticed the discrepancy in the amount of time the 2 minute, 3 second story devoted to Antoine Dodson (three clips for a total of 31 seconds) versus the time given to Kelly Dodson, his sister (two clips for a total of 7 seconds). So, the brother of the victim gets three times the air time as the victim herself. At the time, I took this as evidence that WFF was exploiting Antoine Dodson's compelling ghetto character at the expense of Kelly. I'm ashamed to say that it was not until much later that I considered whether or not this was a disservice to Kelly more than it was to Antoine. We know that Kelly must have had more than 7 seconds worth of stuff to say, because the story itself contains a clip of her talking and pointing, but this is dubbed over with the voice of the reporter. So, what of Kelly's story is edited out to make room for Antoine's ridiculous, self-aggrandizing rant?

I have little doubt that the story was edited the way it was to feature Antoine because Antoine was, frankly,more fun to watch--because he was so laughable. But the fact is, this was supposed to be a story about an attempted rape. A man violated the sanctity of Kelly Dodson's bedroom, in an attempt to violate her body as well. Such a crime is not and should not have ever been a joke.

The fact that it has become one is certainly an interesting academic phenomenon. But it also ought to trouble us. A news station may very well have robbed a woman of her story in order to amuse its viewers with the antics of her brother. And we lapped it up and laughed it up. Should we be okay with this?

Monday, July 18, 2011

Notes on Hedging and Introversion

A friend recently pointed me (via Facebook, of course) to a Carl King blog in which he outlines ten "myths about introverts." He bases these on his reading of the book The Introvert Advantage by Marti Olson Laney and on his own experiences. As an introvert myself, the list struck a chord with me, but I was especially interested in his third myth. He says:

Myth #3 – Introverts are rude.
Introverts often don’t see a reason for beating around the bush with social pleasantries. They want everyone to just be real and honest. Unfortunately, this is not acceptable in most settings, so Introverts can feel a lot of pressure to fit in, which they find exhausting.
This one stands out to me because I am often thought of as rude, aloof, and any number of other horrible things. It's something about myself that I have hated and that I have made concerted efforts to change. But the fact is, I don't ever even know when I'm doing something that others think is rude. And I am, in fact, far from being aloof; I am actually extremely emotional, though I am completely out of my element when trying to figure out how to express emotion in a socially acceptable way (so, I simply don't).

With regard to my accidental rudeness, I recently had an experience that I found enlightening, as I figured out what my misstep was, and why one particular person thought I was being rude, when I in fact, had no intention of being so.

Recently, while on a call, a coworker with whom I rarely work approached and asked "do you need anything?" I was extremely busy: I was on the phone with my our crime information unit, I was having an IM conversation with my dispatcher, and I was talking on the radio to our helicopter all at the same time. None of this, however, was anything she could do for me, so I said simply "no." She said something I didn't really hear in a tone of voice that was clearly perturbed and she drove away quickly. She then sent me a message that said "sorry I asked."

She apparently found my very terse answer to be rude. Of course, I had not intended to be rude. I was very busy, so I answered her question as directly and as quickly as I could. As a student of rhetoric, I was instantly interested in why my intention has miscarried. The fact is, and this is something I had not thought much about until this incident, there is an awful lot of hedging in our day to day communication.

"Hedging" is the process through which we qualify our language in order to soften the points we make, or the things we say. Our modern American culture seems to prize hedging a great deal. We are expected to use softening expressions like, "in my opinion," "I think," "but thank you anyway," and so on. It's a linguistic cue that is designed to be somewhat self-effacing ("this is what I believe, but it's only my opinion"), and thus exalting the other interlocutor.

Of course, these are dense, unspoken (and arbitrary) conventions designed to provide communicative context, especially with regard to how the speaker positions himself relative to the listener. In other words, by softening one's language, one in essence shows that he is not aggressive. Hedging is the linguistic equivalent of a dog's refusal to look her owner in the eye, in order to show that she is passive. And in a culture that presumes everyone to be equal, it's very important to us that we show through these cues that we are not trying to be the big dog.

It is this sort of dense communicative rule that is often lost on introverted folks like me. Perhaps this is because we tend to be fiercely interior and thus it does not occur to us that we will not be interpreted by our language alone, but also by these linguistic and contextual cues that we don't quite get--because we don't quite get other people.

Of course, while it has been a fun intellectual exercise for me to think about the importance of hedging, I don't expect that I'll start doing more of it. That is because I will continually fail to realize that I'm not doing it already.

Sunday, June 19, 2011

Fathers, Sons, and the Badge: Just in Time for Father's Day


The truth is, though my close friends will already know this, I have very little in common with my father. He and my mother divorced when I was very young, my mother remarried, and I have basically been raised by my step-father, who I consider to be as legitimately titled as father as my "real" (my childhood term) dad. As a child, I spent weekends with him, as is the case in many divorced families, but as I got older and had more and more things going on, I saw less and less of my father. We are, therefore, very different in many ways.

But this never stopped me from idolizing my father any less than all sons do. So I was always impressed with my dad, the police officer, as a young boy. I was impressed with the uniform, with watching him work on his patrol car so he could go to work, and with the relatively mild stories he would bring home. I was in fact so impressed that I began "arresting" neighborhood kids. My dad tells a story about hearing a loud commotion in front of the house once, and coming outside to find that I had a row of young Hispanic kids "jacked up" against his take home car. It might not be surprise then that I would grow up to become an officer myself (the fact that my step-father is a firefighter also meant that I had my choice of hero jobs to follow. . .I chose the one I thought looked more fun).

Nearly eight years after becoming an officer myself, I sat with my father today on Father's Day, eating grilled brats and talking about the job. Of course, I've "talked shop" with my dad countless times since joining the department, but not until today did it occur to me how interesting an unique these conversations were.

I came to this today because my in-laws were over as well and, in the middle of my conversation with my father, I grew suddenly aware and concerned that they were totally left out of the conversation. I don't know if they felt that they were, and I'm sure that they must have found some of our stories at least entertaining. But the fact is, educated and open as my in-laws are, they could never be anything more than entertained outsiders in our conversation. Indeed, I forgot they were there entirely, and wasn't even thinking of them as an audience. I was too busy talking about the family business.

This made made me realize what a unique type of shared family experience these stories represent. Yet, as strange as this family dynamic must seem, it is one of the most often written about aspects of police families. Edward Conlon (a third-generation cop) writes about his conversations with his father, an FBI agent in his book, Blue Blood. Brian McDonald, author of My Father's Gun recounts watching his brother (also third generation) share stories with his father. The History Channel mini-series based on and named for his book references these conversations as well.

In the police family, all family gatherings become business lunches, group therapy sessions, water cooler talks, or any number of other things. Ultimately, though, these conversations represent an extraordinary and unique form of bonding. There are not very many sets of fathers and sons that work under the same incompetent structure, deal with the same types of "customers," and ultimately stare into the same abyss.

The police family is a rare type of family where father and son are also brothers--members of the same tight-knit fraternity. For me, my shared experiences with my father have given me much to talk about with a man who is otherwise hard to talk to, and with whom I might not have much to talk about. We've experienced the same kinds of fights, lonely midnight shifts, frustrations with a sometimes apathetic, sometimes critical public, we've experienced the same fears, and seen many of the same horrors.

This gives us a bond unique even among fathers and sons. Perhaps this bond, as well as the pride the job brings, is why so often police work becomes a family affair.

Monday, May 02, 2011

Bin Laden's Death and the Discursive Use of Scripture in Social Media

One of the interesting things about being both a Christian and an academic, particularly one whose interests lie in digital rhetoric, is getting to watch the interesting discursive practices of lay Christians as they navigate life. For this reason, the conversations in response to the death of Osama Bin Laden have been particularly interesting to me. So, here, in the midst of the moment, I'm dropping in to make some notes about this rhetorical event even as it unfolds.

Because social media sites allow such instantaneous, unreflective, and unrevised communication, it's a prime site for watching the ways in which people react to and discuss events, like the death of our current greatest enemy. For this reason, I've been watching facebook posts from my Christian (and a few former-Christian) friends.

I've seen three different threads running my friend's facebook posts. They are:

1) God does not delight in death, even of evil men, so celebrating Bin Laden's death is inappropriate (Ez 18:23, for example).

2) In this, I'm including variations on one theme: God is just and therefore it is right that the evil should die, and the emotions attached to retributive justice are human and natural and therefore cannot be judged (Psalm 137:8-9)

3) People shouldn't be using the scripture to make these arguments, since we must, by definition, argue these scriptures out of context.

I'll address the first two by critiquing the third. The view that we should not be using scripture to back our thinking on this issue seems to rely on a couple of assumptions. First, that the authors of the status updates falling into the first two categories intend to use scripture as evidence that their view point is the correct view. And, second, that they shouldn't do so.

This first assumption may very well be correct. Certainly, I've been watching a lot of arguments and implications that the author sees his or her view as the correct one and that others must get on board. The authors of the first category seem to be the bigger offenders in this regard. Some (by no means all, and probably not even most) of these authors wish to prove that those who rejoice in Bin Laden's death do not show the love of Jesus, and that rejoicing over death, even of a man as evil as Bin Laden, is therefore sinful. Implicit in such arguments is the suggestion that because I do have this kind of love, my Christianity is superior to your's. Certainly, I understand the critique of this type of argument. Such presentations of Christian thought make us look narrow, and completely out of touch with human experience/emotion.

Authors of the second category often seem instantly defensive, expecting or responding to the arguments of the first category. I can likewise understand the objection to this category. Such presentations of ourselves may make us seem unrepentant, showing a lack of understanding for how bad people become bad, and perhaps downright hypocritical (I'm supposed to love my enemies after all).

So I understand the objections to people using scripture to justify their own feelings on what is an extremely important national event. But I do not share it. This is because I question assumption number two.

One colleague of mine noted that this is evidence of why people ought not use the Bible as evidence in an argument. She says this from the perspective of a composition teacher in the Bible belt, where we are constantly trying to convince our students that there are very few academic arguments that benefit from citing scripture. That's just not the way academic writing works.

Indeed, the Bible is not written in such a way that it can work for arguments of scientific (social or natural) proof. Instead, the Bible is about the stuff of life. This is why I critique authors in category three. Writers in this category understand that the the Bible cannot be used to decide which side is right and thus they argue that it should not be.

I, however, think that this is exactly what scripture is for. Scripture is not helpful in arguing objective truth; it is, however, designed to help us navigate the subjective experience of human life.

The reason writers in both categories are able to find scripture that seems to back up their own feelings is because the Bible does in fact support both views. Jesus was the prince of peace, but he also came to bring division. As one friend of mine wrote "The Bible says not to get drunk. It also says God makes wine to make the heart glad. The Bible says God hates divorce and also says God divorced Israel. Go Figure." The Bible contains such contradictions (and they are contradictions) because life itself is contradictory. The Bible is complex because people are complex.

The purpose of these scriptures, then, is to help us make sense of and discuss the often contradictory emotions we experience in these events. My colleague mentioned to me that half the posts she saw used the Bible to support the idea that we should rejoice and the other half were used to support the idea that we shouldn't. My response was that this is because both were correct.

This is why I applaud the open conversations taking place on this issue. By discussing these events, and by struggling with our experiences and with how scripture relates to them, we gain a picture of the complexity of life and the complexity of God. Some in category three have expressed that these types of public debates are dangerous because they make Christians seem divided. But I have often though that the church's role in a post-modern culture ought to be to show the world that we too are complex and disintegrated subjects. That we too struggle with how we ought to feel and act. That there is room for other people to struggle with us.

We ought to embrace our own multiplicities and wrestle with them. In this way we, like Jacob, wrestle with God.

Tuesday, March 29, 2011

Radiohead and Samuel Beckett

In a paper I wrote during my MA, I argued that Samuel Beckett's writing career can be described as a process of deconstructing the theatre. Though I still have work to do to fully flesh out this claim (if I ever get around to it), Beckett's work suggests this to me because his work, especially toward the end of his career, becomes more and more minimalistic.

Through their repetitions, Beckett's early plays already subvert cherished theatrical conventions with regard to the plot model which requires rising action, climax, and falling action. Conflict is also a tricky concept in Beckett's plays. As his career progressed, however, he abandons more and more theatrical conventions. His plays get increasingly shorter and he begins to dissect his characters--rather literally.

In "Happy Days," Winnie is buried in a mound , without use of her legs. So body parts are disappearing in Beckett's plays. In "Not I," only a mouth is visible on an otherwise dark stage, and in "Breath" no character appears at all. There is only a stage covered in rubbish and the sound of breathing. At the end of his career, Beckett began writing radio plays in English, his characters finally completely disembodied and no longer physically present in the space of a theatre, existing only as sound waves.

Again, I'm not able to make the case fully that Beckett is trying to deconstruct the theatre in a linear and progressive way across his career. I'm not so sure the chronology of his plays allows this argument. But certainly his career trends this direction.

So this morning Radiohead's "The King of Limbs" hit stores, and I got to Best Buy thirty minutes after they opened to buy it (I still like CDs, or I could have bought it digitally on February 18th).

When I listened to it, I immediately thought of Samuel Beckett. It's perhaps a strange thing to compare a modernist playwright to a post-modern rock band. But I see Radiohead as doing to rock music what Beckett did to the theatre. As Radiohead moves forward in time, they are seemingly deconstructing rock music.

They began innocuously enough with their 1993 post-grunge "Pablo Honey" which some have taken to be a subversion of grunge music (using as evidence the eerie similarity between "My Iron Lung" and Nirvana's "Heart Shaped Box.")

Thom York has expressed dissatisfaction with that first album and their albums since have gotten progressively more ethereal and less musically unified. Their last two albums, "In Rainbows" and now "The King of Limbs," are particularly deconstructive. One blogger even calls "The King of Limbs" Radiohead's "least accessible album to date."

In both these albums, the music has become more and more electronic, increasing the level of mediation between audience and artist. The songs are also almost completely without hooks, making it easy to get lost in the music, rather than to sing along with the songs. Finally, it often seems that the melody (where one is recognizable) and the rhythm section are in two different meters, creating a disjointed feeling, as if one can never quite catch up.

The wonderful thing about Radiohead's dissection of music is that it questions the genre and indeed music theory in general. Radiohead seems to be asking just how important our Western notions of aesthetic actually are? Can music that doesn't conform to this aesthetic succeed, especially in a genre as flippant as Rock?

What Radiohead is doing is extremely interesting, artistic, and intelligent. I hyperbolically predict that, if Radiohead continues to make albums, their music will eventually consist only of screeches, feedback, and disembodied sounds. And I'm looking forward to buying these.

Friday, March 25, 2011

Memoirs of Spruill, a Plagiarist

In my composition classes, my students just read their rhetoric's chapters on plagiarism and how to avoid it. Of course, because of my own ideological perspectives, I have wanted to subvert common-sense, naturalized views of plagiarism and copyright as moral/ethical issues. So, my students are also researching the history of copyright and blogging about their thoughts.

Teaching these chapters and thinking about the issues from the perspective of an academic interested in new literacies (where issues of what constitutes plagiarism and copyright infringement are considered problematic) has made me recall the one time in my life that I myself plagiarized. What follows is my own confession along with a self-analysis of why I plagiarized.

As a sophomore at Moore High School, I was assigned a term paper in my Biology class. The way the assignment worked was as follows: the teacher had a list of possible topics which students were to sign up for, so that there were never more than one person talking about the same topic (probably to combat plagiarism, since students couldn't copy off of other students). I was one of the last students to get a chance to sign up, so I got stuck with the topic of Prostatitis.

The night before the assignment was due, I realized that I hadn't done a bit of research or writing. So, I sat down with an old medical reference book that my mother had bought second hand when I was a baby and she wanted to stop calling the pediatrician for everything. I performed the classical plagiarist move of copying the book but changing words that I didn't know so that they didn't seem out of place.

I was (and still am) the kind of kid who was scared to death to do anything bad, because I was always convinced that I would get caught. So when I got the paper back with the comment, "you should write for medical journals," I just knew that the teacher knew and that his response was a subtle way of saying "I know where you got this." But he didn't, and I got an A, both on the paper and in the class.

Now, years later, I am a writing teacher. So it's valuable to look back at my own crime and analyze why I did it. After all, I was not a lazy student. I was a very good student, which is actually probably at least partially what allowed me to get away with it. So, why do good students plagiarize? Obviously, I do not claim that my experience is representative, but here is what led me to commit plagiarism:

1) The topic. I was a good student, who loved to read. But this topic was a total dud. By the time I was able to pick a topic, I could only pick a topic in which I was not at all interested and a topic that was actually kind of creepy (high school boys do not want to think about or talk about inflammation of the prostate). This problem led to problem two.

2) I waited too long. Only hours before I was to turn the paper in, I still had not started. Many teachers agree that it is often the good students who plagiarize. This is a product of last minute pressure by students who are afraid of failing an assignment that they waited until the last minute to begin. So there is pressure on good students to perform well. For this reason, a good student will not want to turn in aomw have baked crap. And, for me, this problem was exacerbated by the fact that the assignment was a dud. I am and was a meticulous student, but and I am and was also a bit ADD. So, the fact that the assignment was not the least bit interesting to me made it awfully easy for me to neglect.

So what, then, are the implications of my story? How do we see to it that our assignments discourage, rather than encourage plagiarism? To finish off here, I will address what we can do to write assignments that will help to our students avoid the pitfalls that I fell into.

First, design assignment topics that tow the line between too open and too narrow. Allowing a students to simply write about anything may keep a student from straying beyond his comfort zone, and would make it easier for a student to hand in papers from the fraternity file. So an assignment should be specific enough and closed enough that the student must write his own material in order to fulfill the assignment.

But there must also be enough wiggle room in the assignment that the student does have at least some input into the topic. Freedom within boundaries is the goal. This allows the student to make the assignment her own in such a way that it will hold her interest, at least enough to get the assignment done.

Secondly, revision should be built into the assignment sequence. Though, by and large, we all except the importance of revision, a number of classes still do incorporate revision into the class. A teacher should be viewing, and commenting on multiple drafts of each paper. Doing so allows the teacher to watch the process of the student, which makes last minute plagiarism much more difficult. It also provides milestones for the student so that the student must work on the paper bit by bit, draft by draft. This negates the possibility that the student is waiting until the last second to begin a paper.

I don't intend to argue that these techniques will solve the plagiarism problem. Though these are cornerstones of the writing process in my classes, I still do face about one plagiarism case a semester. But these do a long way toward solving the problems that cause otherwise conscientious students to plagiarize.

Tuesday, February 15, 2011

Public Broadcasting, Democracy, and Market Economy

Since Plato, our belief has been that a well ordered democracy requires a well-educated public. Part of the project of ensuring one in modern America has been public broadcasting. However, on Wednesday, House Republicans proposed a budget that would end funding to the Corporation for Public Broadcasting, which receives a tiny .0001% of the federal budget. When I posted a link on facebook to an online petition against the upcoming cut, I aroused the well-meaning and respectful ire of some of my conservative friends who believe that public broadcasting ought to be left to market forces. Their argument is, to wit: if people want the educational programming offered on PBS and NPR, advertisers will pay for it. Therefore, let the market bear it out.

The problem with such a view is that broadcasting paid for by advertising (thus, "the market") inevitably follows the ideological perspectives of the advertisers. Thus, large advertisers with lots of money quickly hold a monopoly of information.

To find evidence of the problems with this, I need look no further than my home in Oklahoma City. During the debate surrounding a recent tax extension the state's largest newspaper, according to a local republican campaign strategist, refused to report stories which held the tax proposal in a negative light because it's largest advertiser (the Chamber of Commerce) was in favor of the proposal. The most interesting thing about this situation is that the newspaper is unabashedly conservative. So, a conservative newspaper refused to publish arguments against a tax because its advertisers wanted it. In this way, the market was able to make a newspaper violate its own ideology, and effectively quash dissenting views. Thus, the supposedly fair and democratic marketplace, which was controlled by a powerful few, effectively censored the press. Public Broadcasting, which hands money to LOCAL stations, allows the stations to base their programming on what it's local viewers and producers feel is valuable, free from the pressure of the large advertiser who may not have the best interest of democracy at heart. (See this blog by Rep. Earl Blumenauer)

What's also true is that what the market wants and what the people need are not always the same thing. The public WANTS American Idol. So American Idol is what the market will bear, but it's hardly what is necessary to ensure a citizenry capable of voting on something other than which seventeen year old singer is the most dreamy.

The assumption that the "market" will always allow what's best to win out (which has oddly become the central tenant of the modern Republican party) is naive at best. This is because the market has constantly shown that it will, if it must, sacrifice the good of democracy for the strength of the bottom line. Though we insist on a system of checks and balances in the federal government, no such system is inherent in the market.

An extraordinary amount of our legislation exists to act as a check to this power. It has often been argued that the consumer serves as the check to the power of industry by choosing what to buy and what not to buy, but this is only partially true. Indeed, a very large percentage of my income is spent on things that I need. I have no choice but to buy them. I must put gas in my car to get home, and the industry that produces that product is controlled by a small handful of people who can and do exert extraordinary control. This is the power that comes with controlling a necessary commodity. So, though in theory, the market polices itself, in practice things happen differently. After all, if a handful of businessmen can do, on a national level, what the Chamber of Commerce did in OKC, what results is known as an oligarchy. Thus, a capitalist market has the same potential to become tyrannical as it does to remain democratic. This is why we write legislation to control business--to provide a check on the power of industry, just as congress checks the power of the president.

Public Broadcasting is a part of that tradition. It ensures that the people with the most money don't get to own information.

Finally, I find the connection between free market and democracy implicit in these arguments a little problematic. Capitalism is a market system; not a system of government. As I hope my earlier syllogistic demonstration of how capitalism can in fact promote tyranny and the anecdotal evidence of the OKC tax debate have shown, democracy and capitalism are not the same thing.

That being said, the institutions that are designed to protect democracy need not and ought not be judged according to their market value, but in their importance in promoting and supporting democracy (Can you imagine if we subjected the US military to market forces? It hasn't turned a profit since WWII). I think people probably assume that the public school system exists to prepare students for the workforce. However, the pioneers of the public school movement (most notably John Dewey) never connected schooling with the market--such a connection is actually a much more recent phenomenon. Instead, they saw education as necessary in a vital democracy. Thus, the public school system is designed for no other purpose than to ensure an educated voter pool. Public Broadcasting was developed as a modern extension of that same project. Public schools exist to foster preparation for democracy through education; public broadcasting exists to foster participation in democracy through education. And if eduation is indeed vital to democracy, it must be protected and supported regardless of its market value.

This debate ultimately stems from the notion that to be "conservative" means that one supports saving tax payer money while prodding the market. But the word itself suggests that to be conservative is to hold on to our past ideals, one of the oldest and most important of which is that a well-educated people can govern itself. Public broadcasting is part of that tradition. It exists to educate those who would self-govern and it thus helps to protect democracy. And I happen to think that protecting democracy is a much more conservative ideal than protecting the market.

Saturday, February 12, 2011

Questioning Art: Who Gets to be Called Artist?

Today was date day for Charissa and me, the gloriously rare day when we leave our kids somewhere and spend the day running around doing things. Not having gotten many chances to go since the boys were born, we decided on a trip to the Oklahoma City Museum of Art, one of our favorite and most frequented places before the boys came.

The museum has re-arranged since we were there last as it has some new collections, and seeing some of the new items, along with my criticism of some of the old has led me to some of the open theoretical questions concerning art, what it is, and who gets to make it.

These are famous modern questions of which art students and critical consumers of art will already be familiar. Indeed, as I suggested in my recent post regarding Duchamp, much of the work of modern artists has been to ask these questions as a way of critiquing their own artistic traditions. This has been a modern and post-modern concern in all the arts. Theatre has Grotowski, Brecht, and Artaud who played with theatre conventions by disentegrating the fourth wall, deconstructing the spoken word, and aboloshing elaborate sets in favor wharehouses and street corners. Literature has Pynchon who broke the conventions of space-time, Vonnegut who destroyed any illusion of the death of the author, and Saporta who destroyed the convention of linearity altogether. And visual art had of course Duchamp, Pollock, and so on. So the questions that I ask here are not new, but I am asking them anyway, in response to my own confrontation with these questions in my little hometown museum.

The centerpiece of the OKC-MOA is it's large Dale Chihuly exhibit. Chihuly began his career in stained glass and accidentally discovered the ancient art of glass blowing while fooling around with some of his glass. Henceforth, he has become world renowned for his work in blown glass. The reason Chihuly fits into my questions here is that many years ago, just as his work was getting larger and more complex, he was involved in an accident which rendered him blind in one eye. Having lost his depth perception, he gave up entirely doing the hands-on work involved in creating his own work. Instead, his sculptures are actually formed by a large number of apprentices, based on rather abstract paintings that Chihuly paints. The paintings are not schematic at all, and therefore do not serve as any type of blueprint for the actual glass sculptures, Instead, they seem to act more as inspiration for the apprentices who will make the sculptures as Chihuly stands behind them pleasantly shouting orders.

For years I've thought of this as an interesting extension of the work of other 20th century artists. Modern artists have asked the question of "who is the artist" in interesting ways. Duchamp critiqued the notion of an inspired genius when he began arranging things found in everyday life and calling them found art sculptures. Others have done similar things in their art. But in each of these cases, artists put their hands on objects not typically considered artistic and asked, "does it make it art that I, an artist, touched it." Chihuly extends this questioning even further because he doesn't touch anything. Instead, in his work, he is not the artist because he crafted the actual piece, but because it came from his mind. The actual building of the sculptures is performed by apprentices who are, we pressume, craftsmen and not artists.

Of course, as I've grown more sympathetic to the Marxist complaint, this definition of the artist has come to bother me. Chihuly hires an extraordinary number of young artists, they create magnificent sculptures, and he gets to put his name on the work. And so, with little physical input of his own, he has grown to world renown on the backs of people we've never heard of. In this system, the artist is the one who gets credit as the "idea man," while his workers, the artistic proletariat if you will, languish in obscurity and in the heat of the glass kiln.

I continue this question of who gets to be called artist as I move on to the work of Alfonso Ossorio. His bizarre collage sculpture INXIT is the centerpiece of a new collection of his work at the OKC-MOA. The piece is a door and door frame with an extraordinary Hodge-podge of strange stuff glued to it including animal bones, plastic birds, and a human skull, all of it creepily encrusted with glass eyeballs. The interesting thing about it is that it looks like every road-side oddity created by any local crazy man who ever donned greasy overalls. Upon seeing this piece, I turned to Charissa and asked, "do you suppose that artists sometimes trick us by saying 'I'm going to throw some bull-crap together and you have to take it seriously because I have an MFA.'"

This theme continued when we watched a film introducing the MOA's temporary exhibit of the constructivist sculptures of Jill Downen. In the film, she discusses how she became obsessed with texture while being placed in time-out by her mother. She notice a crack on the wall and became so excited by it that her mother no longer used time out as a punishment, making her wash dishes instead. The film then goes on to show her walking around an empty New York apartment with a small video camera gushing over the textures of the interesting apartment. The film shows a clip of one of these videos, which she uses as inspiration. The shot is zoomed in unnaturally closely and the video is shaky, and the whole thing is dubbed over by Downen as she said odd, artsy things. It's strangely reminiscent of "the Blair Witch Project."

It occurred to me that, in both the cases of Downen and Ossorio, this art is being produced by people who, if they were anything but artists, would be taken as mentally ill. I told Charissa at this point that the only difference between an artist and a lunatic is which side of the river the person went to school. If she is studying at MIT, she is a maniac. If he is at Harvard, he is an artist.

This is, of course, a joke, but it makes a serious point. That is, the label of "artist" is perhaps a great deal more arbitrary than we have often assumed. The difference between Ossorio and the goofball in Memphis who painted his house pink and glued bizarre stuff to it is a Harvard degree. The difference between Downen and a troubled kid with a penchant for taking strange videos is a Danforth Scholarship at Washington. In the case of Duchamp's "Fountaine," the difference between a urinal and a sculpture is where in the art gallery is stands.

This is not to say that these are not talented artists. They are, without a doubt. This is to say that who gets to become an artist and who goes unnoticed forever has as much to do with the relatively arbitrary forces of access and educational opportunity as it does with artistic "genius." It bears repeating here that this idea is not new to me. Many artists themselves feel this tension and play with these concerns in their own work. But, having been confronted with these truths, I had to make something of them.

Friday, February 11, 2011

Writing Students (and teachers) Take Heart; What we're asking you to do is really hard

Writing terrifies students.

Many of us who have been writing for a long time, or are "talented" writers, see writing as a relatively natural process. It's only putting language down on paper, after all. Yet our students shriek, shake, and cry with fear when we give them even "simple" writing assignments. Indeed, the act of writing paralyzes our students.

Flower and Hayes have explained the cognitive processes involved in the writing of actual written language, and their explanation helps to explain why the act of writing so befuddles our students. They explain that "the information generated in planning may be represented in a variety of symbol systems other than language." The ideas that generate writing often come in the form of images, sense memory, emotions and "even when the planning process represents one's thought in words, that representation is unlikely to be in the elaborate sytax of written English" (1981). The act of writing, to Flower and Hayes, is an act of translating ideas (which are non-linear and jumbled) into linear written English.

Writing, therefore, is an extememly complex cognitive process which requires our students to produce formal written English for discourses with which they are still infamiliar out of the jumbled mess of human cognition.

It's no wonder then that many of our students are overwhelmed by the task of writing. What we are asking them to do is difficult stuff--difficult stuff that we ourselves have often taken for granted. This is something to remember when we are frustrated that our students "just aren't getting it." We are asking them to lift heavy weights.

Since this is true, we must be careful not to assume that good writing just is. Instead, good writing is a carefully developed skill. We should see ourselves less as gurus or shamans, guiding them through the spiritual and mysterious process of writing, hoping that their exposure to our gods will magically enlighten them. Instead we should see ourselves as physical trainers, helping them learn to isolate their writing muscles. Though the act of writing is complex and recursive, the processes that make up this complex act can be isolated and trained. If we can help our students do this, then the act becomes easier and more natural (as it is for us) when they must put these processes back together to complete acts of meaningful writing.

But this requires careful work and patience on the part of both the teacher and the writer. So take heart, if it seems like this stuff is really hard, that's because it is. But just as you wouldn't give up working out the first time your muscles are sore, so you must not give up the first time your brain hurts. This includes both you students and you teachers.

Tuesday, February 01, 2011

The Right Hand is High School English; The Left Hand is College Composition

My freshmen often express disillusionment in their transition from high school English to college composition. Many of them feel (and I agree) that their high school English classes have left them unprepared for college writing in general, and my composition classes in particular. As a teacher, it is easy for me to suggest that such culture shock is a natural part of the learning process. As students advance both educationally and cognitively, it makes sense that the pedagogies upon which they once relied and within which they once thrived will seem inadequate. My students, on the other hand, have no such benign perspective. Many of them feel cheated or even led astray by high school teachers who would often justify hard or tedious assignments by claiming that they were "preparing them for college."

Thus, when my teaching points out the inadequacies of the five paragraph essay, or when I challenge often overly-pedantic views of first person writing and so on, my students respond bitterly about a secondary education that was supposed to "prepare them for college" but that has failed to do so. My students often hyperbolically express their desire to injure, maim, or kill their high school teachers for their false teaching. In fact, a student once told me that when he had gone home for a visit, he ran into his high school English teacher at Wal-Mart and he told her, right there in the store, that she owed him an apology.

On one hand, our students' feelings are natural to the educational experience and we need not worry about them; I remember having these same feelings as an undergraduate who had excelled in my high school English classes and had tested out of Composition I. But there is also a real and legitimate criticism couched in the responses of my students. At least from my perspective as a composition teacher, there is little curricular alignment between high school English programs and the college English departments their students are entering. There are, unfortunately, some unavoidable reasons for this.

Most high school English programs combine both the teaching of writing and of literature. And because most English teachers enter the field because of their love for literature, teaching about the history and interpretation of literature becomes the primary focus of the class. In fact, writing instruction in most high school English classes takes place while students are writing about the literary works they are reading. Thus, even in the English classroom, writing is treated as more of a skill set necessary for but peripheral to the real subject of the class. Writing is an activity that supports the teaching of literature, rather than being a subject of its own.

More significantly, though, is that high school and college teachers don’t seem to know what the other is actually doing. This should not be particularly surprising. Many high school teachers do not have the educational credentials and few have the time to teach as adjuncts in college composition programs. At the same time, only a few of the composition teachers I know have taught in secondary English programs. Though it’s an ever-present mantra in high school English classes to say that the class is “preparing students for college,” teachers of high school English know very little about what we actually do in the composition classroom.

To our shame, many first year writing programs have done little in the way of supporting secondary English programs. There are many researchers in the universities who are studying the writing habits, rhetorical prowess, and language usage of high school writers and proposing pedagogies based on their findings. But the lessons we are learning from this research seem too often to be getting lost in the ether. High school teachers aren’t reading our journals, and we’re not visiting their classrooms.

It is my admittedly un-researched argument here that there is a palpable disconnect between first-year composition programs and secondary English programs. It is also my assertion that this is a problem that we ought to work toward fixing. Without a doubt, true curriculum alignment will not be possible. The two enterprises are different enough to prevent this. Though high schools do indeed make it their goal to prepare students for higher education, they also have the burden of universal education. So it is also their goal to prepare students who will not and, perhaps, cannot go to college. Furthermore, high schools do not simply send students to the nearest state university, but instead send students to the four winds. Therefore, high schools cannot hope to account for the numerous pedagogical approaches at different institutions.

But even without some form of specific curriculum alignment, we can improve our teaching in both high school and college by fostering better communication between college English departments and high schools programs. We can help high school teachers better prepare their students for us by making clearer what we do, and what types of writing we privilege. We can also find ways to make sure that the research we do about their students gets back to them, so that the latest research a teacher has won’t be what they learned in their English education program while they themselves were in college. Furthermore, if we hate the five paragraph essay, we ought to be searching for and implementing new techniques to teach organization and invention. We are, after all, the research wing of the educational enterprise. And if teachers need these forms that we so hate because they work, it falls upon us to find something else that works and that isn’t antithetical to what we teach.

Ultimately, we must recognize that we are not involved in mutually exclusive projects, but rather we are indeed colleagues whose work can and should influence one another’s practices.

Thursday, January 27, 2011

Duchamp: Post-Modern Forerunner; Practical Jokester

In 1917, while in New York, French artist Marcel Duchamp bought a urinal from an iron works company, signed it "R. Mutt" and called it "Fountain." The piece (if indeed it is one) has been considered a practical joke, but has also been taken quite seriously. Duchamp added the piece to his collection of found art objects which he had titled "Readymades." In 1915, Duchamp began displaying every day objects in artful ways in order to critique the adoration of art and the artist. Though there is as much playfulness in his movement as there is philosophy, his concept of found art, and his statements behind it pre-figured important post-modern ideas, a few decades before post-modernism became all the rage in Europe.

With his found art movement, Duchamp intended to question what defines art. Where does craftsmanship or utility end and where does art begin? What makes one piece of white earthenware featuring graceful geometric curves art, and what makes another a urinal?

Duchamp, who was critical of artist worship, played with the notion that by signing something and putting it in a gallery, one elevates it as art. This is particularly true if the signature is that of a well-respected artist. In fact, what allowed many 20th century artists to experiment with deconstructive styles was that they were already accomplished artists. Picasso's cubism might never have been accepted if not for his more traditional early work, his almost Classical Rose Period, and the aesthetically pleasing Blue Period. But, because he was who he was, he was able to experiment. People would expect the sometimes childish looking work of cubism because it had his name on it.

Duchamp seemed to recognize this ability, and he purposefully questioned this by simply signing every day objects, and placing them in galleries, knowing that folks would therefore call this work art.

This suggests then that the label "art" is arbitrary. The fact that "Fountain" was critically received as art (it was, in fact, named the most influential art work of the 20th century in 2004), suggests this. It's acclaim clearly had nothing to do with the graceful, arching lines of the urinal, or the pure white color, or the perfect classical geometry of its shape. If this were so, it would have been the original craftsman who made the first of these who would deserve credit as "artist." Instead, Duchamp signed a pseudonym on it, included it in a gallery show, and it was art, simply because the right person (Duchamp) put it in the right place (a gallery).

This attack on the arbitrary nature of art is decidedly post-modern. Duchamp questions to what extent art is art because it has some intrinsic quality, or to what extent is art art simply because someone decided that it was. In this way Duchamp is, in 1917, already doing what post-modern thought would do just after World War II, decades later.