Sunday, September 01, 2024

Communication, Leadership, and Relationship

For the first time in a university setting, I have moved outside of first-year composition and into Business Communications. I am teaching a course as an adjunct at the University of Oklahoma called "Strategic Communication for Business Professionals." The material is incredibly fun for me to prepare and to teach because the class is an amalgam of my academic background in Rhetoric and Composition and the last decade of my professional experience as a leader and supervisor.

One really nice thing about the design of this class (the basic structure of which was provided by the department) is that our semester started with lessons on interpersonal communication, emotional intelligence, ethics, and credibility. For me, the fact that the course recognizes that communication starts long before communicative acts actually occur is a great strength of our course. Communication does not happen independent of credibility and relationship, and so strengthening these is a vital precursor to effective communication. 

Last week, the topic of our class was credibility. We discussed individual and organizational credibility, why it's important, and how to establish it. We interviewed a professional from a high credibility organization and discussed case studies of organizations that had harmed their credibility. 

As part of this process, we had a short set of group discussions and debrief where students shared examples of the best leader they had ever had as well as the worst. None of the answers were surprising. The best leaders cared about those they led, led by example, and mentored followers. The worst leaders were rude, hyper-critical, or micromanaging. Again, not earth-shattering stuff to anyone who's ever worked anywhere, but what I was able to point out to my students after hearing their stories was that all of the their examples were relational. Not one of their examples involved competency. 

Certainly, competency is important too but, for one thing, it is less common for someone with no technical know-how to get promoted than it is for someone with sufficient job knowledge but little emotional intelligence. Also, problems we have had with bosses that caused us emotional discomfort or harm are far more memorable. This conversation provided a salient example to my students of why credibility and emotional intelligence are key foundations for communication. 

In my previous assignment at my regular nine-to-five, I was working to develop the informal leadership capacity of our trainers and senior members. This had, in my opinion, been neglected in the past and I did not feel that we had given our informal leaders the skills that we could have. As I spoke to them, I began trying to build a mantra out of the teaching that, "it doesn't matter how right you are, if you aren't also effective." The ideas was that if you seek to influence others, you have to be attuned to what they are willing to be influenced by. You won't win them over by being right if they think your only interest is in being right. They must believe that you care about them. I've since added to this mantra (though I no longer have this particular group to repeat it to):

It doesn't matter how right you are if you are not also effective. And you cannot be effective without influence. And you cannot have influence without relationship.

Of course, this is too long to be a mantra, so probably my original should stand, and this should become a follow-up explanation, but I think the causal chain is important. Communication isn't just a matter of getting the facts right and building an unassailable logical connection between facts and your claims. If an audience doesn't accept you as credible, you will not communicate as effectively as you might have. And if your relationship with your audience is poor, you will have no credibility. If you want to influence others more effectively, you will need to invest in your relationships with them.

This is why relationship-building must be an integral part of any communication strategy. And the more difficult the conversations will be, the more this is true. In any organization, a good communicator should begin building sincere and positive relationships immediately, long before the communicative moment. Your effectiveness as a communicator will be directly correlated to the quality of your relationships.

Friday, July 17, 2020

Guide to Better Arguments, Step 2: Do Your Homework

In my first post for this series, I argued that step 1 to having better arguments was to assume the humanity of others. You may notice that this post follows the pattern of being about things that happen before or outside if an actual argument. This is on purpose, of course. Indeed, one of the big problems in contemporary arguments is that we have a tendency to simply enter into them. Or perhaps we are drawn into them.  The fast-paced nature of modern information causes issues to seemingly appear instantly, and we have learned through experience that if we don't jump in fast, we'll be left behind.

Because of this, we have the unfortunate and problematic habit of entering conversations before we're really prepared to. There are things that need to happen before we begin arguing. The first was based on the idea that we should develop the right attitude. This one is a more practical matter. Step 2 is that we should resist the temptation to join an argument that we are not prepared for, and instead when an issue arises that we feel demands our response, we should begin by doing our homework.

In an interview with Princeton Press about his book On Bullshit, philosopher Harry Frankfurt suggests that:
It's a widespread view in a democratic society that a responsible citizen ought to have an opinion about everything. Well you can't know very much about everything, and so your opinions are likely to be based upon bullshit.
 Frankfurt describes nicely a prevalent problem that leads to some very bad arguing. That is that we tend to think that we must enter every important conversation. We think this is true even if it turns out that we don't know very much about a topic. We do some quick views of news clips, perhaps follow a few links we find on social media, and we quickly form an opinion while the iron is still hot. If we want to avoid the problems this creates, we have to learn to so our homework. This seems obvious, but it requires some specific commitments.

Even before we begin, we must do our homework with the right attitude. We tend to form an opinion quickly, then begin looking for information that backs up what we already believe. This is a terrible way to research, but it's without question the most common way. To do this is to weaponize knowledge--it's gathering ammo to use against your opponents. Knowledge should be a peacemaker. Instead, when you research, research to learn. Instead of starting with an opinion that needs to be supported, start with a question. Start with a desire to understand a complex and difficult issue. Then research with the following commitments in mind.

1. Wait to form an opinion for as long as possible.


The British Psychoanalyst Wilfred Bion adapted John Keats's concept of "negative capability," referring to the concept as "the ability to tolerate the pain and confusion of not knowing, rather than imposing ready-made or omnipotent certainties upon an ambiguous situation" (Meg Harris Williams: The Aesthetic Development). When teaching student writers, I described this as the ability to put off forming a solid opinion for as long as possible in order to evaluate all the available evidence and interpretations. This is indeed uncomfortable for us. We feel the need to know how we feel about something right now. This is especially true in contemporary culture where we have a tendency to define ourselves by our politics. We think that our opinions are central to who we are. But if we are arguing better, we must begin thinking better. This means thinking longer.

My best real-world example of this is that in 2014, after the shooting of Michael Brown in Ferguson Missouri, I was often asked for my opinion. I was, after all, a police supervisor. I was and am, arguably, an expert on police use of force. In fact, evaluating police uses of force and offering recommendations about them to police command staff is one of my primary responsibilities. On the other hand, I knew about this shooting only through news reports and media stories, just like everyone else. And I knew nothing about the offender, officer, department, or city where it happened. So when I was asked for my opinion, I would tell people that I didn't know much about it and therefore could not speak to this particular case. I would sometimes offer some philosophical and ethical thoughts on police use of force in general, but I would not speak on this case in particular until I felt like I had done a lot of homework. I read everything I could find. I read books about the so-called militarization of policing, and read both investigations into the event done by Eric Holder's Department if Justice. I did not publish a written opinion of the Ferguson incident for more that two years.

I don't suggest that everyone should refuse to have an opinion on any issue for two years. But I would suggest that people ought to have a meaningful commitment to evaluating an issue in long-form before forming an opinion and certainly before arguing with others about it.

This might mean, unfortunately, that when an important argument begins, you may have to decide to sit this one out. But fear not; if it's an important issue and a real problem, the subject will come up again (after all, here we are talking about police shootings again, six years after Ferguson). And next time, you'll speak as one with authority because you've actually done the work and taken the time to sit with the issue. In the meantime, it is okay to say that you don't know enough about something to speak on it.


2. Read deep and wide


One very unfortunate consequence of our social media lives is that we are prone to build echo chambers for ourselves, where the other people in our feed tend to look, sound like, and think like us. When we have those errant friends from our old lives whose opinions are irritating to us, we may put up with them for a while, but when we've had enough, we hide or unfriend them. Because of this habit, we can end up being surrounded by opinions just like our own almost all of the time. This makes opinions different than our own seem even less natural and less reasonable to us. We assume that most people, or at least most reasonable people, think like we do, and this makes other views seem even more foreign and clearly wrong. 

This causes us to naturalize our own views, and because our views seem self-evident, we begin to assume that people who argue views other than our own must be either politically motivated to knowingly lie, or they must be evil. This, of course, is a huge violation of Step 1

This is only one of the reasons why you should read as widely as possible. Read multiple viewpoints, multiple presentations and interpretations of data, and read not just in the narrow issue you're interested in, but also some in the issues surrounding it (for example when researching inequities in public education, you may end up also reading about the history of urban sprawl and about school funding methods). This helps you to understand the complexity of the issue. Read people who you think you agree with and people you think you disagree with, and remember to assume their humanity. If they make arguments you think you disagree with, examine the ethical system that causes them to feel the way they do. You may find that their concerns are legitimate and in so doing find some common ground.

When you read a source, use this trick taught by Peter Elbow: read the source once as a believer and once as a doubter. This will help you to both read assuming the authors humanity and to read critically.


3. Evaluate your sources.


I suspect that people know that not all sources are created equal. But navigating these sources is not as easy as it may seem. People have a hard time knowing which sources are valuable and which are not. People are getting better (believe it or not) at recognizing sources that are trash. It may not always seem so, but we have come a long way since a fellow student in my freshman composition class thought that it was illegal to post untrue information on the internet (in fairness to her, in those days, most of us had never used the internet when we got to college except maybe to flirt in a chat room).

I'm afraid that many people evaluate sources based on whether or not they share their political, religious, or other view points. I hope that we can learn that this is not a valid way to judge a source. Instead, we need to learn to evaluate all sources critically and to decide how much value to place on each source. We can do this by evaluating several things about it. The first being a source's use of evidence. Sources handle evidence differently and for different purposes, and not everyone who presents data in a text understands how data works.

While its not actually true that "you can make statistics mean whatever you want them to mean," it is also not true that any form of data speaks for itself. Data is tricky. It must be controlled and carefully interpreted. Knowing how to take raw data and make sense of it is a specialty and it takes training. Though this is not entirely fool-proof and within each of these categories there is a wide range of quality, for the most part this hierarchy will help guide you: a TV news story is better than a tweet by a politician; a print news article is better than a TV news story; a piece by an expert is better than a print news story; a peer-reviewed academic essay is better ran a piece by an expert (although writing by an expert will usually rely on academic research). Again, this is not fool-proof, but it's based on each level of media's ability to give fuller detail, a commitment to verifiable data, and an understanding of how to interpret data. Understanding how these sources works will help you know how much weight to give each source. For example, it is common for news paper articles to present data, often that they have themselves collected. This is generally well-meaning enough (no matter what media-haters think) but news paper writers tend to present data as if it speaks for itself and as if it is self-evidently objective. This is not because they are trying to trick you or because they have an "agenda." It's simply because journalists are rarely trained as researchers. They are writers, and are not often trained to properly control data or to interpret it. 

Beyond this, when you are evaluating a source you should ask if the organization publishing it has specific goals in mind, who the author is and how they are trained, what the writer's motivations seem to be, where the data the author uses came from (which may mean that you have to go evaluate yet another source). Be especially cautious when reading people you agree with. Just because you agree with them in principal does not mean that their argument is valid, or that they are treating data fairly. Be willing to disregard someone you agree with but who came to their conclusions unethically or illogically. Using them in your argument will weaken your argument and make you seem less reasonable. 


4. Consider all your opinions to be tentative.


While you should be very slow to form your opinions, you should be quick to notice when they need adjustment. Be willing to complicate your own opinions or even to change them entirely when you encounter new evidence that challenges what you thought you knew. I firmly believe that we need to normalize people changing their minds. Too often, when someone changes their mind, we accuse them of being fickle, or weak, or "saying whatever the audience wants to hear." Our ability to record and retain texts for much longer makes this problem worse. We love to confront people with old recordings or writings where they contradict themselves. We point it out as hypocrisy. This needs to change. We need to make it normal and honorable for people to change their minds.

What would help this is a commitment to transparency in our communication. When we change our mind, we shouldn't do so slyly or try to pretend that we've alway felt this way. We should be willing to say "I used to believe this, but then I learned this, and that caused me to rethink the issue." Likewise, if you don't know something, be willing to say "I don't know, and so I don't have an opinion right now." I sometimes like to start sentences with, "I don't know if I believe myself, but I think _______." 

Finally, our commitment to truth and to proper action must outweigh our desire to be right. It must outweigh our desire for political or social standing. It must be more important than getting our way, or teaching so and so a lesson, or any other personal, selfish goal. We have to stop seeing argument as a game to win, and start seeing it as a way to come to better understanding and better solutions.

Wednesday, July 15, 2020

Guide to Better Arguments, Step 1: Assume Humanity

After a couple months of social media hiatus (and a four year blog silence), I'm reintroducing myself by working to counter the negativity that led me to take a break in the first place. It's probably not secret that we have a social media problem. With all the great things it's given us, like the ability to stay in contact with old friends all over the world, get better connected to new friends, share images of our family and our meals, and so on, anyone with a heart and mind can see that it has created some serious problems. Perhaps the most obvious and insidious is that it has exposed, in a very real way, that we have an argument problem. The defaced, dehumanized nature of social media has magnified our inability to argue well and perhaps among some, our unwillingness to. The fact that we can say whatever we want without the fear of getting punched in the nose has allowed us to be very nasty indeed, and I think has, in many ways, revealed who we really have been all along.

Part of the problem is that we are all convinced that we are not the problem. They are. And if they would just start seeing things like I did, there wouldn't be a problem. See the problem?

Furthermore, we've so thoroughly convinced ourselves that social media life isn't real life that we see it is an anything goes, zero sums political/social/philosophical game. But in fact we've done real damage in the actual world. This is because, as it turns out, internet life is real life after all. We've got social media presidents, social media news stories, and social media controversies. The problem being that we've learned to hold ideas about complex issues that are only 280 characters deep (but at least it's up from 140).

With all that in mind, I've decided that the least I can do is use my distant voice in the wilderness to try to help us be better. I would just as soon limit social media to cat videos, kid news, and food photos. But since people are going to use social media and the internet to argue, we might might as well try to make it better.

When I was studying Rhetoric in graduate school, my guiding sense of purpose was that we rhetoricians could be a guiding light in a culture that needs to learn to argue effectively and ethically. I thought we could really influence the world by influencing our students not to follow the patterns of cable news, sports talk shows, and political debates, but instead follow a more Socratic path toward well-intentioned truth seeking. Ultimately, though, I started advancing in my civil service career and chose mammon over truth. Then I became disillusioned with the whole project: first disillusioned that my own field actually shared any of that ethical sensibility in the first place, and finally that people were really able or willing to learn what we had to teach--we were, after all, just a gen-ed that students were trying to survive so they could get to their "real" classes. Nevertheless, in the hopes that there are some who want to be better, who want to argue better, I'm writing out this series of posts about how to argue effectively and humanely. There are other such works out there, most of which are better than mine. But here are mine anyway. Beginning with Step 1:

Assume Humanity

One of the things I would tell students early on in my classes is that in academic writing, they should assume that people who disagree with them are just as smart as they are. Our culture of political argument love to posit the "other side" as stupid or evil. We love to set up straw men. We easily claim that people who disagree with us are dumb, or immoral, or communist, or fascist, or racist, or whatever other pejorative term fits the particular situation. 

The problems with this approach are that 1) these claims about others are almost always false and 2) they immediately shut down any conversation with people in those groups. This is not a path to affective argument, at least not properly understood. Instead it's a refusal to argue at all. It's a way of saying about the other, "I don't have to listen to you because you are a _____." Because of this, we set up arguments where the only people willing to listen to us are people who already agree with us.

While such approaches my resonate well in our own self-selected echo chambers, they do nothing to actually advance the conversation or to help bring about a reasonable and humane conclusion. Mostly, they end up helping to maintain the status quo. After all, if all you're doing through your argument is railing against an opponent with more power, then they are likely just to shut you out and shut you up. . .and remain in power. If, in the other hand, you are able to build enough groundswell of support to replace old power with new power, history shows that you are almost always going to replace the old with an equally awful new (read Paulo Freire, or if that proves too hard to find, listen to the Who). The oppressed simply become the new oppressor. 

But if the point is to actually be heard, to hear one another, and to build a better world (or school system, or criminal justice system, or roommate relationship) those types of arguments can only ever fail.

Instead, if we are to begin to argue better and more effectively (which I am defining as arguing toward substantive and humane understanding and action) than we need a different type of argument. And that begins by assuming one another's humanity. You must begin by assuming that the "other side" is as real as you are and as sincere in their beliefs and motives. You may have difficulty understanding how they could possibly see things the way they do, but you must assume that they do so for good reason. You must begin to ask what values they hold that would lead to their conclusions and you must be willing to accept that these values are as real and important to them as yours are to you. You must be willing to concede that yours might not be the only values that are valid and important and you must begin to wonder if there is middle ground that upholds both your values. 

If you accept the humanity of people that you don't agree with, you cannot call them communist (unless they themselves say they are, in which case you have to accept that communists mean well enough and have a history of valid complaints that led them to where they are). You can't "teach them a lesson" by threatening them or releasing their addresses on the internet. You can't call them sexist, racist or other names, accuse them of murder or any of these other straw-man tactics. Instead, effective and ethical argument starts with affirming their humanity. It means assuming that, like you, they want what's best and they think their beliefs achieve this. You must be willing to accept that they are just as real and just as valuable as you are.

This may see elementary to you, but it's clear that this is the root of a lot of our problems, and so it's step one. Assume the humanity of others. Start here.

Thursday, July 07, 2016

Ferguson, Missouri and Leaving Academia

This is a piece that has waited nearly two years to be written, and even as I write it, I partially regret it, but it feels prophetic--a voice in the wilderness that needs to be heard.

 I'm a police officer. I'm also a teacher. I've had a foot in each of these identities my entire adult life. I wanted to teach at a university from the time I first arrived at university (before that, I wanted to teach high school). But after a year working as a professional actor, I decided that I did not want to teach theatre and that I would need a "real job" if I wanted to re-tool in graduate school. So I took the only other job I had ever wanted to do, and I became a police officer in the same medium-large city as my father. While I worked as an officer full time at night, I also earned a Master of Arts degree in English, worked two years toward a PhD in English Composition, Rhetoric, and Literacy, and taught either as an adjunct or a graduate student for eight years. Two years ago, I took on more responsibility on my police department (and, full disclosure, a requisite increase in pay) and stepped away from the PhD, uncertain as to whether or not I would finish it, but certain that I would continue teaching. This August, the third academic year in which I have not taught will begin, and I will still not be teaching. What happened? Ferguson, Missouri.

A couple months after I stepped away from my PhD program, Michael Brown was shot and killed by Ferguson, MO police officer Darren Wilson. A version of events told by his friend held that Brown had been trying to surrender to Wilson, holding his hands in the air, when Officer Wilson murdered him in cold blood. This was aired in a television interview and immediately became the official version of events in the national media.

Around the same time, a hispanic man in a neighboring jurisdiction died after a video-taped confrontation with police, in which no weapons were involved. Two years later, two more high profile cases have again brought the issues that these incidents laid bare back into the public conversation. At the time I was asked often by friends, people at church, and my academy colleagues what I thought about these cases. I would say simply that I wasn't there; I'm not part of the investigation, and so it would be inappropriate and irresponsible for me to give an opinion at that time.

This is an attitude that, in large part, I held as an important academic habit. In fact, I saw this as an ethic that academics hold dear. My own discipline of Composition and Rhetoric preaches a thorough and sober approach to evidence. We have even coined a term for this: "negative capability," or a rhetor's ability to withhold judgement until she has examined all the evidence, weighed it against other evidence and context, questioned assumptions, biases, and motives of those providing the evidence and indeed the rhetor himself, and taken time to consider what all these mean.

Yet what I saw among many of my academic friends and colleagues was very much the opposite. It is not surprising to me that my colleagues would hold particular views on race and its role in structures like policing. After all, many academics in the humanities, myself included, hold fairly progressive ideas with respect to social issues. My own research involved the ways in which elements of the built environment act as signifiers of inclusion/exclusion (in other words, how the way we build stuff, the signs we post etc. are read by people to they belong in a place or whether they don't). What did surprise me was their extraordinary rush to believe and act on a version of events from a spurious source given long before any other evidence was even available. Incidentally, in the two years since Ferguson, I have read both DOJ investigations (Darren Wilson and the Ferguson Police Department were investigated separately), and I am convinced that 1) Wilson was legally justified in shooting Brown and that 2) the people of Ferguson were absolutely reasonable in believing Brown's friend instead of "their" horribly malfunctioning police department. I do not extend this priviledge to professional academics whose job it is to wait for, then carefully weigh evidence (two years later, this is the first time I've given a public opinion on the incident).

I saw from my colleagues in academe countless critiques against the police, accusations that American police are racist and violent, part of abhorrent system of control, out of control themselves, and so on. I saw none of the things I had come to expect of my own field of Rhetoric. I saw no discussion of the mediators involved in video evidence--the role of camera angle, the role of the videographer and her decisions about when to start and stop filming, of what to include in the frame and exclude out of it, the role of the news editor in determining where to cut the video and whether or not to include sound, and whose commentary will be included, whether that commentary will come before or after the video, etc. I saw no attempt to interrogate the market system at play in media coverage of the event-- no question as to whether or not wealthy white owners of cable news networks may have something to gain through editorial choices that posit middle class, white police officers as murderers of poor black youths. Instead, I saw shouting in university courtyards, vicious name-calling, rushes to judgement, and a complete lack of respect for actual evidence. Certainly, as is the case in examining any community, it would not be fair to lump all academics into one box. There were people, usually close friends, who would ask me about my perspective as a police officer and had tremendous respect for whatever insights I had (evasive and tentative as they often were). There were probably many others whose prudence kept them out of the public conversation all together and so I never knew about them. But I didn't see them, not even to urge calm and deliberation, which is what I always felt we were supposed to do.

After the in-custody death in our neighboring jurisdiction, while the investigation was still on-going, a community literacy center with whom I had worked for several months held a writing workshop for the family and friends of the subject (I don't mind calling him victim) to write about what the experience was like for them. On the same day that my former colleagues were posting pictures of themselves with smiling members of the victim's family, we were assigning officers to sit on the home of one of the involved officers who lived in our jurisdiction, because he was receiving death threats serious enough that we deemed it necessary to post a guard on his house, something that usually only happens in the movies. That officer was, supposedly, under an open investigation and thus enjoyed, supposedly, the legal presumption of innocence. I am not aware that any writing workshop was ever considered for the officer, his coworkers, or his family (maybe there was and I never knew, but  I would have been the natural person to contact to facilitate it, so I feel safe in assuming there was none). No one ever gave him the opportunity to write about what it was like to go from being afraid he was going to die, to being afraid he was going to be prosecuted, to being afraid he was going to die again. No one gave his wife and children an opportunity to write about what it was like to have their father vilified in the public forum--to hear that he should go to prison forever, that he should be killed. No one asked them to describe what it's like not to be allowed to go outside to play because there have been threats against your family. There was no attempt by my colleagues at the writing center, that I know of, to understand a fuller picture of those effected, or even acknowledge, despite having worked with me for months, that there was one. Instead, that officer, a husband, father, and public servant, was relegated to the position of robot-police machine, inhuman and inhumane instrument of a racist and evil system, that which all right-minded, socially progressive academics must stand against, then publish research about.

I had always seen my field as a last vestige of hope in a society that rewards rushing to the extremes (take, for instance, the people we've chosen as our presidential nominees). As teachers in universities, we were perhaps American society's last best hope to finally learn slow and respectful discourse on important topics--topics that involve the lives of police officers and of minorities, topics which deserve serious and deliberative discourse. Yet the speech and activities of my colleagues after Ferguson made me question if we had ever actually believed any of the things we had been teaching our students all along. Maybe we never actually meant it.

Policing is very imperfect. If not carefully managed, it can have dehumanizing effects, both on officers and the people with whom they interact. Officers, if they are not careful, can easily develop very myopic views of the the world, views which they fully believe are the only "real" world that only they are in a position to see. Police officers can fall prey easily to black and white views of their communities: us and them, good people and bad people, criminals and victims, and yes, even sometimes, black and white. These tendencies must be examined and we have the obligation to determine what role these tendencies play in incidents like those in Ferguson (and these similar incidents since), and what we can do about them. But the profession of policing is dominated by people who truly believe in and orient their professional lives toward service. As problematic as this type of dualistic thinking can be, police officers overwhelmingly believe that good people need to be protected from bad people, and that this mission is worth risking our own lives. This is evident in the personal and professional sacrifices police officers make every day.

My wife and children are without me every night that I work, so that my wife and I have slept in the same bed at the same time less than half the days we've been married. I am 37 years old and I have arthritis in my right hip, where my gun presses into my side. I know what it's like to be bitten by a human and spend the next year having routine blood tests for blood borne pathogens. I know what human brain matter looks like on a popcorn ceiling. I know what it's like to tell an African American man, a father like me, that his 17 year old daughter was killed by a stray bullet at a New Year's party, while at the same time a 30 year old Syracuse PhD candidate is publishing a paper claiming that I can't empathize with that man because I don't live in his neighborhood (a real paper, by the way, given at CCCC, 2014).

We keep doing it despite these sacrifices and despite the inherent danger not because we were picked on in high school and we want to get back at the world, or because we want to subjugate minorities and trap them in their lower class neighborhoods, or because we just like the power trip, or any of the other things we are accused of every day. We do it because we believe in it. Imperfect as we are, we believe deeply in the perfection of our cause.

I'm left to wonder now what we, as academics believe. How did our reaction to Ferguson and similar incidents seem to betray the habits that I thought we most valued, the ethics that drew me to the field in the first place. Perhaps I am being too critical of academe. Perhaps it is well-meaning enough. Maybe we already knew what we believed, and so we thought we must speak out quickly because the value of human life trumps the academic, theoretical value of "negative capability." Maybe we made the very human error of failing to question our own biases when reacting to Ferguson, even as we teach students to carefully examine their own. Maybe we have such firm beliefs about systems in general that we failed to examine the particulars of these individual cases, instead prematurely assuming that the individual case was simply one more example of the problem of the system as a whole. Maybe we still hold our academic values firmly, we just forgot them a little in the heat of the very important moment.

But as any recently disillusioned person, I can't help but see something more insidious. Maybe, when Ferguson happened, what we saw was a bandwagon. Aboard that bandwagon we saw ready-made kairotic research questions, publishing opportunities, entryways for our literacy centers into communities usually reluctant to trust egg-heads with clipboards. Maybe it was nothing more than  careerism, and completely unreflective exploitation of my life as an officer and the lives of the minorities in the communities I serve.

I love the university. I love teaching, and I miss it often. And I hesitated to write these words for nearly two years because I wish to reserve the right to change my mind, but at the moment I cannot see myself returning to the classroom. Ferguson, Missouri forced me to confront these two roles I play: as an academic and as a police officer. I'm not the kind of person who feels an obligation to "choose sides." I am deeply suspicious of anything that feels like "us versus them," but my experiences watching, talking to, arguing with these two separate communities in which I'm wholly a member has forced me to confront what I believe.

I believe that, despite deep imperfections, perhaps even tragic ones, in American policing, the men and women I serve alongside do it because we believe in what we're trying to do. We hold these beliefs strongly enough to risk our lives for them.

As an academic, I'm not entirely sure we actually believe anything.

Monday, May 18, 2015

Oklahoma Meteorology and Digital Culture Debates

In Oklahoma, weather is big business. In the Spring, when the risk of severe weather is the highest, it is not unusual to spend a Saturday night on the sofa eating Ben and Jerry's and watching wall-to-wall weather coverage. Weather coverage has spawned its own television entertainment culture in the Oklahoma City market. Everyone here knows about the Mike Morgan Severe Weather Tie (which has its own Facebook page), and the Gary England Drinking Game. Storm chasers like Reed Timmer, Val Castor, Alan Broerse (the spelling of whose name I did not have to look up) are household names. Meteorologists are bonafide celebrities, consistently the first T.V. personalities in our market to earn verified twitter status. Weather coverage is, therefore, incredibly competitive. This means that it is often very dramatic.

The competitive and dramatic nature of weather forecasting in Oklahoma, combined with the serendipitous co-occurance of the busiest time for severe weather and Spring sweeps means that local T.V. networks are vying for viewership through use of high drama, employing scored of spotters, helicopters (one station brags that it has two helicopters chasing storms), and well-equiped weather centers. Locals, therefore, often accuse T.V. meteorologists of overdramatizing the weather in order to gain ratings. These accusations range from the everyday as any prediction that doesn't come true becomes evidence that meteorologists were "over-hyping" all along to much more serious suggestions that meteorologists' overdramatized warnings have caused deaths. Because of this, it has become fashionable for jaded Okies to distrust the local network affiliates for their weather coverage (though we still find ourselves glued to the television every time it "feels" a certain way outside).

Of course, such distrust is not new. Just as concerns about "liberal media agendas"and the like led to the explosion of alternative media such as talk radio, blogs, social media, and so on, so too have these concerns led some to search for their weather coverage from alternative sources. (I myself prefer to stick to the National Weather Service's predictions when I start to sniff sensationalized weather forecasts.)

Into this mix came meteorologist Aaron Tuttle, a controversial and even divisive figure in Oklahoma City media. Tuttle is a former KOCO-TV meteorologist who "decided to retire" from television in 2007 to work as a meteorologist for the FAA. He is a celebrated, vilified, and parodied figure who runs a website, appears in auto dealership ads, and posts mirror selfies of his abdominal muscles. According to his social media sites, he still "works for the federal government," but he also charges $200 per appearance to speak in public and offers a "free" web app for which he accepts donations. He has a large following in OKC, mainly, it seems, because of distrust in local media, and his own critiques of local media for their sensational broadcasts.

It's ironic, then, that he stepped into serious controversy when he made this prediction on May 17th.



As I write this at the end of the date in question, I look out at clear skies and the setting sun. Obviously, it's not unusual for meteorologists to get it wrong. It happens so often, it is a cliche (not to mention, a Nicholas Cage movie). But this forecast was different for a couple of reasons. One was simply that it was SO wrong. Every other forecaster in the market, including the National Weather Service forecasted a 20% chance of a few scattered thunderstorms, some of which could become "marginally severe." Indeed, a few popped up in Eastern Oklahoma in the early afternoon, then fizzled out as it cooled and they lost energy. There was never, according to any other meteorologist, any tornado threat.  Apparently, when Tuttle made his forecast, he failed to account for a very obvious factor (something about low level moisture maybe?) that stabilized the atmosphere to the point that tornados where simply not a concern. It was an oversight so obvious to people who know what they're doing that meteorologists across the country questioned Tuttle's motives for making the prediction.

Even early in the day, before we could know for sure whether or not Tuttle was right or wrong, meteorologists were trying to fix the damage his forecast was causing as panicky Oklahomans shared, retweeted, and generally freaked out about his dire suggestion that tornados of the "EF4/5 kind"were possible. Early in the day, according to the alternative media website "The Lost Ogle," Emily Sutton was challenging his forecast on social media. Marc Weinberg, an OU trained meteorologist in Kentucky went so far as to suggest that Tuttle should be held accountable, comparing his irresponsible forecast to yelling "fire" in a  crowded theatre. The forecast was apparently so obviously flawed that even a Meteorology undergrad studying at OU implored his twitter followers not to trust Aaron Tuttle.

For these forecasters, Tuttle's forecast could only be explained by one other interesting feature if the above post. It is this phrase: "Link to donate to keep free." Aaron Tuttle, a meteorologist whose only public exposure is now through social media (he is called an "internet meteorologist" by his T.V. competitors) has nearly 94,000 Facebook followers. That number has jumped by more than 6,000 in the past two days, as followers shared his post out of concern for a tornado outbreak akin to the Moore Tornadoes of May 3,1999 and May 20, 2013.

My interest in this incident is that it is an interesting example of debate between established forms of media and new media forms. Here is a clear, everyday example of the debate between what is referred to in the study of rhetoric as Bookspace/Cyberspace, or Web 1.0/Web 2.0. Cynthia Selfe discusses this debate as being between people who think of "technology as bane" and those who think of "technology as boon."

The proliferation of alternative media forms has been made possible in large part by the "technology as boon" mindset of much of contemporary culture. Technology, especially Web 2.0, has often been credited with democratizing knowledge. Those who are suspicious of "bookspace"
see old forms of media as propping up old systems where knowledge was reserved for the few and powerful. They see old media as inherently biased, carefully crafting facts to support particular ideologies, or to pander to advertisers and editors.

Yet, interestingly, people who are so suspicious of these old media forms seem to often accept New Media as somehow value neutral, as if those engaged in New Media have no biases, ideologies, or motivations other than complete truth. We are often quick to accept that those engaged in digital or other forms of alternative rhetoric are benevolent revolutionaries, trying to make their prophetic voices heard against an oppressive establishment.

But this controversy should remind us that this is not necesarily true. Rhetoric in any form is never value neutral. We can never accept it uncritically. Here, in fact, we see a definite strength of established, old media. At KOCO, if Tuttle is this irresponsible, he is out of a job (which makes one wonder how truly voluntary his "retirement" was). After all, old media, with all its problems, has a reputation to maintain. There is a system of checks and balances in place to, at the very least, see that incorrect predictions are just wrong because sometimes you get it wring and not because person making the prediction was wholly irresponsible. After all,  when it was discovered that Jayson Blair had plagiarized and fabricated stories in the New York Times, it was a huge controversy, and it tarnished the reputation of one of the most established and revered media establishments in the world.  When it happens on some right wing blog (like, daily), no one bats an eye. Yet we continually distrust these established forms and accept uncritically new forms, much the same way we challenge our doctors with things we read on WebMD.

If we are to learn media literacy in the Digital Age, we must learn to have at least as much healthy skepticism for New Media forms as we had for the old forms when we started seeking out alternative sources. We must learn to question why Aaron Tuttle would make such predictions and in the same paragraph ask for donations. We must learn to ask why he is not working in television anymore in a market where good meteorologists stand to make a lot of money. We must learn not to so quickly accept that he "retired" because he was tired of fame (which seems to be the suggestion he makes in this explanation). We must ask what his motives are before we share his Facebook posts or buy his weather ap. Or rather, offer it for free with a donation.

Monday, February 17, 2014

Long Overdue Name Change

I opened this blog on September 26th, 2004. I was a rookie cop, and I was in entering the mourning stage of my break up with my old life as a professional actor, playwright, and general literati, so I started a blog to help preserve that part of myself that I feared police work was already starting to kill.

Because I saw it primarily as a space to work out questions of literary aesthetic (films, novels, plays etc.) I gave it a name that parodied literary journal titles, which always seemed to me to end in "Review" (Sewanee Review, Hudson Review, the Southern Review, and so on. A search on the Poets and Writers database lists 248 journals with "review" in the title). In an act of self effacement, I named it the Brummagem Review, an early-modern era reference to counterfeit coins.


Of course, I misspelled it as "Brummegem," an error that lasted an embarrassingly long time in the banner title (not the only time I've done that: see the back cover of my self-published play, The Origin of Language, also the second act of my Master's thesis Murder/Rapture).

The blog has taken a number of forms throughout the years from baseball column, to political opinion page (on both sides of the aisle), to culture site. The past several years, as I have landed more solidly in Composition and Rhetoric as a graduate students and teacher, it has mostly become my place to work out ideas about teaching, composing, and literacy.

With that in mind, I've decided to change the name to Composition Cop as a nod to my two lives, and as a play on "grammar police." This is already my handle on both Twitter and Instagram, so it seemed a natural move.

Apropos of blogs in general, I'll still feel free to let the subject matter stray when I feel like it (see my last post regarding Valentine's Day cards), but this name fits much better than a misspelled, antiquated self-offending word.

Friday, February 14, 2014

Valentine's Day Cards: when you're told [by Hallmark] that you need to say "I'm Sorry."


The process of buying Valentine's Day cards is one of the most frustrating experiences of the year for me. I find the writing in these cards terrible. It could be that, as it so happens, I'm a pretty decent writer myself, so I always feel like I could just buy a blank card (if it were possible to find these) and write something as well-written as any of these cards and far more personalized. This year, as I was searching for a card that didn't make me sick, I finally put my finger on the problem. Cards in the "For My Wife" section come in two basic genres:

1) Comical cards that make sex jokes, which for some reason, I find incredibly tacky, and

2) Cards couched in the language of apology. Every card I pull off the shelf contains phrases like "maybe I don't say it often enough. . ." This is a relatively explicit apology, but even those that aren't this direct contain this common trope. There are references to expressing love "just because it's Valentine's Day. . ." and other such suggestions that Valentine's Day is a day in which we husbands make up for a year's worth of neglecting to express love for our wives.

It's incredibly difficult to find cards that don't make such references. This is, I suspect, a product of an essentialist view of male behavior which assumes that we are all reticent to tell our wives that we love them and will therefore need to apologize for this. But what about those of us who happen to be decent husbands? Whose wives do know that we love them?

I might be tempted to say that we don't need Valentine's Day, because we do this stuff every day. My wife would not agree with this. We don't do elaborate Valentine's Day stuff. This year, we're going to have a picnic with our boys. Later, we'll eat expensive chocolate and cheap champagne, and that's about it. It is, however, still important that we buy each other cards. So, can someone just make blank cards that aren't covered in pictures of kittens?*

*Charissa tells me that she has this same problem. Cards for husbands also come in two genres: "you're a jerk but I love you anyway," and, of course, sex jokes.

Sunday, February 09, 2014

What are Students Thinking? -- Quoting the Dictionary

See the explanation of the "What are Students Thinking" post series, where I explain what these posts are about and what they are for.

For this first "What are Students Thinking" post, I'm considering why students quote dictionaries. This is one of those freshmen cliches that all teachers of first year writing know and abhor. Composition theorists have mentioned this problem in discussions of the positivist views of language many students hold. Our students, like many people outside of English departments, have an understanding of  language as fixed and definite. To them, words have explicit, singular meanings, and these meanings may be found in dictionaries. Dictionaries, to our students, are catalogues of the factual, unassailable, exact and, correct meanings of particular words. There is, for them, a naturalized understanding of the way words work, as if their meanings come from God, who directly inspired the prophet Webster.

When I talk about definition arguments with my students, I begin by foregrounding where definitions come from. Most students have never considered that dictionaries are actually written by people. Thus, definitions are, after all, the opinions of a collection of human authors and their editors. To demonstrate this, I like to show students definitions from Samuel Johnson's dictionary, which includes such gems as:

oats: a grain, which in England is fed to horses, but in Scotland supports the people.

Centuries removed from Dr. Johnson's dictionary, we can see the very obvious class structure implied in this definition. Yet we take our own modern dictionaries as purely un-ideological repositories of linguistic truth. We don't register (or even know about) the political aims of Noah Webster's spelling reforms, undertaken in order to differentiate (read: un-Frenchify) American English. Instead, these definitions are taken to be givens.

For our students, this means that a dictionary definition gives them solid ground. After all, in asking students to write argumentative papers, we are forcing them to confront the continent nature of the conversations they join. They're finding out that truth is not as solid as they've perhaps been made to believe. The very process of argument emphasizes this. We all make contributions to a conversation, and no one is quite right, and maybe no one is quite wrong either. This is disconcerting, especially for students who just want the right answer, the one that's going to get them a good grade.

A dictionary definition, then, is at least one unassailable truth, a thing that cannot be attacked. A word, at least, is what it is and no one can count off because they disagree with it.

But even in the argument of definition papers I assign, where the continent, unsettled nature of particular terms and concepts is actually the point, I see these dictionary definitions cropping up. In fact, in the unit my classes are finishing up, we emphasized the continent nature of words, and talked about the histories of these dictionaries in order to denaturalize my students' views of language.

I even went so far as to just flat tell them, "you should not quote the dictionary in your papers for all the reasons we've been talking about." After all, the whole point was for them to build their own definitions and argue for their legitimacy.

Yet when the first drafts came in, I found dictionary definition after dictionary definition. What gives? Did students not believe any of the things I'd said about where these definitions come from? Did they just flat fail to listen? This seems like a pessimistic view of my impressively involved students. As I said in the intro to this series, I'm much more interested in the rhetorical choices students are making when they fail to meet expectations, especially those I've made pretty explicit. Students know that these definitions are not solid facts, but rater opinions. They know that the point of this assignment is to make definitions on our own. So why would they, nevertheless, default to these dictionary definitions?

A few of them actually cited the definition in order to problematize it. They quoted the definition then went on to write about how deficient the definition, and this culture's understanding of the concept, actually are. This is actually a fairly sophisticated use of these definitions. If I react only to the fact that the broke my "rule" against quoting dictionaries, I fail to give them credit for doing something that is actually quite advanced. Instead, maybe I have to consider such moves as a fair exception to my rule. But what of students who make the more typical move of using the definitions as the final authority?

Nancy Sommers, David Bartholomae and others have written about the difficulty in helping students to develop a sense of agency and authority in their writing. One of our tasks as first year writing teachers is to enculturate students into the expert world of academe. Part of our job is to help them sound like experts, even before they really are. But our students seem have a great deal of apprehension about this. After all, their role in education has always been as receptors of knowledge--vessels into which knowledge is poured by teachers who are assumed to know more than them.

The kinds of writing most of them have done before coming to college illustrate this. Before coming to college, students spend a great deal of time writing book report in which their goal is to prove they've read the book, essay tests where their goal is to show they've internalized the course material so that they have become conversant with it, and other such performative writing tasks. So, even in their writing, their goal has always been to show that they have learned other people's knowledge.

Yet, what we are asking them to do is to contribute knowledge--to create it. In asking them to formulate definitional arguments, I am asking them to assume the role of the expert. Students seem reluctant to do this. Again, their role has always been to learn from the experts, not to be them. Asking them to become experts has disoriented them. It may even feel presumptuous to them to assume such a role. They respond by defaulting to someone else's expertise, as if to say "I may pose my definition, but at the end, I'd better provide the real definition by the real expert."

So how do I counter this? How do I encourage students to assume their own agency and expertise? How do I convince them that their definitions are as valid as those written by interns and editors at Webster's, and are in fact probably better (because they haven't been simplified into one sentence).  These are open questions, but answers that we cannot arrive at unless we ask better questions than "weren't you listening when I said not to quote the dictionary?"

Saturday, February 08, 2014

What are Students Thinking?

Writing teachers are full of pet peeves. This is perhaps inevitable since every time we experience writing habits we find distasteful, we do so while grading a huge stack of papers. This means that we experience these over and over so that it seems like language is infested with these things. The common response of many who teach writing is simply to complain a lot. We create lists of freshman cliches, which we share with one another or sometimes with students (I once had one such list designed to help students find these in their own writing. It was tellingly entitled "Ugh Words"). We create elaborate jokes, send terrible students sentences back and forth to each other, and create snarky internet memes (ironically, about the terrible things the internet is supposed to be doing to student writing). I myself have written a series of five-pararaph essays about why the five-paragraph essay is so bad.

What we often forget to do, to our own shame, is to try to figure out why students do these things. Perhaps this is because we assume that these problems in our students' writing exhibit some kind of natural deficiency in their processes, work ethic, or with the students themselves. In other words, we assume that the problem is with our students, and not with our ability to teach them. Students have these problems because they are lazy, waiting until the night before the paper is due without time to really revise or evaluate their own writing, because they simply aren't listening to us in class when we tell them what [not] to do, or because they just don't care that much.

I, on the other hand, prefer to eschew utterly pessimistic views of our students and the work they do, mainly because such views are not particularly helpful with regard to my own teaching. It is, instead, much more helpful to look at these common problems in our students' writing, not as natural deficiencies or personal attacks against me personally, but as rhetorical choices our students make. In other words, when these students go awry, what were they trying to do, why did it fail, and how do we help them do what they were trying to do in more acceptable ways.

With that in mind, I'm beginning a new segment of this blog that deals especially with these peevish problems in student writing with the goal of theorizing what students are doing when they make common mistakes, or rather, when they write in ways which seem totally acceptable and appropriate to them, but which we hate. Assuming that students aren't doing these things because they hate me and wish to attack me, or because they are idiots who can't be helped, or because the golden age of my university years has given way to the incoherence of net-speak, the goal of this series will be to try to develop our own pedagogies so that we may respond to these problems in ways that are optimistic and constructive.

The first is already in the hopper; it will deal with students who quote dictionaries, and it will drop later tonight or tomorrow. So be watching for it. Now, I'm off to eat tacos.

Saturday, February 01, 2014

Space--Male Privilege--Gender Guilt(?)

Any jogger knows what to do when he sees a strange dog on the next block: cross to the other side of the street and walk. Similarly, when we are out at night, we know to stay under the streetlights and on well-walked places. We know these things because we have a developed understanding of space and how we behave within it--so much so that it seems intuitive. We don't often, however, know or recognize that we have this awareness. When we interact with space, we generally do so on auto-pilot, without really noticing what we are doing or even that we are doing anything.

My most recent research has centered on the ways in which we "read" space and how, thus, space is textualized in ways that both encourage and constrict movement, empowering some people in particular spaces while restricting others.

When presenting this research recently, a fellow graduate student asked me about the gender differences in the particular space I was researching. How do women view the space, and do they move differently through it? Because my own research focused more on class (a different form of Other), I really had no data to answer this question. But it was a great question; no doubt women perceive the space differently than men, and it's reasonable to suspect that these perceptions influence the way women move through and interact with the particular space.

Then, a few days ago, I caught myself doing something interesting. I was leaving campus rather late on a cold day, which means that there weren't many people walking around. As I walked across an empty parking lot heading back to my car, a woman got out of a car further down the lot but directly in my path. I did something very interesting without really thinking about it in the moment: I changed my path. 

As soon as I had done it, I realized what I had just done. I had diverted my path to keep from walking too close to a woman in an empty parking lot in the twilight of a cold day. I had done this because I realized that crossing the path of a relatively large man in a black coat in empty parking lot was going to make her very nervous. I would have never done this to avoid another man; to do so would have been a sign of weakness--the weaker primate making way for the stronger. This, on the other hand, was a curious bit of chivalry, one that had happened naturally and, I think, was borne out of my recent hyper-awareness of space.

The incident also made me realize that feeling comfortable in public spaces is yet another male privilege that I have enjoyed and also completely naturalized. I've seen scores of news stories and public information campaigns advising women on how to protect themselves from predators. Included on all of these lists is an awareness of space. Know who is around, know what kind of area you're in, look for escape routes, and on and on. I, however, have never been taught any of these things (except for some versions of this in the police academy, but these were taught in and on very different terms).

What this means is that I have been allowed to blithely move through space with little awareness of how my presence in that space effects others. This means that everywhere I've ever gone where there has been a women present, she has had to pay close attention to me and she has had to adjust her movements accordingly. I, on the other hand, have been free to disregard her presence or even fail to notice her presence at all.

I've asked my wife before what it's like to be a woman in public--does she live with the knowledge that every man she passes looks at her? Does this bother her? Does she eventually just get used to it? Again here, I am made aware that women have had to think about how to handle me, though I have never been taught to be aware of my effect on them.

I'm not exactly exactly sure what I'm suggesting? Am I saying that men should stop staring at women in the mall? Absolutely! Am I saying that we should divert our paths in empty parking lots to stay away from women, thus protecting them from being uncomfortable with us? I'm not sure. I am aware that such extreme chivalry might be another form of sexism, based on some ingrained assumption that woman is a frail species that I must protect, even from the psychological effects of my presence. At the very least, I'm asking to be made more aware and for other men also to be more aware of how we ourselves interact with and are part of space, and how that effects others--others of different races, classes, genders. That doesn't mean that we must carry guilt simply for being born as we were; it's not my fault that I was born a white male to a middle class family. It does mean, though, that I should be aware of what it means to others that I am what I am, and how my presence in a space necessarily influences that space.