Meh.

According to Yahoo News, “meh” has just been added to the Collins English Dictionary. I support this.

You know, I do have to wonder about the usefulness of adding colloquial language to dictionaries if that particular colloquial phrase is merely a trendy, faddish kinda thing. It’s not like I go around saying that so and so is the bee’s knees or anything. (Okay, I’ll admit to using supposedly “outdated” phrases like that just for fun at times, but as a predominant form of communication? No.)

So, will “meh” actually make it to the stature that other words such as “cool” have? Of course, I don’t know for sure, but I’d say yeah. Even if it doesn’t remain “meh” specifically, but transforms into “eh” or “uh” or something like that, it’s still a form of common communication that’s being used more and more readily. It’s often instantly understood too. Even without having a specific definition in some fancy British dictionary, the gist of the meaning is understood. It just works well.

Even I’ve been susceptible to its influences:

facebook status update

facebook status update

Oh, yeah. For Realz.

I mean, I could try to properly describe the kind of ambivalence and indifference that I was feeling, but “meh,” to me, is more of an expression of that indifference rather than a description of that same feeling. Ya feel me? “Meh” is like the actual tear, whereas saying “Kaitlin is indifferent” is like the word “crying.” Prospective readers understand so much about my particular state with just that one word without me going on and on about it.

Perfecto.

So, yeah. It deserves to be in the dictionary. And that’s not so “meh.”

More Trickster Rhetoric . . .

These quotes come from an excellent piece by Malea Powell, “Blood and Scholarship: One Mixed-Blood’s Story.”  I read the spirit of Harlot throughout these lines . . .

The only way for the mixed-blood to survive is by ‘developing a tolerance for contradictions, a tolerance for ambiguity,’ and by turning those contradictions and ambiguities into ‘something else’ (Anzaldua 79).  Anishinabe writer and theorist Gerald Vizenor would have Indian scholars/mixed-bloods play trickster, to use our knowledge of the language and structure that compose the narratives that bind us as instruments to cut away those same oppressive stories.

Vizenor celebrates the humor and play room that are made available to crossbloods (what I’ve been calling mixed-bloods) in the simultaneity of our positions on the margins of American culture combined with our iconographic centrality against which much ‘American-ness’ is imagined.  Sharp humor (yes, sharp like a weapon) and radical temporal figurings (we are always at the past and the future in the present, and visa versa) help Vizenor to posit the trickster as a space of liberation.

For me, the trickster is central to imagining a ‘mixed-blood rhetoric.’ The trickster is many things, and is no thing as well.  Ambivalent, androgynous, anti-definitional, the trickster is slippery and constantly mutable.

I find the trickster in every nook and cranny of daily life as a mixed-blood.  But, more important, I see the trickster at work outside of Indian-ness as well, in the contrarinesses that inhabit the stories that tell, and un-tell, America and the Academy.  The trickster isn’t really a person, it is a ‘communal sign,’ a ‘concordance of narrative voices’ that inhabits the ‘wild space over and between sounds, words, sentences, and narratives‘ (Vizenor 196).

Trickster discourse does ‘play tricks,’ but they aren’t malicious tricks, not the hurtful pranks of an angry child; instead, the tricks reveal the deep irony that is always present in whatever way we choose to construct reality.  Trickster discourse is deflative; it exposes the lies we tell ourselves and, at the same time, exposes the necessity of those lies to our daily material existence.  Trickster discourse asks ‘Isn’t the world a crock of shit?,’ but also answers with, ‘Well, if we didn’t have this crock of shit, what would we do for a world?’ The trickster asks us to be fully conscious to the simple inconsistencies that inhabit our reality” (9).

More trickster rhetoric to come . . .

A Little Plug (‘N Play)

Gauti Sigthorsson posted his Screen Studies Conference presentation creatively titled “Home is Where My Archive Is.” It runs about 20 minutes and is most definitely worth the listen. If not for the actual complications Gauti brings up, but also for sentences like: “you’re functioning as my 3D PowerPoint presentation.”

Promiscuousness of Promiscuity

I just started going through a book called The Information Society Reader, a collection of foundational readings on the study of the Information Society, and a few pages into the editor’s introduction I had a déjà vu moment with this oddly familiar statement:

It can seem that the [concept of “Information Society”] is used with abandon, yet as such it is capable of accommodating all manner of definitions. Readers should look carefully for the definitional terms used, often tacitly, by commentators in what follows. Are they, for instance, emphasizing the economic, educational or cultural dimensions when they discuss the Information Society, or is it technology which is given the greatest weight in their accounts? One might then ask, if the conceptions are so very varied and even promiscuous, then what validity remains [. . . ]? (p. 10)

Webster, Frank (Ed). (2004). The Information Society Reader. London: Routledge.
(Or click here to see the text in Google Books)

Is this warning not incredibly similar to those we hear about the study of rhetoric? Varying definitions, an undefined scope of study, questions of validity? Lately I’ve felt lulled into a (likely) false state of security. How many times do we hear of academic programs stating with pride that they are interdisciplinarity, crossdisciplinarity, transdisciplinarity, multidisciplinarity? Promiscuity is a characteristic more and more fields of study display with some pride. This crossing of borders has become something of an academic movement, but all movements have a beginning and usually an end, or even if it has lifecycles and never entirely dies out, the times when it tapers out can be painful – as the history of rhetoric can attest to.

What’s interesting to me about the study of the Information Society is that its inception has been some sort of uber-manifestation of interdisciplinarity. It’s flowed and found nodes of connection in the same manner as the Network Society itself. It’s almost like the global community’s entry into the Age of Information is what has made this move toward interdisciplinarity possible in the first place (both in terms of technology and of an emerging climate that condones and even celebrates such behavior), and it’s quite fitting – if not problematic – that the field purporting to study this new age should mirror it as well.

But I can’t help but think there’s bound to be a tide building against such breadth and promiscuity. And if so, I wonder when its time will come, what it will look like, and what the alternatives may be.

Watching TV Makes you Smarter?

Yep.  At least that’s what Steve Johnson claims in his 2005 New York Times Magazine article with that title.  And…it’s an argument worth considering, especially given our penchant for dissing Americans in matters of intelligence. (Consider, for starters, Susan Jacoby’s recent book The Age of American Unreason, former Vice-President Al Gore’s The Assault on Reason, and Richard Shenkman’s new book Just How Stupid Are We?: Facing the Truth About the American Voter.)  H.L. Mencken wasn’t mistaken when he once said, “No one ever went broke underestimating the intelligence of the American public.”

So, maybe it’s worth overestimating the intelligence of the American public, or at least reconsidering some of our criticisms.

Here’s the gist of Johnson’s argument:  a number of contemporary television shows, including The Sopranos, 24, The West Wing, and ER (keep in mind this was published in 2005) are actually demanding of some of our mental faculties.  The mental faculties he’s referring to include attention, retention, the parsing of complex narrative threads, and the deciphering of quick dialogue filled with information most viewers won’t understand.

He uses The Sopranos to illustrate his point about complex narratives.  In one episode, the viewer has to untangle at least 3 different narrative threads with layered plots in just one scene.  And, he says, the narratives build from previous episodes and continue on in future episodes.  ER is an example of a show full of quick dialogue packed with complex terms and a vocabulary unfamiliar to most that the audience must wade through to follow the story.

All of this, Johnson argues, requires the audience to focus–exercising the parts of the brain that map social networks, work to fill in missing information, and help make sense of complex narrative threads.  What Johnson’s crediting here is the structure and design of the shows…not the content.  The content, he acknowledges, is probably more immoral and sensational than ever.  But that’s not the point in this examination.

So, is this a valid point?  Do others agree?  What shows on TV right now might be comparable to the ones Johnson cites to make his argument?

Speaking of interdisciplinarity…

I wanted to place a plug for the upcoming “Expanding Literacy Studies” conference, the first international, interdisciplinary conference on literacy studies organized and hosted by graduate students. It will be held at The Ohio State University on April 3-5, 2009.

Expanding Literacy Studies logo

This conference is dedicated to exploring the broad range of literacies–alphabetic reading and writing, visual, digitial, rhetorical, critical… and so on. If you compose or decode a text, that’s a literate act. And this conference offers an opportunity to contribute to a conversation that transcends the usual disciplinary borders… while chatting with a group of smart, fun people. Tell ’em Harlot sent you.

See http://literacystudies.osu.edu/conference for more info.

meant to be?


As I push and shove (or, rather, swing and duck) my way through my dissertation, I’ve been thinking lately about the topic I once promised myself I’d write my dissertation on: the rhetoric of fate in American culture. You see, there was a time in my life about six or seven years ago that I had a major philosophical shift in my thinking. Previously, I had been a faithful believer in fate and predestination. Everything was, of course, predestined—where I’d go to college, who I’d meet, what career I’d have, whom I’d marry, if I’d marry, etc. After some pretty heated discussions with several people I respect and admire, I toyed with the idea that maybe everything wasn’t based on fate, or wasn’t predestined.

To make a long story short (or, to spare you a personal story more interesting to me than to others, I’m sure), I’ll cut to the chase. In the process of shifting my thinking, I asked anyone and everyone what they believed about fate. Did they, too, believe that everything was predestined? What did people mean by fate? Predestination? Most profound to me, and pertinent to Harlot, is the contradiction I found over and over in what people believed about fate, and in what they said about it. Most didn’t really believe in fate, but I could easily catch them speaking as if they did.

For example, my mother firmly stated that she didn’t believe our lives were predestined—that we had independent thought and choice in what we did. She did, however, routinely utter such comforting statements as, “Don’t worry, Kelly, it wasn’t meant to be,” or “If it’s meant to be, it’ll work out.” My best friend confirmed that she, also, did not believe that our lives were predestined. However, she would often ask the question, “Where is Mr. Right?” “I guess I’m not meant to find him yet?”

What I’m still curious about is why many of us (not to mention popular culture) often speak as if things are meant to or not meant to happen if we don’t really believe it. Do we really believe, on some level, that things will work out? Do we need to believe that? Is it all just rhetoric we’ve heard and repeat out of habit?

Obviously, I, at least, wasn’t predestined to write that dissertation. Long way from there…

Gee-Speak

Click on the pic below to hear a quick interview with Gordon Gee by the The Chronicle of Higher Education, where he makes a glib remark that I found to be illuminating:

gee.jpg

In responding to how Universities should deal with Washington’s increased scrutiny on how we function (from endowments to athletics), he is quick to point out that we need to stop reacting so defensively. But that isn’t as interesting to me as when he describes how we should go about it; specifically, in our communication, he says we need to “not be so damn academic about it: we’ve got to be able to communicate and communicate wisely and communicate well.”

The implied argument is rather scathing and obvious so I’ll skip over flushing out all the connections. But the call for a change in our communication patterns is well worth taking not of, I believe – how do we go about getting “academese” to mean “elegant argument.”

Or is it too late?

Introducing Glossa Technologia . . .

Check out our sister site – another rhetoricalcommons gem:

Glossa Technologia

gt.jpg

Professor Ben McCorkle is a key player in this brilliant project, and I’m sure we’ll all be thanking him profusely a few years down the road as the site grows.

Although the link will take you to a more than adequate description of their project, I’ll just briefly mention here that it’s a wiki-bibliography on matters relating to digital technologies in rhetoric, composition, and literacy studies. Right up our alley, eh?

The site is still in its pre-launch-beefing-up stages. For those of us studying for comps (or for those that just beat the hell out of ’em – congrats Vera), there’s plenty of spaces for contribution.

Get to work Harlots . . .

“Growing Up Online”

And we’re back to my love of PBS, Frontline this time, with a special about technology’s effect on teenagers. Especially interesting is chapter two, “A Revolution in Classrooms and Social Life.” I have to admit to being a little miffed at “everybody uses Sparknotes” or “nobody reads books” concepts as a technologically advanced young person who does indeed do her own reading. (Though, I will admit to being overly excited at such available online books services such as DailyLit.com, which sends a user multiple easily consumptive sections of books for free by email or RSS feed if they’re in the public domain and for a minimal fee if they’re a contemporary work.) Sure, it’s been a few years since I’ve been in high school (thank god), but it’s extremely disconcerting to me to think that the advance of technology has left someone in English studies thinking that they don’t have a place anymore. I mean, did mathematicians freak out at the advent of the calculator? I think not. They used that tool to their benefit (even those in love with the abacus), as other technologies can be used to benefit other areas as well.

It is ironic that I watched this online though, no?

Ze link…

http://www.pbs.org/wgbh/pages/frontline/kidsonline/