Jaye Davidson Has a Dong Part 2

So the CPGs and the SADs are ruining the discourse for those of us that would like to have critical conversations about things without it being turned into a contest1. At this point, it’s probably necessary to take a step back and say: I don’t think this is the product of intentional malice. Or, at least, not the way I’ve made it sound. I think it’s a failure of the discussion to evolve in the same way experiential habits have.

Much hay is made of the arrival of television that is taken seriously as a critical subject2. Even people that don’t whisper about the “golden age” are still fairly well-prepared to talk about David Chase or David Simon or Mitch Hurwitz or JJ Abrams or Amy Sherman-Palladino, and even people whose general television approach is relatively-uncoupled from the showrunner/writer/creator can probably find something to say about The Sopranos, The Wire, Arrested Development, Lost, or Gilmore Girls3, 4. And this is rightfully so: I’ve never been a particularly-close study of television, but the existence of stories that take on dramatic and emotional weight (and, honestly, it’s mostly emotional weight, about which more later). I’m not here to take anything away from anyone’s achievement in the field of television, but I think that there’s a bigger force at play, and it also plays into some of the spoiler-aversion (and, to a lesser extent, the photo-gathering, at least in an abstract form).

Television is episodic and is, really, one of the last episodic media we have. This means that it is the last place where there is a scarcity: you only have as much of the story as has been produced, which means that the “experience” aspect that I talked about last time is the most in effect with television. But television’s business model hasn’t progressed as rapidly as storytelling developments, so we’re still left with this idea that television is a thing that happens once a week for the first few and last few months of the year. And so they’re still reliant on the “make people talk about it so they watch it when it airs”.

Now, even I’m not so paranoid that I would believe that the SADs started as shills for the networks, but I’m not stupid enough to not see that the way shows are written is starting to reflect the constant clamour of the SADs (and also the CPGs, and we’ll get to those in a moment, I promise). It’s most notable in the cable drama (The Walking Dead, Game of Thrones, Mad Men), but it’s happening across the board that the incentive to watch every week is basically the fear of spoilers. Which is all well and good, except that it’s developed to a point where it seems that somewhere in the show bibles is the order to create something that could be spoilered every week – beyond the marketing/teaser stuff (the stuff that you see in the trailer every week, which is generally treated as “premise,” and not greeted with the howls of the recently-enspoilered5), there seemingly must be that thing that would, if mentioned, cause people to jump off bridges for finding out too soon.

This, then, is not damage done to the discourse around the art, it’s damage done to the art. An interesting (and, for the person writing this, somewhat heartbreaking) object lesson is that of Doctor Who. Over fifty years old, the Doctor has survived more changes in television habits (including several that left him benched for years at a time, and archival indifference that resulted in the loss of many hours of his own backstory), and for the first several decades, the show was a joyful, often-campy, sometimes-surprising show in the adventure story tradition6. It was revived, and maintained the joy and the adventure for, oh, twenty episodes or so, before promptly jettisoning all of that to be a high-action space opera7. And sure, “Blink” and “The Doctor’s Wife” and “The Waters of Mars” are all fine examples of space opera done well. But they’re also crafted for an audience that expects a big ol’ wham every episode, so there’s (ostensibly) something to talk about and (practically for our purposes here) an incentive to watch so you can avoid hearing about it outside of its native environment.

This could be seen as regular evolution, except for the change in (and sameness of) tenor. There are no more “thought-provoking” episodes of Doctor Who. There are no more real worlds, there is no more intrigue. What there are, instead, are feelings, settings, and veiled surprises. The approach to the proactive that made Logopolis8 feel like a place you wanted to see more of and think about. It’s also what made the Cybermen9 so terrifying: they weren’t interested in the free-thinking world that you were literally being entertained by. It made you feel for the worlds that the Daleks10 had destroyed, knowing that they were sketched and populated by people. And, ultimately, it’s what made “The Warrior’s Gate”11 so effective – we cared, and we cared because we were allowed to fill in.

That has been traded for a dictatorial approach to emotional response (it’s actually a response that’s evoked in much the same way as films12) – latter seasons of Doctor Who rely heavily on setting things up in a way that we imagine that we feel for them, because that’s what the show is telling us. And the stakes are ratcheted up so high, and the tension-inducing musical cues and camera swoops and serious close-ups and heavily emotional language is in place so that at the end of the episode we have what is known, lexically, as feels. And that’s appropriate – it’s not “feelings,” and it’s certainly not “emotion.” There’s nothing at stake, no part of the viewer is invested in it, there’s just the opportunity to go through some no-risk, high-reward entertainment.

It’s certainly nothing new that such things exist, and become popular: the soap opera is older than television, the melodrama is a couple hundred years old, and it was a seventeenth-century playwright who said that human life itself is “sound and fury, signifying nothing.” The whole idea of popular entertainment is to be able to have the experiences, mentally, emotionally or whatever, that one chooses in a safe environment. What has changed is the idea that there is that it all needs to be given the same sort of weight, and this is because the “everything is ok” democratization has taken with it critical language.

A confession: I love American Horror Story. I also love Nashville, Helix, and, despite it all, Doctor Who (I know). But I’m not unwilling to consider them as artistic specimens, and I see no reason to defend loving Helix (which is pretty execrable in any realistic sense) because I want to see how The Rocketeer and his band of Merry Pretty Doctors (and Possibly-Evil Science Guy) escape the arctic slobber-zombie cult. Because I sign on to be taken on that ride. It’s exactly like getting on a roller coaster and also being willing to see how a roller coaster is engineered. The fact that I’m not actually hurtling toward my death does nothing to change my feelings about the experience, nor do I think that they should.

But when so much of what is discussed and agreed-upon to be considered “good” is actually just designed to provoke a reactionary response, and any attempt to say to people with similar television-viewing habits “hey wait this is actually kind of monotonous” is shut down by people hollering that if you say so much as the hair color of one of the characters you’re giving away too much and “spoiling” the viewing experience, then where is the ability to, conversationally, develop any kind of critical language? I’m not talking specifically about professional critics here: they’re doing fine, and a discussion of the state of professional criticism would be a fairly different thing13. I’m talking about the ability to, over cocktails or pie talk about why something does or doesn’t work.

So you become a CPG, except this time instead of literal photos, it’s social message statuses and tweets and tumbls. And that was less-obtrusive (actually, liveblogging and, later, live-tweeting looked like they were going to be a pretty cool thing there for awhile), until it, too, started requiring writers rooms to start picking out scenes and phrases to hang at the bottom of the screen as a recommended hashtag, or to bumper shows with announcements for following the show. And then you’re already spending your time documenting – I was there, I am here, I did this – instead of watching. And then you’re a speed-reader again: you’re watching because if you don’t you’ll be spoiled, but instead of watching you’re talking about watching (at the behest of the production company, even).

I’m making no argument for the purity of television, nor do I particularly blame people for doing something other than watch tv (seriously, my Threes score has much, much more to do with whatever I’m ignoring on television than it does with my devotion to the game itself). This was also not an attempt to pick on television, specifically. It’s actually wrestling with something that comic books have been dealing with for a couple of decades now (single-issue readers vs. trade paperback readers being the over-air vs. netflix of the comic book world). But the fact that there is now a burgeoning trade in talking about it publicly means that it’s all coming to a head at once, and it’s beginning to reach a point where there is very little space for anything critical.

NEXT: Can you believe there’s more of this? There totally is! But this time it’s about individual opinions and the role of criticism therein. Doesn’t that sound like more fun? I bet it does! Stay tuned!

1 well, at least without turning it into that kind of contest. There is still the possibility that it could be a contest of the use of critical language, which is its own thing that frankly, I’m not getting into here.
2 there has always been television criticism, at least in the form of reviews (although also in a great many “this is almost certainly terrible” tomes, the most famous of which is Amusing Ourselves to Death
3 admittedly, people whose general television approach is relatively-uncoupled from the showrunner/writer/creator probably won’t find much to say about Deadwood, Treme, Running Wilde, Undercovers or Bunheads.
4 this list is by no means comprehensive, and was mostly picked so that I could make the point in FN 3 that shows are much more than their creators as revealed by their difficulty with second acts. So not present are: Jenji Kohan, Aaron Sorkin, Rob Thomas, Mike Schur, Larry David, etc. etc. etc. and on and on and on.
5 although if this continues to get worse before it gets better, the same mindset will be applied to television that’s applied to movies, and we’re going to have nothing but titles and air-times as a basis to decide whether or not to watch
6 the adventure story tradition itself is a discussion for another time, but is another victim of the amped-up nature of people’s demands on popular entertainment.
7 “space opera” is, here, a term of art – it’s the glorious descendant of “soap opera” and “horse opera”, and it refers to stories that are, well, more operatic in tone. Whether you think that’s a good thing or not is entirely up to you, but the easiest way to explain “science fiction” vs “space opera” is that it’s precisely the difference between Star Trek and Star Wars. Or, actually, Star Trek as it existed prior to J.J. Abrams and Star Trek since J.J. Abrams. In fact, if it helps and you’re not familiar with Doctor Who, you can replace every instance of it with “Star Trek” and you will have the exact same situation. It’s a rough time for science fiction franchises.
8 or the Mirror Universe, for you Star Trek people
9 or the Borg (or even The Romulans if you want me to stick to TOS, it’s the same thing: the pursuit of logic above actual creative thought)
10 Klingons
11 “The City on the Edge of Forever,” or, more similarly “The Inner Light”
12 perhaps most infuriating about the whole thing is the way that dramatic television has forgone the things that it actually can do better than any other medium – that is, exist in the same place with the same characters week after week – and given it up to be more like twelve-hour movies. When it works it’s really something to see, but when it wobbles it’s almost impossible to not be frustrated with.
13 albeit a more heartening one – non-academic criticism is one of the only areas of reportage that has actually gotten more even-handed and less emotionally charged over the last few decades. I have ideas why this is the case, but I’ll spare you another several-hundred-word footnote


Jaye Davidson Has a Dong (Part 1)

I have no stake in the arguments over whether the internet makes discourse better or worse – I tend to think that people argue the way that people argue with the tools and basic language faculties, and that since language is always, by its very nature, going to be inadequate to the task of expressing thoughts1, but I do have a stake in the people who are doing everything in their power to reduce the discourse itself to “I’VE SEEN THAT.”

I’m talking about a couple of different groups of people here, the “SPOILER ALERT” Declaimers and the Constant Photo-Gatherers. The SADs are much more common in the online world than the real world, although they’re making their encroachment into real conversations (“oh my god you guys can’t talk about True Detective I have to weave a whole new wall-hanging before I can even start it”). The CPGs are much more of a real-world phenomenon (and are, in fact, pretty well-documented as being really annoying), but I want to say that these two behaviors are actually the same thing.

And I want to say it by talking about speed-reading. Recently, there’s been an enormous number of words given over to a number of speed-reading techniques, primarily because people have started figuring out that you can use apps and things to train speed-readers. So people have been doing it, and claiming increases in reading speed of multiple orders of magnitude in some cases. And this is literally the whole problem.

There’s nothing wrong with not liking to read. It’s like any other way to spend time. I don’t particularly get much out of serious television (unless it’s funny, I’m almost impossible to engage when you’ve got a commercial break every eight minutes). I find the opera to be inscrutable. I hate football. I am extremely picky about which video games I spend time with. And I think everyone has basically the same set of reactions, jumbled up, to things. So it stands to reason that there are people who feel about reading (or reading fiction, or reading nonfiction, or reading non-comics, or whatever) the way I feel about the opera (or television, or football, or video games, or whatever). But it seems to me that if you don’t like doing something, then it’s probably best to just not do it, right? We’re all only here for a little while, after all. And if you do like something, why treat it like a chore?

This is, I posit, because much of the discussion has evolved so that the only way things can be discussed is in terms of achievement. Easily-swapped ebook files, entire seasons of television shows available to stream (or to download via torrent), and more-or-less constant access to any given thing you could possibly want to look at, hear, or read has meant that the scarcity that drove the discourse for a long time is essentially gone2, which means that the mere experience itself no longer matters very much – if it’s no longer difficult to find something, or costly to acquire it, finding it and acquiring it is cheap. This is, I hasten to add, exactly as it should be: I’m a pretty big proponent of “everything should be available all the time to everyone,” and give no weight to the idea that exclusivity is a mark of quality.

What that difficulty of acquisition did create, however, was a sign of dedication, and that is no longer a viable one. There has always been a large degree of fandom (NB: the “classical” use of “fandom” as an abstract noun rather than its current popular use as a mass noun. This is deliberate and if you skipped FN1, now is the time to go back and read it) that has been something like sumo-wrestling contests – showing off your level of intensity and devotion by sheer mass. This still lives on in popular culture as the idea of the collector as hoarder or minutiae-obsessive3, which is so pervasive I don’t even have to list examples here. Or, given that you’re reading this and thus have some familiarity with who I am (even if only through my work here), to tell you that I know these people for the same reason that I know people that breathe air and people that aren’t half-duck. And I’ve been having conversations about things that I was obsessed with/was baffled by/completely hated/was indifferent to with people who were equally obsessed/baffled/hating/indifferent for basically as long as I’ve been having conversations4. And so I’ve been a part of the sumo-wrestling. And I’ve won some, and I’ve lost some, and that’s a really shitty way to look at liking something, but that kind of jockeying for position is kind of what we’re left with5. But now that the experience itself is cheap, how to prove that you’ve had it?

Well, you can either maintain the idea that seeing something enters you into a secret club, or you can announce your intention to experience it fully just as soon as you have time. Handily, both of these things require that you scream at anyone who so much as mentions a single detail about the plot, and, lord forbid, the ending. A significant percentage of the SPOILER ALERT population are people who are literally trying to police the conversation to convert the essential purpose of the conversation from one of sharing information (“I can’t believe the yellow king was actually Alexandra Daddario’s boobs that whole time!”) to one of drawing attention to one’s obsession with a thing by 1) pointing out loudly an intention to see it and 2) making it known that your relationship with the thing is so intense that knowing any piece of information is liable to “spoil” the entire experience6.

The other “problem” presented by spoilers, of course, is the fact that once the information is available to people, there is nothing special about having the information. So keeping a tight rein on what information there is (by loudly shaming people that would share it with people), even down to costume designs or events that may, potentially happen in a story7. Because then you maintain the illusion that there is some sort of exclusivity in having seen something.

Maintenance of that illusion, then, means that whatever meager currency there is to be gained by “having seen” undergoes downright Polish levels of hyperinflation. And so speed reading. And so second screening (watching one thing while doing something else on another screen, which is something I’ve been guilty of doing more-or-less since I was old enough to read.) And so the mad race to “have finished” and to have the knowledge and to be able to be the person who looks to the other person at the party and says “oh my god you haven’t seen The Wire, well I guess we can’t talk about it so I don’t spoil it for you. You have to watch it. I binge-watched the whole series in a weekend8” Thus derailing what could be another conversation about appreciation or the value of television or even the consequences of the drug trade or, hell, even the benefits of hiring excellent accent coaches and converting it into “I have done a thing,” and conflating the idea of critical speech with the idea of achievement, which then means that saying “I didn’t like that thing you did” becomes even harder to parse from “I don’t like you.”

If only these people had an easier way to prove that they actually did something. But that would require that books, and movies, and television shows, and what-have-you become events, where you can pay admission and then walk through the door and take pictures of yourself in front of the stage/screen/on the convention floor/touching the author. Oh, wait.

They’re the same people, and they’re doing an enormous amount of damage to the way we talk about things.

NEXT WEEK: So why is this bad for anything except human conversations? Well, it’s bad for television, and since that’s where a lot of cues are coming from, that means it’s bad, ultimately, for everything up to and including bears. Stay tuned!

1 This is the longest footnote I’ve ever written: words can only define things, and even then only define them for the subset of people who speak the same language, and even then the subset of those people who have that word in their vocabulary, and even then those people who agree with the usage. This isn’t actually going to be about language, but it’s important to establish up front that if you want to talk about why people are super-annoying you have to obviate the bogeyman of usage that’s different from mine. Someone who talks differently than me (or you) doesn’t talk “wrong,” they talk “differently.” Dialects (defined for our quick and dirty needs here as “the variations in a common language because of group identification”) exist because of the need for specific communities to communicate metatextual information to each other in a hurry. Consider: “my bad” or “keep it hundred” or “thug” or “twerk”, each of which can be used by any number of groups, and commonly is, but communicates something different, even if the denotation is the same, depending on the user and his/her audience*. Internet-borne dialects are as old as the internet itself and have, as usage of the internet as a means of conversation and, indeed, socialization has risen, changed ever-more rapidly to continue to provide differentiation between one group and another. The primary modes of our communication have gone to a place that is so beyond even the question of formality that even such formerly-impregnable plateaus as “grammar” are bent by every facebook post or text message. Note again: this is by no means a bad thing – there’s very little to be gained by insisting that everyone talk like you were taught to when you were in third grade, especially since odds are you don’t talk like you insist everyone should talk. Dialectic speech is proper speech, unless dictated otherwise by context (that is to say: I don’t think you should say “I don’t even.” or “This made me feel all the feels” in a job interview or dinner with the president or even around the office). The point of this extremely-lengthy discursion is to say: for a long time critics spent their early careers formulating the language with which to discuss the things they were offering critiques of (the dialect of critics is what you’re making fun of when you make fun of words like “plangent” or “cognoscenti” or “mise en scene”), and a lot of people don’t put that kind of effort into that specific form of expression because that isn’t a dialect their mien requires the use of, and so are left, in a lot of ways, without the means to conversationally discuss something beyond “I liked it” or “I didn’t like it.” Which, again, is not bad, it’s just part of why things are the way they are. Anyway. The article is back up there. Better go read the rest of it.
* we’re going to largely leave race out of this footnote, to avoid clouding the issue by adding emotionally-charged elements, but surely you’re all smart enough to see where it comes into play
2 not entirely gone, mind you. The fact that we’re keeping around an enormous amount of the ephemera – and especially the critical ephemera – of bygone days means that we have an enormous catalog of objects that are largely-unavailable. My personal favorite example is Don Delillo’s (written as Cleo Birdwell) Amazons, which is out of print and largely ignored by its author, who has never taken official credit for it. On a side note, if anyone has a copy of Amazons, I am prepared to do a lot of things to get my hands on it.
3 actually, the “hoarder” vs. “minutiae-obsessive” is basically the difference between a glutton and a gourmand. They’re not mutually exclusive, but they are the same sort of thing. And so eating a cheeseburger because it’s a cheeseburger and eating a patty made ground and mixed by Pat La Frieda cooked to your specific instruction and topped with aged cheddar cheese, red onion jam and tomato chutney on a brioche roll are, to people who are neither concerned with the number of cheeseburgers they eat nor the specific combination of ingredients that go into the cheeseburger, the exact same action.
4 actually, I’ve been an obsessive person for as long as I’ve had interests, and I can say with some authority that when I say “conversations” what I mean, at least 33% of the time, is “filibusters”
5 there are, I grant you, people who don’t have their conversations these ways. This is a generalization, and it governs an enormous part of the most visible parts of these kinds of discourses, so they’re metonyms, here.
6 this doesn’t make any sense to me either, but I didn’t choose the word “spoiler”. See FN1 again, I guess?
7 This happens even when the thing in question is an adaptation of another thing, in which case it’s probably useful to point out that the fucking movie is in color, and wonder how they deal with how Jonas comes to see color because it’s the turning point of the fucking book and jesus christ it’s been there for decades why are you being such weirdoes about this?
8 True story: the reason every single person you’ve ever met who has told you to watch The Wire has delivered this exact same monologue is because it’s actually what it tells you to say in the last frame of the last episode. Also, you should really watch The Wire.


Breakfasting Outside the Bun or: How I learned to stop breakfasting and live mas


So obviously Taco Bell’s marketing people have been entirely taken over by extremely stoned weirdoes. Extremely stoned because, well. I can’t imagine that someone that wanted the “reasearch and development” job at Taco Bell would be anyone else1. Weirdoes because this is a piece about breakfast at Taco Bell.


I’ll assume you’ve heard about it by now, and have gotten all the goggling and boggling out of your head, but in case you haven’t, go ahead and read that sentence again.

So, fast-food breakfast is, in many ways, just about the best excuse there is for fast food in general2. Subway entered the fray a few years ago, and that was inexplicable (and, also, terrible). Wendy’s had brief flirtations with breakfast for a few years there, and then abandoned them3. In fact, there hasn’t been a meaningfully penetrative entre into the fast-food breakfast wars since, arguably, Starbucks started serving those wee sandwiches4.

But Taco Bell has an advantage over those places. There’s not really such thing as a “breakfast sub,” and when your chain is known for its almighty Spicy Chicken sandwich, its dollar chili and its square burgers, it’s not really an easy transition into “breakfast.” But the breakfast burrito is a noble and honorable tradition. Beans, rice, egg, tortilla, salsa. Really, this should be a slam dunk and I should be reduced to writing this piece about how it’s impossible to get truly spicy food at Taco Bell.

But that’s not the direction they went with this. Ignoring the traditions even of such taco-friendly preparations as huevos rancheros and frijoles refritos con salsa, we instead get the same sort of tortilla-wrapped food-pile that Taco Bell has been rearranging for years, except now they’ve got, like, fast-food-breakfast items on them. I don’t know.

The breakfast items, then. Yes, there is a breakfast burrito (I mean, they’re also not stupid), although it’s just, like, eggs cheese and meat. That is, actually, no different from the weird prefab “burritos” you can get at McDonald’s (although, presumably, Taco Bell’s can be made on-site). You’re already losing, Taco Bell. There’s a “grilled taco” thing, also with bacon or sausage, also with eggs and cheese. So it’s the same as their burrito, only folded differently and squished in that sandwich-press thing they use to make my beloved crunchwrap supremes5. Speaking of! There’s a breakfast version. It’s got a hashbrown instead of the middle crunchy tortilla, which is….weird. Plus more sausage. It is, however, not the weirdest idea they’ve had.

There are cinnabon bites, which are apparently wee frosting-filled dough-balls. It’s been a long, long time since I consumed a cinnabon, but the appeal isn’t the frosting, and the frosting-focused cinnabon balls seem like the sort of thing that appeals to people that….don’t have tastebuds? Hate their lives? I don’t know. Spoiler alert: by the end of this article, I will know.

The marquee, headline act, the star of the marketing campaign and, seemingly, the raison d’etre of the Taco Bell War on Breakfast, however, is the waffle taco. It’s a waffle, folded up like a taco, with sausage and egg in it, that you then dump syrup on. There is no way that’s a good idea. It doesn’t even look good in the ads. It is, of course, little more than a marketing hook, and that’s been Taco Bell’s thing for awhile: come up with a foodstuff so attention-grabbing that people talk about it, and then they eat it. It’s why they keep coming up in this space.

And talk about it we have, but really, that’s not food. That’s a terrible idea that’s then covered in syrup, which is its own separate terrible idea.

So, down to the eating. These are the things I do for you people.

So I ordered a bunch of these things and cut them in half. Sharing is caring, and heart blockage is heart blockage.

The first course was the breakfast taco, which I got with bacon. There is a thought that goes with nearly all of the breakfast offerings, and it is this: these eggs need help. I realize that fast-food eggs aren’t really the fancy delicacies of, say, my own scrambled eggs, and they aren’t actually that bad, but they’re bland. Bland and kind of…chewy. The texture was really the weirdest thing to get over with the breakfast taco: it’s a layer of eggs, some bacon bits over the top, and some cheese to hold the whole thing together, smashed in the aforementioned taco-squisher that they use to make all their “Grilled” foods. It’s fine. It benefits greatly from being grilled, but the items are held in there pretty loosely, so this is the one that was most likely to spill apart.

Next up was the burrito. I tried the sausage version (I am assured by the website that it exists in bean form, but either the Taco Bell I went to doesn’t carry it that way, or their menu is doing its best to hide that option, so sausage it is). It was basically-indistinguishable from a mcdonald’s breakfast burrito. Or, say, a cafeteria steam-tray breakfast burrito. Or really any other dollar breakfast burrito in the world. Which is to say: pretty satisfying. Being all held in by the tortilla instead of loosely sprinkled helps edibility, there seems to be a pretty good amount of cheese. The sausage was surprisingly-not-bad, the eggs had the same problems they had on the taco (they are, after all, the same eggs), but it mattered less. The sausage added moisture to them that the dry bacon bits really didn’t.

And finally, the piece of resistance. The A.M. Crunchwrap was the biggest question mark. What are the odds that it deserves the name of crunchwrap? The original crunchwrap supreme is such a majestic example of the power of terrible fast food to heal. It’s a smallish pile of ground beef, a shot of sour cream from that gun-thing, a sprinkle of gross factory tomato, a and a warm, fatty blanket of cheese sauce that’s all laying tostada-like on a crunchy tortilla until the whole thing is smothered in another tortilla so you can pick it up and stuff it in your mouth. It is not a thing of half-measures: it is terrible food that is terrible for you. There is no redeeming value. There is no “at least it’s only made with chicken” here. This is an acknowledgment that you are eating food from a place that deals in death, or colon cancer, or death by colon cancer. And you pick up the tortilla that’s hugging all the other ingredients close to its floury bosom, and you take a bite, and the power and majesty of science is brought crashing before you: from cheap manufactured ingredients comes forth absolute satisfaction.

The A.M. crunchwrap has a lot to live up to.

And it kind of doesn’t, but it’s admirable how close it gets. The idea is the same, except the middle, crunchy tortilla is replaced by a hash brown, and the meat in this case was “steak6”, which I’m comfortable with – it was texturally pretty good, had a better flavor than the sausage, and after encountering the brittle bacon-grit that was in the taco, I think I’m happy enough to join any “anti-Taco Bell-bacon” protests that may happen down the line. The middle hashbrown did a pretty good job – it was wider and thinner than the average fast food hash brown. The cheese did the same job it always does in the crunchwrap arrangement. It’s not the crunchwrap supreme – it lacks magic. But as non-magical breakfasts go, you could do a lot worse. I can’t imagine eating the hash brown on its own, though. That thing was pretty clearly a grease trap.

So then, the cinnabon balls. They were fine. I mean, if you want a doughnut hole that has “the jizz” inside of it. It tasted more-or-less like I remember a cinnabon tasting. They come in 2, 4 or 12, which is kind of horrifying, but made it easy enough: I only had to negotiate two balls. The balls were oddly-sized – it was unclear if I was to stuff each ball whole into my mouth, or bite each ball and half and watch the jizz ooze out of the remaining half. I ended up just stuffing the balls into my mouth. The balls were covered in non-powdered sugar, which is slightly better than powdered sugar in the sense that I don’t have to worry about breathing it in, and slightly worse than not being covered in anything because I hate having ball residue on my hands.

Balls.

Anyway, Taco Bell breakfast. It’s fine. It’s cheap, predominantly, which is the appeal of anything at Taco Bell. And, if you avoid the bacon, it’s relatively tasty. But it’s not as good as Taco Bell lunch, so if you’re already committed to going to Taco Bell that day, just wait until 11:30 and get a proper crunchwrap supreme. It’s what I would do.

1 hell, they probably make sure that you’ve combined menu items in a fit of marijuana-induced stupor before they even let you in the door. Because, seriously, that is how new Taco Bell menu items are created
2 the other good excuses are: the almighty Wendy’s Spicy Chicken Sandwich, a hot Whopper, and, admittedly, the Crunchwrap Supreme, about which you’re going to hear more later.
3 if you don’t remember them, that’s probably perfectly normal: I live in a college town in Ohio (i.e. a test market), and as such our Wendy’s does all kinds of crazy shit that the rest of the country doesn’t hear much about.
4 and even then, they already had bagels and stuff, so you were already thinking “breakfast” when you looked at their wares. It’s just that the wee sandwiches are so much more fast-food-y.
5 crunchwraps supreme? cruncheswrap supspremes?
6 this is actually my first experience with Taco Bell’s “steak” – the only item from Taco Bell I get that has meat in it is the crunchwrap supreme (although, in the interest of full disclosure, if I want a giant Diet Pepsi and get the combo thingy, I also end up with a taco that I generally do eat). I think I tried something made with chicken in it a few years ago, and decided that it was better to not go to Taco Bell than to go to Taco Bell and economize on my heart-murder.


The 2014 MTV Movie Awards

Ah, the MTV Movie Awards. On the one hand, this is one of the awards-show highlights of the year, because it’s the one that’s the most likely to veer off the rails into “HUH?”sylvania1. And so the winners and things are generally pretty beside the point – nobody is really paying much attention, but it’s outside of the normal awards season and gives awards out to things like explosions, so it’s pretty good to have it, especially as it’s a flavorful burst of summer at a time when every single viewing of the sun seems like a cruel joke that will soon be yanked away from us heartlessly.


Anyway. You know how this goes.

Best Hero
Right. Well, Superman is a giant yawn who once decided to walk instead of flying across the country. He also, back when he was Cleveland’s own, couldn’t actually fly, and jumped instead, which is much cooler. There are not words in this column, nay in this blog in toto, to describe the “eh” I feel for Superman. So it’s not him. Thor has only been a “hero” for the last forty years or so, as for the millenium or so before that he was pretty much a spitful doofus who, nonetheless, ate his goats every night and then revived them every morning. So he’s disqualified on the percentage game. Iron Man, though, right? Yeah, I just feel like for every good storyline (his alcoholism, say) there’s one where he’s got a bunch of extra brain tissue in his body and that’s why he’s so smart2, or, like, Civil War. So no. Not a hero. Luckily, someone is up for playing John Cale, and John Cale is basically the biggest hero I’ve got left IRL, so he wins.

THE RIGHTFUL WINNER: Wait. You mean this is just for their depiction in movies in the last year? And John Cale is the name of Channing Tatum’s character in White House Down? This is the dumbest category I’ve ever heard of. The biggest hero in cinema last year was Will Forte in Nebraska. Give it to him. Or actual John Cale, if you can get him. He deserves it.

Best Cameo Performance
Given that this category is pretty transparently an excuse to get Kanye into the ceremony so he can do some crazy-ass shit onstage, I feel ok in taking a pass on trying to figure out if any of these were actually good cameos.

THE RIGHTFUL WINNER: Still John Cale, actually.

Best Musical Moment
You know what people are really annoying? I’ve said this before, but bear me out. You know what people are really annoying? People that whine about MTV not playing music videos anymore. You can probably dig around on this site to find out why, or else just read footnote 33, but the point is: whatever the status of the “M” in their name, their awards shows are generally pretty good places to see musical performances. That said, I guess it would be unseemly to nominate yourself for this sort of thing, so I totally understand why they don’t. I mean, insofar as I understand “why” and not “why, specifcially.” Anyway. I will say right now, any a capella rendition of “Barracuda” gets my vote every single time. Dum digga dum digga dum digga DAH DAAAAAAAAAHH

THE RIGHTFUL WINNER: Melissa McCarthy, “Barracuda,” Identity Thief

Best On-Screen Transformation
How mad do you think Monica Brazelton would be about the fact that the actor gets the credit for their transformation instead of the costume designers and makeup artists? Man, she would be pissed. Since I don’t want an angry fictional wardrobe assistant all up in my fries, I think it’s best to stay out of this one. Oh, and also because I feel a little uncomfortable about the sociopolitics of Dallas Buyers Club. That, too.

THE RIGHTFUL WINNER: What about that Russian muppet that has to learn to pretend to be Kermit? That’s a pretty rightoues on-screen transformation. Is he around?

Best Villain
On the one hand, actual somali pirates like those played by Barkhad Abdi and actual slave owners like those played by Michael Fassbender are, while not in anything like the same league, both actual villains, I feel that categorizing real people (or the fictional representatives thereof) as “villains” is reductive in a way that means we don’t actually think about things. On the other hand, I want to hate Michael Fassbender on principle4. On the other other hand, when you’re making up a fictional villain, you can make them sympathetic in a way that you can’t with real people which, paradoxically, means they’re less villainous. This sort of rules out Mila Kunis (who’s a victim of circumstance, really) and Donald Sutherland (who, after a fashion, was doing what he believed to be right). Ideally that would leave us with Benedict Cumberbatch, but he’s being nominated for his portrayal of Khan (ugh), rather than Smaug or the Necormancer/Sauron, either of which is ten times the villain that Khan (ugh) was. So.

THE RIGHTFUL WINNER: Michael Fassbender, but not as the slave-owner guy. Just actual, real-life Michael Fassbender. That handsome fuck.

#WTF Moment
Why did this category get a hashtag added to it? That’s really stupid. Anyway. They should have to limit it to films that someone actually tweeted (or tumbld) #wtf during the watching of. Don’t tweet in theaters kids. It means you’re a jerk. Anyway, the entire existence of The Counselor is so baffling to me that any moment from it counts.

THE RIGHTFUL WINNER: Car Sex (or anything else, seriously), The Counselor

Best Shirtless Performance
EQUAL OPPORTUNITY BOOB-OGLING. I’d like to say that’s what we’re about here at ONAT, but it really isn’t. Less awards for boob-oglers. Or from boob-oglers, as it were. Anyway. It’s obviously not Jennifer Aniston or Zac Efron, whose toplessness in a movie is basically a marketing stunt. Kudos to Leonardo DiCaprio and all that but he’s (sorry) outgunned (sorry) here. So Finnick or Thor? I think we know where this is going.

THE RIGHTFUL WINNER: Chris Hemsworth

Best On-Screen Duo
You know, if I weren’t such a classy blogger, I would make a joke about how the “best on-screen duo” award could also be about boobs. Turns out I’m a fucking gentleman. Anyway, still not into Dallas Buyers Club. Also not feeling American Hustle or Ride Along. So Leonardo DiCaprio and Jonah Hill or Paul Walker and Vin Diesel?

THE RIGHTFUL WINNER: I mean, it’s Jonah Hill and Leonardo DiCaprio. But since Paul Walker is dead and was affiliated with MTV pictures for awhile, I suppose it would be unfair to not consider him. Unfortunately for his award-hungry ghost, I don’t work for MTV.

Best Scared-as-Shit Performance
Actually, it’s “Scared-as-S**t” on the website, and I’m sure it will be cleverly and coyly bleeped out. But the word is “shit” and it’s up there AND THERE’S NOTHING YOU CAN DO ABOUT IT. Anyway. World War Z was terrible. The Purge was terrible. Mama was pretty good. Vera Farmiga is pretty. Hey Rose Byrne!

THE RIGHTFUL WINNER: Jessica Chastain, Mama

Best Comedic Performance
Does it seem to anyone else that We’re the Millers came out, like, a million years ago, and not, in fact, last year? Anyway. Last year didn’t seem like a terrible year for comedies, but the MTV Movie Awards sure are making it seem like one. Why is Ride Along nominated for so many awards? This is all terrible. You people should all be ashamed of yourselves.

THE RIGHTFUL WINNER: Mrs. Coach’s hair. I don’t have a tie-in there, but it’s got to win something, and this seems like the category I have the least to say about.  

Best Fight
Wait. This is The End was the year this is all nominated for. So is the confusingly-similarly-titled World’s End. So was Anchorman 2. Why the hell is the comedic performance category so terrible? Anyway. Why is World’s End not nominated in this category? The fight scene between Nick Frost and A Bar Full of Recently Revealed to Be Androids (the one where he beats them all up with a stool) is the best fight scene I’ve seen in years. This is why it is good to be the arbitrating power behind all awards bodies.

THE RIGHTFUL WINNER: Nick Frost, World’s End.

Best Kiss
Much like the Best Fight category, these nominees are all pretty ridiculous. I mean, the category is pretty ridiculous, but I do remember in years past, it was at least populated by kisses that were at least somewhat serious. And I guess that’s what The Spectacular Now is all about. The rest of this stuff is, contextually, all presenting sexuality (in the form of weirdly wet kissing) as some kind of weapon, and that’s, as evidenced by the number of nominees for whom it’s true here, so old hat at this point that what the hell is the point? If the point is “attractive people making smoochy motions at each others’ smoochy parts”, then I suppose Don John would have it in a walk. But let’s give out a statement award with this one, shall we? Let’s stop letting “women kissing up on each other” stand in for “this character is a tiny bit awful” (American Hustle, We’re the Millers and Spring Breakers, which is most of the nominees, use it this way, albeit in We’re the Millers’ case comedically). And, what the hell, let’s get excited about romantic protagonists smooching their love interests

THE RIGHTFUL WINNER: Shailene Woodley and Miles Teller, The Spectacular Now

Breakthrough Performance
This isn’t a bad category, no. It’s actually kind of a good category. Hopefully Miles Teller and Liam James can be around to ply some troth for awhile, and it wouldn’t be surprising if Will Poulter and Margot Robbie stayed around for while either. No, all of that is fine. In fact, I’ll go ahead and say that Miles Teller should probably have it. But we need to talk about this Michael B. Jordan thing. The dude has already been on the finest of television programs ever made, as the quarterback for the East Dillon Lions on Friday Night Lights. And if you deny that that counts as “breakout,” consider, then that his first major television role was on the goddamned Wire. He’s broken out. Seriously.

THE RIGHTFUL WINNER: Miles Teller, as promised. But seriously. Michael B. Jordan is the best.

Best Male Performance
Most of this is pretty by-the-numbers: at this point, among Bradley Cooper, Chiwetel Elijofor, Leonardo DiCaprio and Matthew McConaughey, you know who you favor. You’ve at least seen the clips that they send to awards shows, and frankly, I don’t have much invested in telling you you’re wrong5. I would like to address, however, the presence of Josh Hutcherson here in this category. To wit: The hell?! Or, in the parlance of the MTV Movie Awards, #WTF? Josh Hutcherson is not the best male performer. He’s only nominally the latter. I don’t think Andre Breton would nominated Josh Hutcherson here, and you don’t even know who that is.

THE RIGHTFUL WINNER: Chiwetel Elijofor.

Best Female Performance
While it’s neat that they don’t separate out the “leading” and “supporting” categories, doesn’t hat also make it confusing for Jennifer Lawrence? Probably not. Or, alternately, probably. She seems confused by lots of things. Anyway. Who are the people that made We’re the Millers bribing? Is someone on the nominating committee so enamored of Jennifer Aniston-as-stripper that the movie has to be nominated in every category? It’s not her. It’s also not Katniss, to ease up on the aforementioned confusion. At this point, we’re left with the same Amy Adams/Sandra Bullock/Lupita Nyong;o competition we have at every awards show. And it always comes out the same.

THE RIGHTFUL WINNER: Lupita Nyong’o

Movie of the Year
MOVIE OF THE YEAR. Movie of the year. Yes indeedy, movie of the year. I very nearly had a heart attack when I saw that We’re The Millers wasn’t on this list. I guess everything else is the consolation for not being tapped for the big one. So sad. I can’t give an award to the middle part of a story, especially since The Hunger Games and The Hobbit are exercises in exactly how many parts you can split something up to maximize profits6, so it’s not either of them. The Wolf of Wall Street needed more “Science Oven”, American Hustle needed more Jonah Hill’s Prosthetic Genitals. 12 Years a Slave seems kind of dour for an MTV Movie Award, yes?

THE RIGHTFUL WINNER: We’re The Millers. duh.

And that’s it! Summer awards season has begun, so expect things to only get wackier from here!

1 also people dress like clown prostitutes, and that’s always the funniest part.
2 I know that that sounds like a joke. I assure you it is not. He developed neural tissue all throughout his body, not just in his brain. This is what happens when you let Orson Scott Card* write your superhero.
* No, really. Still not making this up.
3 because when it did play music videos, most of them sucked, and nobody should be nostalgic for the days when you had to wait an hour to see the one good video in their rotation.
4 he’s handsomer than Jon Hamm, makes better acting decisions than Jon Hamm, is more talented than Jon Hamm, is funnier than Jon Hamm, and, well, we’ve all seen Shame, and, frankly, I think if someone doesn’t start hating him then life will continue to be the least fair thing in the universe. Maybe he’s just a tool created by Jon Hamm so that people don’t hate Jon Hamm out of jealousy of Jon Hamm. Which would make Michael Fassbender sympathetic in addition to all that. I DON’T NEED THIS KIND OF THING, PEOPLE.
5 rest assured, however, that you are wrong, simply by dint of not being me.
6 I haven’t said in these parts: this is actually especially stupid in the case of The Hunger Games, which could, actually, have been four movies. The first book is two completely distinct stories: the setup bit where we see the world and its characters, and the actual games. But instead of taking the first story and splitting it up (like it should be), they’re splitting up the last book, which barely has enough plot in it to be a full story in and of itself. THIS IS DUMB.


On Rocks, Rolls and Halls of Fame, as well as associated isms

Guys! The Rock and Roll Hall of Fame inductions are happening! I know I already wrote about their chances for accurately reflecting true Rock and Roll greatness, but it turns out I have some more to say!


You see, like all Halls of Fame, the Rock and Roll Hall of Fame simultaneously exists in two states: the theoretical, where it represents a canonification, a beacon to future generations that this, the set of things in here, is what was Important, and the physical, where it represents a collection of artifacts – this is Sly Stone’s jumpsuit, these are Kurt Cobain’s guitars, this is James Brown’s microphone, these are Elvis Presley’s boots.

The Rock and Roll Hall of Fame and Museum is pretty much an unqualified success in the latter sense: as a museum, it’s great. Videos, ample access to the songs themselves, rolling exhibits delving deeply into individual acts (the Rolling Stones the last time I was there, a rather impressive Pink Floyd1 exhibit the time prior). It’s basically what you’d expect, with the centerpiece (literally, as the showroom is in the center of the middle floor of the building) being an hour-plus video showing the names of every inductee, with clips of as many as the video designers tried to include2.

But the Rock and Roll Hall of Fame’s strength as a museum is precisely the divorce of the two things. The museum end – the part with all the stuff – doesn’t have to worry about the long-term canonization, and it doesn’t have to hold up a narrative. So the high points of the museum are free to be, say, an analog synth played by Milan Williams3 or the hand-written first-draft lyrics of “Here Comes a Regular”4, without the HOF itself having to worry about the long-term canonization prospects/narrative role of The Commodores or The Replacements in and of themselves.

I keep using the word “narrative,” and that’s not by accident. The Rock and Roll Hall of Fame as a Hall of Fame exists to preserve the story that was largely invented by Rolling Stone, the magazine that underwrites much of the HOF’s existence. The story of Rolling Stone, due to its position as the most popular source of modern music criticism, is largely the story of modern music criticism. It’s also responsible for the currently-bubbling-back-under idea that there is “serious” music made for “serious” consideration and “not serious” music. In the case of Rolling Stone, the house organ for rock music, the “serious” music meant the album-length, non-radio-friendly material. This attitude, borne at the time out of truth (the album-oriented-rock stuff was better, really). The problem is that “the most worthy stuff at the time” became equated with “the most worthy stuff”, and the set of attributes that make rock music seem great were applied, unilaterally, to every song, with things that failed to meet the circumscribed qualifications that made, say, a good Led Zeppelin record, being deemed “unworthy.” This fostered a contempt for pop music (not to mention country music, dance music6, and, eventually, hip-hop, although more on all of these things in a second). Thus the circle of “things that it’s cool to write about critically, and therefore ensconce in the idea of ‘serious things’” went little further than “rock music.”

Over time, people (still Rock People, mind you) became new sets of younger Rock People, and they began to make allowances: country was still right out, but Johnny Cash and Willie Nelson weren’t. Rap was still hardly allowed to be considered music, except for Run DMC and Public Enemy. The Motown and Stax sounds, that had led so seamlessly into disco’s firmament, became lauded as the thing to which “black music” should aspire7. The thing that made all of this baffling in hindsight is that instead of allowing for, say, Johnny Cash (for it is Johnny who is something of the king of “reappropriation by people who have absolutely no idea about he tradition or even genre in which the great man worked”) to be an exception in and of himself, you got the argument that, indeed, he was somehow a rock musician (simply put if rock = good, then also anything that is good = rock, right?)

And this is how it was, until about ten years ago, when two things happened in tandem that fractured the hegemony. The first is more interesting, but less germane: by 2005 or so, there was an entire generation of kids who were getting into music for whom file-trading and the like wasn’t some novelty to be enjoyed, but a fact of existence. The biggest blow to the record industry was, in fact, file trading, just not in the way they thought: when every kid can be exposed to every record, and everything arrives via the same distribution channel, then everything gets evaluated not on how much money was spent, or how much credibility can be bought by the ownership of an artifact, but based on how much they liked it. The record-selling industry has still not figured out a way to re-mobilize to convince people that 1) there isn’t an entire world out there made of bands playing whatever type of music you could ever, conceivably, want to hear and 2) they should pay the artificially-inflated prices that came from the sale to a captive audience (see FN5) that could pretty easily be convinced that the music covered therein was all the music that mattered.

The second thing, more to our point here, is the coinage of the term, and surrounding debate around, “rockism.” To wit: all of those ideas a couple of paragraphs ago that are so counterintuitive and clearly exist only to prop up a wing of an industry that levered itself around for so long on its own press were called out as being reductive. “Of course” they said “a Destiny’s Child song isn’t going to be the same thing as a White Stripes album8, why should it?” What an excellent question! And, for a brief, shining time, it looked like things were going to be sane. And then there was a weird game of one-upsmanship to enjoy music of outre genres in an attempt to prove your poptimism. This led to things like people buying Ne-Yo records.

And so years this debate bubbled slowly – it’s still never come to any kind of real head – with people being branded with a scarlet R for not showing enough love to pop music or whatever. Some good was accomplished (a decrease in people that believe that any genre in particular is “only for those, lesser, people over there”, the rise in prominence of the inestimable Maura Johnston), and some bad continued (country music remains something of an untouchable caste, except insofar as it’s the genre where people continue to buy records, which seems like weird disconnect that I’m not going to say much about here).

It should surprise none of you to find where I may come down in the argument. My lists of the best songs and albums of every year are publicly available and you can use them to piece together a pretty good idea of where I stand. But the short version: rockists are silly people who deny themselves a great deal of pleasure by insisting on a very narrow definition of quality, even if rock music itself is capable of being some extremely exciting, extremely provactive music. I like these babies, I just don’t like that bathwater.

The most recent (and largets in some time) bubble was a week or so ago, when Saul Austerlitz wrote a long-ish piece about how music critics aren’t snobby enough. It’s a piece about which I am ambivalent: I’m all for snobby. If you don’t think there are things that are bad, if you don’t think that there are better ways to spend your time, if you don’t think that there are things that aren’t worthy of your consideration, and if you make those decisions in a reasoned9 fashion, then by gum, you are prepared to have a debate, and to debate an opinion is literally the best thing you can do with an opinion. On the other hand, S.A. does a lot of sniffing and looking down his nose at how much of what people listen to is “pop music,” and, well, that’s not actually a reason to be snobby. Anyway, Austerlitz’s piece was the second shot in a volley started by Ted Gioia, about which the less said, the better. Suffice it to say: if Ted Gioia’s stupid article leads to an examination of poptimisim as the New Critical Orthodoxy, then that will be exactly one worthwhile result of that article10. If this wing of the argument gains any momentum, it would seem to be affecting a change toward an appreciation of “brain” music, or even just “ears” music (as opossed to “adrenaline” or “ass” music), which probably will mean dusting off the old rockism trenches once more.

So what does all of this have to do with the Rock and Roll Hall of Fame, a pretty pyramid-shaped building on Lake Erie?

Well, they’re the ones keeping the home fires burning for this whole thing. See, that narrative above, as confusing as even those paragraphs are in this context, are the shortest-possible version of the narrative (and the fallout of that narrative) that carries the rock and roll hall of fame with it. One of the things that has been increasingly amusing (and remains so now – all of the microdebates among squabbling critics and bloggers doesn’t change the fact that, fundamentally, most people will remain unaware of it) is that even as the magazine that pays to keep the spotlights shining on Rob Halfrod’s codpiece has, as per Mr. Gioia, become, essentially a lifestyle magazine11, the rock and roll hall of fame stands somewhat ahistorically.

And so it’s hard not to see the ideological component of the HOF (i.e. the propping up of the attitudes and language of rockism as though 1) the debate never happened and 2) people haven’t, by and large, stopped listening to new rock musc12) as necessary. If, in 2014, we’re still going to have to go out there and strap on our fightin’ boots and march against this nonsense, it’s nice to have what is, essentially, a willing strawman, representing a consensus opinion that no one could possibly hold about what is and is not “worthy” of being housed.

If we’re going to include a discussion of the trappings of a genre in our discussion of the genre itself (and why shouldn’t we? the way a performer presents him or herself is obviously important enough to be included as part of the package, so why wouldn’t it be important enough to consider in the analysis?), then we can say that the most “rock and roll” thing about the Rock and Roll Hall of Fame is that it is a big, ostentatious, instantly-identifiable, loud object that is very much about insisting on its own self-importance.

So the induction ceremony is going to happen, again, and there’ll be one next year, and the year after that, because the Rock and Roll Hall of Fame is always going to be there. So go in and look and Nancy Wilson’s dresses, they’re pretty cool! But try not to take any of this very seriously, because the attitudes that it represents have been pretty reductive and harmful to a lot of artists and the subcultures they represent.

Oh, and I made it through this whole piece without mentioning punk rock, which almost needs its own set of consideration for the amount that it has been turned inside out in that building. Maybe next year.

1 nowhere is the Rock and Roll Hall of Fame’s revisionist schizophrenia apparent than in their treatment of Pink Floyd – not inducted in their first year of eligibility, they were inducted and then immediately saturated the Hall of Fame enviornment. Prog rock’s treatment is pretty baffling in and of itself, but if Rush, Yes, et al get the same eventual reception that Pink Floyd received, there’s not going to be room for much else.
2 This video is the most interesting thing in the museum for several reasons. The one I’m going to mention here is that it’s an opportunity to get an idea for the overall musical shape of things. Another is that old performance footage shows just how divorced from the sound of the actual band things got in the seventies and, especially, the eighties. Oh, and performance footage of Flavor Flav from 1992 or so is just about the most depressing thing I’ve ever seen. 3 an instrument so beautiful I almost proposed to it.
4 it used to be even sadder, guys. The original lyrics are also more clearly about Bob Stinson, which does absolutely nothing to make it less miserable.
5 consider, if you will, the benefit to the record labels whose material Rolling Stone is covering: the critical notion that the music that is “worthy” is not only housed on the more-expensive album format, but unless you listen to the then-nascent FM radio band late at night to hear album-oriented rock music, you won’t even be able to hear it unless you purchase it yourself. There’s no way to make an industry profitable quite like a captive audience.
6 perhaps no one suffered from the “things that don’t rock aren’t any good” backlash quite as much as disco, a form that was, for a very long time, so knee-jerk reviled as disco. You can find any number of sources that will argue that the anti-disco backlash was homophobic in nature, and I wouldn’t disagree too strenuously (a lot of the early criticism certainly took on a lot of that tone), but I think the even more basic impulse is to deny the notion that the Rock People, who had, but the time disco came along, spent a lot of their effort setting themselves up as gatekeepers of Cool Stuff, could be circumvented and that disco music, which is easier to make, easier to play, and completely outside of the rock mien, represented a world outside of the purview of the Rock People. The backlash was so heavy that, unlike other genres that the HOF would eventually subsume, disco remains something of a pariah.
7 this particular idea would retain traction for decades – that whatever represented what the Rock People thought was black music at the time wasn’t as good as it had been twenty years ago. Eventually, the alternative rap thing caught up and Rock People could just shake their heads and mumble about The Roots.
8 this is the mid-oughts, remember.
9 NB: reasoned, not informed – you don’t have to know anything about Imagine Dragons to know that Imagine Dragons suck, provided that you know the reason that you think they suck.
10 should you be interested in a piece from that perspective, here’s a flavorwire article that also draws a connection between Gioia and Austerlitz.
11 albeit a lifestyle magazine that usually has a couple of ace writers on the staff and, until recently, included National Treasure Matt Taibbi, so it’s still not lost any stock as a worthwhile periodical.
12 this is actually a result of the Rock People’s attitudes: if you’re told, as a kid, that the best rock music happened thirty or forty years before you were born, and that nothing will ever be as good as it was when a generation or two before you were kids, how likely are you going to be go out and support someone who’s doing something new, or even to do that yourself? This more than anything – this enshrinement of and absolute insistence upon the old stuff – can speak to the enormous loss of market share of rock music.


The Academy of Country Music Awards

The Academy of Country Music Awards! These are totally different, and totally unrelated, to the CMA awards that I wrote about back in November. No sir, these are uh…CBSier, certainly.


Actually, these awards are sort of the last vestige of the “Nashville vs. Not-Nashville” fight in country music – the Country Music Association is Nashville-based and, as a result, tended1 to ignore the western-based artists. Now, of course, that rivalry doesn’t particularly exist, but we still have two award-granting bodies, so of course we also have two awards shows. Part of ABCs stable for decades, they moved to CBS eleven years ago, and managed to not get any more interesting as a telecast.

Oh, and they also have the most annoying website of any other awards show. Ah, well. On to the awards!

Songwriter of the Year
Among the dumb-ass things that the website does is not listing the reason why the person was nominated – that is, here we have “songwriter of the year,” but not actually any of the songs these people wrote this year, and there aren’t really many good resources for that sort of thing.

THE RIGHTFUL WINNER: Well, it’s either Shane McAnally or Luke Laird, but I guess they’re going to have to share it because I DON’T KNOW WHAT’S BEING CONSIDERED, HERE.

Vocal Event of the Year
Oh my god these are the dumbest award categories. I don’t know what a “vocal event” is. I thought it must be a live performance, but one of these nominations is for a remix and that, by its nature, cannot be a live performance. It started off so well! Anyway. It must refer to the fact that these all have guest stars. I’m completely and totally baffled why the members of Lady Antebellum are credited individually on Hootie’s cover of “Wagon Wheel,” but they are, and it’s just one more reason I believe this awards show to be custom-designed to prevent me from making any sense of it.

THE RIGHTFUL WINNER: Wait…what was the question?

Video of the Year
So seriously, when you look at the list of nominees on the website there’s the same publicity paragraph next to their thumbnail picture every single time. So, for example, Tim McGraw is nominated in this category (and also in that last category) for “Highway Don’t Care,” but the text where you’d expect an explanation begins “Superstar Tim McGraw charged into 2013 with his Big Machine records debut…”2. That tells me nothing about anything. And they’re all similarly helpful.

THE RIGHTFUL WINNER: Slow-motion footage of Mrs. Coach’s hair, blowing around.

Song of the Year
K, so, same deal with the text but I can live with it. Do you suppose all those Blake/Miranda divorce rumors are because they take this awards show very seriously, and they can’t stand competing against one another? I just can’t see either of them winning, just in case. Gary Allan’s publicity paragraph begins “Dictionary.com defines freedom as…” and that makes me so angry that every time I see it I punch a hole in my computer monitor, and that’s making this a very expensive writeup. So: Lee Bice or Darius Rucker?

THE RIGHTFUL WINNER: Darius Rucker, because I’m used to “the Year” that an award is “of” being a pretty nebulously-defined thing.

OK, so, literally in the time since I wrote the first part of this, something has happened to the ACM Awards website and it is much easier to use. I do not know why this is the case, but it is what happened. So at least now I can stop whining about how terrible the website is. Anyway. Onward.

Single Record of the Year
Again, the distinction is made between record and song, and again it turns out not to matter much, because despite measuring two very different and distinct things, the two categories, once again, turn out to be about the same, effectively. Not to mention the same as the CMA Awards. Basically, this is an exercise in redundancy. I considered doing the Teen Choice Awards instead (which aired last week), but I didn’t want to spend several hundred words explaining to children why they suck, and also the Teen Choice Awards make me deeply uncomfortable.

THE RIGHTFUL WINNER: You, the readers, for receiving my largesse in writing about this pile of boring nonsense.

Album of the Year
The face of mainstream country really is this weird divide between a very specific kind of good-time party music and a very specific kind of love ballad3. I don’t think it’s ever been more circumscribed, and it’s really annoying. It makes it hard to remember which thing is which, in addition to making it hard to care. Luckily for all of us, there’s Kacey Musgraves.

THE RIGHTFUL WINNER: Kacey Musgraves

New Artist of the Year Presented by Kohl’s
Wait…Kohl’s? This would be surprising except they also are responsible for the Best New Artist category at the American Music Awards. Kohl’s: a company that apparently cares deeply about novelty in pop music. What’s even weirder is that these awards are on CBS, rather than ABC, but were on ABC for many years. Is Kohl’s somehow a holdover, or are they really cross-network, cross-genre sponsors of new artists? Anyway. Kip Moore and Justin Moore are not related. Brett Eldredge doesn’t have to share either of his names. I think the choice is clear, here.

THE RIGHTFUL WINNER: Brett Eldredge

Vocal Group of the Year
It’s just “Group of the Year,” really. It’s not like they’re giving awards to instrumental groups. Actually, one of the most interesting things about awards shows, and one of the reasons I write about as many of them as I can stomach, is the little holdovers from industry standards that are no longer practiced. So, presumably, this was to identify the vocal groups as opposed to the backing bands that are a rich part of country music’s recorded history4. Anyway. That’s more interesting than this category, even though this might actually be the best category in the program. The problem is, for all that the Eli Young Band, Little Big Town and The Band Perry are pleasant enough, I’ve never really wanted to go listen to them. I do like the Zac Brown Band and Lady Antebellum. I suppose there just can’t be enough awards going to Lady Antebellum. On the plus side: Nickel Creek just made a new record, so hopefully this category will be a lot easier next year.

THE RIGHTFUL WINNER: Lady Antebellum

Vocal Duo of the Year
Man, I take back my complaint about that last category being dull. I also honestly had no idea Big & Rich were still a going concern. Or, rather, that it would be possible for an awards-granting body to try to convince me that they were still a going concern. The Florida Georgia Line are even worse. Thompson Square and Dan & Shay aren’t really worth considering. Wasn’t there a First Aid Kit record last year? Shouldn’t they be here? Ah, well. Stupid nominating body. I guess that leaves Love and Theft.

THE RIGHTFUL WINNER: Love and Theft

Female Vocalist of the Year
At least as the categories become more well-defined, it becomes easier to actually choose a winner. Taylor Swift, Sheryl Crow, and Kacey Musgraves aren’t very talented vocalists, despite making the best records of any of these people. Miranda Lambert and Carrie Underwood both are gifted vocalists, and this is a vocalist category.

THE RIGHTFUL WINNER: Miranda Lambert, I guess

Male Vocalist of the Year
I am all but officially out of things to say about this particular group of people. These weird mid-year awards shows are always the hardest to work up: most of these records are past the tale end of their cycle, and none of them are really exciting enough to have much vigor left in arguing for them. On top of which, Blake Shelton and Keith Urban are basically constants here, and Luke Bryan and Jason Aldean certainly appear to be well on their way to joining them. I guess that’s how we arrive at our answer: Lee Brice, who should probably enjoy his chance while he has it.

THE RIGHTFUL WINNER: Lee Brice

Entertainer of the Year Presented by Ram
OK, so, disappointment of the day: I know that the presenter of this award is Dodge Ram. I do. But how much better would any awards show be if the winners had to snatch their award from an actual ram? Now, before you go protesting that Blake Shelton is the biggest, strongest dude in the bunch and can probably take down any goddamn sheep, I invite you to remember: he’s also always drunk at awards shows. So can he manage to chase down a ram while intoxicated? I don’t think he can. I also think that out of worry for the trampling death of her spouse5, Miranda Lambert will find herself too distracted to be a worthwhile contender for the award. Taylor Swift appears, in general, to be at least some part woodland creature, and so she would have a better-than-even shot if she can get ahold of it early. The two problems there are: she doesn’t really have the stamina for an endurance race against a ram, and, since she’s part-gazelle, she spooks easily, so if things got aggressive, she’d be too gunshy to really go for it. Luke Bryan has the stamina, but he lacks the aggression to really get in there and get after the award. Besides which, knowing that Blake Shelton is going to drink himself and Miranda right out, the extremely-crafty George Strait is going to do his part to interfere with Luke most of all. Oh, he may be older, but that just means he’s been on this mountain many times before. He knows exactly how to get that award, and he knows who his competition is. Using his wiles from decades around the metaphorical matterhorn, he’ll have the ram snared up just as soon as his competition wears themselves out or runs off in a state of high panic.

The award was his to take all along.

THE RIGHTFUL WINNER: George Strait

Alright that’s it! I have no idea what the next awards show is, but I’m certain that it’s got to have a new crop of things to award with other things, and therefore must be better!

1 the past tense is because the country music industry itself is pretty split between Nashville and California, since while it’s true that most of the creators are in and around Nashville, the business end of the whole thing is generally where the rest of the record business is.
2 Big Machine records is owned and operated by Scott Borchetta and is the source of most of the nominees for this awards show (because that’s how these things tend to go), as well as most of the major players in country music. Nearly all of these people are on some subsidiary or other of Big Machine Records.
3 there are occasional exceptions, but, really, look at the country charts for the last couple of years and you’ll see maybe a dozen songs that aren’t one of those two things, and of those dozen, probably ten of them are “slice of life” songs about country-livin’. Blergh.
4 I think I talked about it back at the CMA Awards, but the gist is: country singers used to use studio bands rather than touring bands, and neither band was really “their” band, they were just “a” band. It still kind of works that way for a lot of popular music, but it’s no longer treated the same way by the public face of the industry.
5 I mean, the rumor mill would have it that their marriage is basically on death’s door, but they do still share a house and stuff, I’m sure there’s still some concern.


Interested in Endings

So at the time of this writing, the internet is on fire because How I Met Your Mother more-or-less unilaterally failed to make its fanbase happy with an ending that, well, sucked. In entirely-unrelated (and considerably less timely) circumstances, I am also a person who has recently viewed the fifth and final Twilight film, which more-or-less unilaterally failed to make me happy with an ending that, well, sucked. This also comes after a two-year stretch in which most of the major players in the most recent “golden age” (30 Rock, The Office, Fringe, Breaking Bad, Eastbound and Down, and Mad Men have all either ended within the last eighteen months or will end in the next few) have dealt with their endings to varying degrees of success1.


At the same time, the shift in recent years in film properties has been to extend the life of the source material (because, obviously, there’s always source material anymore), even beyond the now-standard “splitting books into part” (which includes not only the aforementioned Twilight movies, but also Harry Potter, The Hunger Games, the conceptually-ridiculous three-part Hobbit and essentially any other franchise that seems to have enough legs to warrant being split ridiculously2), thus making it so that you can go as long as possible without actually having to deal with an ending.

Endings are hard – you’ve got a story that, due to the nature of serialization, people consume over long periods of time. Even if you’re an interested and voracious reader in the target audience, The Lord of the Rings still takes, say, a few days to get through. So the ending, the point at which it all stops, has a lot to do – it has to lower you back out into the world, to make it so that the story feels complete without making it feel airless3. In the world of television, especially, the ending also has the added challenge of having to be literally iconic. That is: some portion of it must be able to serve as a symbol for the entirety of the show4.

Perhaps because so much in television rides on endings, and because even short-lived television shows are on for, say, a year, the ending tends to be the thing that is remembered above all else5.  And so, generally, there are high expectations for television endings (provided, of course, that the show is ending while it still has an active fanbase to expect things out of it – shows that sort of bump to a stop aren’t really included in this, as they tend to have endings that are more geared toward function than fulfillment), and when things don’t go well, there is a problem.

Television is especially in a pickle in this regard – the nature of the medium, itself, is to offer pleasure. For all that people are writing about serious-minded and weighty television, at the end it still comes down to satisfaction. Perhaps because films are so much more worked-over (and have fewer parts), they don’t seem to be as prone to it (although it does happen), but books are especially forgiving6. But they also are things that have an ingrown cultural expectation that they are permitted to be difficult. Films and books are seen as serious intellectual pursuits, for reasons to variegated and mundane to address in this post. Television is still the common man’s tonic, and there is none more lowly than the traditional sitcom – just look at how much critically-acclaimed television is critically-acclaimed for reasons having to do with it not being, essentially, Cheers7 or whatever.

So How I Met Your Mother, especially, was extra-doomed: a three-camera sitcom with a laugh track on a network generally associated with the lowest of lowest-common-denominator television shows, it was mainly praised in its early days for not being as conventional as the shows around it.

And that was the biggest tell: the whole time it was considered for contextual reasons, rather than intrinsic reasons. How I Met Your Mother was beloved by a section of its fanbase for the same reasons that its contemporaries The Big Bang Theory and 2 Broke Girls have become huge successes: strong (if broad) characters that are thrust into strong (if not always entirely original) situations and react accordingly. It’s a sitcom formula that has led to success in just about any set of terms throughout history – at its base it’s sort of the platonic model for the sitcom. But How I Met Your Mother seemed to gain most of its enduring critical respectability from the fact that people could point to it and say “see? I don’t hate everything on CBS, I like How I Met Your Mother.” And then eventually it built up enough of a body of good work (and had a cast with considerable charisma, except for the lead, who is a wet drip) that it became difficult to ignore simply in terms of scale.

And so How I Met Your Mother’s ending was bad, but it was also unsatisfying, and so people rose up against it. But it’s only the latter condition that still matters: in the world of television, they are scarcely two different concepts.

As television continues to consist of stories that are getting critical acclaim supported by series that are popular, the former renewed at the wishes of the show’s creators, the latter renewed because of the opinions of the great unwashed, and as “serious” begins to be entirely interchangeable with “gun-punching and emotional”, television begins seeking a state where endings are not required, but rather encouraged and perhaps, occasionally, even given: television is entering the realm occupied by comics.

Comics in the eighties and nineties, spurred by Frank Miller, Alan Moore et al, went through what is called colloquially the Dark Age – heroes got grim ‘n’ gritty, stories that may have been about angst or moral upstandingness became stories about grief or despair and moral relativity and, as things heightened, an operatic sense of plot and stakes and, eventually, an almost self-parodic level of blood, guts, and ridiculous testosterone-pumping plotlines. This also coincides with comics at more-or-less the height of their popularity as a medium. So it’s obvious what to make of that: that entre into critical respectability that costs some portion of the spirit of the medium comes with an increase in people talking about it, which also comes with an increase in people paying attention to it. The best advertising is people talking about it, after all. If it’s the subject of a conversation, people remember it, and people remembering it is well more than half the battle.

Films already went through their own Dark Age: the American Auteurs, and also the european new waves9. They’ve come out the other side of it. It’s actually not an accident that the rise of the comic-book movie is exactly in line with the obsession with closure: comic books don’t have closure. They rarely end. And so film companies, seeking the perpetual motion machine of an effectively-endless franchise, are looking to properties with literal decades of history to mine, not only for plot and story ideas, but also for a guide to what does and does not work for fans. And, best of all, you don’t have to leave people with a “last” impression, only a “more recent” impression.

So what’s the takeaway from all this? Well, ideally it would be to stop putting all of the weight for your feelings about something on its ending, but that’s disingenuous. See, I hated the ending of How I Met Your Mother, and I wasn’t even a fan in the first place. And it colored the way I look at the show in hindsight (all, like, ten hours of it at the time of this writing) – if that was where it was all going, why did you waste so much of everyone’s time? And so what I have to say is something quite different, and it’s a hope, not an instruction: the dark age leads to immense popularity, which leads to a drop in popularity, which leads to a brief span of time where there is a power vacuum where interesting things – Jason Reitman, Judd Apatow, Sam Kieth, Kurt Busiek, Radiohead – can all happen. And then after that the cycle starts again.

But in restarting, the medium loses something of its identity, because now it’s in the hands of the money people instead of the creatives. So maybe let television be television. I can’t encourage you to ignore the next show about a troubled white dude who treats his wife with contempt (and expects you to as well) and is actually the villain, even though it’s almost impossible to see him that way because of how the story is told, but it would be nice if you all did. And maybe when a pretty-good sitcom delivers a wet shit sandwich of an ending, we should hate it.

Because there are things worth hating, even unimportant things. Because saying “that wasn’t good and I won’t allow it” is as honorable a response as saying “I didn’t care about that show and I still don’t.”

So, y’know, enjoy the ending of How I Met Your Mother (and, most likely, Mad Men, and, eventually thereafter, The Walking Dead and perhaps most apocalyptically, Game of Thrones9) for two reasons: one, because it’s terrible, and therefore you get all of the satisfaction of hating it. Really get in there and hate it, too. Because the more you think about it, the more you’ll be critically considering it, and when’s the last time you did any serious critical considering? But also hate it because that ending is made possible only on a television show, and we should celebrate each medium for its idiosyncracies.

1 Television is on my mind in this space because of a thing that you’ll get to see sometime in the near future, actually. It is also somewhat-related to endings.
2 This might be slightly less maddening if it wasn’t so hidebound by a weirdly-insistent structure: no matter how the story is paced or placed, the last part of it is split in two. The Hunger Games got the worst of this, as it may have benefitted from a fourth film, but in the form of the first book yielding two films, instead of the last.
3 not to get too bogged down in it, but ultimately I think the thing that makes the ending to How I Met Your Mother, a show that, admittedly, I’ve not seen more than a dozen or so episodes of, such a drag is how hermetically it seals up the story – it all seems like a game of chess played by fate, and people haven’t liked that kind of story in several centuries, although that kind of thing did play well in Rome.
4 bonus points if it’s – ugh – a callback. Like, say, thrusting an oddly-colored musical instrument at an erstwhile agent of SHIELD.
5 for more on this idea, come back in a week or so!
6 although, interestingly, the books that are part of series have the most challenging** endings that have also been adapted to films have not been adapted to completion – I’m thinking, specifically, of A Series of Unfortunate Events, The Chronicles of Narnia and, sadly, The Prydain Chronicles.
** which does not here mean “best”: the end of anything is better than “and they all go to heaven except for the Dwarves because Jews aren’t allowed in Heaven and also Susan because she was a tramp”, even if that ending is considerably more difficult because of its strident moral stance – there is no softening to make the ending seem happier or more inclusive (a scene, say, of a makeup-free Susan the Ho in Aslan’s Heaven)
7 to be clear, I would like more shows to be like Cheers. I’m not even kidding.
8 it’s worth pointing out that these eras were followed by the blockbuster era, and the unprecedented boom times for the film industry. Seriously, I’m not paranoid enough to believe that there are people in control of these things, but if there were, the trick is to make everything darker, more depressing, more violent.
9 it’s pretty apparent how many people are entirely unfamiliar with the capacity for disappointment of a long-running adult high fantasy series by the number of people that are in no way girded for the end of Game of Thrones***, and, indeed, the ending of A Song of Ice and Fire****
*** that’s the tv show
**** those are the booksa
a this is a footnote embedded in a footnote embedded in a footnote.