Archive

Television

A deplorable year, in context.

petra

Margit Carstensen plays the embittered Petra Von Kant in Fassbinder’s 1972 film of his quasi-biographical play; pictured here during her final on-film meltdown in front of her family. © 1972, New Yorker Films.

It started with cocktails.

It was November 8th—Election night, 2016. My partner and I had dinner (nachos, I think) with a cocktail on the side, to try and wash away the bitter taste of the ugly year leading up to this occasion. We caught up on some pre-recorded programs in the DVR, and switched over to PBS for the occasional play-by-play of electoral returns. Of course, it was still “too early to tell” at this point; though the smugness of certain commentators—a less-than-subtle confidence in the already projected outcome (a Democratic “landslide”)—gave me pause.

In the months preceding this night, I endeavored to raise awareness of the complex and multi-faceted significance of this election—and the devastating ramifications if the Presidential seal were to go to the most corrupt, unqualified, and inexperienced candidate ever to campaign for this office (from foreign policy, to climate policy, to basic civil rights, to corporate privileges, to tax policy, to infrastructure, to cyber-security and net neutrality…) I had cautioned my Bernie-adoring friends that the so-called “lesser of two evils” was, after all, still “less evil.” I encouraged folks to consider the pragmatic perspective that many social workers (myself included) are forced to adopt on a day-to-day basis, as a consequence of living in an imperfect world with imperfect choices: while one can rarely take an action that will result in no harm whatsoever (with the notion of “no harm” being in direct opposition to the human experience), one can gather information and critically evaluate options in order to take the path of least harm.

As I sat in front of the television, sweaty glass of booze in hand, I saw the path opening in front of our nation: suffice it to say, it was not the path of least harm.

I would like to say that, in hindsight, I responded to this awareness with a proportionate level of disappointment. If I were to be perfectly sincere, I would admit that my disappointment and anxiety skyrocketed beyond any proportion I might’ve prepared myself for, and my subsequent display of emotion was probably on par with the most exhibitionist meltdown of a character in a Fassbinder film (think Petra Von Kant screaming at her family, drunk on the carpet; or Elvira recounting her history of trauma from inside a slaughterhouse). After fifteen minutes of incredulously gazing at the incredulity of the commentators on the TV screen, I wandered off to bed in a daze, and sobbed myself through a (seemingly endless) night without sleep.

Some time after, my partner wandered up and lay next to me—our dog Sam sprawled in between us: blissfully blind to the specifics of what was happening around him, but visibly aware that something was off. He rubbed his nose against my side and I scratched behind his ear, periodically reaching for my phone and checking the electoral map for signs of hope; none were forthcoming. At a certain point, I just stop checking—painfully aware of the heightened anxiety provoked by these micro-updates. And then, the indigestion started. And the routine visits to the toilet to try and purge the queasiness swirling around in my stomach. And the hours spent in near-delirium, staring at the ceiling and waiting for the night to end, while simultaneously dreading the thought of having to survive the night and emerge into the reality awaiting me on the other side.

I’m still lying awake when I hear the clicking of a computer—my partner having woken up before me (as per usual), now checking the news feed on his desktop. I counted the seconds between the first few mouse clicks, and the first audible, heaving sobs; I think it took about fifteen seconds. I turned my face into a pillow and cried.

* * *

I find myself reliving this fateful day, as I embark on this effort to put my experience of 2017 in some sort of context (call it self-therapy). I can’t help but feel that the answer to many questions that have arisen out of this disastrous, unsettling, and disorienting year, lies somewhere in the outcome of that night—and the collective reaction to an action taken by the smallest margin of our population ever to select a (proposed) “leader of the free world.” In the months immediately following the election, I was one of many to identify a heightened level of engagement with social media; and while I cannot attest to the motives of others, I will readily concede that my personal engagement was driven by a heightened awareness of the unprecedented impact social media had yielded throughout the course of the election. In reading the near-unbelievable, beyond-dystopian tale of Cambridge Analytica, and the well-documented strategies implemented by several shady figures in favor of a global right-wing coup, it became quite evident to me that we stood on the threshold of a deeper abyss than was projected by the most dour catastrophist during the election itself. I felt a compulsion to be more outspoken than I had been before (since, evidently, reserved compunction, blind faith in objectivity, and trust in the collective conscience of mankind had not yielded any favorable results). Looking back over some of the insights and commentary I shared publicly via social media at the start of the year, I regret none of what I wrote—but I can now recognize the general insignificance of my commentary with a greater degree of intellectual clarity.

This isn’t to say I’ve adopted a defeatist perspective. Today, I can sincerely claim (give or take a little) the same level of investment in the plight of humankind as I claimed last November; and the year before that, and so forth. But as our global village (if McLuhan’s term can even be fairly applied to our present-day climate) advances towards ever-increasing levels of chaos, I’ve become painfully aware of how incompetent and, in many cases, outright detrimental this twenty-first century drive to provide running commentary on the human experience has been to achieving any sort of actual progress. Retrospectively, in fact, one can trace the most recent phase of devolution (and devaluation) of the human species through a comprehensive anthology of our president’s impetuous Tweets—accompanied by the often-comparably impetuous retorts of commentators across the globe. If one were inclined to place these exchanges in context and illuminate the bigger picture for those in need of perspective, one could print this anthology of Tweets and comments and hang it on a wall in a museum; opposite this display, one could hang a display of climate data, pictures of the refugee crisis, profiles of newly-appointed right-wing judiciary representatives, annual hate crime statistics, research on hereditary traumaworld poverty statistics, annual gun violence statistics, opioid overdose statistics, and current nuclear arsenal statistics (with illustrations). The viewer of such an exhibit should be capable of drawing their own conclusions.

Suffice it to say, very little social progress has been achieved during the past year. One could go so far as to argue that we have taken such an enormous step back in our social evolution—the trajectory of social progress has been scrambled to such an extent that we have to redefine the very idea of social progress. For example: prior to the election, one could generally accept that, regardless of one’s economic status or party affiliation, sexual assault was a deplorable action. But something changed, somewhere along the course of the 2016 campaign trail. If one were to examine the Republican party’s response to the excavation of that infamous Access Hollywood tape, and compare it to their response to revelations that then-President-elect Bill Clinton’s had engaged in an extended affair, years before the 1992 election, one would have to resolve that the Right has either lowered their standards for outrage, or only complain when their majority is on the line. In addition to this, we find the emergence of a new Right-wing chorus (which will go on to be adopted by many a libertarian, third-party voters, and Democrats as well): the now familiar refrain of “fake news;” a magic potion for alleviating the symptoms of cognitive dissonance.

gennifer

During the historic 1992 campaign trail, it didn’t take long for the revelation of President-elect Bill Clinton’s 12-year affair with Gennifer Flowers to become a partisan weapon yielded by the Bush campaign to cast moral aspersions on his Democratic opponent.

In 1992, voters of all stripes wrestled with both the knowledge of Clinton’s affairs and an awareness that this information might be manipulated for partisan gain; in 2016, there appeared to be little-to-no wrestling at all. Polls at the time indicated that, by and large, 45’s base was actually strengthened by the revelation of the tape: casting the objective information of the tape aside, many of 45’s supporters voiced an opinion that their only concern lay with how this information might be skewed for partisan gain—and not with the implications of the information itself. In other words, the information of our then-President-elect’s predatory behavior (in combination with all the other evidence accrued to support the case for his predatory business practices) was as good as irrelevant. And so began the trend of alternative facts, and the convenience of being able to reject information that conflicts with one’s pre-existing belief pattern by merely denying its existence. Viewed along the action-reaction continuum, “fake news” was both a reaction to the leftist obsession with investigative journalism, and a positive action in its own terms (using “positive” in the Skinnerian sense). For by achieving an unspoken consensus among themselves—that information adverse to the advancement of one’s own political goals cannot (and should not) be bothered with in the first place—45’s supporters have succeeded in establishing a level of intellectual disengagement not seen at any other point during the nation’s past century of political discourse.

If we now consider this new right-wing action (“just say “fake news” whenever anything upsets you”), we must consider the subsequent leftist reaction (hyper-dramatically present the severity of upsetting developments, in an attempt to appeal to the emotional-spiritual side of right-wing fact-deniers). The leftist reaction can be seen throughout any number of impassioned Facebook and Twitter rants: that (somewhat-to-absolute) self-righteous outpouring of hysteria and concern, presented with all the pathos and drama of an argument in some generic TV courtroom drama. This brand of emotional reactivity has been, in some cases, strategically channeled to advance social issues (as in, most recently, Tarana Burke’s powerful #MeToo movement); on the flip side, the catharsis of social media engagement presents a stumbling block for individuals who have no conception of follow-through. For instance, the fanaticism of Howard Beale (Peter Finch) in Paddy Chayefsky’s Network (1976), which I see frequently shared (in the form of the “I’m as mad as hell” excerpt) by peers and acquaintances on social media, offers a prescient insight into the risks associated with commercializing outrage—though I fear some folks take the bit out of context and fail to apprehend the way it all falls apart.

In Sidney Lumet’s film of Chayefsky’s acclaimed script, Peter Finch convincingly plays a neurotic newsman who “flips a wig” after being let go from his job, and takes to the air to advertise his on-air suicide a night in advance. Instead of delivering on his promise, he launches into a sermon about how the world is going to shit, then beckons his viewers to run to their windows and yell into the streets with him: “I’m as mad as hell, and I’m not going to take it anymore!” His viewers comply, and the station executives hear of a boost in ratings: they investigate the situation further, and realize there’s big money to be made selling outrage to the deadened masses. In a relatively short period of time, the mentally unstable Howard Beale has been asked to front his own television variety show—to feature his now-trademark impassioned rant as a sort of nightly act. Beale displays some resistance early in production, but by the end of the movie has been brainwashed to the point of putty in the station’s hands: a once genuine expression of repressed jouissance has become a weird sort of household name, and the television executives profiting from his mental health condition wind up having him killed, because of an eventual decline in ratings.

network

In Network, Peter Finch plays a hysterical newsman (named Howard Beale) whose psychosis is co-opted by his employers at the network for a spike in ratings. Once the act grows old and ratings decline, Beale is bumped off by his executives, who will continue accruing royalties from his downfall. © 1976, MGM Pictures.

While extreme and grotesque in its scope, and rendered for largely satirical purposes, Chayefsky’s work does seem to offer a cautionary tale for our time. In order to prevent becoming as pathologically shortsighted as Howard Beale, one must always ask oneself, when contemplating such catharsis: What purpose could this possibly serve? What’s the intended follow-up plan for one’s outrage—or is there one? Is it possible that one is just yelling words into a digitized vacuum, which then captures one’s words and capitalizes upon them, selling them off as part of a metadata package? Is this essay going to become just one more yell into the vacuum? One hopes not, but one never knows.

Since human beings have still failed to learn the lesson the universe endeavored so painfully to instill in us throughout last year’s election (the lesson: social media activity will not fix most things, or even anything; but it can readily make things worse if given the opportunity), we’ve apparently doubled down, and now find ourselves caught in the middle of a surreal and bizarre game of “who has the most sexual predators in their camp?” (As far as what this game is intended to prove or resolve, I suppose anyone’s guess is as adequate as the next person’s.) One by one—day by day—famous celebrities and political pundits continue to drop like flies in the ointment of this “to catch a predator” show; inappropriately enough, this surreal game has been (and continues being) overseen by the predator who set this chain in motion last Fall (don’t worry: he’s not going anywhere anytime soon).

In keeping with the chosen leftist reaction to overstate one’s passion for a given issue—in a vain effort to “wake the deadbeats from their slumber”—we now find the exponential possibility of human folly achieving the highest (or lowest?) levels of stupidity. For starters, we have the borderline-comical leftist insistence on the morally “wrong” connotation of sexual assault: as if by insisting strongly enough, those who believe otherwise might instantaneously be converted. Furthermore, this juvenile proclivity for moral sermonizing has embedded itself as a point of division between proponents of liberal policy. Just as the more die-hard idealists who upheld the “purity” of Bernie Sanders against the “corruption” of Hillary Clinton drove a wedge between the otherwise-united front of liberal voters (aided and abetted by the Russian trolls who targeted third-party and Bernie supporters with strategically placed news stories to reinforce their disdain for Hillary), we now have liberal idealists thinning their own herd (yet again) by singling out anyone who fails to fall in line with the outspoken chants leveled against perpetrators of sexual assault.

I recently stumbled upon an article which provides a textbook illustration of the infantile thought process underlying this leftist penchant for “out-idealist-ing” one another. In an online Stereogum/Spin magazine article (filed under the “News” heading), a writer named Peter Helman takes issue with comments and views put forth by the ever-divisive Steven Morrissey in a recent Der Spiegel interview (yet again, I find myself stumbling upon the commentary before the news itself; which, in and of itself, isn’t news). Here’s a verbatim transcript of the opening paragraph, as printed in the article (whose writer acknowledges openly that he did not bother to pursue a proper translation of the interview, and relied upon Google translator as arbiter of the interviewee’s meaning):

“Hey look, Morrissey said a stupid thing! It’s been a while since Moz has said something truly objectionable and not just, like, ‘Oh, Morrissey is kind of an asshole.’ But now, in an interview with the German news outlet Spiegel Online on the heels of his new solo album Low In High School, he’s come through with some genuinely terrible opinions.”

First, we find the distinctly liberal cocktail of snark and finger-wagging writ large in the opening statement: before we are even offered a glimpse at the musician’s controversial comments (let alone the chance to remind oneself, as hopefully all reasonable and grown adults do in such instances: “what do I care what some music journalist thinks of what some musician thinks of some matter with which he has no direct affiliation?”), we are instructed (seeing as how the reader cannot possibly be intelligent enough to reach their own conclusion) that the comments are objectively “stupid.” Then, we have the reinforcement of this admonishment coupled with an insistence that one ought to consider these “stupid” statements even more offensive than whatever the last thing the writer had admonished the musician about. Then, as if the message had not yet been clearly conveyed (after all, we’re dealing with a reading audience that cannot be trusted with their own thoughts), the writer insists that this latest interview with the Moz reveals “some genuinely terrible opinions.” (Be still, my fluttering outrage odometer!)

I’m disinclined to even bother with an analysis of the article (let alone the comparably over-indignant commentary of those who shared the “story” on social media; excepting for maybe Shirley Manson, who brought up a valid point in suggesting that Morrissey appeared to not have the latest updates on the “plot[s]” of Spacey and Weinstein), but I nevertheless feel compelled to provide some sort of a corrective to the borderline-toxic preachiness of these self-appointed messiahs to moral indignation. Not that Morrissey’s views, as quoted here, are even that noteworthy or idiosyncratic: if anything, they seem to echo the contrarian tone of similarly uneventful remarks delivered by Johnny Rotten earlier this year. But whereas Rotten and Morrissey are merely doing what they’ve been doing all along in their respective careers (namely, being abrasively provocative), Helman’s heavy-handed critique—along with any analysis bearing the imprint of such thoughtless indignation—inflicts the greatest damage of all on the integrity of an intelligent dialogue: for not only does it inherently reject the reader’s intelligence (something that neither Rotten nor the Moz, bluster aside, would ever dare try), it functions primarily as the byproduct of a profit-driven online press: a press which now feeds vampirically on the outrage of the web-surfing public, frequently leaning on the crutch of self-righteous indignation as a shortcut to increase clicks and shares. (Hm… that sounds familiar.)

And since “writers” (at least, the successful ones; the ones whose bread-and-butter is outrage-tinted click-bait) save the most upsetting/eyebrow-raising/scintillating bits for last (in order to maximize the advertisement space between the reader’s first click on the article and the long scroll to its disappointing finish), there must be some build-up to the exhibit of [insert celebrity’s name]’s horrifying remarks. Like an 18th century freak show, in which true horror would have to be instilled in the imagination of the visitor, before being deflated by the banality of the exhibit itself. (Sure enough, cries of “shame!” and “how dare he?” were heaped upon the Moz within minutes of the article’s posting; after all, what’s one more pariah on the fire…) In keeping with every other un-news-worthy observation shared by Morrissey in an interview, a scandalous viewpoint has been tried and condemned for failing to align with the prevalent vernacular and perspective of the times, and persona non grata status has been duly granted to the offending party. From what we know about the artist in question, one ought to suspect this is what he wanted all along, anyway: win-win (I guess?)

Morrissey-660-Reuters

Steven Morrissey’s 30+ year career has been consistently marked by stylized, overly dramatic outbursts, coupled with the artist’s vegan activism and often reactionary views. As a (by)product of the British punk era, Morrissey is to many a poster-boy for resisting conformity. Also  renowned as a legendary pain in the arse.

My point here isn’t that Morrissey’s statements should be defended: he’s a grown man and should take ownership of whatever non-sense and/or half-sense pours out of his twisted mouth. Rather, my point is to ask: What purpose could this possibly serve? And moreover: What does all this exhibitionistic “journalism” imply about the state of social commentary? Have we truly devolved to the point that an individual needs to preface any commentary on the subject of sexual abuse (and the inherently complex psychology of victims and perpetrators) with an assertion that one does, in fact, disapprove of sexual abuse and predatory behavior? Are there popular articles out there that I’m not seeing, in which individuals go on record saying that they condone sexual abuse, and wish there was more of it? And if so, is the tone of such deplorable articles so unrecognizable from the tone of a level-headed writer’s, that level-headed writers need fear their audience suspecting they might, in fact, be pro-sexual abuse? And if so, wouldn’t the abuser-shamers serve their purported mission more capably by tracking down those pro-abuse folks and chastising them? Regardless of the answers to any of these questions, nothing remotely edifying can come of such conversations, if we cannot bring ourselves to respect (read: allow) the judgment and intellect of our reading audience—sans these forceful and belittling cues to trigger our moral outrage.

Which brings me back to the actual problem at-hand, and the elephant in the room that remains perpetually sheltered from the storm of allegations swirling around him: the President of the United States. For unlike Morrissey (or Johnny Rotten), our president has made it clear time and again that he is pro-sexual abuse, and despite the skepticism of his supporters (who feared that their boy’s well-documented predatory behavior might be yielded by leftist commentators for partisan gain), he has displayed no compunction about turning allegations of abuse into political weapons—so long, of course, as the allegations are directed at individuals outside the Republican umbrella. Which renders it all the messier when individuals on the left allow themselves to get caught up in the hurricane of abuser-shaming (often with noble intentions, at least at the start), since this is exactly what the most powerful person in the country has been relying upon this entire year to advance a truly abusive agenda—not least of all, through his success in appointing an entire slate of unnerving judicial assignments: out-of-touch bigots and bloggers; unqualified lunatics who will shape our country’s legislation for decades following the inevitable demise of this administration. All the while, his White House continues to ignore and deny the allegations of 16 women who have confronted the public with their abuse stories, and the President remains… the President. As of this writing, there have been no formal inquests proposed in Congress to investigate and pursue these claims further.

I suppose I should feel compelled here to state my own disavowal of sexual abuse, and to verbalize my support for the victims who have come forth with their alternately harrowing and unnerving stories. I’ve chosen to refrain from offering any commentary on the subject up until this writing for a combination of reasons; mainly, as someone (and more specifically, as a white man) who has not suffered sexual abuse firsthand, I feel it isn’t really my place to remark on a subject so close to others, yet so distant from my own lived experience. I’ve found that, in such cases, it’s best to just shut up and listen to those who know what they’re talking about.

* * *

The title of this essay is taken from a track on this year’s Sun Kil Moon/Jesu collaboration, 30 Seconds to the Decline of Planet Earth. The song takes as its subject the child abuse scandals that haunted Michael Jackson to his early grave: it appears to have been inspired by a conversation on a plane between the song’s writer (Mark Kozelek) and a young woman traveling to Greece to perform in a musical of Michael Jackson’s life. The song paraphrases a conversation between the two, in which Kozelek asserts (rather firmly) that the world is, undoubtedly, a better place without a pedophile R’n’B star living in it. At first listen, the lyrics to the song are more-than-slightly jarring: casual listeners might be inclined to interpret this perspective to be the actual opinion of songwriter Mark Kozelek, whereas those who’ve spent time with Kozelek’s other recordings may recognize the sound of his (often) darkly satirical social commentary.


I can’t say for certain whether the lyrics to “He’s Bad” come from a place of sincere commentary or social satire, but I find it difficult to accept the former interpretation. In fact, the perspective of the song’s narrator is often so wince-inducing in its generalizations, one can only make sense of it when read in quotation marks:

“Is the latest on him true?
Well I don’t fuckin’ know
But if I had a son, would I let him get into a car with Michael Jackson?
Fuck no
I’m sorry for the bad things that his father did to him
But it doesn’t add up to building a Willie Wonka trap for kids
And changin’ the color of your God given skin
He made creepy videos that the popular kids liked back in the eighties
And once over a balcony he dangled a baby
And did the moon walk
And talked like a 9 year old girl
I don’t give a flying fuck what he meant to the mainstream world
Roman Polanski went down in flames and was incarcerated
But this young little kid addict will forever be celebrated
A hundred plastic surgeries and paid two hundred million to shut people up
Took someone’s child like it was nobody’s business and dragged him around on a tour bus

He’s bad
And he’s dead and I’m glad
He’s bad
And he’s dead and I’m glad
He’s bad
And he’s dead and I’m glad
He’s dead and to me it ain’t that fuckin’ sad”

The song has stuck with me all throughout the ups and downs 2017 (and it was, for the most part, a year of downs). A friend of mine, who suggested I check out the record, cautioned me in advance about the song’s “cringe-worthy” quality; at first listen, I shared in his assessment. But upon further listens, a space opened up in the longer instrumental stretches of the track, and I found myself strangely drawn to it. Presently, I find it to be a brilliant piece of songwriting—perhaps even moreso, if these are, in fact, Kozelek’s verbatim opinions. The song capably highlights a common trend of generalization and oversimplification among present-day liberal pundits: one might as well call it the “make sure the baby goes out with the bathwater” syndrome. Because it’s easy (and more precisely, facile) to take a step back from the strange and unsettling case of Michael Jackson, and surmise that he was nothing more than a sick man who preyed on children—that consequently, the world is better off with him dead than alive, and he might as well have gone sooner. But had he never lived, this song (a highlight from the record, I think) would not exist: not just its lyrics, but its arrangement, structure, arpeggiation… all of which pay tribute to the late “King of Pop.” Which begs the question: Is it right for one human to judge the life of another and determine they ought not to exist—or have existed at all? It’s the same question that underlies the debate(s) surrounding the death penalty; war; abortion. Taken at face value, the perspective of Kozelek’s song sides with the affirmative answer to this question. But interpreted satirically, the question remains open-ended. Unlike the above-mentioned Stereogum article, the reader of Kozelek’s song is actually given a space to think for themselves and reach their own conclusions; so that, even if these are the songwriter’s dyed-in-the-wool beliefs, we don’t feel pressured into adopting them as our own (or, conversely, into rejecting them outright).

michael

Michael Jackson was the subject of great public scrutiny throughout his short and strange life, which provides the subject for the recent Sun Kil Moon / Jesu track, “He’s Bad” (from 30 Seconds to the Decline of Planet Earth, available from Caldo Verde records).

I refuse here to entertain the idiotic question that somehow goes on being debated in certain circles: can bad people make good art? (I will, however, quickly dissect the idiocy inherent to the question’s phrasing: firstly, there is no such thing as “good” people or “bad people;” and second, what do you think?) However, I do find it noteworthy that a lot of angst appears forthcoming in the public response to revelations that Louis CK, Charlie Rose, and Kevin Spacey—celebrities that, unlike the blowhards who preceded them (Roger Ailes, Bill O’Reilly, and Harvey Weinstein) were somewhat well-liked—have lived messy lives and done some deplorable things. Each time a new pariah gets added to the fire (all the while, the President shakes hands with Duterte on a visit to the Philippines, and swaths of Puerto Rico remain powerless), I’m reminded of the excellent documentary Happy Valley, directed by Amir Bar-Lev and released in 2014—a year or two following the explosive child abuse scandals involving the once-respected Penn State coaches, Joe Paterno and Jerry Sandusky. In his film, Bar-Lev explores the American (and outright human) proclivity for placing a fallible person on a pedestal, and then shaking one’s head in disbelief when fallibility rears its ugly head. As I watched the reactions pour in on social media (friends who initially felt inclined to defend their heroes against the allegations, and as soon as proof was provided, reluctantly gave in to the evidence, tossing the baby out the window), I had a flashback to the confused street riots that followed the Sandusky trial—in which the townsfolk of Happy Valley alternately mourn and celebrate the dismantling of a statue once proudly erected to Joe Paterno (who was not convicted of perpetrating, but was found to have enabled Sandusky’s behavior after being informed of its existence).

The psychology of the townsfolk, which is smartly and respectfully explored by Bar-Lev in his documentary, can easily be transposed to the public psychology surrounding this irrational debate unfolding on our national stage; the key question in the debate is: Where do we store our dismantled idols? (As opposed to the far more proactive question, which everyone seems too afraid to pose: Why are we obsessed with erecting idols in the first place?) For some (and most specifically, for those employed in talk news) the answer to this question is “straight to hell.” Never in my adult life have I seen such a rabid drive—propelled primarily by pundits who appear to take more than a little schadenfreude in exposing the discovery of (yet another) sexual predator—to excommunicate individuals from their professions (before their employers have even had a chance to evaluate each situation and weigh in on the matter; a scary social precedent, to be sure), and hold their mock-trial in the court of social media (for an especially disturbing case-in-point, see the response of so-called “progressives” to the revelation—forecast eerily by a Roger Stone tweet—that Al Franken did some things in poor taste on his USO tours).

happyvalley-4

A resident of Happy Valley, featured in Amir Bar-Lev’s 2014 film of the same name, takes a stand by the statue erected to honor Joe Paterno’s heroic status among the community. His sign reads: “Paterno, the coverup artist !! Paterno, the liar !! Paterno, the pedophile enabler !!” © 2014, Music Box Films.

On the one hand, this sort of “cleaning house” could be argued as a corrective to the long-delayed response of Fox News executives to the litany of allegations leveled against Roger Ailes and Bill O’Reilly (et al); on the other, the zealousness of this drive appears to be conspicuously overlooking the most powerful perpetrator in the room. And if we are not making an effort to prioritize the perpetrators who wield (and exploit) the largest amount of power, but gladly chase after those with relatively little power (yielding the digital equivalent of pitchforks and torches), then the only takeaway from this discussion is that humans are increasingly oblivious to their own addiction to the mechanisms of an exploitative society—to the extent that we routinely take private pleasure in exploiting other exploiters, all the while denying our own role in the circle of exploitation.

At the end of the day, it is this recognition of the multi-faceted power deferential involved, which appears to be the biggest stumbling block for most folks to navigate. Setting aside the heavy-handedness and gross over-simplification of Morrissey’s “controversial” remarks, he does appear to be clumsily pointing to a taboo truth that some people refuse to acknowledge: that in some (not all) instances, the prospective “victim” in the abuser-abused equation will find a way to subvert the power deferential for their own gain. And before the reader reaches for their pitchfork, allow me to clarify that I am speaking of these matters in the specific context of exploitative behavior within an exploitative society (one cannot ignore the reality that individuals make desperate decisions under desperate circumstances). One thinks of the psychologically-sound Lolita twist, for instance—in which an older man preys on a “nymphet,” only to find her turning the tables and exploiting his inappropriate adulation to achieve independence. Which isn’t to say that Nabokov lets Humbert Humbert—or his predatory leanings—off the hook; rather, he accepts the obvious element(s) in this equation, before pointing to the more taboo reality lived by more than just a few individuals in this power-driven society: a reality in which the exploited learns how to exploit, for lack of other identifiable options. (In her best-selling memoir, Reading Lolita in Tehran, author Azar Nafisi provides a feminist interpretation of Nabokov’s text—which she read as a metaphor for the often oppressive experience of life in the Islamic Republic of Iran; further highlighting the on-going significance of discovering fresh social commentary in old texts.)

Alas, such nuanced observations can never see the light of day in our current debate surrounding sexual abuse (or any other issue, for that matter), seeing as how the very idea of truth has already been co-opted and distorted by opportunistic sycophants and sociopaths at Fox News (and elsewhere)—who continue making a killing, selling dumbed-down distortions and good old-fashioned lies as substitutes for insightful commentary. And on the other side of the political fence, a vehement denial of nuance in sex politics frequently appears as a thin disguise for some vaguely misogynistic impulse to deny the emotional, behavioral, and psychological complexity of the feminine experience (for it’s easier to brand every woman a victim for life, even after they’ve made peace with their offenders and politely invited their defenders to piss off). Rather than confront the messy psychology and uncomfortable truths inherent to the dynamic between victims and perpetrators of sexual abuse, talk news pundits (few of whom can be considered experts in social psychology) have apparently reached a consensus that the best way to talk about the issue is to: not actually talk about it; shame the perpetrators until they’re out of a job; and run the interviews of the victims being forced to describe (in gritty detail) their abuse to the flashing lights and rolling cameras; over, and over, and over again… Similar to the way most pundits talk about gun violence (except, no one has actually lost their job for enabling mass gun violence through aggressive gun lobbying; not that I’m aware of, at least).

And again.
What purpose could this possibly serve?

What strikes me the most about Kozelek’s song (which inspired, at least in part, this meandering diatribe) is how it either intentionally or, perhaps unintentionally highlights the banality of its own perspective. Every time I hear the song, I think to myself “I would never go out of my way to listen to a song that presents such a perspective with utmost sincerity:” it would be like taking Randy Newman’s “Rednecks” at face value. A song that feels so stiltedly obliged to assert moral autonomy, while somewhat sadistically proposing a recommendation of death to criminal offenders…  What purpose could this possibly serve? One hopes, the purpose of irony. For in making the listener consider words and deeds of such strict moral outrage—in confronting us with our own respective failures to accept some amount of gray in our black and white lives—one might then feel a little wiser, considering the possibility of something else. Not unlike in the films of Fassbinder, who strove time and again to show the audience the need for change, while never spelling out what that change ought to be (after all, shouldn’t we be smart enough to figure it out on our own?)

In an essay from a book I’ve had my nose in lately, detailing the merits of RWF’s 1971 film masterpiece The Merchant of Four Seasons, author and former acquaintance Christian Braad Thomsen observes:

“Fassbinder […] shows the necessity of vigorous action on the part of the viewer. But he’s not a school teacher, who wants to raise his finger and tell the audience what they have to do, if they want to change the world. He is the Socratic artist, who uncovers how the existing possibilities of living have failed, and points out that change is necessary.”

Throughout his prolific and multi-faceted career, Fassbinder sought to demonstrate the mechanisms of his inherently flawed and power-driven society, clearly enough for any viewer—regardless of their education, intelligence, or station in life—to understand the mechanism and, in turn, recognize the need to rise above it. Three years after releasing The Merchant of Four Seasons, Fassbinder carried his vision of societal deconstruction to an even more poetic and empowered level with Ali: Fear Eats the Soul. In an interview given after the film’s release, Fassbinder observed that “I tend to think that if […] depressing circumstances are only reproduced in film, it simply strengthens them. Consequently, the dominant conditions should be presented with such transparency that one understands they can be overcome.” Rather than taking a sadistic pleasure in portraying the misery of those too enslaved by a social mechanism to recognize how breakable their chains might be, Fassbinder sought to show love for these strange creatures called “humans,” by perpetually revealing the existence of the chains—and the absence of a wizard behind the curtain. In film after film and play after play (and without any undue condescension or simplification), he succeeded in demonstrating that all individuals (regardless of race, gender, sexual orientation, age, or politics) are capable of breaking the chains of exploitation, if they only choose to live a pure existence—predicated upon the inherent human values that society continually distorts (by claiming them as its own, and often assigning them a capital/nominal value).

Put plainly: one cannot go on playing un-elected judge to man’s folly, and resigning oneself to a vacant culture of reactionary outrage. That would entail rejecting the possibility of finding a different way to live, and to demonstrate, by example, an alternative to such folly. (And if one rejects the need for an alternative to folly, one is simply a fool.)

201714438_1_IMG_FIX_700x700

In his five-part TV mini-series, Eight Hours Don’t Make a Day, Rainer Werner Fassbinder paints one of his most positive portrayals of people trapped inside a social mechanism they yearn to break free from. © 1972-1973, Westdeutscher Rundfunk. Renewed 2017, Arrow Home Video.

I’ve returned to Fassbinder many times over the past years, and have always left with an uncanny sense of premonitory relevancy and an inspired momentum. A DVD set of the recently rediscovered (and beautifully restored) TV miniseries, Eight Hours Don’t Make a Day, has been a close companion these past few months; I’ve rarely seen a film so genuinely positive in its outlook, let alone a film of his. His characters are shown (as usual) to be moving parts in a social machine, but all the parts in this machine move beautifully—and each on their own terms. Instead of falling back on jargon, party slogans, or naïve Marxist sentiment, Fassbinder shows characters from all throughout the power structure as somehow genuine and, just as importantly, capable of empathy (and change). He shows that action will forever speak louder than the most eloquent words—while simultaneously revealing how words and images can be employed to further the awareness of a need for action. Not just social (read: collective) action, but individual action. Presently, fleeting social movements (via trends, hashtags, and viral videos) demand wide-spread attention, and the individual finds himself trapped between a biological drive to engage with one’s own self-actualization, and the socially conditioned response to ignore or reject this drive: to follow the horde or disappear. Hence, the “individual” is celebrated, but only on the terms of the individual’s bond with society; if the individual does anything to sever this bond with society, the individual essentially (and in certain cases, actually) will cease to exist.

Indeed, it would seem as though excommunication has been the only fear to consistently unify individuals, in societies across the world—and throughout the ages. The higher the threat of expulsion, the greater the anxiety in one’s life; the greater the anxiety in one’s life, the greater the relish in the expulsion of another. (Following this train of thought, one shudders to think of all the threats and anxieties our current President must have accrued in his lifetime.) I find it especially concerning that so many straight, white, and self-proclaimed “feminist” men appear to be foaming at the mouth to call out anyone who fails to speak the programmatic lingo they’ve conditioned themselves to communicate with; one wonders if some (or perhaps many) of these individuals might be protesting so loudly, for fear of having their own past improprieties exposed. Either way, I imagine a casual time traveler would have a hard time distinguishing our media’s contemporary treatment of sexual abuse scandals, from the Warren Commission’s treatment of the “Red Scare:” so many people eager to see the lives of others demolished, for fear of being next in line…

In another section of Thomsen’s illuminating text (entitled Fassbinder: The Life and Work of a Provocative Genius; quotes taken from the English text, translated by Martin Chalmers), analyzing Fassbinder’s explosively controversial (and for just reasons) play, Garbage, the City, and Death, the author reveals a parallel theme to the circle of exploitation: the corresponding closed circle of oppression—escape from which is a far more complicated matter. Thomsen writes that:

“Throughout Fassbinder’s work we see the oppressed assuming the norms of the oppressors, whether out of a conscious need for revenge or whether they have more or less unconsciously internalized the dominant norms. Fassbinder never has ‘pure’ heroes. Rather, he demonstrates one of the most melancholy consequences of oppression, that the damage to the souls of the victims makes them unable to find alternative norms, so that the only possibility left to them is to recapitulate the norms that have led to their oppression.”

While it sometimes feels as though Fassbinder has artistically resigned himself to a closed box of self-fulfilling prophecies, his work continually reminds the reader/viewer of the complexity of human behavior; a phenomenon which, after decades of misrepresentation (or reductive representation), we appear to have grown somewhat culturally blind to. (So we keep building new idols, only to tear them down after human behavior—yet again—reveals its darker potential.)

19387416_303

In his 1981 feature film of Lili Marleen (pictured above: a resplendent Hanna Schygulla, in the titular role), Fassbinder reveals the cyclical nature of exploitation and oppression through the story of a Weimar cabaret singer in love with a Jewish man, at the start of World War II. As its despairing plot progresses against the oppressive backdrop of the Nazi regime, all of the protagonists catch themselves in the act of being exploited and exploiting others, to survive and to pursue the faint possibility of self-actualization in desperate times.

As I continue to revisit Kozelek’s song—from one month to the next, and one criminal celebrity exposé to another—I’ve decided that I don’t ever want to catch myself reveling in the demise of another human being. Those words, “He’s bad/He’s dead/and I’m glad,” ring hauntingly hollow; they don’t feel genuine… As a stand-alone thought, without corrective, they feel like a disservice to the vastly complex potential of our human nature. And yet, one is so very often stumped, when confronted with the death of someone who did truly terrible things.

This morning, the headlines read: “Charles Manson Dead at 83.” Later in the day, upon arriving at my office, I was made aware of a death in the immediate family of one of my co-workers. I felt (and still feel) a profound sadness for my friends and their family. I thought of Leslie Van Houten momentarily, and the families of his victims, but apart from that I could muster little in the way of an emotional response to Manson’s death. The words of Kozelek’s song ran through my head again, and they rang false again; seeing as how gladness was an emotion, and I couldn’t bring myself to fit emotion in the equation of Manson’s death. A custodian at the office made small talk with me about the news while changing trash liners, observing that: “Someone who did so many awful murders… If I’d had my way, he would’ve been taken out back and put down. Saved the tax payers some money.” I acknowledged his observation, and respected his right to view the situation in such plain terms; I clarified that the death penalty was (rather controversially) suspended in California around the time of Manson’s sentencing. After our brief exchange, and upon considering Kozelek’s song and the deaths of other infamous criminals (and criminal artists) throughout history, I decided to follow my gut. After all, he was somebody’s child, and somebody loved him. Gladness seems glib, even in the plainest of contexts.

* * *

Where does all this leave us?
And what are we to do with the pieces we have left?

In answer to the first question: we are left alive and awake in the United States of America. We have a Constitution that has up until now guaranteed a fairly open space for independent speech and individual commentary. We have great books written by great minds; illuminating films by directors who see (or at least saw) the potential for the medium to show the possibility of an alternative, and the accompanying need for change. Beautiful records by our favorite musicians; museums and galleries full of artwork to expand our horizons (unlike the talk shows, reality shows, and click-driven online journals that rely upon the shrinking horizons of their viewership, in order to sustain their traffic and ratings). Blank paper, canvas, web outlets (free while they last), on which we can project our visions of an alternative and our private and collective need to change.

alphaville

Anna Karina in Jean-Luc Godard’s dystopian masterpiece, Alphaville: channeling Alfred Hitchcock; channeled later by Ridley Scott. © 1965, Athos Films.

As for what we’re supposed to do with all this… well, that’s up to us: individually and collectively. I’m in no position to outline the course for an entire nation of people—each with their own individual views, ideas, convictions, and motives—but I do think we would be better off trying to learn something from the trials and tribulations we’re living through, rather than just repeating these tired tropes of scapegoating, public shaming, and language-policing (among other forms of dictatorial social conduct). As was sorely predicted by many of us, when the results of last year’s election rolled in, this year has been a nightmare on many fronts; and short of an organized revolt or an awakening of Republican consciousness, we’ll have to endure at least one more year of the nightmare. By continuing on the course our society has traversed this past year (a course that has both recycled traditional socio-economic exploitation tropes and invented new ones, thanks to the willingness of millions to surrender their thoughts, ideas, photos, and identities to metadata collectors—free of charge—to be exploited by the highest online bidder), we are sentencing ourselves to an increasingly dystopian future in which individual thought verges on extinction, civil liberties become novelties, sex is reduced to a formal contract, and humor is no longer recognized. Like Godard’s Alphaville, only far less cool to look at.

As far as my own experience of 2017 is concerned, I like to believe that I’m leaving this year older and more tired, but wiser as well; less quick to jump to conclusions, more open to the ambiguity of life and the possibilities for change. I advance into the wilderness of a new year with the knowledge that no socially-imposed chain of exploitation can hijack my freedom to think and act in accordance with a greater wisdom. Unless, of course, I grant this chain the power to do so.

“Even Richard Nixon has got soul.”
– Neil Young
(from the 1977 song, “Campaigner,” recently reissued on his Hitchhiker LP)

Advertisements

An appreciation of the 12th annual Dayton LGBT Film Festival

It was a beautiful mid-October weekend in Southern Ohio, and a modest-but-dedicated crowd of midwesterners congregated in the lobby of Dayton’s Neon Movies for its annual LGBT Film Festival. Over the course of the weekend, a total of seven feature-length films and ten shorts would be screened for the festival’s attendees (Yours Truly made it to five of the features and nine of the shorts). The films ranged in subject matter: from high school rom-com, to maudlin countryside English drama, to a documentary about the most world-renowned drag ballet troupe, to a family portrait set in a small Alaskan town. Collectively, the films seemed (to this viewer, at least) to represent the best and, on one or two occasions, the worst of LGBTQ culture in the 21st century. Which is a testament to the quality of the festival and its selection process: for the dregs only make the gems pop that much more; and as in every year prior, there were far more gems than dregs.

freakshow

Alex Lawther plays Billy Bloom: the frustrated (and frequently, frustrating) protagonist of Trudie Styler’s Freak Show. © 2017, IFC Films.

The festival opened on Friday the 13th with Trudie Styler’s independently-produced teen comedy-drama, Freak Show. For want to move on and discuss some of the more worthwhile features showcased during the festival, I am tempted to fall back on the old adage “the less said about it, the better.” But of the few disappointing features this writer endured over the weekend, Freak Show actually presents a substantial number of worthwhile talking points. Sadly, the finished film appears mostly oblivious to its own potential; and when the filmmakers seize upon the opportunity to say something of substance in the picture, they either lack the vocabulary to communicate it effectively, or forfeit the opportunity altogether in order to fall back on easy clichés and grossly oversimplified (not to mention divisive) rhetoric. In fact, it is more-than-likely that anyone with anti-LGBT inclinations would not only have their fears reinforced, but emboldened by the film’s misguided perspective.

For starters, it is impossible to read Freak Show as anything but a direct descendant of the prolific American television entrepreneur Ryan Murphy—and more specifically, the zeitgeist-defining Glee franchise on Fox television. From the outset, Styler makes her stylistic template all-too-clear: from the upscale school environment, to the character (stereo)types (the hunky-jock-with-a-heart-of-gold; the Christian goody-two-shoes cheerleader; the loud-and-proud queer kids) to the generic, broadly stylized photography and editing, Freak Show lives and breathes the DNA of the cultural harbinger that preceded it. With one key difference—which the picture wears on its sleeve rather clumsily and cluelessly: that whereas Glee emerged during the Obama years of “hope and change,” Freak Show is presented as a product of desperation in “the age of 45.” Which makes it all the more disappointing that, rather than presenting alternatives and proposing solutions to the mean-spirited cynicism of the country’s cultural hurricane, Styler & co. seem to have gotten lost somewhere in the storm.

I find it especially interesting to note that Freak Show (a borderline cruel comedy) was helmed by a woman director: in my personal reading of the picture, the fundamental mistakes made by Styler’s production were the product of good intentions—yet they seem to echo an unhealthy trend permeating the country in 2017. Namely, the trend of going to bat for an identity/gender/ethnicity outside one’s own, but resorting to blindly aggressive (verging on plain mean) tactics that many of the persecuted individuals stuck in the limelight might well feel inclined to reject—if given a chance to speak. It is all-too-apparent that Styler has an emotional investment in her protagonist, the precociously flamboyant Billy Bloom: one questions, however, whether this same emotional investment has been applied towards any of the other characters in the picture. For it appears that Styler’s empathetic range is about as myopic as the picture’s screenplay (adapted from a book that I’ve never read, and am in no position to criticize), and her specific lack of empathy for one of the narrative’s primary antagonists—the goody-two-shoes cheerleader, Tiffany (played capably—perhaps too much so—by Willa Fitzgerald)—is telling. The narrative’s intentions backfire with each cringe-inducing line forced upon Fitzgerald’s caricatured cheerleader (an archetype one could surely recognize without the undue delineation granted here), espousing every bigoted stereotype of the religious Right, but without even a hint at the human fallibility that enables such nonsense. (For comparison, note that Billy is never painted as anything less than a victim, though his distinctly privileged and narrow worldview just as readily coincides with that of a bully.) Styler & co. have gone to such great lengths to mock and vilify their antagonist, that any viewer with a modicum of trained compassion might feel compelled to jump to Tiffany’s defense.

glee-episode2_pqs24a

Though criticized by some prominent figures of the religious Right as a distortion of adolescent norms, Ryan Murphy and the writers of Glee actually displayed a consistent effort to humanize their Christian characters and respect the broad range of belief systems among the show’s viewers. © Fox Television, 2015.

As for our protagonist, Billy Bloom represents pretty much all the negative sterotypes of queer youth, with few identifiable virtues. For instance, Billy is frequently seen quoting Oscar Wilde, yet in practice he represents none of Wilde’s resiliency, wisdom, or empathy for his peers. He bemoans his ostracization at school, yet intentionally exacerbates the problem by presenting increasingly rarefied and flamboyant incarnations of himself from day to day—simultaneously expecting and lamenting criticism. Looking back on the picture, I am reminded of an insight shared in the documentary Rebels on Pointe, screened Sunday afternoon: speaking in relation to the ethos of the film’s subject (a drag ballet troupe), one commentator insists that dancers “don’t have to fit in, but they have to be able to function.” When in Freak Show, the blame for the protagonist’s inability to do either is foisted upon a cheerleader, I can only hope that no one buys the implication (particularly LGBTQ teenagers, for whom the picture was most clearly intended; what kind of message is this?)

Our protagonist (and the film he fronts, for that matter) waves a banner of blind acceptance and tolerance, but he routinely displays a lack of awareness, empathy, and respect for those outside his sphere of influence. In a particularly telling sequence, Billy decides to compete against Tiffany for the title of homecoming queen, and subsequently attempts to outshine his competition at a stadium pep rally. Tiffany, who proudly states she has been preparing for this occasion since 7th grade, presents herself on a predictably decorative float with a banner announcing her candidacy; Billy ostensibly one-ups her by riding in on a float shaped like an enormous high-heeled platform shoe—holding a guitar and playfully pantomiming the act of making music. Watching the broadly painted scene unfold, I found myself struck less by the grandiosity of the protagonist’s presentation, and more by the way the scene inadvertently highlights the empty ambition of Billy’s character, and the movie in general: for while they both offer an occasionally credible guise of substance—fragments of a message: an increased awareness and understanding of LGBTQ issues, perhaps; or some vague missive of empowerment—they evidently lack the ability to make any real music with the tools at their disposal. By the film’s long-awaited close, its creators have succeeded only in drawing our attention to the weakness of their own propositions; never having bothered to investigate (much less address) the source of the bigotry they feigned to condemn. (On a more positive note, I will take a moment here to champion the never-ending talents of Bette Midler and Celia Weston: two beacons of on-screen light who never fail to shine brightly.)

But the night was’t a total wash: the short that preceded Freak Show, a 12-minute drama centered upon a young man of color who enters the world of drag and discovers his queer family (in the same vein mined by Jennie Livingston 27 years prior), presented us with an endearing portrait of queer family dynamics. The boy’s mother (played smartly by Yolonda Ross) convincingly represents the real-life struggle of mothers around the world—recognizing their own distance from the cultural orientation of their offspring, but ill-equipped to traverse the gap and (in some cases) reluctant to even try, for fear of challenging one’s own convictions (the dual meaning of the film’s title, “Walk For Me,” further highlights this theme). Driving home at the end of the night, I found myself regretting the disparity in runtimes between the two features.

walkforme

Brenda Holder makes herself up as Paris Continental in Elegance Bratton’s economical but effective short film, “Walk For Me.” (No major distributor attached.)

* * *

Saturday’s offerings proved much more rewarding—starting with a selection of “Top Drawer Shorts” (seven in total): three of which were forgettable, three of which were good, and one that was outstanding. The first entry, “Something New,” assumes the form of a light-hearted romantic comedy (the writer and star, Ben Baur, was present for the screening and explained during a brief Q&A that he found inspiration in the romantic comedies of Meg Ryan: having never personally acquired a taste for Ms. Ryan’s whitebreaded brand of bourgeois lovesickness, I confess to having no horse in this race, and will temper my criticism accordingly). While essentially innocuous, the script is tepid at best, and outright callow in its lowest moments. Which isn’t to say that queer comedies haven’t traditionally been shaded in tones of callowness; but when no other qualities can be discerned, one wonders if this might be all the filmmakers have to offer.

The second short in the series, “The Devil is in the Details,” offered us something more substantial—but juxtaposed against its hollow predecessor, it almost felt over-compensatory. A period piece set in a 19th century French borstal for girls, the film centers on a young woman achieving the realization that she was born with hermaphroditic genitalia. As her testes painfully descend throughout the short’s exposition, the faculty grapples with the boundaries of gender identity and ultimately decides to transfer the student to an all-boys school. Beautifully shot and impeccably acted, the only shortcoming I could perceive in “The Devil…” was its somewhat constrictive running length; which is, in film terms, a definitive compliment.

devilisinthedetails

Laure LeFort plays Alexina in Fabien Gorgeart’s noteworthy short, “The Devil is in the Details.” © 2016, Première Ligne Films.

Next up, the festival’s first “true story” offering: titled “Imago,” this quasi-documentary explores life through the eyes of a 15-year-old transgendered girl, who decides to write a letter to her father outlining the reasons she cannot bring herself to spend time with him anymore (the end credits explain that the screenplay took this real-life letter as its source material/inspiration). The film is short, effective, and memorable: one gleans the distinct impression that the filmmakers bit off just as much as they could chew within the budgeted amount of screen time. The film was followed by what read to me like a failed Saturday Night Live skit (“Haygood Eats”), and then came the cream of this anthology’s crop—a short documentary entitled “Bootwmn.”

Somewhere between Christopher Guest and Louis Malle’s American documentaries from the 1980s (God’s Country; …And the Pursuit of Happiness), “Bootwmn” is a charmingly earnest, refreshingly non-abrasive portrait of a self-proclaimed Texan bulldyke named Deana McGuffin. Charting her journey from apprentice to her grandfather’s boot-making enterprise, to a visionary boot designer/boot-maker in her own right, the film toys thoughtfully and playfully with themes of authenticity, communication through creativity, and the objective value of a work ethic. Throughout the film we meet two of Deana’s employees, and join them as a fly on the wall during their adventurous decision to enter a pair of queerly decorated boots (known as the “Gay State” boots) into a highly traditional Texan boot-making competition. For fear of spoiling the outcome of this altogether remarkable celebration of the human spirit, I will refrain from saying more.

bootwmn

Deana McGuffn (center) flanked by two of her workshop assistants in the delightful dark horse of a short, “Bootwmn”—directed by Sam McWilliams & Paige Gratland, and backed by a crowdfunding campaign. (No major distributor attached.)

The penultimate short, an Australian drag piece titled “Picking Up,” was fine but forgettable. And while not as forgettable, Danny DeVito’s cute and aptly titled “Curmudgeons” left me wanting (of what exactly, I’m not sure). My vote for Best Short is cast for “Bootwmn.”

Up next—and following immediately on the heels of the “top-drawer shorts”—one of two full-length documentaries included in this year’s line-up: The Untold Tales of Armistead Maupin. Comprehensive in scope and scintillating in detail (including the sultry anecdote of a threeway with Rock Hudson), Untold Tales is a delight, and is bound to win over fans and first-timers in equal measure. In classic documentary form, filmmaker Jennifer Kroot places Armistead’s first-person narrative of his own life’s story within a well-rounded framework of objective context from third party sources. For example, when Maupin explains his defense for having outed other celebrities at the height of his own fame, Kroot quickly jumps to the perspective of other LGBT voices who alternately support and criticize his motives—with a pause added for the viewer to reach their own conclusion. At no point does Kroot’s focus stray far from her central subject, but the sheer range of perspectives, stories, and insights shared throughout presents a veritable kaleidoscope of 20th century queer culture. Ultimately, Maupin emerges (like all great documentary subjects) a fascinating, admirable, and flawed character—whose life work (and story) raises as many questions as it provides answers. It goes on to win this year’s Audience Favorite award.

* * *

While I regretfully missed the Saturday evening screening of Sensitivity Training (directed by Melissa Finell), I returned for the late-night showing of Shaz Bennett’s commendable feature-length debut, Alaska is a Drag. Filmed in rural Michigan but inspired by the filmmaker’s own experiences gutting fish for a living in a small Alaskan town (while dreaming of making it big in the movie industry), Alaska comes across as an honest, assured, and pretense-free family drama—raising issues of identity and conformity with all the wisdom and humor denied us by Friday night’s feature. The star of the film, Martin L. Washington, Jr., delivers an absorbing and memorable turn as Leo—the twenty-something Alaskan drag queen who dreams of making it big and moving to the big city, but is trapped gutting fish for a living and tending family wounds. At times reminiscent of characters in a Jarmusch movie, Martin’s tangible rapport with his on-screen sibling (played by Maya Washington; no relation) gives the film life and frequently compensates for the frailties of its writing. The film is shot simply and effectively, and the photography is, at times, inspired—particularly during the sequences of the family RV at night, and the transitional sequences of the siblings strutting home down a dirt path. The exceptional supporting cast of Alaska is rounded out by Matt Dallas, Christopher O’Shea, and Kevin Daniels—with smart cameos by Jason Scott Lee (Dragon: the Bruce Lee Story) as Leo’s affable employer, and Margaret Cho as the town’s drag king bartender.

alaska

Martin L. Washington, Jr., and Maya Washington star as an endearing set of siblings in Shaz Bennett’s full-length feature debut, Alaska is a Drag—previously released in 2012 as a short with the same title. (No major distributor attached.)

Leaving the theater at the end of this second night, it struck me that Alaska is a Drag handled many of the same issues and themes marketed by the opening night’s misfire: the queering of masculinity and jock culture; interpersonal conflict and religious conviction; the tension between longing to fit in and wanting to stand out. What worked in the latter film, but not in the first? For starters, Bennett’s film leaves something to the imagination—a quality I can only speculate is closely linked to a filmmaker’s respect for the audience’s intelligence. More importantly, Bennett (who wrote the film as well as having directed it) insists upon an understanding of each character in her film’s tapestry; which isn’t to say she allocates equal screen time to each character, but simply that she refrains from taking any cheap shots, and commits herself to practicing the fundamental message queer culture has been striving to convey for well over a century. The message: that everyone deserves the dignity of their own personhood—and the plight (read: struggle) of humankind is to recognize and respect this universal dignity.

* * *

The third and final day of the Dayton LGBT Film Festival read like a victory lap. I missed the first feature (Pushing Dead, directed by Tom E. Brown), but made it for the two final screenings: Bobbi Jo Hart’s documentary on the (in)famous Ballets Trockadero de Monte Carlo, titled Rebels on Pointe; and this year’s heavily-hyped British import, God’s Own Country—touted as a more explicit Brokeback Mountain. Both films successfully live up to the hype surrounding them (a second screening of Rebels on Pointe was added, at the last minute, to accommodate the Dayton Ballet dancers who could not make it to the first screening), and it is authenticity that emerges as the weekend’s clear winner.

284554

Dancers of Les Ballets Trockadero de Monte Carlo waiting in the wings of Bobbi Jo Hart’s endearing feature-length documentary, Rebels on Pointe. (No major distributor attached.)

In Rebels on Pointe, the viewer is introduced to the world of drag ballet through an all-access pass into the real lives of dancers for the world-renowned Ballets Trockadero de Monte Carlo—the first and foremost all-male (and all-gay) ballet company; committed to rendering post-modern (and frequently comical) interpretations of historically celebrated ballet works. The film is gentle, intelligent, smartly pieced together, and irreverent in all the right places. As we get to know each of the dancers profiled by Hart & co., we discover an eclectic range of personalities, family backgrounds, dance résumés, and cultural origins. One dancer is a young Cuban emigre whose mother was a dancer of note in his homeland; another is a thirty-year-old American who struggled to fit in with the orthodox ballet company he had initially joined—finding himself more properly challenged by the the more experimental director of the Trockadero; another is a forty-year-old man whose parents underwent a generational struggle to embrace their son’s life pursuit (they eventually came around, and are featured memorably among the filmed interviews); yet another has chosen to relocate from his native land of Italy, in order to follow his dream and make his family proud. Hart expertly weaves the dancers’ stories together with selected snippets from live Trockadero performances, and the finished product emerges as something between a behind-the-scenes Madonna tour documentary, and one of Jean Rouch’s sociological studies.

Speaking of studies, God’s Own Country wound the weekend down on a note of decided realism. Set in the stunningly photogenic Yorkshire countryside, this feature-length debut by director Francis Lee is likely to acquire a fair share of international accolades before the year is up: and rightly so. Filmed with the same grace regularly displayed by one of its two main protagonists, the Romanian heartthrob Gheorghe (played with quiet magnetism by Alec Secareanu), God’s Own Country tells the tragicomic tale of a young Englishman (played by Josh O’Connor) following in the footsteps of his father—a modest sheep farmer—and willfully suppressing his own dreams of finding romantic fulfillment with another man. As his repressed inclinations toward tenderness habitually transfer themselves into acts of rage and brutality, Johnny (O’Connor) embarks upon a gradual but believable journey of self-discovery; visually, his journey is matched by the characters’ endeavor to surmount the harsher elements of the stark, cold country.

There are many directions in which Lee’s film could have easily mis-stepped, but it is a testament to his skills as a budding filmmaker that he managed to avoid every opportunity to genericize (or scandalize) his subject matter. As with any film of note, the photography merges with the sound design and the chemistry of the actors’ performances to create a fully-formed piece of moving poetry: a whole that can be read both as an eloquent sum of its parts, and as an entity onto itself. O’Connor deserves special commendation for the complex definition of his lead performance, which successfully elicits every audience response imaginable over the course of the film’s roughly two-hour runtime: from disgust to sadness; from anger to empathy; from laughter to scrutiny. In Johnny, we find a protagonist with both the nuanced pathology of Terry Malloy or Jim Stark, and the primal force of Jake La Motta. Here’s looking forward to what Lee (and O’Connor, for that matter) have to offer us next.

godsowncountry

Alec Secareanu (left) and Josh O’Connor (right) play accidental lovers in Francis Lee’s confident and affecting debut feature, God’s Own Country. © 2017, BFI Films.

* * *

Seen together, the films selected for the 12th annual Dayton LGBT Film Festival effectively presented a sort of running dialogue between disparate perspectives and ideologies throughout the queer community: a dialogue that transcends time and identity, but occasionally gets hung up on or the other (or both). In granting the auspice of victory to the notion of “authenticity,” I propose that the finest observations presented throughout this dialogue emerged from a place of genuine creative expression, whereas the weakest commentary appeared wrapped up in a shiny bow of commodified entertainment. A contrast that resonates most markedly in our contemporary cultural climate—in which these same factors of commodification and hollow entertainment, which have regrettably (but nevertheless, successfully) embedded themselves within our cultural and political landscapes, threaten daily to consume all forms of genuine interest in (and expression of) the human condition.

We see it in the contrast between Freak Show and Rebels on Pointe; or the chasm of perspective (and intention) separating “Something New” from “Bootwmn.” We also see it in the recurring re-appearance of negative gay stereotypes: the callow sex addict who treats his fellow humans like objects; the pompous and shallow histrionics of a young queer kid who expects the world to bow at his feet; the self-righteous rebukes directed at anyone and everyone whose politics conflict with, or simply stray (no matter how minutely) from the advancement of one’s own interests. Perhaps these stereotypes exist to remind us that these character flaws still exist; in which case, point taken. But one could just as easily argue that these character flaws persist to this day as a byproduct of perpetuated stereotypes; in which case, maybe we would all be better served by letting such vacuity go, once and for all. Maybe we would be better off by simply embracing the compassionate perspective outlined in the work of Shaz Bennett, Francis Lee, Bobbi Jo Hart, and Jennifer Kroot (and the works of Louis Malle and Jean Rouch before them): that everyone is entitled to the dignity of their own personhood—and it is our charge to recognize and respect this dignity in others, as much as it is our journey to discover it for ourselves. In the immortal words of St. Francis: to understand is to be understood.

Vice Principals is the show that every American adult—and more specifically, every racially disoriented white American—should probably be watching and talking about over dinner. As it becomes increasingly difficult to satirize reality (with reality itself having become an un-ironic satire of social indecency), the creators of this half-hour HBO comedy series (Jody Hill and Danny McBride) have somehow managed to pointedly encapsulate everything bad that is afflicting our country’s societal wellness—while at the same time saving a space for the remaining dregs of decency, which are routinely squeezed out of similar attempts at encapsulating our problems in dramatic form. It’s a program defined by its crass, cruel, grotesquely arch, and (often unexpectedly) black comedy. But while the ostensible victim of the show’s first season was a black woman climbing the ladder of upper management in a public school system, it is the show’s prime villain (her cold-blooded VP, Lee Russell) who undergoes the greatest scrutiny and, ultimately, comes across as the “biggest loser.” Unlike other programs (in both documentary and fiction realms), which consume themselves with endeavoring to paint the plight of the minority citizen in shades of self-pitying helplessness—with a frequently less-than-subtle nod to a (typically white) social justice warrior, who rides in like a knight in shining armor to save the damsel in distress—Vice Principals is a ruthless portrait of the victimizers; making no excuses, and taking no… well, maybe a few prisoners.

vp3

Dr. Belinda Brown (Kimberly Hebert Gregory) sizes up her two infantile and devious cohorts in HBO’s Vice Principals. © 2016, HBO Networks.

When Dr. Belinda Brown (played superbly by Kimberly Hebert Gregory)—the impossible-not-to-be-enamored-with principal who sets the show in motion—is brutally forced out of its equation at the end of the first season, one feels a profound sense of loss. But not only hers: ours, as well. (If anything, Dr. Brown likely considers herself released from the toxic environment of her fellow protagonists’ making.) It is our loss to not have her as a prominent part of the show’s perversely hysterical conversation anymore, being left to contend exclusively with the petty hooligans who have taken her place. In actuality, at the start of the second season’s premiere episode, we find Dr. Brown alive and well—reunited with her husband and two kids, and living a good distance from the deranged vice principals who attempted to ruin her life (and very nearly succeeded). When she begins to hint at her own departure from the show’s narrative, it not-so-subtly calls to mind the departure of our country’s previous commander-in-chief—whom we’ve since seen skydiving, vacationing with his family, and generally conducting himself like an all-around decent human being. All the while, total chaos and insanity looms in the place he used to sit, and a nation is left watching history re-play itself out like a warped VHS tape of white power rallies, devastating hurricanes, incredulous White House leaks, presidential scandals, and arrogant white kids with bad haircuts and polo shirts, armed with tiki torches to defend poorly sculpted monuments of the Confederacy (just when you think you’ve seen it all…) It’s hard to watch this last season of Vice Principals and not blow a wish for Dr. Brown to come back and give one more inspirational pep rally in the North Jackson High School auditorium—just to feel a tingle of hope, that all is not (yet) lost.

Looking back, the first season was a chore for many to sit through: it garnered justifiable criticism for subjecting viewers to an exhaustive, vicarious experience of racist/sexist intimidation and persecution—which so closely echoes the real-life experiences lived by millions of Americans. But while the show certainly has its fair share of “cover my eyes ’cause I can’t bear to see where this goes” moments, I would argue that it remains a rewarding, perhaps even necessary experience for white Americans (especially white men). It forces the viewer to witness the devastating outcomes of intolerance, but not from an easy “scared straight” perspective; instead, the viewer actually has to do some work—to connect the dots between the shallow instincts that compel a person to behave in such a hateful fashion, and the reality such a person must effectively disengage from in order to fulfill such absolute hate. For hate is, ultimately, an uninhabitable condition (something one needs constant reminding of at this point in time). To highlight this truth, there comes a moment in every episode during which VP Neal Gamby (our anti-hero-cum-protagonist, played by McBride) will catch himself in the middle of some atrociously mean-spirited act—typically provoked by his far more nihilistic partner-in-crime, VP Russell—and question his ability to follow through with his ruthless vows, eventually caving in to his own vulnerability. In these moments, the viewer recognizes that even the Scroogiest of conservative white men has a soft spot, somewhere deep down; and in this act of empathetic recognition, the viewer finds their own embers of hateful inclination slowly sizzling out. (In turn, viewers with an overtly racist and/or sexist inclination—who might, at first glance, align themselves with the diabolic intentions of Russell and Gamby—are bound to cave in by the first season’s conclusion, upon realizing the fruitless and dispiriting outcome of the protagonists’ hate.)

vp2

Vice principals Lee Russell (left, Walton Goggins) and Neal Gamby (right, Danny McBride) contend with the unsustainability of their own prejudices. © 2016, HBO Networks.

Since the inauguration of 45, I’ve been troubled by the response of many a despairing liberal to the ill-informed cocktail of bigotry and racial intimidation perpetuated by the president and his base. On the one hand, it seemed to me the reaction of liberals was disproportionately soft—compared to the out-and-out violence (verbal, physical, psychological) that we found ourselves up against; on the other, it seemed a pretty ill-advised approach to fight fire with fire: to attempt and wipe out hate by singling out and shaming the haters, many of whom are so blinded by their own misinformation that they fail to recognize their bigotry as hatred incarnate. I recall beating my head against a wall (literally), and exchanging a series of frustrated emails with friends, most of which culminated with a half-joking recommendation that we split the country in half, effectively separating the evolutionists from the devolutionists. In seeking a broader perspective, I found myself drawn to the brilliant and frequently sardonic songs of Randy Newman, which have—throughout the past four-plus decades—effectively charted the folly of the stupid white man in America; sans effigies, platitudes, or other common forms of creative scapegoating. And I asked myself: Where are the Randy Newmans of today? Where are the Gore Vidals, the James Baldwins, the Nina Simones? How come every visible attempt at protesting the ignorant insanity of 45’s America appears to swing toward the two outer extremes of timid sloganeering and destructive violence? (Fortunately, not long after I went through this line of questioning, it was announced that Mr. Newman would be releasing a new studio album later this year—providing a much-needed salve. Far less fortunately, so many voices belonging to people of color have been effectively suppressed, repressed, depressed, or extinguished altogether; rendering it difficult for the range of creative perspectives the country ought to be represented by to truly flourish—and sentencing the fate of acceptable social protest to a kneel in a football stadium.)

Setting aside the apparent racial intolerance that has festered throughout the country (and the Russian interference that reinforced this intolerance through strategic interventions on social media), part of this dilemma likely stems from another root cause of 45’s presidency: the mindset underlying that lamentable term, “political correctness.” In hindsight, it is difficult to imagine 45’s candidacy gaining the kind of momentum it generated without the scapegoat of liberal hyper-sensitivity. Every slogan developed throughout his campaign served to highlight this critique: from “crooked Hillary,” to “bad hombres,” to “what a nasty woman,” to the cringe-inducing “he can grab my…,” to the swiftly appropriated “deplorable and proud of it,” the racial hatred permeating the campaign’s tone was matched only by its general disdain for pre-meditated and/or sensible syntax. And as with all false generalizations and stereotypes throughout history, there was, in fact, a justifiable criticism at the onset of this profane game of Chinese whispers. Namely, the criticism of the left’s increasingly rigid thinking on the subject of policing language: a well-intentioned effort to nip hate speech in the bud, but one that has frequently neglected to take into account the Quixotic nature of its own pursuit. For just like the idealist of Cervantes’ great novel, the “P.C. police” (as they’re commonly referred to by irritable right-wingers) often find themselves chasing windmills and missing the forest for the trees: so wrapped up in the semantics of isolated incidents, they lose sight of the motivators behind the language they are policing—which might foreseeably range from absolute, vitriolic hatred; to an infantile desire to provoke or offend; to sheer ignorance of the meanings attached to the words one has chosen.

It is within this context that Vice Principals presents a swooping breath of fresh, tension-splitting air. Although the premise of the show is itself a persistently tense exercise in caustic polarization, the manner in which it mirrors the real-life tensions surrounding its creation (considering that the first season’s airing coincided with the peak of the 2016 election) serves to deflate the pressure accompanying its subject matter. Here we find three character types that are frequently subject to the “politically correct” treatment—an effeminate, plausibly closeted gay man; a heavyweight divorcé; and a well-educated woman of color—released from the popular liberal’s cocoon of cultural suffocation, and allowed to live and breathe as characters that are every bit as nuanced as they are dense; almost like actual people. And if the show has a secret ingredient in the recipe of its greatness, it most likely lies within this astute recognition that vilification and deification are equally ineffectual tropes (both in narrative terms, and in lived reality). It would be easy—all too easy—to rewrite the show with Gamby and Russell (embodied by the relentlessly brilliant Walton Goggins) as dyed-in-the-wool hate-mongers, with a cheaply sketched-in backstory of how they came to be so hateful (e.g. childhood abuse; bullying; exposure to violent crime): the rest of the series—assuming the form of a prime-time melodrama—would essentially write itself, with the characters either achieving progress towards an awareness of the origins for their respective prejudices; or, conversely, digging their heels in deeper and, eventually, falling on the sword of their own bigotry. Not only would such a literal execution of the premise be uninteresting: it would render it increasingly difficult for the actors to bring any real pathos or complexity to their characters, since such a narrative is ultimately a glorified journey from point A to point B. In other words, this more “sensitized” approach would present the antithesis of a real person’s life journey, which invariably presents a more complex trajectory through various stages of change and emotional/intellectual growth.

vp1

Gregory (right) provides the heart and soul, and Goggins (left) the diabolical thrust behind Vice Principals—the only great satire thus far broadcast on American television in the year 2017. © 2016, HBO Networks.

Rather than taking the easy way out of contending with bigoted protagonists, Hill and McBride have boldly chosen the more challenging, and far more rewarding narrative approach. In Gamby and Russell, they have created two strangely… lovable bigots. Not that one loves them because of their bigotry (the show is structured in such a way that such sympathies are unlikely, at worst); one loves them in spite of the raging ignorance and intolerance that continually threatens to swallow them whole. Instead of being vilified and caricatured as two creatures from the black lagoon who’ve arisen to claim some distorted interpretation of supremacy, Gamby and Russell are just two stupid white boys with no real grasp on the concept of emotional maturity—and watching their psyches disintegrate from episode to episode is every bit as comical as it is maddening. Not unlike our current president, whose racist inclinations frequently appear to stem less from an inherent sense of racial superiority (I mean, just look at him), but more from a cynically strategic approach to soliciting support from pockets of the U.S. voter base, which any seasoned politician with a modicum of decency would refuse to entertain (e.g. David Duke and his cohort, and at least half of our Presidential Cabinet). But the real masterstroke of Vice Principals is that, despite the uncanny parallels between our presidential administration and the admin of North Jackson High, the show succeeds precisely where the president’s administration has failed: by actually making us care about the fate of its ignorant protagonists.

It is safe to say, at this point, that hardly a person in the country—or, more broadly, on the face of the earth—can be bothered to care about the personal fate of the 45th president. It is, in fact, difficult to think of any figure in our nation’s history who has been so widely (and so justifiably) reviled, across the board of political identification and cultural affiliation. And true to form, 45 has surrounded himself with individuals who only serve to further dehumanize his public persona: compounding the reality television aesthetic of his own making, and continually escalating the threshold of public disdain. And I would here argue that it is this aesthetic of idiocy—this constant talking down to the citizens of a country who, by and large, know they deserve better—that presents the biggest hurdle for his detractors to surmount. The brilliantly monotonous condescension of Maxine Waters, in addressing one of the president’s multiple administrative chumps, Steve Mnuchin, provides a case study in the only appropriate way one can respond to such arrogant bluster: consistently raising the point (“reclaiming my time”) of our administration’s inadequacy, incompetency, and seemingly interminable disrespect towards the citizens whose interests it has been charged to uphold.

belinda-brown-1024

Dr. Belinda Brown: carrying on with conviction and humor. © 2016, HBO Networks.

Likewise, in Season 2 of Vice Principals, Dr. Brown brilliantly dismantles Neal Gamby’s initial hypothesis regarding his violent assault at the culmination of Season 1: upon being accused of Gamby’s attempted murder, the former Jackson High Principal scoffs at this suggestion, instead drawing Gamby’s attention to a tattoo she has had affixed to her back—depicting her two former vice principals actually eating shit, while smiling and amorously holding hands. It’s her own personal idea of revenge: a gesture that hilariously highlights the racial divide at the heart of Season 1’s tension. For whereas the white male testosterone pumping through Gamby’s and Russell’s systems repeatedly compels them to acts of childish violence and lashing out, the cool “been there, done that” attitude of Dr. Brown—whose past experiences with indignant white men can only be imagined by the viewer—empowers her to keep her calm and carry on with humor and conviction: two things the country (if not the world) is in most dire need of now.

It has yet to be seen how the remainder of the series will play itself out. As Russell and Gamby delve deeper into their farcical investigation of Gamby’s shooting, one can’t help but think of the President’s own glorified wild goose chase: to single out his dissenters, and thereby satiate his acolytes with a gushing fountain of persecutory accusations directed at the liberals they all thumbed their noses at this last election (or, to expand upon this metaphor with an even more precise one, the noses they cut off to spite their own faces). Two well-played scenes in the most recently aired episode serve to highlight this real-life parallel: in one, Gamby enlists a black security guard from the school to search the lockers of multiple black students, all of whom he has targeted as prime suspects for his attempted assassination (without a shred of evidence, of course). After finding nothing but homework, textbooks, and a scientific calculator in one boy’s locker, the security guard observes in a disheartened tone: “Man, you actually made me think he was guilty!” The other scene in question entails Russell planting a hot mic in the teacher’s break room, in order to tune in to the gossip taking place behind his back (most of it directed at his gaudy wardrobe, social awkwardness, and apparently deadly halitosis): when he later proceeds to fire his entire faculty for subversion, one immediately thinks of Sean Spicer, Steve Bannon, Reince Preibus, Sally Yates, Michael Flynn; the Mooch.

For some prospective viewers, this will all prove a little too much too soon. And yet, in bringing ourselves to truly care about the fate(s) of Gamby and Russell—in wanting them to get at least a little woke; to stop being such selfish assholes, and to play a little bit nicer—there’s a chance we might bring ourselves to care a smidge more about the fate of this altogether asinine administration, along with the misguided minions who stubbornly refuse to withdraw their support for it. In turn, and for better or worse, it is they who now dictate the fate of our nation.

Indigestion’s a pain.

I found myself in the midst of an especially bad bout last night, tossing and turning in bed, struggling to fall back asleep. In such instances, I occasionally find myself achieving a heightened level of awareness and concentration: as if hyper-awareness of one’s natural (or unnatural, as the case might be) biological functions carried with it an increased sensitivity to other surrounding circumstances.

In this instance, I found myself dwelling upon a recent essay in-progress, which seems to be going nowhere slow. The subject of my reluctant essay is the suburban experience (more specifically: American films that have explored suburban themes in a Mythical vein). It’s one of those frustrating instances where the writer knows what he wants to convey—even how he wants to convey it—but once all the pieces are lined up together, they no longer convey what was meant to be conveyed.

I’m reminded now of a startling incident that occurred earlier in my workday, as I was driving a client back to her residence—which was located in a somewhat run-down suburban neighborhood. As we drove past some smartly structured houses, I offered some casual observations to break the silence of the drive—small talk about some of the more striking residences, many of which featured alarmingly pointed rooftops. It was then that my client interjected a most unexpected anecdote: “Yeah… A lady shot her two kids in the head last night, over there by that school. I guess she had told the cops the world was a terrible place, and she didn’t want them living in it anymore.”

Understandably, I found myself at a loss to form a suitable response. I’m certain I said something nominal and insufficient, something along the lines of “that’s horrific,” or “how terrible.” It was a jolting reminder of just how fleeting and cruel this life can be. It also underscored the inadequacy of my writings on suburbia, which paled in comparison to this shocking anecdote—having failed to represent the surreal perversity of the suburban experience, in its full scope. A recently released Sun Kil Moon record came to mind, as well. In the opening track, “God Bless Ohio” (a follow-up, of sorts, to the preceding “Carry Me Ohio”), songwriter Mark Kozelek pays tribute to the Northern gothic elements of Midwestern living, touching upon a range of suburban issues: alcoholism; A.A. meetings; the loneliness of being a child; nursing homes; psychotherapy; human trafficking; mass killings.

Maybe I should just scrap my essay and let Kozelek’s song speak for me, instead.


Sleeplessness has been a recurring motif of 2017 for me. During the day, I frequently find myself struggling to concentrate on basic tasks—easily distracted by the latest development in the investigation of our president’s relationship with Russian oligarchs and government operatives, as well as the on-going toll of devastation mounted by a conscience-free Congress and an administrative agenda fueled by corporate greed, short-term private gain, and a stiff middle finger to the vast majority of our country’s population. I was struck by a recent episode of Bill Maher’s show on HBO, in which Dr. Cornel West and David Frum were guest panelists. In an exchange that was (admittedly) cringe-worthy at times, Maher and West sparred on the subject of the 2016 election: West, who was outspokenly opposed to another Clinton presidency, stood by his idealistic decision to not vote for either of the primary candidates; Maher challenged his decision with an itemization of some notable areas in which the two primary candidates differed from one another, with an emphasis on the compounded harm being inflicted upon minority groups by 45.

Hearing Cornel West’s voice rarely fails to bring me joy: his combination of humor, zeal, and intellect is unsurpassed by his few peers, and his perspective is fiery but reasonable. Watching him spar with Maher on this issue brought to light the deeply personal nature of his investment in politics, and I found myself torn between two equally impassioned points of view. As I think back on the debate, I’m struck by the awkward correlation between religion and politics in this country. Apart from the obvious investment of religious power in American politics, it strikes me that politicians in this country are frequently placed on a similar plane to religious leaders: they are often evaluated as much on abstract moral principles (or lack thereof), as they are on competencies and qualifications. West makes it clear during the debate that his opposition to Hillary Clinton was of a moral nature—a perceived “lack of integrity,” as he defined it. On the flip-side of the argument, we find the pragmatism of the vehemently atheistic Maher, who is able and willing to look past the character flaws of a given politician in order to hone in on the practical, real-life outcomes of their stances and actions.

Setting aside my love of Dr. West (and that tremendous laughter of his), I cannot help but feel a sense of exasperation at our country’s obsession with bringing religion into all facets of life. I’m reminded of an observation shared by a philosophy professor I had in college, who attended multiple symposiums at home and abroad, only to find that European nations have little (if any) of the political hang-ups our country has developed in this regard. Theories of evolution and creationism coexist peaceably; women, atheists, and non-Christian theists are allowed to hold public office without controversy; and outside the Vatican (a unique religious outlier, if ever there was one), it’s unanimously agreed that religion ought not to be a deciding factor in economic and social policy. I think of David Fincher’s American film masterpiece, Se7en, in which the seven deadly sins of Christian folklore provide the foundation for a rigorously coherent series of horrific murders. I think also of real-life horrors committed by the Ku Klux Klan (a white Christian organization); the so-called “conversion therapies” imposed upon gay people in Christian communities; the persecution of victims of rape, in an assortment of forms, under the alarming guise that their assaults may have been “God’s will;” the historical genocide of Native American people, performed in the name of a Christian God and country.

“God bless Ohio
God bless every man
Woman and child
God bless every bag of bones, six feet under the snow
God bless O
God bless O
God bless Ohio”

I think of the recent terrorist attack in Manchester, which stole the lives of 22 unsuspecting concertgoers and injured 120 others. (I will refrain from making mention here of the terrorists responsible for the attack, or the religion of which their organization is a perverse offshoot, seeing as how they have gathered sufficient negative publicity over the years—and it doesn’t seem to be helping any. Perhaps it is best to remove the plank from one’s own eye, first.) I think of all the different religions in the world that provide a foundation for the most appalling crimes against humanity, and I think of the unscrupulous support lent to our current administration by millions of American Christians. I think of that genius of early American cinema, Ernst Lubitsch—having just watched Trouble in Paradise for the first time the night prior. I think of the excruciating cleverness of Lubitsch’s characters; the hilariously amoral, yet totally functional relationships they foster and maintain with one another. I think of Jorge Luis Borges’s beautiful and unassuming essays, compiling assorted theories of eternity and ontology: the power of the human mind to overcome the self-inflicted impositions of religion—and the seeming refusal of the human spirit to embrace the assets of pragmatism. I think of Morrissey’s early song for The Smiths (“Suffer Little Children”) about two highly pragmatic, non-religious sociopaths from a separate, but equally dark chapter in Manchester’s history (Ian Brady and Myra Hindley). I think of the silence on the moors where their innocent victims were slaughtered; I think of the screams and explosions that jolted Manchester Arena on this god-forsaken Monday night. I think also of the solace offered during a non-religious vigil held in Manchester on Tuesday, to mourn lost lives and lost innocence; and the open gestures of solidarity extended by individuals and cities around the world—none of which required the pretense of religion to achieve their intended message.

Oh, human (t)error:
So much to answer for.


I’ve thought a lot (and continue to think) about the ways in which the jolt of last year’s election outcome sent shockwaves pulsing through every facet of the American experience—many of which we have yet to fully appreciate (or, in some cases, even to recognize). I’ve noticed tiny paradigm-shifts taking place in areas of everyday life, some of which are so minute they might be disputed as misperceptions. For instance, there’s the weekly program CBS Sunday Morning, formerly hosted by Charles Osgood and currently represented by Jane Pauley: previous segments on ecology and environmental issues have accentuated the well-documented, factual impact of climate change upon different parts of the planet (many of which provide source material for the show’s closing “moment of nature”). In the most recently aired episode, Jane visits the city of Amsterdam, where she is forced (as commentator) to acknowledge certain obvious changes in the landscape—including a visible rise in the sea level, and subsequent changes in irrigation. A phrase she uses in this segment has been stuck in my mind all week: “whatever the cause.” As in, “whatever the cause of these changes…” As if the matter were still up for debate.

I think of the shifts in media coverage that have historically accompanied drastic regime changes in different countries throughout the world. I wonder to myself how long it might have taken for Mussolini’s state-operated propaganda machine to fully infiltrate popular Italian knowledge, or for Lenin to convince his minions of the evils of Western living.

I imagine this essay reading like a poor man’s attempt at a Mark Kozelek ramble. I’m reminded, again, of my meandering essay on the suburban experience—and how truly difficult it can be to write about something when you actually have some pre-existing knowledge of it (in contrast to the old adage). In a way, such a task is even more difficult than writing about the unfamiliar: at least then, one can quite easily acknowledge and convey the limitations of one’s lived experience. But in the case of a subject that lies close to home, the writer is expected to have some sort of preternatural grasp on the topic—a near-omniscient, no-stone-left-unturned level of understanding. Maybe this is why so many Americans are turned off when a politician fails to publicly answer a question with utmost knowledge and understanding of their personal interests: instead, they’re expected to be godlike magicians, sauntering into town on the campaign trail and telling everyone exactly what they need (or, more commonly, want) to hear. God forbid a politician should ever be heard saying those three dreaded words: “I don’t know.” Far better to hear someone say: “I am your voice… I alone can fix it.

* * *

I think of the recent return of David Lynch and Mark Frost’s much-beloved television series, Twin Peaks. I think of what a tremendous joy it was, watching those first two hours of this new 18-part series—momentarily forgetting about issues of popularized ignorance and man-made atrocity (of both the religious and the non-religious variety). I’m grateful for creators—true creators—like Lynch and Frost, who seemingly have made it their lot in life to build upon and restore popularity to Myth (the only human creation that continues to transcend pure reason and pure religion). It makes me feel lucky to be alive, to witness the brilliant and awe-inspiring fruits of their efforts. I hope these efforts—and the efforts of other keepers of the flame—are ample enough to keep the Myth alive, for all the atrocities that are coming down the pipeline.

And I continue trying to shake my hyper-awareness of how terrible things have gotten. I continue trying to just live life, for what it’s worth, and not let it bring me down. But damn: indigestion’s a pain.

© 2010 IFC Films

© 2010 IFC Films

I missed Tiny Furniture during its brief run at my local art house theater, but I was intrigued by the advance trailers. When I finally caught it on Netflix (what a truly dreadful way to define one’s initial experience with a film), I felt simultaneously disappointed and exhilarated; to this day, I think both terms remain applicable when defining my feelings as regards Lena Dunham’s rising career.

For starters, I should explain the exhilarated half of my conundrum. As a twenty-six year old gay man, I cannot help but empathize with Dunham’s conflicted portrayals of aggravated modern existence; especially in Tiny Furniture, where she taps directly into this sense of built-in apathy that so thoroughly pervades her (and my) generation. The dilemma of severe desperation perceived as laziness—a condition whose authenticity I can readily vouch for—has rarely been captured so astutely. The fact that she is quick to confess the shred of truth inherent to this perceived laziness makes her portrayals all the more endearing, at first glance.

My disappointment with Dunham arises from a combination of things: her derivative reference to creative influences (especially Woody Allen), her sometimes overly-abrasive characterization—which can be downright hateful on occasion—and her inclination towards framing problems so as to negate any possible solution. This last item is the most discouraging, and it carries a definitive historical precedence in American film, perhaps warranting a brief overview for the purpose of better understanding the topic at hand.

During the early seventies, a phenomenon occurred in American film, which some characterized as the “Easy Rider syndrome.” It entailed an almost misanthropic obsession with the futility of good intentions—stemming in large part from the disappointment of the failed hippie movement, the multiple political assassinations, and the ongoing war in Vietnam. It did not take long for film studios to realize the profit potential of capitalizing on this prevailing hopelessness, and beginning with Easy Rider, many a downbeat premise was greenlit and financed for major distribution. Some filmmakers—like Sidney Lumet, Bob Rafelson, and Robert Altman—seized the opportunity to make lasting works of art on a large scale that would not have been possible if despair weren’t so culturally en vogue; the strength of films such as Network, The King of Marvin Gardens, and Nashville arises not so much from the time of their making as from the timelessness of their vision and insight. Many films of this subgenre, however (which might as well have been dubbed “futility chic”), were just self-indulgent portraits of a world devoid of hope, humanity, or any saving grace.

Needless to say, it did not take long for the popularity of this trend to wane; after all, an audience can only be told that life is shit so many times before it either buys the message and gives in to suicide, or grows weary of the helpless sermonizing and sets out in search of some light at the end of the tunnel (Star Wars?). The unfortunate side of this return to optimism was the disappearance of the likes of Lumet and Altman from the mainstream. The pendulum swung so heavily in the direction of vacuous entertainment that many filmgoers surrendered the prospect of having to think altogether; of course, it didn’t help that the new batch of film school “auteurs” was so eclectic in intent as to lack any cohesive drive: Bogdanovich wanted to make movies that belonged to a previous place and time (namely, America in the fifties), Scorsese was violently trying to bring the sensibilities of early American filmmaking up-to-date, while Spielberg and Lucas just wanted to entertain at (quite literally) all costs. It should come as no surprise which direction audiences were most driven towards.

Presently, there are more options available for independent artists to create and distribute than there have ever been before; consequentially, there is also an inundation of artists within most every medium that—coupled with the overwhelming resources of the Internet—is making it increasingly difficult for individual voices to stand out and be heard. (By comparison, the Bogdanovich-Scorsese-Spielberg splinter effect was a highly focused and calculated division of interests). With this in mind, I could never have foreseen the magnitude of Lena Dunham’s success, and I would be lying if I said I wasn’t somewhat delighted initially—after all, this is a natural human response to seeing a “dark horse” climb ahead of the pack. But this initial delight only made my subsequent disappointment more profound.

As I watched the first season of Girls on a weekly basis, I felt much as I did watching Tiny Furniture for the first time: amused, startled, and somewhat exhilarated. There is an undeniable spark to much of Dunham’s writing, and at their best, episodes of the show play like well-tailored short stories. Tiny details emerge from the whole to form an unexpected outline, and the characters all seem blessed with a livelihood they are frequently stripped of in mainstream television. It also didn’t hurt that Dunham and her casting directors had assembled one of the most riveting and dynamic groups of young acting talent in recent memory—in particular, Adam Driver, Alex Karpovsky, Jemima Kirke, and Zosia Mamet. All the elements combined to make for a rare combination of sitcom-structured humor and self-critical social commentary. Even with the benefit of syndicated viewing (on demand or on DVD), it is difficult to pinpoint where it all started to go sour; the more I think on it, I get the impression there was a fly in the ointment all along.

A large part of my excitement behind the show’s first season stemmed from the fact that such a young writer/producer had been granted such enormous creative privilege, both from a financial standpoint and a platform vantage. One gathered the impression upon viewing Tiny Furniture that Dunham was still unsure as to what it was exactly she wanted to say but, like Fellini’s doppelgänger in 8 1/2, she was going to say it anyway. Four years have gone by since the debut of that first “major” feature, and I’m quite convinced that Dunham is still unsure as to what it is she’s trying to say. The shark-jumping moment (for me) occurred somewhere during this season’s subplot about the “e-book” her character is attempting to publish (though I’ll readily admit that the appearance of Gaby Hoffman as Adam Driver’s deranged sister hasn’t helped in restoring my estimation of the series, either).

A running joke of the series has been Hannah Horvath’s obliviousness to the ramifications of her privileged upbringing; though Dunham insists she is playing a character, the similarities with her own personal background are undeniable and, to an extent, admirable. After all, honesty is one of the most neglected traits in contemporary teleplays, and the brutality of Dunham’s analysis is oddly reassuring at times. “If many young folks are truly this devoid of personal values and good judgment, at least they are aware of it”—this is the message one might have taken away from the series. But as Ruth Gordon said to Bud Cort when he shared his love of a local junkyard with her in Harold and Maude, “is it enough?” And this is my question to the show’s ardent admirers: Is it enough to just admit one’s lack of sensibility, episode after episode? Is it enough to say, “yes, young people are this deplorable—but look at the world they live in: how can they be otherwise?” Isn’t this just an extension of the Easy Rider syndrome?

I am taking the time to pose these questions in writing because I still believe Dunham to be a very talented and creative individual, and I find it inspiring that she has obtained (and maintained) such high visibility in a culture riddled with illiteracy and rampant diagnoses of ADD. There is an almost literary quality to much of Dunham’s television work, and it is encouraging to see that young people are willing and able to respond to it; this, in and of itself, throws a little light upon the lie of the show’s increasing cynicism. Although I was willing to overlook Dunham’s lack of focus at the start of the series, figuring she would find her way through the process of writing these episodes and ultimately find the words to express her obscured intentions, it’s all become more than a trifle tiring. Perhaps the truth of the matter is that Dunham is not merely unaware of what she’s trying to say; perhaps she’s not trying to say anything at all, in which case my disappointment is doubled. In a day and age where artists have to struggle against every sort of media configuration (Facebook, Twitter, YouTube, and other “likes”-driven outlets—all ruled by a mob mentality) to maintain their purity of voice and complexity of reasoning, it would now appear as though she climbed to the top of the media mountain without any camping gear.

It seems to me as though Dunham has at least two options at her disposal for recapturing the original appeal of Girls: she could attempt focusing more on the social and/or environmental circumstances that are shaping her characters’ behavior (Hannah in particular), or she could attempt to offer some solutions to their increasing nihilism. Although Adam is still the moral compass of the series—a fact which was once touchingly ironic, but is now nothing short of depressing—it isn’t helping that we see less and less of him. Even Ray and Jessa, the voices of reason complementing Adam’s sense of right and wrong, have become increasingly marginal and, like the other members of the core cast, much too confused. One gets the unfortunate impression that Dunham is striving for some trendily despairing form of artistic integrity here (much like those films of the early seventies), and if this is the case, the joke is on her: she appeared more in control, and demonstrated far greater clarity of artistic vision when she was just writing out of love for the characters.

This is my ultimate disappointment with Dunham’s award-winning series, really: it is starting to lose its natural sense of social consciousness. I firmly believe in the necessity of raising awareness for the problems faced by the youth of today, and Girls has succeeded in doing this at times. For example, many folks are unaware of the strain placed by the baby boomer generation’s postponed retirement on the employment opportunities of young people; many young folks are having to lean on their family resources, “sponging” off of their parents to make ends meet and struggling to maintain a shred of optimism in their own future career opportunities. Some become so numb to their own perceived ineptitude that they never really grow out of the cycle of filial dependency—a problem that, as shown with Hannah in the first episode of the series, can be compounded by their parents’ reticence to cut the financial umbilical cord. In Tiny Furniture, the issue of an inadequate minimum wage was aptly satirized by a single close-up of Dunham’s first paycheck from her job as a restaurant hostess: we know from the financial logistics of New York City living that she would practically have to work an 80-hour week to make this wage meet the cost of living. In short, the problems Dunham examined early on in the show were problems that most of her audience could relate to directly; those who couldn’t, could at least appreciate their validity and significance.

In the latest season of Girls, there is not a trace of down-to-earth sensibility to be found. It truly seems as though the financial success of the series has gone to Dunham’s head: it can most clearly be seen in Hannah’s sociopathic insistence on having her e-book published, (deliberately) refusing to pause for an emotional breath when her agent is unexpectedly found dead. This particular development reads more like the rotten fruit of an over-indulged celebrity than the keen observation of a Midwestern twenty-something. I am not implying that she should be prohibited from showing despicable behavior among her characters (hasn’t she been doing that all along?), but the basic rules of storytelling dictate a certain minimum of respect for one’s characters and audience—and both are sorely missing as of now. The depth of Hannah’s misanthropic leanings is possibly starting to reveal more about Dunham’s displeasure with herself than it is about her character’s pathology, and I don’t think anyone wants to sit through this much self-disparaging analysis (no matter how big a fan of Woody Allen one might be). The show has effectively become an ugly caricature of its former self.

In closing, I would like to reference something I once read in a collection of Pauline Kael’s writings—something which has popped into my head on numerous occasions while watching the past few episodes of Girls: “Allowing for exceptions, there is still one basic difference between the traditional arts and the mass-media arts: in the traditional arts, the artist grows; in a mass medium, the artist decays profitably.” I am still holding out for the possibility of Lena Dunham proving herself a rare exception to this rule, but she certainly has her work cut out for her now.