a contemporary reading of The Marriage of Maria Braun


Hanna Schygulla confronts a history of patriarchy, racism, and genocide in the surprisingly comedic film masterpiece, The Marriage of Maria Braun. Now available in a new 4K restoration from Arrow Home Video (UK/Region B). © 1979, Westdeutscher Rundfunk.

The Marriage of Maria Braun ends with one of the most unforgettably jarring sequences in modern cinema. After attending a reading of the will for her deceased business partner (and one-time lover), the film’s titular character—a career-topping, bravura performance by Hanna Schygulla—steps into the kitchen, to light a cigarette from a stove-top burner. Not realizing she left the gas on the last time she had lit up, Mr. and Mrs. Braun both go up in flames, inside the house she slaved away the film’s entire duration to finance and furnish. With barely a second to register the shock of this sudden and fairly calamitous conclusion to the film’s engrossing narrative, Fassbinder boldly stamps the film’s end credits across the screen immediately after the explosion—with the film’s coda (the attorney and another beneficiary, having stepped out the front door just in time, quickly turn around and gasp in horror) playing out in the background.

Superficially, the finale constitutes a real shocker: one could argue it as a perverse variation on the deus ex machina, in which the protagonists are “saved” from the slow and silent death of their bourgeois existence (or one could engage in heated debate about whether or not the move was intentional on Maria’s part). On second viewing, however, it becomes apparent that not only does Fassbinder plant seeds of foreshadowing throughout (e.g. the multitude of scenes in which we see Maria lighting her cigarette on the stove; or the slightly paradoxical image of her pouring cold water over her wrist, in an apparent prelude to suicide, mere minutes before the final explosion), but that furthermore, the entire picture makes clearest sense when read as a comedy. A profoundly irreverent and socially subversive one, at that; but aren’t all great comedies? And is it not possible that Fassbinder took inspiration for this diabolical denouement (which was not present in Peter Märthesheimer original scenario) from the greatest—and darkest—dark comedy yet projected on the big screen, Kubrick’s Dr. Strangelove, or: How I Learned to Stop Worrying and Love the Bomb?


Maria Braun, lighting one of her last cigarettes from the kitchen stove-top. © 1979, Westdeutscher Rundfunk.

Setting aside speculation, what has been made apparent to this viewer by the text of this truly remarkable film—which merits inclusion on any list of great films released in the 20th century, as well as topping the list of Fassbinder’s own greatest achievements (although there are many contenders to choose from)—is Fassbinder’s uncanny ability for manipulating dramatic forms and genres; conveying radical ideas within a widely accessible medium, and restoring purpose to dramatic forms that have been stripped of their social significance through decades of authorial misuse. In The Marriage of Maria Braun, Fassbinder places his usual emphasis on the absurdity of social conventions, in order to raise the viewer’s awareness of their own enslavement—all the while exaggerating the punchlines beyond the parameters of superficial amusement. It’s hard not to chuckle, for instance, at the sight of Fassbinder in a cameo role as a black marketeer; or to laugh knowingly at Schygulla’s response, when asked if she might be interested in a collection of (German philosopher/dramatist) Heinrich von Keist’s writings: “Books burn too fast, and they don’t warm you up.” And it’s harder yet to distance ourselves from the pragmatism that underlies the absurdity of his characters’ situation, seeing as how it is the same pragmatism that underlies the entirety of modern existence in the so-called “developed” world.

If Maria Braun represents an ideal comic representation of mid-twentieth century ennui, one can only scratch one’s head as to what the 21st century equivalent might be. In the medium of film, the genre appears to have been primarily co-opted by sketch comedians, and perpetrators of that most dreaded cinematic invention of all: the “high concept” movie. Certain contemporary filmmakers, including voices as eclectic as Mike White, Judd Apotow, Greta Gerwig, and Jordan Peele, have tackled the genre from a slightly more idiosyncratic angle; but much of our mainstream comedy fare remains grounded in a soundbyte-oriented definition of comedy as a situational experience, as opposed to the broader definition of comedy as existential experience. White has proven an exception to this rule—with works such as the HBO series Enlightened and the Selma Hayek vehicle Beatriz at the Dinner pushing the laughs aside, in favor of bleak desperation and post-post-modern angst; and Peele has garnered significant accolades and audience super-fandom for his depiction of racial tensions in his Oscar-winning directorial debut (Get Out), though one could pose the argument that he settles too easily for ideological clichés and formulaic horror tropes—as opposed to pushing the more radical undercurrents of the film’s subject matter. In both instances, we find artists consigning themselves to an either/or dilemma between hope and despair; comedy and horror; provoking thought and proselytizing.

By comparison, Maria Braun remains—in all its cinematographic luster (courtesy of the late, great Michael Ballhaus) and historical incisiveness (courtesy of Fassbinder’s commitment to doing his homework, at all times)—a viable alternative to the polemical standards of comedic storytelling currently trending. Asked to choose between converting the viewer’s attitudes and provoking the viewer’s thought process, Fassbinder inspires independent thought as a vehicle for behavioral conversion; torn between comedy and horror, Fassbinder settles on melodrama as the ultimate popular genre—painting everything in bold colors and brushstrokes, then letting the audience decide whether to laugh or shriek. Finally, at the crossroads of hope and despair, Fassbinder chooses anarchy; carving out the shortest path between two points, and revealing the roundabout nature of mankind’s often senseless travails. Like the Marx Brothers before him (arguably his closest and least frequently acknowledged cinematic relatives), RWF betrays no agenda for social change in his film texts: instead of telling us what we need to change (or how), he accepts that the ultimate purpose of comedy is to reveal society as-is to be little more than one big farce. Unlike the Marx Brothers, who seemed content with savaging social conventions only to end up reinforcing them, Fassbinder was fed up and ready for a bigger change. And while his work has noticeably inspired contemporary queer filmmakers, from John Waters to Todd Haynes to Wong Kar-Wai, there’s an apparent scarcity of post-’70s film efforts dedicated to pushing radical liberal thought through popular genre forms (interestingly enough, the most successful efforts appear to have been in television: such as Norman Lear’s Mary Hartman, Mary Hartman and All in the Family, or more recently, Roseanne).


Roseanne Barr and John Goodman revive their widely beloved (and occasionally reviled) characters in the ABC sitcom, Roseanne. © 2018, ABC.

Though he frequently exhibited a hot-tempered impatience in his personal life, RWF displays a practically infinite patience throughout his work: always willing to break down the mechanisms of social oppression into ever-smaller moving parts (for ease of comprehension), one is hard-pressed to find a filmmaker in the 20th century as dedicated to making the world a better place. (A reality that often gets lost in popular interpretations of his work; including the exhausting documentary, The Story of Film: An Oddyssey by Mark Cousins, in which the writers focus primarily on allegations of misogyny in the artist’s personal relationships—before transposing these allegations onto his work; highlighting only The Bitter Tears of Petra Von Kant, and ignoring the entirety of his remaining forty-odd films and television programmes). Realizing early on in his dramatic career that progressive cinema stood on the verge of atrophy via overly cerebral discourse (Godard and Pasolini) and increasingly esoteric forms (Antonioni and Resnais), Fassbinder took a prescient and decisive step back in the direction of a more universal film language. (Pasolini followed this move some years later, with his Trilogy of Life, only to feel betrayed by consumerist imitators and return to a more overtly radical cinema in his final epitaph). And as our present-day cinema persists in reinforcing the divide between art and commerce, it’s a move that merits study and, quite possibly, repetition.

In order for the reader to fully appreciate the brilliance of The Marriage of Maria Braun, a preliminary viewing of Fassbinder’s recently re-discovered TV mini-series, Eight Hours Don’t Make a Day, may first be in order. And considering how rarely this obscure work of Fassbinder’s had been screened outside of his homeland (up until this past year’s Arrow Home Video release), an international critical re-evaluation of Maria Braun—among other later works in the director’s filmography—could make for an interesting and illuminating dialogue. Frequently blacklisted by contemporary critics as a self-hating homosexual pessimist, Fassbinder is seen at his most bouyant, hopeful, and resilient in Eight Hours. Even after one takes into account his proposed follow-up episodes to the five installments he produced, in which things were slated to take a darker (dare I say, more Fassbinderian) turn, it’s difficult to dismiss the radiant joy of Luise Ulrich’s Oma, Werner Finck’s Gregor, or Schygulla’s Marion. More than any of his other works—televisual or otherwise—Eight Hours Don’t Make a Day is the work of a filmmaker fully convinced by his characters’ ability to transcend the belittling dynamics of their circumstances.

In this context—and following a run of increasingly bleak portraits authored by the filmmaker mid-decade (Fox and His Friends; Fear of Fear; I Only Want You To Love Me; In a Year of 13 Moons)—The Marriage of Maria Braun presents one of the most resilient characters in the Fassbinder universe. Not unlike Erwin/Elvira before her (in 13 Moons), Maria Braun is a woman oppressed by generationally perpetuated societal constructs. But whereas the personal turmoil that lay beneath the surface of 13 Moons (whose premise was inspired by the suicide of Fassbinder’s lover, Armin Meier) contributed to an intensely impassioned work, in which the line separating individual villainy from broader mechanisms of oppression was often blurred beyond recognition, the formally mannered melodrama of Maria Braun allows for a more advanced level of intellectual and emotional clarity. Which isn’t to say that passion is withheld from Maria Braun; but instead of enmeshing his characters in the dark web of his own private passions and hang-ups, here he permits the protagonist to revel in passions of her own. In this and other regards, The Marriage of Maria Braun can be argued as the most explicitly feminist work he ever produced.


Maria Braun waits in the rubble of war-torn Germany for the unlikely return of her husband. © 1979, Westdeutscher Rundfunk.

Maria Braun is as passionate a character as you’re likely to find in the films of Fassbinder. Married to a German solider (Hermann Braun, played by Klaus Löwitsch) for one day, after which he is sent off to fight in the trenches of WWII, Maria spends the first half of the movie standing on the platform to the nearest train station; a sandwich-board, bearing her husband’s name and photo, slung over her body—to solicit information from passersby who might have the details of his death or survival. When she finally decides to throw in the towel, discarding the sign on the railway tracks and heading to the American G.I. bar for some action, no viewer can reasonably bring themself to blame her for betraying her marital vows. (And a couple scenes later, Maria is informed by a friend’s husband—just returned home from the front—that her beloved Hermann has, in fact, died in battle). Confronted with the stark and powerful imagery of dilapidated streets and buildings—places where people once lived and raised families, turned to rubble by tanks and air raids—the viewer cannot help but recognize the tragicomic absurdity of Maria’s situation (let alone the absurdity of the institution of marriage, as perpetuated by the patriarchal lineage of Western lawmakers). This is the second major precipitating moment in the film’s comedic chain reaction—the first being its titular wedding.

Over the course of the picture’s two-hours-and-spare-change runtime—unfolding briskly and economically—Maria Braun finds herself (and her passions) repeatedly cornered by twists of fate that might never have occurred, if not for the man-made boundaries and expectations imposed upon her. First, she experiences forbidden love with a somewhat older, African-American G.I. (Bill, played gracefully by George Eagles), who teaches her English and loves her with an evident tenderness. Forsaken by the racist and ageist “civilization” by which she is surrounded, Maria is again befuddled when her thought-to-be-deceased husband returns home—alive and in one piece (apparently, her friend’s husband was privy to false information). True to comic form, his return coincides with a sequence of playful lovemaking between Maria and Bill. Pushing the absurdity of the scenario even further—until it reaches the fundamentally absurd parameters of credibility itself—Maria breaks a bottle over her lover’s head, knocking him dead to the bedroom floor. (Sped up and stripped of its synced sound and full frontal nudity, it might’ve made for a memorable bit in a silent Chaplin comedy).

Episodic dominoes continue to tumble, as Hermann chooses to take the blame for his wife’s crime, rather than endure an in-detail spoken testimony from Maria on the subject of her inter-racial affair. With Hermann sentenced to an indefinite amount of time in prison, Maria finds herself back at home (“without a man”), with a yearning desire to make it up to her husband; a desire that is shown, in the unfolding drama, to be part social imposition, and part genuine passion (ultimately, is there a difference?) Studied dialecticians both, Schygulla and Fassbinder appear in this film to be more psychically and theoretically attuned than in any of their other collaborations—some of which were amateurish (Rio Das Mortes, The Niklashausen Journey), most of which were good (Lili Marleen, Pioneers in Ingolstatsd), and a number of which were spectacular (The Third Generation, Effi Briest, The Bitter Tears of Petra Von Kant, and Eight Hours, to name but a few). But only in Maria Braun and Eight Hours do we find Schygulla and Fassbinder bending overtly towards love. For in both works, the creators seem to be banking—albeit obliquely; tongue occasionally planted in cheek—on the possibility of transcending the oppressive bullshit of humanity.

8 hours

Irm Hermann (foreground) and Hanna Schygulla (background) work to transcend the bullshit of humanity in Eight Hours Don’t Make a Day. © 2017, Arrow Home Video.

We see it in the crucial penultimate scene, in which the will and testament of Schygulla’s deceased business partner (Karl Oswald, played by Ivan Desny) stipulates her as a shared beneficiary with her “wronged” husband. For while Oswald had only met Hermann once before (during a prison visit), he has projected his own unrequited love for Maria onto Hermann, subsequently choosing to subsidize Hermann’s existence upon his release. Recognizing in Maria’s husband a devotion that she, herself, reserves for Hermann in the final act (a devotion which will accompany her to the grave, unstated and unrewarded), Oswald appears driven by a combination of patriarchal impulses and personal pride to take Maria down a notch. And as they listen to his condemnation of Maria’s perceived coldness, the camera lingers on the couple’s faces (Maria’s in particular), revealing a shared response of sadness: sadness at the implication that one might have loved the other any more or less than they themselves were loved. (I feel compelled here to highlight the simple joy produced in this scene, as we are granted the frequently censored opportunity to watch a character think on-screen.) All told, Maria is shown to have been most persistently oppressed, by a multitude of social institutions (including the very manner in which her husband feels compelled to express his “love:” possessively and apologetically). In this moment of clarity, we—viewers and protagonist alike—experience a genuine breakthrough; and while Maria is ultimately driven to despair by her circumstances (and by the life choices they have inspired), her inertia makes room for the viewer’s own emancipation.

Taken at face value, the ending to Maria Braun’s saga may seem an unwarranted after-thought. But in the realm of the Sirkian melodrama, nothing can be taken at face value—least of all the ending: for the more incredulous and tacked-on the conclusion, the more urgent the viewer’s responsibility to read through the lines of its manufactured essence; to identify the reality beneath the facade—the truth that social convention will not allow to be spoken aloud in polite company (most commonly represented by the “happily ever after” motif, which masks the unlikeliness of utopia being achieved in real life). Under these conditions, the question then becomes: What truth is being withheld by the ending to Maria Braun? Multiple interpretations hold up to scrutiny; in this writer’s opinion, it is a somewhat shocking (considering the source) acceptance of the possibility that people might actually live happily ever after. At least, it’s as absurd—and therefore plausible—an outcome as the next.

A precursor to the “women’s lib” and “free love” movements, women like Maria Braun—who most certainly existed in the days of the Economic Miracle, by all historical (and hereditary) accounts—represent the more progressive side of re-education in the wake of WWII. This movement was prompted by a younger generation (the children of Maria Braun), compelled towards an understanding of the horror to which so many of their ancestors paid witness (and in which many were complicit), and by their longing for a world stripped of the factors that made the horrors of the holocaust possible in the first place (namely: the dual evils of mass industry and mass ideology). The finest and most influential voices of this generation would go on to shape utopian cultural movements for decades to come.

For many of this new generation, Germany had re-experienced Year Zero. Things had to change; or else, what good were any of them? Radical trends, both superficial and profound, ensued in all areas of the New German youth culture. From music (Can, Kraftwerk, Cluster) to film (Wim Wenders, Werner Herzog, Volker Schlöndorff), from literature (Gunther Grass and the Vergangenheitsbewältigung movement) to theater (Action-theater and Anti-theater; in both of which Fassbinder played a significant role), a sea change was palpably taking place. And as with most movements, politics would lag behind, eventually catching up out of necessity; for a government can only be as functional or as deplorable as the culture out of which it has formed. Indeed, there was plenty in the immediate post-war period that remained lamentable—both culturally and politically speaking. Inundated by so-called schlager-rock, dumb b-movies masquerading as high art, and former Nazis being (re-)elected into office, the kids of the German New Wave collectively realized that something had to give before they could evolve as (a) people.


Legendary music innovators Can took Germany (and the world) by storm during the decade of the New German Cinema. Circa 1972; photo credit unknown.

This provocation for change would have its more violent exhibitions, such as in the notorious Baader Meinhof/RAF incident and the emergence of neo-nazi subcultures; but it would also yield such tender works of art as the film subject of this essay. A work that shares an equal love for mankind and womankind (and all in between), while simultaneously pointing to the oppressive mechanisms—instilled from one generation to the next; cycling through phases of industry, depression, and recovery—that render this utopian love such a challenging concept to maintain. After all, if it weren’t for war, Maria may never have thrown her husband’s picture onto the railroad tracks; or engaged in a romance with an American soldier; or adopted all the negative and aggressive (and predominantly male-generated) cultural traits of the corporate mentality. And if it weren’t for marriage, this whole soap opera wouldn’t even have existed.

While a casual survey might indicate a general distrust and disdain for the idea of anarchy, it is important for an interpreter of Fassbinder’s work to recognize that his is a romantic anarchy: meaning, an anarchy that accepts and embraces its own untenability, while refusing to hide or ignore the basic appeal of its tenets. His condemnation of social constructs stemmed from a genuine, dialectical longing to embrace the multitudinous forms of civilization; all the while dismantling the most rigid ideological molds, and making room for better (if not always entirely new) ideas to take center stage. For instance, this essay would argue the central idea in The Marriage of Maria Braun to be a belief in the unsung possibilities for people to love fully and unabashedly, free of obligation or socio-culturally imposed restraints. (Fassbinder would return to this thesis again in Lola and Querelle, before rejecting it one last time in his funereal-yet-magisterial opus Veronika Voss).

Hanna Schygulla plays Maria Braun as a somewhat reserved small-town girl, who turns from a state of repression to a wild bout of hedonism, eventually settling for the upper-middle class formula of the new German economy; a formula in which hedonism become a commodity, and relationships dissolve into missed connections. And while we all only have ourselves to blame for some things in life, Fassbinder (& Märthesheimer) and Schygulla proclaim here that sometimes, society needs to reorient itself in the mirror of its own history: to scrutinize the systems of its own disintegration, without pointing fingers or placing easy blame; and then, to actively decide upon the course of its own future. Instead of turning to despair, they employ the tools of film comedy (wit; mischief; crisis)—refined through the shiny machinery of Hollywood movie magic—to show that it’s all just a laugh, seen in the colorful stage-light of the American melodrama. And conversely, the laugh is on us, as storytellers, when we fail to account for this interpretation and start taking ourselves too seriously—or thinking too rigidly.


Left to right: Darius (Lakeith Stanfield), Alfred (Brian Tyree Henry), and Earn (Donald Glover) tap the more existential side of comedy in Atlanta. © 2018, FX Networks.

It would seem that now would be the ideal time for an existential comedy of this nature. Maybe this is why so many Americans gravitate towards the novelty act of “comedy news shows:” a longing to find the humor in their situation; to either lighten the load of current events, or de-mistify the real struggle(s) of social progress. And while some of these programs may be satisfactory from a purely anecdotal standpoint, they tend to lose their universality and impactfulness when they turn legitimate talking points into ideological wedges. Especially considering the unwanted (but entirely too real) threat of international cyberwarfare, we might well benefit from honing our models of universal communication and dialectical/critical thinking—rather than casting them aside in favor of jingoistic platitudes and passionate inaction. If for no other reason, because the humor that emerges from this climate of divisiveness is hardly ever humorous, nor does it serve the most noble purpose of comedy: that is, to bring people together in a shared understanding of their collective ridiculousness.

As far as this broader definition of existential comedy is concerned, this writer has been impressed by the cheeky work of writer/director Donald Glover in his original TV series, Atlanta (which manages, in its most brilliant and memorable episodes, to neatly extract the existential crisis at the core of its situational vignettes). The recently released Death of Stalin—an Armando Iannucci (In the Loop, Veep) theatrical film comedy about this very subject—alternates between moments of brilliant humor and morbid logic, though it occasionally seems overly aware of its own ominous timeliness. Greta Gerwig’s Lady Bird carries a distinct regional-and-therefore-universal flavor (as opposed to the more lamentable inverse), and Taika Waititi (What We Do In the Shadows) seems to be following in the humanist footsteps of Christopher Guest. By and large, however, American comedy appears to be adrift in a sea of ideological word-traps; monitored by cultural watchdogs who alternately attempt to foster a better society, or seek to contain that which they do not fully comprehend (and in many cases, a bit of both). Perhaps it is the existence of these very constraints that outlines the freedoms we find so appealing in comedy. Nonetheless, these constrictions have a way of asserting themselves possessively and repressively; dragging us back to primitive misunderstandings and oversimplifications, and enslaving us to a false notion of freedom that—while worded as a superficially different dogma within different social circles—is fundamentally redundant and divisive, and only serves to wreak havoc on our efforts to understand and to evolve.

Taken as a whole, it’s sort of hilarious.

* * *


Betti and Maria swap truistic insights from the corner of a house party. © 1979, Westdeutscher Rundfunk.

In a particular memorable sequence, midway through The Marriage of Maria Braun, the protagonist finds herself one of only three passengers in the first class car of a train bound for Berlin. While trying to seduce the affections of a wealthy businessman (and in need of a source of income, with her husband recently imprisoned), Maria is approached by the third passenger—a rowdy and lewd G.I. under the influence (played by Fassbinder’s on-screen crush, Gunther Kaufmann), who appears convinced that Maria is a sex worker. The scene builds uncomfortably at first, as the dual themes of prostitution (selling one’s body to be part of a man’s business, vs. making one’s body a business) are brought into full relief, against our protagonist’s most noble intentions; but the tension breaks, as Schygulla pops off on the G.I. in filthy-but-grammatically coherent American slang—picked up from her previous affair with Bill. The viewer’s sympathies are then inspired to switch to the bright-eyed Kauffman, who looks somewhat intimidated and offended by Maria’s words—before confidently offering Frau Braun a military salute, addressing her as his superior.

The flip-flopping of power dynamics that permeates the middle section of Fassbinder’s masterpiece serves to define Maria’s trajectory in epic narrative terms. Only instead of it being a “great white man” at the center of a white man’s narrative, we find a woman in command of her own narrative; collaborating with a mix of creative individuals from different backgrounds and ideologies, and confident in her own POV—unafraid of getting lost in the shuffle. It’s the portrait of a woman inspired by the power of love, the quest for fulfillment, and the possibility of redemption in untold places. When one takes into account the remainder of the film’s character cast, one finds a range of different individuals, with different and entirely credible perspectives that conflict with or concede to one another (particularly endearing are Maria’s mother, played by Gisela Uhlen, and her girlfriend Betti, played by Elisabeth Trissenaar). They all demonstrate the capacity for a transcendent love, but only some manage to shatter the barriers of social oppression; and of those, only some manage to maintain their radical perspective (while others, like Maria, drift away on an ocean of creature comforts. Interpreted by certain critics upon its initial release, Maria was an allegory for post-Weimar Germany: “a character, that wears flashy and expensive clothes, but has lost her soul”).

Ultimately, Fassbinder and Schygulla seem to love all their characters in equal measure. They seem to be inviting us to love ourselves a little better: to demonstrate our self-love by actively confronting our surroundings, and dismantling the mechanisms of our own oppression (without substituting them for a different set of chains). They seem here to remind us that ideas are great, but ideologies are tiresome. That we can get more done by just spelling the problem out—ensuring we all share in a deeper understanding of the human condition—instead of operating from a private assumption of how things work and how we ought to fix them. They seem to be telling us that if we truly understand, we’ll be able to laugh about it; and if we can laugh about it, we might be able to really do something about it. Because when we’re allowed access to this universal laugh (a laughter that bravely confronts the darkness, rather than riding along with it), the darkness is no longer too frightening to bear. And once fear is removed from the equation, the soul can begin to breathe again.


“Everybody looks so ill at ease
So distrustful, so displeased
Running down the table
I see a borderline
Like a barbed wire fence
Strung tight, strung tense
Prickling with pretense
A borderline”

So sang Joni Mitchell in one of her finest and most incisive songs—”Borderline,” from her early ’90s (quasi-)masterpiece, Turbulent Indigo. In a subsequent verse, the artist paints a vivid portrait of those who “praise barbarity / in this illusory place / this scared, hard-edged rat-race.” In closing, she gracefully dismisses the futility of every ideology mapped throughout the song, stating casually and assuredly that: “All you deface, all you defend / Is just a borderline.


While there are many songs in the Joni canon that can repeatedly bring me to my knees or force me to eat crow, “Borderline” presents a rather singular fait accompli. Because the entirety of human ideology crumbles under the weight of this simple yet elegant text; including every indulgence in such borderline surveillance, which Yours Truly has ever been myopic enough to commit to writing.

I can no longer number on the fingers of my two hands, the number of times in recent history that I’ve repeated a particular platitude: “we are living in strange times.” Every time I repeat the phrase, I seem to betray a quasi-mystical hope—hope that some rational explanation for our circumstances might be drawn from the ether of such banal truisms. Perhaps the time has come for us to throw in the towel, in searching for any explanation to any of this madness. Or perhaps, it is necessary for us to re-engage the power of the mind, and step outside the now-driven-into-the-ground parameters defining our socio-cultural dialogue. To examine this battleground from a different perspective altogether; to question the very validity of our failed parameters, and try—for a change—to actually understand where we are (vs. insisting on getting to where we want to be, as quickly as possible).


David Bowie, standing by the Wall in Berlin; circa 1987, Glass Spider Tour. On the day after his passing, the German Foreign Office responded to the news by thanking him for helping to bring the wall down.

In my most recent post, I reflected upon the on-going relevance of David Bowie’s penultimate studio album, The Next Day. The post was titled after the album’s surprising lead single, “Where Are We Now”—in which the artist reflects upon his days spent living in West Berlin, during the late 1970s, recording a trilogy of monumentally influential albums with Brian Eno and Tony Visconti (as well as producing and co-writing a pair of equally significant Iggy Pop solo albums). The visually startling music video produced to accompany this single, directed by installation artist Tony Oursler, presents a static portrait of the artist’s studio life during this time: “sitting in the Dschungel;” “walking the dead,” waiting for a train. Though I’ll admit to being mildly perplexed by the song (and its choice for lead single) at the time of its original release, I can hardly think of a more timely artistic expression of what it feels like to live in 45’s America. The feeling of being frozen in time—unable to move forward or back; waiting at a terminal for a train that may never arrive, but unwilling to step outside the station for fear of the horrors that surround you on all sides (not unlike the oppressive weight of the Berlin Wall—yet another futile borderline).

In such a climate, reflecting on the past will continue to prove a necessary task for planting the seeds to a better future. For there remain clues scattered throughout our history, which may well provide us with the guidance needed to prevent this uncertain future from becoming an endless, Nietzschean reiteration of our pre-apocalyptic present.

* * *

I was struck with the inspiration to pen this entry, after reading a noteworthy academic journal by Ringo Ossewaarde (a professor of sociology at the University of Twente, in the Netherlands). I stumbled upon the piece while searching for writings on dialectical reasoning in the 21st century, and I advise the reader to set aside the time needed to digest in its entirety. If you only have time to read one long think piece today, please close out of this post and give Mr. Ossewaarde precedence; for while I may not personally deem some of the ideological fears he ruminates on quite as severe as he deems them to be (and while others, like the resurgence of fascism, seem more pertinent to our present-day situation), I’ve rarely read a piece of philosophical inquisitiveness as pointed, engaging, and nuanced as this one.

In struggling to make sense of the messy political conditions we presently find ourselves in, it dawned on me that a lot of the conversations taking place among us on the national front seem to suffer from an abuse of classical debate models. Rather than originating from a place of reason, observation, and friendly dispute, most of our conversations on contemporary subjects seem to originate from a position of deliberate antagonism. An eristic position, as opposed to a truth-seeking position. With antagonism as the norm, it ought to come as no surprise, when individuals who have adopted a truth-seeking outlook are misunderstood by their detractors, and (unsurprisingly) taken down a notch for daring to seek the most truthful common thread—as opposed to indulging in the more lucrative activity of professional hair-splitting. Ossewaarde captures this emergent dichotomy with great aptitude and precision:

“In Plato’s The Sophist (226a), Socrates identifies the Sophists with ‘the money making species’, thereby asserting that they do not dispute for the sake of the search for truth but instead engage in the dispute as professionals, to articulate their own truth claims for a reward or as a job. In other words, the dialectic turns eristic when friends come to depend (for their rewards) on their own truth claims, so that they become unwilling to negate their initial views (negation would make them lose their rewards). Since victory and not truth is the ultimate goal of the eristic discussion, the Sophists rarely change direction and hence are incapable of progressing towards truth.”

And with one paragraph, Ossewaarde successfully outlines the disease that prevents us, as a society, from making any progress towards alleviating the animosity of our conversation. And while specific changes in policy (such as the Reagan administration’s overturning of the FCC’s Fairness Doctrine; which effectively opened the doors for our ratings-and-sensationalism-driven news networks, and their rosters of theatrically impassioned talking heads who make a fortune by not bending to reason) have no doubt exacerbated our situation, it seems our culture—as well as the cultures of many European ancestors—have been drifting further away from truth-driven dialectical reasoning for decades prior to such official policy changes.


Raphael’s painting of the School of Athens, located in the Vatican, it portrays the commingling of sophists and philosophers in a polis setting.

And while Ossewaarde’s journal singles out the dominant ideology of popular liberalism as one of the most toxic and limiting ideologies currently in vogue, one could double down on the same charges as they relate to neo-conservatism (beginning with the Nixonian era of American politics). Thus, the greater significance of this journal seems, to me, a broader awareness that “ideology is a disease of the mind.” A statement which, bold as it may seem, I find myself hesitant to counter. For have we not seen the rise and downfall of nearly every ideology known to man? And have we not witnessed the devastating effects such trends seem to have on the development of the human consciousness? From cults to religions; from feudalism to colonialism; from communism to fascism. Arguably, socialism provides the only notable exception to this rather overarching rule. And one could argue that this is due to socialism itself being rooted in liberalism—still an overriding force in global culture, precisely because it is the most rational of all the “ism”s currently on the table.

To clarify: Ossewaarde’s critique of popular liberalism (which incorporates multiple insights from thinkers like Mannheim and Mills) is not, explicitly, a critique of the founding principles upon which liberalism has been built. Rather, his critique aims to clarify how popular liberalism has managed to take noble concepts and distort them in such a way that has actually proven detrimental to their advancement. Take, for instance, three ideas central to the liberal ethos: civil rights, gun control, and public services (health, education, and unemployment benefits, for instance). On all three issues, liberals generally hold a more rational stance than their conservative counterparts—though fortunately, some conservatives seem to be shifting towards the light. But if one were to accept Ossewaarde’s critique of liberalism (as a positivist ideology), one would have to acknowledge that we might’ve found a better way to convey the truthfulness of the liberal position; at least, something above the blowhard tactics of a Chris Matthews or a Bernie Sanders.

Similarly, this critique holds up when one considers the splintering of liberals into increasingly small subsections: the pitting of one puritanical ideologue against another, perceived-to-be-less-than; resulting in a climate wherein a liberal feminist (HRC) who had engaged in some misguided commentary (much of it having been prepared by male advisors surrounding her) and used an insecure email server, could be deemed—by some—a greater threat to progressive talking points than a populist demagogue with a mafioso predisposition, actively espousing anti-progressive rhetoric, and willingly adding fuel to a raging fire of xenophobic sentiment. Whereas one individual may see the forest for the trees, another may only see the branches that don’t align with their personal vision of the forest. And it is this very failure to reconcile reality as-is, with one’s personal interpretation of how reality ought to be, that results in the “mind that can no longer think well” (Ossewaarde, p. 408). (Just as this writer still finds himself struggling, on occasion, with the reality that 45 is still the President of this country, despite every indicator that he oughtn’t be at this time; and despite every bone in my body recoiling at every idiotic gesture of his idiotic and actively oppressive regime. There have been times, no doubt, that this mind has not been able to think well about all this; which, to some conspiracy theorists, could be deemed another objective of this administration’s strategy to divide, conquer, and deflect attention away from the puppet-masters.)

In turn, Ossewaarde succeeds in dismantling the most popular utopian “distortion” of reality perpetuated throughout the annals of sociological philosophy. Marxism is therefore seen as: “an eristic (read: an argument that aims to successfully dispute another’s argument, rather than searching for truth) pathology to dialectical sociology… Not only does Marxism make an illegitimate use of the dialectical method, but, in that use, it theorizes the historically determined transcendence of the contradiction in an historically fixated end state – the classless society.” It is, like all other utopias, a fantasy that cannot and will never be achieved; because the dialectic it seeks to suppress is, in this writer’s humble opinion at least, inherent to human nature itself.


Oprah Winfrey hosts a roundtable reunion of panelists who were first interviewed eight months into 45’s term as president. © 2018, CBS News.

For evidence of the dialectical urge in human nature, one need look no further than the number of U.S. voters who have lent their full support to the most mentally unstable president we may have ever entertained (or been shamefully entertained by). In televised interviews with some of the voters in question—including a recent Oprah panel reunion on 60 Minutes—one finds that these individuals are not, as is often portrayed by voices on the left, explicitly bigoted lunatics who wouldn’t know a book if it hit them in the face (though, to be certain, they exist also). Rather, they seem to be predominantly marked by a quality of anti-liberalism; which is different than anti-intellectualism (though occasionally commingled), because its antagonism lies most heavily within the notion of a truth being thrust upon them with no identifiable choice—versus being engaged in a rational dialogue that might enable independent acknowledgement of evidence supporting the correctness of liberal views (which, admittedly, some people will refuse to acknowledge even after a Socratic tutorial). Sometimes, this desire for active engagement manifests itself intentionally (i.e. “I can’t get past the way liberals talk”), other times, with little to no awareness of this desire even existing. In viewing the above-mentioned 60 Minutes piece, one may well note that the liberal-leaning panelists in this segment rarely succeed in effectively conveying the objective facts supporting their views. Rather, their attempts to relay the righteousness of their perspective is more commonly rooted in an observable emotion, which (to someone who might not share in the emotion, having not yet grasped the information which provoked it) generally serves to cloud the objective merit of the information being conveyed.

At the close of Ossewaarde’s thought-provoking commentary, the writer asserts that the greatest hope for a more radical sociology lies in the pursuit of a modern-day Socratic polis (publics)—sans slavery, of course—”in which the paideia – high culture – is continued through radical sociology.” This requires a separation from the elitist (and racist) mindset that underwrote the Greek polis, and an active process of adaptation to the circumstances of 21st century life within society. Most essentially, such an ideal can only be met if, and when, radical sociology succeeds in implanting itself within the machine of global capitalism. As explained by Michael Burawoy (a public sociologist who is currently attempting to reconcile an interdisciplinary range of sciences with capitalist enterprise): “globalization is wreaking havoc with sociology’s basic unit of analyses – the nation-state – while compelling deparochialization of our discipline.”

In other words, having altered the basic unit upon which the study of sociology was established, the pursuit of a globalized industrial complex (and its residual, globalized culture effect) has enabled powers on both the left and the right to call into question the very usefulness of a dialectical sociology (while, curiously, they still refuse to join forces in a single party of globalized fascism; for this is what modern life would resembled, if stripped entirely of  dialectical thought. The debate perseveres; which means that some thought remains, however distorted it may have become). An inverse dilemma is raised from this rejection of the dialectic: for if sociology has provided the backbone to social progress over the past millennia, can social progress stand a chance, when sociology is removed from its original role in the conversation? (And where might the arts, as we know and love them, find a more relevant place in this conversation?)

Burawoy and, to a lesser extent, Ossewaarde both seem cautiously optimistic about our chances for transcending the “ism”s of our time. Ossewaarde writes: “Since the positivist sociologies no longer serve the victorious, radical sociologists should no longer aim at negating positivism (Burawoy 2005a: 261, 266). Liberalism is no longer the foe of radical sociology. In the current crises of global capitalism, the very possibility of Burawoy’s public sociology and its resistance to global capitalism depends on a co-operative partnership with the positivist sociologies. This partnership is not a dialogue or friendly dispute between sociologists or scientists in general. Instead, it is a new deal in which positivism provides sociology with the legitimacy of a scientific discipline, while public sociology makes sociological knowledge accessible to democratic citizens, enabling them to make public issues out of their private troubles [emph. added].”

This may not seem like utopia to the reader, but it may well be the most comparable experience we can realistically hope for.

* * *

Where is hope?
While you’re wondering what went wrong?
Why give me light and then this dark without a dawn?
Show your face!
Help me understand!
What is the reason for your heavy hand?
Was it the sins of my youth?
What have I done to you?
That you make everything I dread and everything I fear
Come true?

Following the heavy blow of “Borderline” and the softer darkness of “Yvette in English,” Joni’s Turbulent Indigo closes with her magisterial “The Sire of Sorrow (Job’s Sad Song)”—in which the singer boldly re-appropriates the narrative of Job from the perspective of a woman (which, one supposes, is no more bold a gesture than that time she re-appropriated Yeats’s “Second Coming,” for her marvelous and profoundly underrated Night Ride Home album). It is a song—and a record—for our times; with its aching plea for an end to the never-ending reaches of trauma, and its desperate yearning for some hopeful resolve. Juxtaposed against the radical sociology of Ossewaarde and Burawoy, Joni’s music provides the inevitable counterweight to pure objectivity: pains for which no cold, objectively delivered explanation will suffice; horrors which no reasoned debate can rescind. The pure subjectivity of human trauma and lived experience, seeking resiliency through artistic acumen.


As I write this entry, I find myself simultaneously reading about the role that Facebook has played in the escalation of an ethnic cleansing underway in Rohingya, Myanmar. A report on the findings of a recent UN study, exploring the factors that have contributed to this genocide, indicates that: ” [Facebook] has … substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public. Hate speech is certainly, of course, a part of that. As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media.” I recall the immediate aftershocks of 45’s election and subsequent inauguration; aftershocks which included a spike in U.S. hate crime, as well as the broadening revelation that Russian-funded social media propaganda had played a significant role in furthering homegrown acrimony. I think of how all these factors might’ve played out differently, had our culture not drifted so far afield from dialectical reasoning. Had radical sociologists played a more significant role in the evolution of social media technology, or had Ted Nelson been successful in launching his alternative proposal to the World Wide Web model (Project Xanadu). It is alarming, saddening, and somewhat humbling to see how a single century of human history can yield so many shortsighted turns; giving way to a chain reaction of negative consequences—some predicted, many unforeseen—and leading us to our present circumstances.


Ted Nelson, interviewed in the 2016 Werner Herzog documentary on the history of the internet, Lo and Behold: Reveries of the Connected World. © 2016, Magnolia Pictures.

I cannot seem to erase this longing, at the very core of my being, to (in New Testament fashion) turn over the tables in the marketplace of our status quo, and insist upon a return to some form of reasonable containment for dialectical contradictions. And while I am reluctant to overestimate the possibilities for art bringing about such a revolutionary change, I continue to be inspired by the surrealist ethos: to create such a shock to the nervous system of the established order, that it cannot help but question the sustainability of its own terms. As art continues being co-opted by the contemporary positivist movement, which seems intent on reforming the arts in an a priori manner (so as to make them redundant), perhaps that counter-reaction to the institutionalization of bourgeois sensibility—once referred to as the nouvelle vague—may have its day in court, once more.

For it is not a question of whether the dialectic will be recovered, or whether humankind will awaken to the benefits of its implementation in society. The dialectic is. If we do not take the necessary steps to accommodate its existence within the fabric of our reality, it will simply continue rearing its stubborn head in ways that further destabilize and undo the best laid plans of mice and men. Might we make room for it, instead?

Looking back on The Next Day, five years after.

“Where are we now?
Where are we now?
The moment you know
You know, you know”

It appears as though the work of David Bowie is only going to swell in significance as the years progress. The year is now 2018, and I’ve found myself drawn to his music as much as (perhaps more than) ever. As its five year anniversary was fast approaching, I chose to stroll down the memory lane of 2013’s The Next Day—just a few days prior to Valentine’s Day; its songs still ringing in my ears when word of the Parkland shooting hit my news feed. “The rhythm of the crowd / Teddy and Judy down / Valentine sees it all…” I still shudder to think of the horrific events of that day, and every other day I’ve spent in this country learning of children slain in a schoolhouse. I am repelled by the terribly distant (yet still terrible) possibility that the person responsible for this most recent tragedy was, in any way, inspired by a song.


In reflecting upon this eerie bit of synchronicity, I found myself thinking of an anecdote shared by Robert Altman, in his commentary track for the comparably prescient film masterpiece, Nashville. Altman recounts having received a phone call from a Washington Post journalist, following the assassination of John Lennon—inquiring whether the director felt any responsibility for that terrible event of December 8th, 1980. Altman reports he was flabbergasted by the question, and the unnamed journalist clarified his line of questioning as a reference to the tragic culmination of Nashville: the first pop culture narrative to propose the possibility of a pop musician being assassinated—without any immediately recognizable motive, even. In his typically smug manner, Altman dismissed the query with a question of his own: “Why didn’t you heed my warning?”

When one considers the impossibility of calculating the value of a person’s life—much less, one’s premature death at the hands of a violent assassin—such questions are utterly irrelevant. The journalist who had the audacity to blame an artist for the devastating actions of a self-proclaimed born-again Christian (Mark David Chapman—who, in another strange bit of synchronicity, had previously considered David Bowie a possible target) betrayed as much futility in his line of questioning, as Altman did in his retort. For how can a society—any society—effectively prevent the emergence of such sociopathic tendencies? Surely, legislative action can be taken to decrease the ease with which individuals access lethal weaponry for acting upon these tendencies; but if the tendencies remain, is it enough?


David Hayward plays Kenny Fraiser in Robert Altman’s Nashville—a murderous face in the crowd. © 1975, Paramount Pictures.

I’ve meditated upon a similar line of thought, in light of the 21st century civil rights movement: our evolution from a kaleidoscopically splintered society with a dominant white male culture, to a more broadly integrated society—bolstered by an emerging, diversely amalgamated mainstream. Whereas the divisive rhetoric and violent repercussions of such an amalgamation come as no surprise to this writer, they present a rather apparent obstacle to the notion of cultural integration: How can we achieve a semblance of unity, when the very notion is perceived by so many Americans (certain minorities included) as abhorrent, or somehow intimidating? And will the shifting of power from one identity demographic to another yield the sort of positive cultural changes that have been forecast by many a liberal optimist—or might it eclipse the more noble intentions of movements initiated within minority groups, once all are able to rest at ease on the laurels of economic power?

Is this the more profound reason behind the refusal of white American women to elect our first American woman Presidential candidate: a subconscious fear that she might signal the end of a more radical feminism, instead perpetuating the already-established aura of centrist pragmatism? While these very words (“centrist” and “pragmatism”) have never struck me as particularly offensive, it does seem that many are put off by the notion of common ground. Perhaps some of these individuals perceive the long-standing tensions of identity-driven antagonism as a more fertile soil, in and of itself, for a more radical politics. (A deceptively shortsighted interpretation, as far as this writer is concerned; but an interpretation, nonetheless.)

I am here reminded of the final interview given by the radical Italian artist, Pier Paolo Pasolini, whose work I find myself returning to on a fairly frequent and compulsory basis. In this interview with Furio Colombo (published on November 8, 1975), Pasolini outlined his rather intricate philosophy of life in society using, arguably, the most simple (and possibly oversimplified) terms of his entire career:

“I miss the poor and genuine people who fought to abolish their master without turning into him. Since they were excluded from everything, nobody had managed to colonise them. I’m scared of these slaves in revolt because they behave exactly like their plunderers, desiring everything and wanting everything at any price. This dark obstination leading to total violence is not letting us see who we are. Whoever is taken dying to the hospital is more interested—if there is still some life left in them—in hearing what the doctors will tell them about their chances to live, than in what the police will tell them about the dynamics of the attempted murder perpetrated against them. I’m not putting intentions on trial and I’m not interested in the cause-effect chain, or in spotting who did this or that first and who is the guilty head of the gang […] If we have reached this point I would like to add let’s not waste time to label things, but let’s see how we can let water drain away before we drown.”


Pasolini, by Ernest Pignon Ernest. 2015

Not unlike David Bowie, Pasolini’s work appears to become increasingly relevant with each passing day, and his words sound (to me) increasingly timely. For we are clearly adrift in the murky waters of the 21st century, and the risk of drowning is rather prevalent—both in literal and metaphorical terms. For as the threat of climate change advances, unfettered and unrestrained by our nation’s near-sighted economic stakeholders, we find ourselves drifting around in ever-smaller circles of us-vs-them rhetoric: cutting down as many crooked branches as we can single out and incriminate, until there is barely any forest left to inspire us. (And all the while, the waters keep rising…)

I fear the reader may take the message of this essay to imply a rather pessimistic view of our future. While I cannot rule out the possibility of a violent end to the experiment of global economics—and while recently published photos of a convocation ceremony for an AR-15 assault rifle (hosted by the curiously named World Peace and Unification church in Newfoundland, PA) bear a rather uncanny resemblance to images from Pasolini’s hopeless critique of Western civilization (Salò)—I find myself increasingly drawn to the distant glow of hope. For all is not lost; at least, not yet. There are individuals among us who have dedicated, and continue dedicating their life’s work to strengthening their communities, and projecting goodness into the world; fostering the tenets of goodwill, service onto others, and an evermore precisely defined, optimistic view of the human potential.

Culturally speaking, I’ve found myself rejuvenated by Tracey Thorn’s latest solo album—Record. A straightforward, unabashed celebration of the feminist ethos and the power of shared experience, the songs on Record glisten with a wise, genuine optimism: a welcome antidote to the more heartlessly commercialized (and selfishly sensationalized) manifestations of liberal thought in the 21st century. Apart from the empowering anthem, “Sister”—in which the singer/songwriter assuredly and poetically states: “Oh little man, you’re such a baby / Put up your fists, nobody ever loved / Someone they were afraid of“—Thorn’s latest offering forgoes confrontational force. Instead, the songwriter finds power in the celebration of small joys—alternatives to the horrors of the big, scary picture which our society currently represents.

tracey thorn queen

Tracey Thorn offers up rays of light and hope on her latest full-length studio outing, Record. © 2018, Merge Records.

Ranging in topic from the pursuit of romantic fulfillment, to the challenge of conforming to gender norms/expectations, to the bittersweet experience of watching one’s child emerge into their own person, to the joy of taking one’s sorrows onto the “Dancefloor” and casting them to the four winds (“Play me ‘Good Times’ / ‘Shame’ / ‘Golden Years’ / And let the music play“), Record repeatedly finds solace and hope in this wisdom: that there is much more uniting us than our sensationalism-driven media permits to meet the eye. And considering how well-received the album has already proven, this is a wisdom that people may be thirsting for.

I’m not sure whether Pasolini would agree with this assessment (and I don’t especially care to verify; for as brilliant as he undoubtedly was, Pasolini was a man as flawed in his thinking as the next), but artists like Tracey Thorn—or Agnès Varda; Alison Moyet; Kate Bush; Mavis Staples; Wim Wenders; Todd Haynes; Richard Linklater; Wong Kar-Wai; Barry Jenkins; the list goes on…—represent, to me, this very notion of “fighting to abolish the master without turning into him.” They have each made the significant realization that the master is not the caricatured villain of “Brecht’s beautiful world” (to quote Pier Paolo once more). It is not—at least, not explicitly—45, or Putin, or Kim Jong-Un. Rather, “the master” is the oppressive cloud hovering above our respective pursuits of self-actualization: the negative forces, both external and internal, which collectively obscure our pursuit of happiness.

* * *

We are each of us presented, at some point in our lives, with a choice between leading a truthful existence, or giving into the corrupting, enticing vices of power. In what is arguably the most noble of all vocations (that of the artist), this enticement presents itself most prominently in the form of one’s own ego. We see its corrupting influence in the more indulgent works of certain filmmakers and writers, or the vain posturing of many a pop singer/superstar. Artists who place the power of their own personality before the virtue of humility; a prerequisite for speaking truth. Perhaps it is that we too often misplace the Aristotilean definition of art: the realization, in external form, of a fundamentally true idea. The thoughtfully sculpted marriage of form and content—liberating the spectator from the suffocating constraints of social norms and taboos, and facilitating our access to truths that are routinely prohibited, suppressed, or distorted by these constraints.

“Here I am / Not quite dying
My body left to rot in a hollow tree
Its branches throwing shadows / On the gallows for me
And the next day
And the next
And another…”

Which brings me back to the The Next Day—and its phenomenal title track; still simmering with all the rage of Dylan Thomas, or the beautifully obscene poetry of Rimbaud, five years on from its initial release. Following on the heels of “Where Are We Now?” and “The Stars (Are Out Tonight),” “The Next Day” was released as a multi-format single on June 17th, 2013; including a square 7″ record and a Pasolini-inspired video. Filmed by Floria Sigismondi (also responsible for “The Stars” music video), with Gary Oldman and Marion Cotillard cast (respectively) as a reactionary zealot and a stigmata-struck saintand with Bowie assuming the role of the rebel Christ-figure; obviously“The Next Day” provoked a fairly impassioned response. Openly condemned by the Catholic League and the former Archbishop of Canterbury, and removed from YouTube just two hours following its debut after reports of inappropriate content, the video (and the song) demonstrated that it was still possible to shock people. Despite the fact that there is little left to be shocked by in Western civilization, and even though the scandal was quickly replaced by the next piece of contemporary tabloid journalism.


Left to right: Tilda Swinton, Floria Sigismondi, and David Bowie on the set of “The Stars (Are Out Tonight.” Sigismondi was responsible for two of the most memorable music videos in one of the most memorable video anthologies a pop artist has ever produced. Courtesy of the artist’s official website.

Perhaps this is the very meaning of the song: the artist’s insight that we seem to advance, as a society, through a redundant series of primitive motions and corrupt gestures; repeating the same mistakes and miracles from one day to the next (and another…). As though the entirety of human history could be condensed into a single, reflexive ritual: the ritual of human dogma attempting (and failing) to conquer the temptations of hedonism and mystery (And the priest stiff in hate now demanding fun begin). Consider some of the song’s most intensely visceral, explicitly ceremonial lyrics:

“First they give you everything that you want
Then they take back everything that you have
They live upon their feet and they die upon their knees
They can work with satan while they dress like the saints
They know God exists for the devil told them so
They scream my name aloud
Down into the well below”

The religious imagery of the piece—and more specifically, the poetic tone with which this imagery is delivered—appears to beckon directly from Pasolini’s painterly, blasphemous, often trance-like interpretations of ancient myths. Many of Sigismondi’s set-ups, though executed with smartly calculated steadicam moves (alongside other more advanced cinematographic devices than were available to Pasolini in his time) echo the frontal, 2-dimensional approach of the Italian filmmaker’s Trilogy of Life. Likewise, Bowie’s own phrasings bear a strong resemblance to some of Pasolini’s later poetry and prose—let alone the correlations of subject matter. And if The Next Day was to Bowie what Trilogy of Life was to Pasolini (considering both works were completed within the five years preceding each artist’s passing), then Blackstar can be seen as a distant parallel to Salò: both masterpieces of indescribable precision and prescience; both fully realized and self-contained coffins, incapable of letting anything else in, or giving anything else away.


Bowie offers the world one final formulation of truth, as explored in Francis Wheatley’s documentary David Bowie: The Last Five Years. © 2017, HBO Documentaries.

Such is the culmination of a great artist’s trajectory. And in between “The Laughing Gnome” and “Button Eyes,” all variety of characters (with their variety of faces) came and went. One minute he was a rock-enamored alien from Mars; the next minute, he was “Halloween Jack.” One day he was a Thin White Duke, and the next, a golden-haired opportunist. (And the next day, and the next…) In this regard, two artists as superficially dissimilar as David Bowie and Tracey Thorn (or Pasolini and Altman) separate themselves from the crowd in equal measure: not so much by refusing to conform, but by epitomizing what it means to exist and to embody (rather than to blend). In this regard, these artists have all succeeded in truthfully representing, throughout their life’s respective works, what it really means to be a face in the crowd.

Take, for instance, the recently aired BBC/HBO documentary, David Bowie: The Last Five Years. At the culmination of his simple and deeply moving film, director Francis Wheatley chooses to showcase live footage of the crowd assembled around Bowie’s birthplace, after the announcement of his passing. The film refuses to sentimentalize the footage through editorial trickery; the footage speaks for itself, and it is allowed to roll unfettered by schmaltzy scoring or slow-motion effects. As the camera passes over each face, painted with a lighting bolt or a flourish of glamorous makeup, the viewer is instantly made aware of the universal relevance—and relatability—of Bowie’s work. Like many, his career began on rather inauspicious terms, with works that betrayed a superficial drive for commercial success and recognition. By the time of his final masterpiece, Blackstar (whose title welcomes a variety of interpretations, but most directly seems to echo a lesser known Elvis song, “Black Star,” released in 1960; which would make it a plausible distant relative to Scott Walker’s 2006 Elvis tribute, “Jesse“), Bowie had accumulated all the wealth and recognition an artist could possibly hope for—but his focus was unerringly on a servitude to his craft: the need to make one final, truthful gesture, before moving on to the other side.

© 1978 Roger Marshutz

Bowie summons the ghost of Elvis on the title track, “Blackstar;” a title that was previously used by The King for a song recorded during his sessions for the 1960 Don Siegel picture, Flaming Star, in which he played the lead role. Photo taken at a performance in Tupelo, Mississippi, on September 26, 1956. © 1978 Roger Marshutz

Likewise, the songs of Tracey Thorn linger with me most endearingly. They present a truth that is unt(a)inted by the self-aggrandizing, self-martyring tones with which too many words have been shouted into too many microphones. They empower without belittling; inspire without condemning. They remind us that art is for everyone: it is not an elitist exercise, or a purely cerebral experience. Nonetheless, art demands a baseline of cognitive and/or spiritual engagement from the audience; a caveat which I fear gets lost in translation, when fledgling artists attempt to force an agenda into the mainstream (after all, agendas can only serve to preclude an audience’s engagement; and truth itself is never in want of an agenda).

I remain skeptical (at best) about this recent thirst to excommunicate artists who have led problematic lives: to dismount their work from the walls of museums, or disregard a lifetime of achievement because of a single accusation (If things aren’t suited / Then they’ll get diluted). For if the purpose of art has been to confront one’s own oppressive “master,” and emerge on the other side with a truthful resolve, it would follow that art has served as one of the most effective therapeutic devices for troubled souls to connect with the rest of the human race. (Consider the life and work of de Sade and Genet, if the reader is in doubt as to the veracity of this statement.) I worry that this latest strain of anti-intellectualism, veiled by dubiously righteous intentions to “purge” criminal—and perceived-to-be-criminal—artists, will merely discourage troubled individuals (like these young men driven to slaughter their schoolmates) to connect with a viable alternative to violence and fascism. And if we are not successful at providing an alternative for those who are lost and disoriented in the back-channels of society, we are all guilty of negligence: of letting the water drown us out, while we stand in judgment of “who did this or that first and who is the guilty head of the gang.”

* * *

As dour as some of these affairs may seem to the reader, and as jarring as the following remark may come across, I presently feel a tremendous pull to believe in the general decency of humankind. The politics of our time are surely as toxic as they have ever been, but this has only rendered the search for reasons to be cheerful increasingly imperative. Put simply, one can no longer afford the luxury of lingering in the debris of a demolished civilization. One can only put forth a daily effort to start anew, with the acquired wisdom of our past failed experiments as a guiding light for what not to repeat.


“In any society, the artist has a responsibility. His effect is certainly limited and a writer or painter cannot change the world. But they can keep an essential margin of nonconformity alive. Thanks to them the powerful can never affirm that everyone agrees with their acts. That small difference is very important. When power feels itself totally justified and approved, it immediately destroys whatever freedoms we have left, and that is fascism… The final sense of my films is this: to repeat, over and over again, in case anyone forgets it or believes the contrary, that we do not live in the best of all possible worlds.”
– Luis Buñuel (from the critical essay, “The Discreet Charm of Luis Buñuel,” as translated in the English text The World of Luis Buñuel: Essays in Criticism)

“And I tell myself, I don’t know who I am
And I tell myself, I don’t know who I am
My father ran the prison
My father ran the prison
But I am a seer, I am a liar
I am a seer, but I am a liar
My father ran the prison
My father ran the prison”
– David Bowie (from “Heat“)

in the Home of the Brave.

“There’s no such thing as love, only proof of love.”
– Jean Cocteau


Is there such a thing as cinema? Do the images that flicker for us on that big screen—paired with foley effects, synced dialogue, and original scoring—compose something tangible and identifiable? Or is it all an illusion; a reproduction of a dream (that most intangible and abstract concoction of all)? More pressingly: is there still a place for cinema, in the age of social media (with its foremost byproducts: outrage and attention deficits), online dating, and reality TV presidents?

It’s a question that has been swirling around the toilet bowl of movie nerd-dom for several years now—fielded primarily by a circuit of twenty-something film school brats (I use the term endearingly; they all appear to be gainfully employed at IndieWire now, so it would seem they’ve landed on their feet), adjusting their glasses as they alternately defend the politics of streamable distribution formats, or decry the disappearance of that communal experience once known as going to the movies. As far as this writer is concerned, the debate can be rendered irrelevant with a simple understanding that where there is a will, there is a way; and regardless of the production/distribution methodology, we have a century-old addiction to recreating our dreams for projection on the big screen. This is unlikely to disappear outright—particularly if one considers that dreams are in greater demand than ever.

Last year saw the demise of many socio-cultural norms and institutions. It also bore witness to some awe-inspiring new works by our country’s foremost dream-makers, and the emergence of some powerful new voices in American cinema. In the former category, no achievement can match the awesome feat of Mr. David Lynch—whose 18-hour-long masterclass in film-making (Twin Peaks: The Return) has left viewers throughout the world kneeling in the dust of its tailspin; bowing to the shape of its receding genius. In addition to Lynch’s crowning achievement, there were strong showings from other established auteurs, including Paul Thomas Anderson, Noah Baumbach, and Todd Haynes. We were served a generous helping of the profoundly twisted, Hitchcockian meticulousness practiced by David Fincher (whose original miniseries, Mindhunter, gives long-form life to the investigative-cum-philosophical theorism of Se7en and Zodiac); we were also granted a fresh dose from the perceptive, loving, and quintessentially American gaze of Richard Linklater. In the newcomer category, there was a powerful entry from Catherine Gund and Daresha Kyi (Chavela); a directorial debut by the fabulously deadpan Greta Gerwig (Lady Bird); and a wobbly but noteworthy second feature by Eliza Hittman (Beach Rats). There was also an imperative documentary on the late civil rights activist and prolific writer, James Baldwin (I Am Not Your Negro, directed by Raoul Peck; worth the price of admission, but regrettable for its failure to tackle the full scope of Baldwin’s contradictory existence), and the surrealistic late-night comedy flair of Jordan Peele—successfully channeled into big screen, feature-length form in the topical blockbuster Get Out.


Photographer JR paces a beach in Normandy, where he and Agnès Varda have just pasted one of many portraits taken throughout Faces Places on the base of a WWII bunker—which was pushed off the precipice of a nearby cliff. © 2017, Cohen Media Group.

On the international stage, we were blessed with offerings from the subtle genius of Ms. Agnès Varda (whose latest documentary, Faces Places, is a fountain of joys), the sensuous intellectualism of Luca Guadagnino (in the James Ivory-penned audience favorite, Call Me By Your Name), and the slick auteurism of Denis Villeneuve (whose eagerly anticipated sequel to Ridley Scott’s seminal masterwork—Blade Runner 2049—left me breathless and teary-eyed). We encountered the quietly mysterious spiritualism of Olivier Assayas (who brilliantly melded the mystical horror of Nic Roeg’s Don’t Look Now with the existential melodrama of Krzysztof Kieślowski, in his original film Personal Shopper), the stark realism of Francis Lee (God’s Own Country), and the smarter-than-average populism of Guillermo Del Toro (The Shape of Water). And while I could easily use this essay to sing praises to each of these international works, it seems to me—with all the tumult and unrest engulfing us on the national (and international) stage(s)—that a more pressing need may be met by attempting to highlight the fruits of my homeland: a country that has, since its very inception, provoked justifiable skepticism around its merits.

Much has already been written about on-going struggles, pertaining to inequality and sexual harassment within the American film industry (along with every other facet of our socioeconomic structure). The movement to systemically advance opportunities for marginalized individuals—and the parallel movement, to raise awareness for the plight of those experiencing institutionalized harassment and discrimination—is long overdue. Perhaps because of this delayed reform, it seems there may be an unfortunate residual effect emerging from this discourse (and more specifically, from the online social media factor; for while this technology has proven well-suited to a number of ends, social progress has scarcely been one of them). That is, the tendency to cynically lament the shortcomings of a given system—in 2016, the “swamp” of Washington, D.C.; in 2017, Hollywood—all the while forgetting that not every individual involved in said system represents said shortcomings.

For instance, if we are to examine the strengths and deficits of the United States, circa 2018, it would be easy—too easy—to highlight the deficit column, and disregard altogether the finer qualities we’ve represented more capably in the past. But would such emphasis prove these qualities to be nonexistent in the present? Or would it merely bring to light the fact that these merits are an integral part of the American fabric—that they have fallen on hard times, and may need some attention to flourish once more? I am hopeful that this new wave of social activism will contribute to the reignition of our country’s innovation and resiliency; qualities which have fallen by the wayside for some time now (at least as far back as our cultural shift in definition—from innovation: discovery and development, to innovation: app development). I am fearful that—within our climate of antagonistic communication patterns, totalitarian politics, and a general predisposition toward reactionary patterns of behavior—this form of activism may all-too-easily be thwarted by neo-conservative powers, intent on branding minority-status citizens as victims for life, and thereby curtailing their power to advance the causes of restorative justice. Regardless of my hopes and fears, I have always found the presentation of a viable alternative to be the most effective strategy for social change (as opposed to the incessant hounding of those already well-known for fostering inequality; lest we forget that all publicity is good publicity, for those with no dignity left to jeopardize).

In a similar vein, I don’t see much merit in hounding on the immense miscarriage of finance that underlies the majority of Hollywood’s output (beyond pointing out that such a miscarriage exists). I’m a firm believer that, in a consumer society, we empower the type of work we want to see more of, whenever we make our selection at the box office ticket counter. Although the aggressive powers of marketing have escalated exponentially these past few decades—culminating in our present-day, tail-wagging-the-dog marketplace mentality—we are the ones who ultimately empower (or discourage) the makers of plastic cinema, when we hand them our attention and our money. Which is why most of us adopt a selective approach in our movie-going habits (let alone the absurd escalation in ticket and concession prices): just as in the world-at-large, one can have a positive impact on the future of cinema, by supporting the proofs of cinema which advance its more worthwhile attributes. And while each viewer has their preferences, I find it remarkable that so many of these attributes have long been shown to be universal. Consider the phenomenon—that a single film can be understood and lauded (or derided) by different nations of people, throughout every corner of the world. That we can each learn from the perceptions and experiences of perfect strangers, and in so doing develop a greater capacity for love and understanding. May this phenomenon never be taken for granted.

For the purpose of this entry (and for the cause of restoring some honor and dignity to a country that has little to champion in either department, as of this writing), I have chosen five of my favorite American films released in 2017: to hold them up as shining examples of our more worthy attributes; and to remind the reader (if one is in need of reminding) that there is still much worth championing in the American landscape. In times such as these, we may all need reminding.


Lady Bird
written & directed by Greta Gerwig; starring Saoirse Ronan, Laurie Metcalf, Tracy Letts, Lucas Hedges, and Timothée Chalamet
released by A24 and Universal Pictures 


Greta Gerwig directs Saoirse Ronan and Timothée Chalamet in a scene from her beloved directorial debut, Lady Bird. © 2017, A24 and Universal Pictures.

I was first made aware of Greta Gerwig when I saw the first of several Noah Baumbach vehicles in which she appeared—the under-valued (in this writer’s opinion) and surprisingly buoyant dark comedy, Greenberg. I immediately took note of the name. There was something in the way she brought her character—and, consequently, the film—to life; something I couldn’t quite put my finger on, and didn’t particularly care to. I hate to use the term “star quality,” seeing as how what passes for a star these days would make the likes of Bogey and Bacall roll in their graves. Suffice it to say, Gerwig has the sort of innate brilliance and affability that could inspire one to ask her out for a cocktail, and debate whether Gene Kelly or Fred Astaire was the better dancer (for no other reason than to hear the sound of her voice as it struggles to keep pace with the winding movements of her wit).

Gerwig has already had a terrific run (and she’s only just begin), appearing in a pair of films she has since co-written with Baumbach—her erstwhile paramour—as well as giving memorable turns in works by Todd Solondz (Wiener Dog) and Rebecca Miller (Maggie’s Plan). Watching her take the lead and walk away with every scene in Frances Ha fostered in this writer the sort of unabashed, film-loving glee that only comes around once in a blue moon; the film’s nouvelle vague aesthetic, rather than making it appear dated, actually served to highlight the confidence and strength of its content and delivery. A year before that, I was positively enchanted by her incarnation of Whit Stillman’s alter-ego, Violet, in his politely subversive and drier-than-a-communion-wafer gem of a film, Damsels in Distress: finding myself only one of two people in the theater (the other being my companion) to laugh hysterically at its tenderly acerbic take on the follies and neuroses of bourgeois young adults, I wondered if Gerwig’s particular (some may say peculiar) sensibility could ever connect with a broader audience. Half a decade later, as I sat in the packed house of that same theater for a screening of her Oscar-nominated directorial debut, I grinned and laughed uncontrollably; I thanked all of our lucky stars this moment had finally arrived.

While one is never in doubt as to the film’s author (one can practically visualize Gerwig acting out every part in the movie during script readings), the ensemble cast of Lady Bird deserves a standing ovation for their dedicated and cohesive effort to bring Gerwig’s writing to life. I was especially taken with Laurie Metcalf (who, in addition to Saoirse Ronan—the film’s protagonist—is now up for an Oscar) and Stephen Henderson, whose subtle performance as a theater instructor in the Catholic high school frequented by Lady Bird has lingered in my memory. Lady Bird’s rotation of friends and acquaintances is equally memorable: from the “shitty Pavement fan” (Gerwig’s verbatim direction) boyfriend played by Timothée Chalamet, to the helplessly perky ex- played by Lucas Hedges (most immediately recognized as the kid in Manchester By the Sea), to her best friend and confidante, Julie (a beaming Beanie Feldstein).

Given time, Lady Bird is likely to be lumped in a basket with every other coming-of-age comedy to ever achieve critical acclaim (The GraduateCluelessRushmoreThe Breakfast Club, etc…). And while there would certainly be some fine company in this basket, it would be a disservice to the extraordinary nuance of Gerwig’s film—which unlike The Graduate, with its stylish cynicism (or Rushmore, with its stylish stylism) happens to be an unexpectedly intricate and layered portrait of adolescence; above and beyond what most are accustomed to getting out of a Wednesday matinee. That such an unabashedly smart, disarmingly confident slice of American film-making could emerge from our current cultural climate—and in the process, achieve international acclaim—is a testament to the finer qualities of the American sensibility. It is also a testament to the (possibly boundless) potential of a strong, idiosyncratic voice in the latest chapter of our nation’s cinema.


Last Flag Flying
directed by Richard Linklater; written by Richard Linklater & Darryl Ponicsan; starring Steve Carell, Bryan Cranston, Laurence Fishbourne, J. Quinton Johnson, and Cicely Tyson
released by Amazon Studios and Lionsgate


Left to right: Bryan Cranston, Laurence Fishbourne, and Steve Carrell play three Vietnam war veterans in Last Flag Flying, Richard Linklater & Darryl Ponicsan’s “spiritual sequel” to The Last Detail. © 2017, Amazon Studios & Lionsgate

It is probably no great secret, among my friends and fellow movie fanatics, that I have a strong affinity for the work of Richard Linklater. Ever since my first viewing of Waking Life, in the form of a DVD borrowed from my local library, I have followed every step of Linklater’s career—with a mixture of fascination and mild apathy (something tells me he would approve of this response; it’s mostly fascination, anyhow).

In Last Flag Flying, Linklater tenderly pays tribute to another great film love of mine—the late Hal Ashby; whose 1973 adaptation of the earlier Darryl Ponicsan novel, The Last Detail, provides much of the spirit for Linklater’s quasi-sequel. It’s an honest, considered, personalized reproduction of the story Ponicsan wrote three decades later (at the height of the second Gulf War): in many regards, the narratives run parallel to each other; but this later entry is more firmly rooted in the trenches of death, and the sorrow of survival. Their events seem to overlap: in both stories, for instance, the three protagonists share a night on the town in New York—and subsequently miss their train. The fact that in one they’re looking to get laid, while in the other they’re looking to buy some mobile phones, is entirely beside the point; the echo effect is palpable, and it is bound to resonate with fans of Ashby’s cult classic. A large part of what renders Last Flag Flying such a noteworthy feat (or proof) of American cinema, is this sense of connected-ness: with the histories of its characters; the histories of its authors; and with the most radically inspired, promising film era in our nation’s cinema (spanning ’68 to ’79, or thereabouts; also the timeline for Ashby’s career). Some may deride this sort of praise as high-handed, but as our connectivity to history becomes increasingly scarce—with sound bytes and YouTube clips superceding context and formal analysis—I think it’s warranted.


Left to right: Otis Young, Randy Quaid, and Jack Nicholson play three Navy corpsmen in Hal Ashby’s 1973 adaptation of The Last Detail. © 1973, Columbia Pictures.

What is most notable about this picture, perhaps, are the thoughtful ways in which Linklater asserts his own personality and characterization throughout. For whereas both Ashby and Linklater linger on the spiritual questing of troubled characters, Linklater advances the quest through a far more directly pointed approach. In The Last Detail, Jack Nicholson’s “Badass” Buddusky rolls his eyes during a unitarian gathering of chanting practitioners; in Last Flag Flying, Bryan Cranston’s Sal embarks upon an incessant, often irritating (intentionally, at that; and effectively, kudos to Cranston) tirade against the perceived-as-indoctrinated rationale of his former buddy—now-Reverend—Richard Mueller (Laurence Fishbourne). Which isn’t to say this confrontational perspective belongs to the director himself (though the viewer may pick up subtle shades of Ethan Hawke’s Jesse in Cranston’s Sal); Linklater merely had the wisdom and good faith to reveal, whenever possible, the changes that time has inflicted upon his characters—along with the changes time has withheld. That there is no direct connection between the three characters portrayed by the actors in each film is especially effective—and affecting: for by pointing to separate instances of similar life patterns, Linklater and Ponicsan achieve a far broader sense of connectivity with the human condition. It’s the sort of artistic gesture that reveals how, even though our behaviors are developed through a complex mixture of environmental and biological triggers, they frequently perpetuate themselves through stubborn repetition, and through subjugation to damaging social constructs (in this case, the construct of war). And if the complexities of human behavior can be perpetuated, it follows they must also be capable of change.

In keeping with this insight (which doesn’t emerge until farther along in the characters’ journey), Last Flag Flying closes on a dark but optimistic note. The resolution belongs to Steve Carrell’s character—an ex-Navy corpsman known as “Doc” Shepherd; the heart of the film, in more ways than one (Carrell’s performance being a quiet and inexorable force throughout). The film fades out as “Doc” achieves a sort of closure with the premature death of his only son; the song that fades in during the end credits is “Not Dark Yet,” from Bob Dylan’s beautiful late ’90s offering, Time Out of Mind. It provides the perfect post-script for the trajectory of these characters—a trio of Vietnam war veterans struggling to connect the dots of their scattered lives (“I can’t even remember what it was I came here to get away from“). It also manages to connect their struggle to the more imminent struggles faced by our country, at this specific juncture in history; for as we sit around, waiting for someone to step up and dethrone the lunatic who’s been given free reign to distort our country for private gain, many of us search for signs of hope—struggling to find some comfort in the paradox betrayed by Dylan’s song: it’s not dark yet, but it’s getting there.


directed by Catherine Gund and Daresha Kyi; starring Chavela Vargas, Pedro Almodóvar, Elena Benarroch,  Miguel Bosé, and Liliana Felipe
released by Aubin Pictures


Pedro Almodóvar and Chavela Vargas: two rebellious spirits, captured in Catherine Gund and Daresha Kyi’s exceptional documentary, Chavela. © 2017 Aubin Pictures.

I am so grateful that my local art house cinema (Neon Movies) picked up this very special and memorable documentary; it was particularly rewarding to have one of the film’s co-authors, Daresha Kyi, in attendance for a live Q&A post-screening. Her pensive and often comical commentary validated all of the finer presumptions this writer had gathered from the screening, but it also served to open up many of the complexities and contradictions scattered throughout the surface (and subtext) of Chavela.

According to Kyi, the process of making a documentary about the famed (and infamous) Mexican chanteuse, Chavela Vargas, began under different circumstances than what one sees in the finished product. The project actually originated with an in-person interview, conducted by Catherine Gund with Chavela at the start of the singer’s first major comeback in the early ’90s. Having gone through her personal archives and digitized all the decomposing film lying in canisters around her studio, Gund rediscovered the power of this twenty-some year old footage, and felt compelled to share it with Kyi. Upon viewing the footage together, and catching up on the later years of Chavela’s life story, the initial concept developed by Gund and Kyi involved having another Latina chanteuse narrate Chavela’s story through her own personal lens. Gund and Kyi assembled a rough promo edit of this approach, then screened the material for a group of potential investors. The consensus was clear: forget about the other singer (whom Kyi did not refer to by name during the Q&A); the story is Chavela’s, and she should be the star of her film.

Upon approval of an expanded budget, Gund and Kyi were able to license footage from different televised interviews and performances, conducted at various times throughout Chavela’s complicated (and at times, difficult to trace) career. They proceeded to film present-day interviews with persons of interest, spanning the course of Chavela’s professional and personal development: a former lover (and life-long private attorney); the Spanish filmmaker, Pedro Almodóvar (who was partly responsible for Chavela’s European comeback tour, along with Laura García-Lorca); and accomplished film composer/long-time admirer, Miguel Bosé. Weaving together the present-day interviews with archival materials, Gund and Kyi have achieved a seemingly well-rounded, often contradictory portrait of their subject—a character whose most prominent qualities arose from her own contradictions. Chavela’s story is alternately inspirational and tragic; outrageous and miraculous. It’s a story (and a voice) that resonates with the most profound notes on the human scale, triggering pulses and emotions that strike the viewer/listener on a multitude of levels. The film’s emotional power serves to eulogize the life of the film’s subject, but it also reminds us of the forest we sometimes fail to perceive—among the tangled trees of this modern existence.

It seems we have reached a point in our history, where tensions have risen about as high as they could possibly rise: we see many of our fellow Americans running for cover from their perceived opponents, from one uncertain day to the next. In times such as these, there is greater pressure than ever to conform to some kind of an agenda; to restore some modicum of stability, or at least the illusion thereof. In the midst of all this pressure, Gund and Kyi gently remind us that many great figures in world history happen to be individuals who refused to conform: women like Chavela, who first made waves by refusing to wear a dress—and later, by rejecting the more limiting definitions of the contemporary LGBTQ vernacular; men like Pedro Almodóvar, who refused to make boring, run-of-the-mill, politically “sensitive” comedies—eventually finding his own niche audience through a celebration of the most outlandish and perverse attributes of outlandish and perverse characters (and narratives). Theirs are the sort of rebellious gestures that will retain their power and intrigue, long after the sediment of history has settled above them.

Gund and Kyi are smart enough to not impose an expected emotional response to the story of their film’s protagonist (unlike the makers of Amy, a film which Kyi admitted to being inspired by, but which she has visibly surpassed): the audience I was a part of responded to Chavela’s story in a variety of ways, and I found this reassuring. For it gives one hope that one day, our dominant culture may catch up with this time-earned awareness: that new possibilities can only arise when we allow our agendas to be challenged, and maybe even discarded (and conversely, possibilities will wither and fade away, whenever we permit an agenda to override a truth).


Phantom Thread
directed by Paul Thomas Anderson; starring Daniel Day-Lewis, Lesley Manville, and Vicky Krieps
released by Focus Features


The stunning power couple of Daniel Day-Lewis and Vicky Krieps share a New Year’s dance in Paul Thomas Anderson’s exquisite melodrama, Phantom Thread. © 2017, Focus Features.

Phantom Thread, the eighth film by American maverick Paul Thomas Anderson, is one of the finest pictures of 2017—and a powerful reminder of every quality that is unique to the tapestry of American cinema. Like Linklater, Anderson is an artist in touch with his film ancestry, unafraid to wear his influences on his sleeve; and much like Linklater, he refuses to cave in to the traps of plagiarism and self-aggrandizement. That his work often carries reverberations of Altman and Scorsese never implies an attempt to elevate his efforts beyond their given potential: rather, these reverberations serve to point the audience in the direction of a cinematic context—highlighting differences as much as similarities, and revealing the greatest common thread to be a stubborn adherence to one’s own dream logic.

Much like his previous Daniel Day-Lewis vehicle, the now-cult-worthy There Will Be BloodPhantom Thread has the quality of a runaway fever dream. But whereas in the previous outing, this sensibility was carried to the extremes of emotional abstraction and narrative inscrutability, their most recent effort takes a more carefully deliberated and thoughtfully contained approach. When one revisits the bulk of Anderson’s output, one often finds an artist struggling to incorporate as many of his (often brilliant, sometimes baffling) ideas into manageable feature-length form. In Phantom Thread, we find the same filmmaker who was responsible for the more quietly austere debut feature, Hard Eight (a.k.a. Sydney): an artist intent on chipping away at the excess—to sculpt a shape defined as much by its omissions as by its features. The resulting effort is ambiguous but precise; perversely comical (in a manner that would’ve made Buñuel blush) and intensely, convincingly melodramatic. It’s nothing short of a cinephile’s dream.

Although it is likely true that all great movies begin with a solid script, Anderon’s films often seem heavily predicated upon their casting (something that could just as easily be applied to Robert Altman, of whom Anderson was an avowed admirer). A substantial part of the joy provided by witnessing Phantom Thread as it unfolds, stems from the organic spark between the film’s three stars—each of them delivering Oscar-worthy turns—and the characters they’ve so adroitly given life. Lesley Manville, in particular, provides a sort of cornerstone for the elaboration of the film’s more subtle character constructions: in her own words (as quoted in a BFI interview), she embodies “this person who is quite rod-like, and can do so much with just one flicker of the eyes.” Around this immovable fixture, the heightened emotional volatility of Daniel Day-Lewis (as Reynolds Woodcock) and Vicky Krieps (as Alma Woodcock) swirls in varying degrees of pathological complexity: at times revealing itself to be an extension of the characters’ personal traumas—such as the chillingly gorgeous sequence, in which Reynolds evokes the ghost of his mother—and at others, boiling out of the alchemy between their respective pathologies. Ultimately, all three characters emerge with the sort of understated depth and intricacy that has, up to this point in film history, only been afforded the likes of Norma Shearer and Anton Walbrook (in the great British films of Michael Powell and Emeric Pressburger). Like all great American auteurs, Anderson knows to steal only from the best.

On the other side of the vaingloriously chauvinstic posturing of Day-Lewis, Krieps shines as a sly sort of antidote to the suffocating dogmatism of over-zealous social (media) activism. Quoted in the same BFI piece mentioned above, Krieps observes that: “I respect Alma so much because she doesn’t really need the recognition or the approval, and this makes us strong… If a woman is not seeking this approval, this is a strength that’s stronger than anything, and you don’t then have to fight your ground, you just take your ground. What I like about the movie is that it’s about a dance between a man and a woman. It’s not about who’s stronger and it’s not about who will win. Once we get past this idea of ‘are the men stronger or the women?’ and just accept that men and women are ultimately completely different and completely opposite and will never be the same—until we understand and accept that—we can then have the conversation, the real conversation we really need. That’s when it will be interesting.”

Perhaps no writing on Phantom Thread captures my feelings about the film more capably than the review penned by A.O. Scott for the New York Times: “There are movies that satisfy the hunger for relevance, the need to see the urgent issues of the day reflected on screen. Paul Thomas Anderson’s eighth feature—which may also be Daniel Day-Lewis’s last movie—is emphatically and sublimely not one of them. It awakens other appetites, longings that are too often neglected: for beauty, for strangeness, for the delirious, heedless pursuit of perfection. I’ve only seen this film once […] and I’m sure it has its flaws. I will happily watch it another dozen times until I find them all.”


directed by Todd Haynes; starring Oakes Fegley, Julianne Moore, Michelle Williams, Millicent Simmonds, Jaden Michaels, and Tom Noonan

released by Amazon Studios and Roadside Attractions


Todd Haynes looks down on the immersive New York City panorama—showcased unforgettably at the conclusion of his latest offering, Wonderstruck. © 2017, Amazon Studios & Roadside Attractions.

Todd Haynes is one of the finest American artists working today, and I hope the relative poor performance of this latest offering (which left critics and audiences scratching their heads in unison) does nothing to dissuade him from following his gut—and venturing far into the wilderness of his boundless and brilliant imagination in the projects yet to come. (And dear god almighty: may the financing keep flowing.) If one reviews Haynes’s filmography to date, one may well identify a knack for engaging in meta-historical conversations with the history of art itself: from the inter-textual experimentalism of Poison (where Jean Genet, AIDS hysteria, the ’50s family melodrama, and the American B-movie collide in exquisitely strange unison), to the daring innovation of his pop music biopics, I’m Not There and Velvet Goldmine (both of which draw from a near-exhausting wealth of inspirations), to the so-far-ahead-of-its-time-it’s-frightening masterpiece, Safe (driven by the finest performance in Julianne Moore’s career-to-date, and an anti-aesthetical conviction that could have given Kubrick a run for his money—in its brutal, unrelenting aim to reveal the power of environment-over-character). And let us not forget the deceptively straightforward melodrama of Far From Heaven, a film so profoundly entangled in the yarn of its own history—which includes the melodramas of Douglas Sirk, the mythology of Rock Hudson (the reluctant Hollywood queer archetype), and the New German cinema of Rainer Werner Fassbinder—that most viewers barely begin scratching the surface of its possible interpretations.

I suppose any commentary on Haynes’s work is bound to solicit accusations of cinephilic elitism and hyper-cerebral analysis. And while such accusations may be warranted, I will readily revert to the same defense offered Last Flag Flying: that with so many contemporary film-makers disengaging from the quilt of film history, is it not acceptable for a handful of our remaining innovators to champion their roots and—more importantly—explore the remaining possibilities for cinematic evolution? For if the reader is open to such a notion, Wonderstruck will likely prove a rewarding and thought-provoking experience. It’s the sort of children’s movie we used to excel at producing in this country, but have seemingly forgotten how to tackle in more recent years. Haynes taps into the unstated wisdom of childhood: namely, a child’s natural ability to withstand the unfathomable sadness of their own existence; a sadness which many of us, as adults, find ourselves less equipped to withstand. Beyond this insight, Haynes revels in the mystified, tangent-prone mindset of his characters. He is the proverbial “kid in a candy store,” and it shows with every frame: just as the children are inclined to impulsive flights of fancy, Haynes is prone to indulge in the occasional bit of cinematic homage (in this instance, a couple of clever, well-played nods to Being There) and self-referentiality (as in the use of stop-motion dolls to reconstruct his characters’ fading memories, calling to mind his now-iconic use of Mattel dolls in Superstar: The Karen Carpenter Story; or the use of David Bowie’s “Space Oddity,” calling to mind his thinly-veiled reconstruction of the Ziggy Stardust story in Velvet Goldmine).

What sets Haynes’s work apart from the mass of self-made auteurs (many of whom bask in the onanism of referencing their own work) is his commitment to conversing with the work of other filmmakers, as much as with his own. And to this end, Haynes betrays a rather singular proclivity for establishing context around his art. Not unlike David Lynch (perhaps his closest relative, in postmodern terms), Haynes provides all the necessary clues for the audience to engage in their own private dialogue with his work. As artists, they share in a recognition that their audience will bring their own plate to the table; and they both know better than to dictate which ingredients their audience should eat. From this perspective, all that matters is that the audience be granted sufficient information to trace the lineage of the food on the table, if they so desire. (Or, if they’re inclined towards a more immersive experience, they can ignore the trail of clues altogether and just savor the feast.)

As for the story of Wonderstruck, suffice it to say that it is every bit as simple and convoluted as a children’s book ought to be (it is adapted from a hefty novel by Brian Selznick, which I have not read). All of the actors deliver strong, convincing performances—particularly newcomer Millicent Simmonds, who has the capacity to break your heart before forcing a smile in the course of an instant—and Carter Burwell’s scoring is sublime throughout. Without a doubt, the best write-up the movie could ask for was provided by the amiable John Waters, who coyly suggested in his year-end top 10 list: “Want an IQ test for your cinephile children? Just take them to see this beautifully made, feel-good kids’ movie about the hearing-impaired, starring a little girl who looks exactly like Simone Signoret. If your small-fry like the film, they’re smart. If they don’t, they’re stupid.”

* * *

So there you have it. Five proofs of American cinema; five signs of hope—that there are still those among us with adequate wisdom, perseverance, and vision to point a way out of the darkness. May these bright lights among us continue to shine through the falling night, and may they inspire others to do the same.


Jaden Michael, Oakes Fegley, and Julianne Moore look up with wonder at a sky full of possibilities. Wonderstruck © 2017, Amazon Studios & Roadside Attractions.

or: An Open Appeal to a Sane Society

Meet the new houseguest who doesn’t intend to leave: the horror movie that doesn’t seem to end, and that you’re not allowed to look away from. Like a 21st century variant of Burgess’s Ludovico treatment—only worse, because you’re actually living with the images forced upon you by some diabolical overlord. Enter the age of 45: the Hotel California of the new millennium. Life confined to a locked, low-ventilation room; with a wild badger thrown in for companionship, and the expectation that you’ll keep cleaning up after the damage—while never being offered the option to expel the badger altogether. At least, not as long as the ratings are up.


Alex de Lange undergoes the Ludovico Treatment in Stanley Kubrick’s adaptation of Anthony Burgess’s 1962 novel, A Clockwork Orange: the treatment entails forcing the patient to watch films of crimes and historic horrors, with the intent that exposure will prevent the patient’s committal of further crimes. Suffice it to say, the treatment is not entirely effective. © 1971, Warner Pictures.

As I sit here—wide awake, still a little stunned by the Senatorial victory of (Democrat) Doug Jones in the well-established Red terrain of Alabama—I realize just how much this bit of good news means to me: to my mental wellness, and my general sense of empathetic orientation with the human race. An orientation that has been shaken to its core since the traumatic national and international events of 2016.

Trauma changes people.

I realize tonight that this isn’t just about Doug Jones and Roy Moore, to me (and possibly, to many other American citizens as well). It’s not just about this shitshow of a presidential administration we find ourselves stuck living through—this wild badger thrown in the room, that we’re not allowed to remove for another three years (maybe less…). It’s about securing some fresh, statistical evidence that the people you’re sharing this country with (including your own self) are still capable of not being vicious, careless, misanthropic, narcissistic, mysoginistic, racist ogres. Evidence that we still have something worth fighting for, hidden among the bushes of the outrage mongers in talk news and the trolls, bots, and clickbait mongers on the internet.

Just as we must remember that the profoundly traumatic realities of 45’s election, his inauguration, and his repulsive miscarriage of Federal power, could have (and should have) been overshadowed by the 3 million plus voters outnumbering his “victory,” we must take (and savor) this moment as a signal that the human race hasn’t entirely surrendered its own plight—despite certain running indicators and unfortunate appearances. That contrary to Nick Cave’s misanthropic anthem (“People Ain’t No Good”), people ain’t entirely no good.

Above and beyond the effect this election portends for the state of Alabama itself (a state that hasn’t swung Blue in the Senate since the pre-Civil Rights Act days of LBJ), this event signals an anxiously awaited response from Republicans to the recent resignation of Democrat Al Franken (in light of the on-going denials put forth by 45’s administration, when confronted with the allegations of 19 women claiming past assault at the hands of our current president). Our nation’s sense of dread and anticipation was palpable, as Alabama faced the somewhat unreasonably challenging choice between a known, racist child predator, and a Democrat: would the state reflect the running trend in the GOP (deflecting attention from its own sins by playing an endless game of “pin the tail on the donkey”), or would they snap out of their Red state-induced coma long enough to recognize the hypocrisy that underlies every facet of their party’s current incarnation? Furthermore, would they perpetuate the mistake made by millions of Americans during the 2016 election—voters who somehow felt it wiser to support and elect the most morally defunct, greed-driven, and predatory Presidential candidate put forth in recent U.S. history, with the apparent delusion that they could return their purchase if it didn’t work out; Democrats, Republicans, and independents who apparently failed to recognize how much easier it is to prevent an elected demagogue’s abuse of power by not electing said demagogue in the first place—or would they prove to the rest of the country that they’d taken notes from that experience, and were willing to learn from past errors in judgment? And last (but certainly not least): would they demonstrate that the all-too-common social problem of sexual abuse (among other abuses of power) was identifiable as a human problem—a problem that transcends one’s party affiliation, or one’s like/dislike of the perpetrator—and not just some perverted political weapon, used to consolidate power and enable further abuses?

“There is more paradise in hell than we’ve been told.”
– Nick Cave (from One More Time With Feeling)

The trauma of waking up and having to see this horrendous failure of humanity (known by the acronym DJT) on every television screen, in every room (or check out for awhile, only to be haunted by fear and unease as to what might have transpired while you were sleeping), should never be downplayed or minimized. These are strange times, to be sure; but beyond the surrealism of it all, these are dangerous times. Dangerous for the fate of the planet; dangerous for the fate of children and adolescents, having to grow up out of the rubble of all this trauma. Dangerous for the fates of democracy: the right to free speech; the right to potable water; the right to our national monuments; the right to an affordable education; the right to affordable healthcare; the right to be a woman; the right to be a person of color; the right to a neutral internet. The right to not have the fragile egos of feeble leaders signing off on unnecessary wars and international conflicts—with the name of your country printed on the dotted line. The rights of veterans to access treatment and services, and to not be rendered homeless and helpless by the cruelty of weak men who sent them off to fight these unnecessary wars.

The right to love the person you choose to love. The right to vote for the candidates and issues you believe in and/or identify with—and the right to have your vote counted. The right to worship (or not worship) the deity of your choosing, and the right of others to do so in turn. The right to a fair trial in a court of law, overseen by a qualified judge who has undergone reasonable scrutiny before being entrusted with the fates of American citizens of all ages. The right to fight for environmentally-sound policy; functional infrastructure; fairer tax structures. The right to fight for the “little guy” (and gal), and a platform on which the underdog is allowed to speak and compete with the fastest runners.

The right to have all claims of sexual misconduct treated seriously, regardless of how much we may like or dislike the person whose reputation is on the line: the rights of the men and women who have experienced horrific personal traumas and abuses to have their stories heard—not exploited for the limelight, or an uptick in ratings, but actually listened to; respected; taken seriously. (Also, the right for the individual being prosecuted to speak on his own behalf and be heard, in the event some kind of foul play is in the works—or, in the event that the individual’s offenses are even worse than what was reported).


Jane Fonda plays a prostitute caught in a scheme of political intrigue, in Alan J. Pakula’s 1971 masterpiece of paranoia, Klute. The film was followed by two other entries in a “paranoia trilogy:” The Parallax View (1974) and All the President’s Men (1976). © 1971, Warner Pictures.

Over the past year, all of these rights have been (or are being) assaulted, defiled, defaced, or distorted beyond recognition. Many of us have turned to each other (or our respective deities) in desperation and confusion, hoping for solace and reassurance. Sometimes, we’ve been greeted with the terrifying vision of our neighbor’s desperation; other times—like tonight, after this small but somehow tremendous victory for the people of the United States—we are offered a ray of hope; a sign of life. A montage of baby steps towards a resolution, interjected at the end of the first chapter in some seemingly interminable (and poorly shot) blockbuster of political paranoia and international intrigue (think Pakula’s paranoia trilogy, or Polanski’s domestic horrors, as filmed by Jerry Bruckheimer; try not to vomit).

Trauma doesn’t usually leave when you ask it to: like that pesky houseguest (or that wild badger), it will linger and wreak as much havoc as allowed, and you may well find yourself on the verge of being evicted from your own home. And despite possible good intentions, lashing out in anger and aggression at the trauma you’re cohabitating with won’t do much good. I’m reminded of a scene in Noah Baumbach’s latest work—a straight-to-Netflix affair titled The Meyerowitz Stories (New and Selected): following their sister’s disclosure that she was molested by their uncle one summer in her childhood, two brothers decide to avenge her trauma by violently (albeit incompetently) trashing their uncle’s car in a hospital parking lot. They leave the scene of the crime giddy with pride at their perceived accomplishment; they feel somewhat less empowered after informing their sister, and hearing her disarming reaction: “it doesn’t change the fact that I’m fucked up for life.”


Elizabeth Marvel plays Jean Meyerowitz in Noah Baumbach’s The Meyerowitz Stories (New and Selected)—a Xerox executive who experienced sexual abuse during childhood at the hands of a relative, and explains flatly that there is nothing that can be done to remove the trauma from her personal history. © 2017, Netflix Pictures.

Hopefully, the trauma inflicted upon us by this deranged, dishonest, degraded, degrading, and possibly treasonous administration, will be survived by the good people of this country. Hopefully, the people who come out of this ordeal will look, think, and act a little more like the good people who turned out in droves for today’s vote in Alabama—people who chose to put principles above partisanship—as opposed to the people who enabled and supported this catastrophe back in its “preventable” stage. Hopefully, we will look back on this day as the day a nation came to its senses: the day we came to appreciate, collectively, just how much is at stake in this catastrophe; how much we have already lost, and how much more we have to lose if we don’t reject this putrescence—once and for all—and return to some core standards of intuition, decency, diplomacy, critical thought, self-awareness, and accountability.

There is still a long way to go, and a lot of work to be done: let’s not just rest on the laurels of a small step for man (however significant it may have been to the survival of mankind). Let’s keep moving ever-higher, up to the highest point on the curve of justice—outlining the arc of history in the most ambitious and humanistic shape possible. Let’s stay the course of sanity; for we should all be well aware by now, how easily we can be misled by the folly of ignorance, frail egos, and festering hatred.

Here’s looking to signs of life after trauma.

A deplorable year, in context.


Margit Carstensen plays the embittered Petra Von Kant in Fassbinder’s 1972 film of his quasi-biographical play; pictured here during her final on-film meltdown in front of her family. © 1972, New Yorker Films.

It started with cocktails.

It was November 8th—Election night, 2016. My partner and I had dinner (nachos, I think) with a cocktail on the side, to try and wash away the bitter taste of the ugly year leading up to this occasion. We caught up on some pre-recorded programs in the DVR, and switched over to PBS for the occasional play-by-play of electoral returns. Of course, it was still “too early to tell” at this point; though the smugness of certain commentators—a less-than-subtle confidence in the already projected outcome (a Democratic “landslide”)—gave me pause.

In the months preceding this night, I endeavored to raise awareness of the complex and multi-faceted significance of this election—and the devastating ramifications if the Presidential seal were to go to the most corrupt, unqualified, and inexperienced candidate ever to campaign for this office (from foreign policy, to climate policy, to basic civil rights, to corporate privileges, to tax policy, to infrastructure, to cyber-security and net neutrality…) I had cautioned my Bernie-adoring friends that the so-called “lesser of two evils” was, after all, still “less evil.” I encouraged folks to consider the pragmatic perspective that many social workers (myself included) are forced to adopt on a day-to-day basis, as a consequence of living in an imperfect world with imperfect choices: while one can rarely take an action that will result in no harm whatsoever (with the notion of “no harm” being in direct opposition to the human experience), one can gather information and critically evaluate options in order to take the path of least harm.

As I sat in front of the television, sweaty glass of booze in hand, I saw the path opening in front of our nation: suffice it to say, it was not the path of least harm.

I would like to say that, in hindsight, I responded to this awareness with a proportionate level of disappointment. If I were to be perfectly sincere, I would admit that my disappointment and anxiety skyrocketed beyond any proportion I might’ve prepared myself for, and my subsequent display of emotion was probably on par with the most exhibitionist meltdown of a character in a Fassbinder film (think Petra Von Kant screaming at her family, drunk on the carpet; or Elvira recounting her history of trauma from inside a slaughterhouse). After fifteen minutes of incredulously gazing at the incredulity of the commentators on the TV screen, I wandered off to bed in a daze, and sobbed myself through a (seemingly endless) night without sleep.

Some time after, my partner wandered up and lay next to me—our dog Sam sprawled in between us: blissfully blind to the specifics of what was happening around him, but visibly aware that something was off. He rubbed his nose against my side and I scratched behind his ear, periodically reaching for my phone and checking the electoral map for signs of hope; none were forthcoming. At a certain point, I just stop checking—painfully aware of the heightened anxiety provoked by these micro-updates. And then, the indigestion started. And the routine visits to the toilet to try and purge the queasiness swirling around in my stomach. And the hours spent in near-delirium, staring at the ceiling and waiting for the night to end, while simultaneously dreading the thought of having to survive the night and emerge into the reality awaiting me on the other side.

I’m still lying awake when I hear the clicking of a computer—my partner having woken up before me (as per usual), now checking the news feed on his desktop. I counted the seconds between the first few mouse clicks, and the first audible, heaving sobs; I think it took about fifteen seconds. I turned my face into a pillow and cried.

* * *

I find myself reliving this fateful day, as I embark on this effort to put my experience of 2017 in some sort of context (call it self-therapy). I can’t help but feel that the answer to many questions that have arisen out of this disastrous, unsettling, and disorienting year, lies somewhere in the outcome of that night—and the collective reaction to an action taken by the smallest margin of our population ever to select a (proposed) “leader of the free world.” In the months immediately following the election, I was one of many to identify a heightened level of engagement with social media; and while I cannot attest to the motives of others, I will readily concede that my personal engagement was driven by a heightened awareness of the unprecedented impact social media had yielded throughout the course of the election. In reading the near-unbelievable, beyond-dystopian tale of Cambridge Analytica, and the well-documented strategies implemented by several shady figures in favor of a global right-wing coup, it became quite evident to me that we stood on the threshold of a deeper abyss than was projected by the most dour catastrophist during the election itself. I felt a compulsion to be more outspoken than I had been before (since, evidently, reserved compunction, blind faith in objectivity, and trust in the collective conscience of mankind had not yielded any favorable results). Looking back over some of the insights and commentary I shared publicly via social media at the start of the year, I regret none of what I wrote—but I can now recognize the general insignificance of my commentary with a greater degree of intellectual clarity.

This isn’t to say I’ve adopted a defeatist perspective. Today, I can sincerely claim (give or take a little) the same level of investment in the plight of humankind as I claimed last November; and the year before that, and so forth. But as our global village (if McLuhan’s term can even be fairly applied to our present-day climate) advances towards ever-increasing levels of chaos, I’ve become painfully aware of how incompetent and, in many cases, outright detrimental this twenty-first century drive to provide running commentary on the human experience has been to achieving any sort of actual progress. Retrospectively, in fact, one can trace the most recent phase of devolution (and devaluation) of the human species through a comprehensive anthology of our president’s impetuous Tweets—accompanied by the often-comparably impetuous retorts of commentators across the globe. If one were inclined to place these exchanges in context and illuminate the bigger picture for those in need of perspective, one could print this anthology of Tweets and comments and hang it on a wall in a museum; opposite this display, one could hang a display of climate data, pictures of the refugee crisis, profiles of newly-appointed right-wing judiciary representatives, annual hate crime statistics, research on hereditary traumaworld poverty statistics, annual gun violence statistics, opioid overdose statistics, and current nuclear arsenal statistics (with illustrations). The viewer of such an exhibit should be capable of drawing their own conclusions.

Suffice it to say, very little social progress has been achieved during the past year. One could go so far as to argue that we have taken such an enormous step back in our social evolution—the trajectory of social progress has been scrambled to such an extent that we have to redefine the very idea of social progress. For example: prior to the election, one could generally accept that, regardless of one’s economic status or party affiliation, sexual assault was a deplorable action. But something changed, somewhere along the course of the 2016 campaign trail. If one were to examine the Republican party’s response to the excavation of that infamous Access Hollywood tape, and compare it to their response to revelations that then-President-elect Bill Clinton’s had engaged in an extended affair, years before the 1992 election, one would have to resolve that the Right has either lowered their standards for outrage, or only complain when their majority is on the line. In addition to this, we find the emergence of a new Right-wing chorus (which will go on to be adopted by many a libertarian, third-party voters, and Democrats as well): the now familiar refrain of “fake news;” a magic potion for alleviating the symptoms of cognitive dissonance.


During the historic 1992 campaign trail, it didn’t take long for the revelation of President-elect Bill Clinton’s 12-year affair with Gennifer Flowers to become a partisan weapon yielded by the Bush campaign to cast moral aspersions on his Democratic opponent.

In 1992, voters of all stripes wrestled with both the knowledge of Clinton’s affairs and an awareness that this information might be manipulated for partisan gain; in 2016, there appeared to be little-to-no wrestling at all. Polls at the time indicated that, by and large, 45’s base was actually strengthened by the revelation of the tape: casting the objective information of the tape aside, many of 45’s supporters voiced an opinion that their only concern lay with how this information might be skewed for partisan gain—and not with the implications of the information itself. In other words, the information of our then-President-elect’s predatory behavior (in combination with all the other evidence accrued to support the case for his predatory business practices) was as good as irrelevant. And so began the trend of alternative facts, and the convenience of being able to reject information that conflicts with one’s pre-existing belief pattern by merely denying its existence. Viewed along the action-reaction continuum, “fake news” was both a reaction to the leftist obsession with investigative journalism, and a positive action in its own terms (using “positive” in the Skinnerian sense). For by achieving an unspoken consensus among themselves—that information adverse to the advancement of one’s own political goals cannot (and should not) be bothered with in the first place—45’s supporters have succeeded in establishing a level of intellectual disengagement not seen at any other point during the nation’s past century of political discourse.

If we now consider this new right-wing action (“just say “fake news” whenever anything upsets you”), we must consider the subsequent leftist reaction (hyper-dramatically present the severity of upsetting developments, in an attempt to appeal to the emotional-spiritual side of right-wing fact-deniers). The leftist reaction can be seen throughout any number of impassioned Facebook and Twitter rants: that (somewhat-to-absolute) self-righteous outpouring of hysteria and concern, presented with all the pathos and drama of an argument in some generic TV courtroom drama. This brand of emotional reactivity has been, in some cases, strategically channeled to advance social issues (as in, most recently, Tarana Burke’s powerful #MeToo movement); on the flip side, the catharsis of social media engagement presents a stumbling block for individuals who have no conception of follow-through. For instance, the fanaticism of Howard Beale (Peter Finch) in Paddy Chayefsky’s Network (1976), which I see frequently shared (in the form of the “I’m as mad as hell” excerpt) by peers and acquaintances on social media, offers a prescient insight into the risks associated with commercializing outrage—though I fear some folks take the bit out of context and fail to apprehend the way it all falls apart.

In Sidney Lumet’s film of Chayefsky’s acclaimed script, Peter Finch convincingly plays a neurotic newsman who “flips a wig” after being let go from his job, and takes to the air to advertise his on-air suicide a night in advance. Instead of delivering on his promise, he launches into a sermon about how the world is going to shit, then beckons his viewers to run to their windows and yell into the streets with him: “I’m as mad as hell, and I’m not going to take it anymore!” His viewers comply, and the station executives hear of a boost in ratings: they investigate the situation further, and realize there’s big money to be made selling outrage to the deadened masses. In a relatively short period of time, the mentally unstable Howard Beale has been asked to front his own television variety show—to feature his now-trademark impassioned rant as a sort of nightly act. Beale displays some resistance early in production, but by the end of the movie has been brainwashed to the point of putty in the station’s hands: a once genuine expression of repressed jouissance has become a weird sort of household name, and the television executives profiting from his mental health condition wind up having him killed, because of an eventual decline in ratings.


In Network, Peter Finch plays a hysterical newsman (named Howard Beale) whose psychosis is co-opted by his employers at the network for a spike in ratings. Once the act grows old and ratings decline, Beale is bumped off by his executives, who will continue accruing royalties from his downfall. © 1976, MGM Pictures.

While extreme and grotesque in its scope, and rendered for largely satirical purposes, Chayefsky’s work does seem to offer a cautionary tale for our time. In order to prevent becoming as pathologically shortsighted as Howard Beale, one must always ask oneself, when contemplating such catharsis: What purpose could this possibly serve? What’s the intended follow-up plan for one’s outrage—or is there one? Is it possible that one is just yelling words into a digitized vacuum, which then captures one’s words and capitalizes upon them, selling them off as part of a metadata package? Is this essay going to become just one more yell into the vacuum? One hopes not, but one never knows.

Since human beings have still failed to learn the lesson the universe endeavored so painfully to instill in us throughout last year’s election (the lesson: social media activity will not fix most things, or even anything; but it can readily make things worse if given the opportunity), we’ve apparently doubled down, and now find ourselves caught in the middle of a surreal and bizarre game of “who has the most sexual predators in their camp?” (As far as what this game is intended to prove or resolve, I suppose anyone’s guess is as adequate as the next person’s.) One by one—day by day—famous celebrities and political pundits continue to drop like flies in the ointment of this “to catch a predator” show; inappropriately enough, this surreal game has been (and continues being) overseen by the predator who set this chain in motion last Fall (don’t worry: he’s not going anywhere anytime soon).

In keeping with the chosen leftist reaction to overstate one’s passion for a given issue—in a vain effort to “wake the deadbeats from their slumber”—we now find the exponential possibility of human folly achieving the highest (or lowest?) levels of stupidity. For starters, we have the borderline-comical leftist insistence on the morally “wrong” connotation of sexual assault: as if by insisting strongly enough, those who believe otherwise might instantaneously be converted. Furthermore, this juvenile proclivity for moral sermonizing has embedded itself as a point of division between proponents of liberal policy. Just as the more die-hard idealists who upheld the “purity” of Bernie Sanders against the “corruption” of Hillary Clinton drove a wedge between the otherwise-united front of liberal voters (aided and abetted by the Russian trolls who targeted third-party and Bernie supporters with strategically placed news stories to reinforce their disdain for Hillary), we now have liberal idealists thinning their own herd (yet again) by singling out anyone who fails to fall in line with the outspoken chants leveled against perpetrators of sexual assault.

I recently stumbled upon an article which provides a textbook illustration of the infantile thought process underlying this leftist penchant for “out-idealist-ing” one another. In an online Stereogum/Spin magazine article (filed under the “News” heading), a writer named Peter Helman takes issue with comments and views put forth by the ever-divisive Steven Morrissey in a recent Der Spiegel interview (yet again, I find myself stumbling upon the commentary before the news itself; which, in and of itself, isn’t news). Here’s a verbatim transcript of the opening paragraph, as printed in the article (whose writer acknowledges openly that he did not bother to pursue a proper translation of the interview, and relied upon Google translator as arbiter of the interviewee’s meaning):

“Hey look, Morrissey said a stupid thing! It’s been a while since Moz has said something truly objectionable and not just, like, ‘Oh, Morrissey is kind of an asshole.’ But now, in an interview with the German news outlet Spiegel Online on the heels of his new solo album Low In High School, he’s come through with some genuinely terrible opinions.”

First, we find the distinctly liberal cocktail of snark and finger-wagging writ large in the opening statement: before we are even offered a glimpse at the musician’s controversial comments (let alone the chance to remind oneself, as hopefully all reasonable and grown adults do in such instances: “what do I care what some music journalist thinks of what some musician thinks of some matter with which he has no direct affiliation?”), we are instructed (seeing as how the reader cannot possibly be intelligent enough to reach their own conclusion) that the comments are objectively “stupid.” Then, we have the reinforcement of this admonishment coupled with an insistence that one ought to consider these “stupid” statements even more offensive than whatever the last thing the writer had admonished the musician about. Then, as if the message had not yet been clearly conveyed (after all, we’re dealing with a reading audience that cannot be trusted with their own thoughts), the writer insists that this latest interview with the Moz reveals “some genuinely terrible opinions.” (Be still, my fluttering outrage odometer!)

I’m disinclined to even bother with an analysis of the article (let alone the comparably over-indignant commentary of those who shared the “story” on social media; excepting for maybe Shirley Manson, who brought up a valid point in suggesting that Morrissey appeared to not have the latest updates on the “plot[s]” of Spacey and Weinstein), but I nevertheless feel compelled to provide some sort of a corrective to the borderline-toxic preachiness of these self-appointed messiahs to moral indignation. Not that Morrissey’s views, as quoted here, are even that noteworthy or idiosyncratic: if anything, they seem to echo the contrarian tone of similarly uneventful remarks delivered by Johnny Rotten earlier this year. But whereas Rotten and Morrissey are merely doing what they’ve been doing all along in their respective careers (namely, being abrasively provocative), Helman’s heavy-handed critique—along with any analysis bearing the imprint of such thoughtless indignation—inflicts the greatest damage of all on the integrity of an intelligent dialogue: for not only does it inherently reject the reader’s intelligence (something that neither Rotten nor the Moz, bluster aside, would ever dare try), it functions primarily as the byproduct of a profit-driven online press: a press which now feeds vampirically on the outrage of the web-surfing public, frequently leaning on the crutch of self-righteous indignation as a shortcut to increase clicks and shares. (Hm… that sounds familiar.)

And since “writers” (at least, the successful ones; the ones whose bread-and-butter is outrage-tinted click-bait) save the most upsetting/eyebrow-raising/scintillating bits for last (in order to maximize the advertisement space between the reader’s first click on the article and the long scroll to its disappointing finish), there must be some build-up to the exhibit of [insert celebrity’s name]’s horrifying remarks. Like an 18th century freak show, in which true horror would have to be instilled in the imagination of the visitor, before being deflated by the banality of the exhibit itself. (Sure enough, cries of “shame!” and “how dare he?” were heaped upon the Moz within minutes of the article’s posting; after all, what’s one more pariah on the fire…) In keeping with every other un-news-worthy observation shared by Morrissey in an interview, a scandalous viewpoint has been tried and condemned for failing to align with the prevalent vernacular and perspective of the times, and persona non grata status has been duly granted to the offending party. From what we know about the artist in question, one ought to suspect this is what he wanted all along, anyway: win-win (I guess?)


Steven Morrissey’s 30+ year career has been consistently marked by stylized, overly dramatic outbursts, coupled with the artist’s vegan activism and often reactionary views. As a (by)product of the British punk era, Morrissey is to many a poster-boy for resisting conformity. Also  renowned as a legendary pain in the arse.

My point here isn’t that Morrissey’s statements should be defended: he’s a grown man and should take ownership of whatever non-sense and/or half-sense pours out of his twisted mouth. Rather, my point is to ask: What purpose could this possibly serve? And moreover: What does all this exhibitionistic “journalism” imply about the state of social commentary? Have we truly devolved to the point that an individual needs to preface any commentary on the subject of sexual abuse (and the inherently complex psychology of victims and perpetrators) with an assertion that one does, in fact, disapprove of sexual abuse and predatory behavior? Are there popular articles out there that I’m not seeing, in which individuals go on record saying that they condone sexual abuse, and wish there was more of it? And if so, is the tone of such deplorable articles so unrecognizable from the tone of a level-headed writer’s, that level-headed writers need fear their audience suspecting they might, in fact, be pro-sexual abuse? And if so, wouldn’t the abuser-shamers serve their purported mission more capably by tracking down those pro-abuse folks and chastising them? Regardless of the answers to any of these questions, nothing remotely edifying can come of such conversations, if we cannot bring ourselves to respect (read: allow) the judgment and intellect of our reading audience—sans these forceful and belittling cues to trigger our moral outrage.

Which brings me back to the actual problem at-hand, and the elephant in the room that remains perpetually sheltered from the storm of allegations swirling around him: the President of the United States. For unlike Morrissey (or Johnny Rotten), our president has made it clear time and again that he is pro-sexual abuse, and despite the skepticism of his supporters (who feared that their boy’s well-documented predatory behavior might be yielded by leftist commentators for partisan gain), he has displayed no compunction about turning allegations of abuse into political weapons—so long, of course, as the allegations are directed at individuals outside the Republican umbrella. Which renders it all the messier when individuals on the left allow themselves to get caught up in the hurricane of abuser-shaming (often with noble intentions, at least at the start), since this is exactly what the most powerful person in the country has been relying upon this entire year to advance a truly abusive agenda—not least of all, through his success in appointing an entire slate of unnerving judicial assignments: out-of-touch bigots and bloggers; unqualified lunatics who will shape our country’s legislation for decades following the inevitable demise of this administration. All the while, his White House continues to ignore and deny the allegations of 16 women who have confronted the public with their abuse stories, and the President remains… the President. As of this writing, there have been no formal inquests proposed in Congress to investigate and pursue these claims further.

I suppose I should feel compelled here to state my own disavowal of sexual abuse, and to verbalize my support for the victims who have come forth with their alternately harrowing and unnerving stories. I’ve chosen to refrain from offering any commentary on the subject up until this writing for a combination of reasons; mainly, as someone (and more specifically, as a white man) who has not suffered sexual abuse firsthand, I feel it isn’t really my place to remark on a subject so close to others, yet so distant from my own lived experience. I’ve found that, in such cases, it’s best to just shut up and listen to those who know what they’re talking about.

* * *

The title of this essay is taken from a track on this year’s Sun Kil Moon/Jesu collaboration, 30 Seconds to the Decline of Planet Earth. The song takes as its subject the child abuse scandals that haunted Michael Jackson to his early grave: it appears to have been inspired by a conversation on a plane between the song’s writer (Mark Kozelek) and a young woman traveling to Greece to perform in a musical of Michael Jackson’s life. The song paraphrases a conversation between the two, in which Kozelek asserts (rather firmly) that the world is, undoubtedly, a better place without a pedophile R’n’B star living in it. At first listen, the lyrics to the song are more-than-slightly jarring: casual listeners might be inclined to interpret this perspective to be the actual opinion of songwriter Mark Kozelek, whereas those who’ve spent time with Kozelek’s other recordings may recognize the sound of his (often) darkly satirical social commentary.

I can’t say for certain whether the lyrics to “He’s Bad” come from a place of sincere commentary or social satire, but I find it difficult to accept the former interpretation. In fact, the perspective of the song’s narrator is often so wince-inducing in its generalizations, one can only make sense of it when read in quotation marks:

“Is the latest on him true?
Well I don’t fuckin’ know
But if I had a son, would I let him get into a car with Michael Jackson?
Fuck no
I’m sorry for the bad things that his father did to him
But it doesn’t add up to building a Willie Wonka trap for kids
And changin’ the color of your God given skin
He made creepy videos that the popular kids liked back in the eighties
And once over a balcony he dangled a baby
And did the moon walk
And talked like a 9 year old girl
I don’t give a flying fuck what he meant to the mainstream world
Roman Polanski went down in flames and was incarcerated
But this young little kid addict will forever be celebrated
A hundred plastic surgeries and paid two hundred million to shut people up
Took someone’s child like it was nobody’s business and dragged him around on a tour bus

He’s bad
And he’s dead and I’m glad
He’s bad
And he’s dead and I’m glad
He’s bad
And he’s dead and I’m glad
He’s dead and to me it ain’t that fuckin’ sad”

The song has stuck with me all throughout the ups and downs 2017 (and it was, for the most part, a year of downs). A friend of mine, who suggested I check out the record, cautioned me in advance about the song’s “cringe-worthy” quality; at first listen, I shared in his assessment. But upon further listens, a space opened up in the longer instrumental stretches of the track, and I found myself strangely drawn to it. Presently, I find it to be a brilliant piece of songwriting—perhaps even moreso, if these are, in fact, Kozelek’s verbatim opinions. The song capably highlights a common trend of generalization and oversimplification among present-day liberal pundits: one might as well call it the “make sure the baby goes out with the bathwater” syndrome. Because it’s easy (and more precisely, facile) to take a step back from the strange and unsettling case of Michael Jackson, and surmise that he was nothing more than a sick man who preyed on children—that consequently, the world is better off with him dead than alive, and he might as well have gone sooner. But had he never lived, this song (a highlight from the record, I think) would not exist: not just its lyrics, but its arrangement, structure, arpeggiation… all of which pay tribute to the late “King of Pop.” Which begs the question: Is it right for one human to judge the life of another and determine they ought not to exist—or have existed at all? It’s the same question that underlies the debate(s) surrounding the death penalty; war; abortion. Taken at face value, the perspective of Kozelek’s song sides with the affirmative answer to this question. But interpreted satirically, the question remains open-ended. Unlike the above-mentioned Stereogum article, the reader of Kozelek’s song is actually given a space to think for themselves and reach their own conclusions; so that, even if these are the songwriter’s dyed-in-the-wool beliefs, we don’t feel pressured into adopting them as our own (or, conversely, into rejecting them outright).


Michael Jackson was the subject of great public scrutiny throughout his short and strange life, which provides the subject for the recent Sun Kil Moon / Jesu track, “He’s Bad” (from 30 Seconds to the Decline of Planet Earth, available from Caldo Verde records).

I refuse here to entertain the idiotic question that somehow goes on being debated in certain circles: can bad people make good art? (I will, however, quickly dissect the idiocy inherent to the question’s phrasing: firstly, there is no such thing as “good” people or “bad people;” and second, what do you think?) However, I do find it noteworthy that a lot of angst appears forthcoming in the public response to revelations that Louis CK, Charlie Rose, and Kevin Spacey—celebrities that, unlike the blowhards who preceded them (Roger Ailes, Bill O’Reilly, and Harvey Weinstein) were somewhat well-liked—have lived messy lives and done some deplorable things. Each time a new pariah gets added to the fire (all the while, the President shakes hands with Duterte on a visit to the Philippines, and swaths of Puerto Rico remain powerless), I’m reminded of the excellent documentary Happy Valley, directed by Amir Bar-Lev and released in 2014—a year or two following the explosive child abuse scandals involving the once-respected Penn State coaches, Joe Paterno and Jerry Sandusky. In his film, Bar-Lev explores the American (and outright human) proclivity for placing a fallible person on a pedestal, and then shaking one’s head in disbelief when fallibility rears its ugly head. As I watched the reactions pour in on social media (friends who initially felt inclined to defend their heroes against the allegations, and as soon as proof was provided, reluctantly gave in to the evidence, tossing the baby out the window), I had a flashback to the confused street riots that followed the Sandusky trial—in which the townsfolk of Happy Valley alternately mourn and celebrate the dismantling of a statue once proudly erected to Joe Paterno (who was not convicted of perpetrating, but was found to have enabled Sandusky’s behavior after being informed of its existence).

The psychology of the townsfolk, which is smartly and respectfully explored by Bar-Lev in his documentary, can easily be transposed to the public psychology surrounding this irrational debate unfolding on our national stage; the key question in the debate is: Where do we store our dismantled idols? (As opposed to the far more proactive question, which everyone seems too afraid to pose: Why are we obsessed with erecting idols in the first place?) For some (and most specifically, for those employed in talk news) the answer to this question is “straight to hell.” Never in my adult life have I seen such a rabid drive—propelled primarily by pundits who appear to take more than a little schadenfreude in exposing the discovery of (yet another) sexual predator—to excommunicate individuals from their professions (before their employers have even had a chance to evaluate each situation and weigh in on the matter; a scary social precedent, to be sure), and hold their mock-trial in the court of social media (for an especially disturbing case-in-point, see the response of so-called “progressives” to the revelation—forecast eerily by a Roger Stone tweet—that Al Franken did some things in poor taste on his USO tours).


A resident of Happy Valley, featured in Amir Bar-Lev’s 2014 film of the same name, takes a stand by the statue erected to honor Joe Paterno’s heroic status among the community. His sign reads: “Paterno, the coverup artist !! Paterno, the liar !! Paterno, the pedophile enabler !!” © 2014, Music Box Films.

On the one hand, this sort of “cleaning house” could be argued as a corrective to the long-delayed response of Fox News executives to the litany of allegations leveled against Roger Ailes and Bill O’Reilly (et al); on the other, the zealousness of this drive appears to be conspicuously overlooking the most powerful perpetrator in the room. And if we are not making an effort to prioritize the perpetrators who wield (and exploit) the largest amount of power, but gladly chase after those with relatively little power (yielding the digital equivalent of pitchforks and torches), then the only takeaway from this discussion is that humans are increasingly oblivious to their own addiction to the mechanisms of an exploitative society—to the extent that we routinely take private pleasure in exploiting other exploiters, all the while denying our own role in the circle of exploitation.

At the end of the day, it is this recognition of the multi-faceted power deferential involved, which appears to be the biggest stumbling block for most folks to navigate. Setting aside the heavy-handedness and gross over-simplification of Morrissey’s “controversial” remarks, he does appear to be clumsily pointing to a taboo truth that some people refuse to acknowledge: that in some (not all) instances, the prospective “victim” in the abuser-abused equation will find a way to subvert the power deferential for their own gain. And before the reader reaches for their pitchfork, allow me to clarify that I am speaking of these matters in the specific context of exploitative behavior within an exploitative society (one cannot ignore the reality that individuals make desperate decisions under desperate circumstances). One thinks of the psychologically-sound Lolita twist, for instance—in which an older man preys on a “nymphet,” only to find her turning the tables and exploiting his inappropriate adulation to achieve independence. Which isn’t to say that Nabokov lets Humbert Humbert—or his predatory leanings—off the hook; rather, he accepts the obvious element(s) in this equation, before pointing to the more taboo reality lived by more than just a few individuals in this power-driven society: a reality in which the exploited learns how to exploit, for lack of other identifiable options. (In her best-selling memoir, Reading Lolita in Tehran, author Azar Nafisi provides a feminist interpretation of Nabokov’s text—which she read as a metaphor for the often oppressive experience of life in the Islamic Republic of Iran; further highlighting the on-going significance of discovering fresh social commentary in old texts.)

Alas, such nuanced observations can never see the light of day in our current debate surrounding sexual abuse (or any other issue, for that matter), seeing as how the very idea of truth has already been co-opted and distorted by opportunistic sycophants and sociopaths at Fox News (and elsewhere)—who continue making a killing, selling dumbed-down distortions and good old-fashioned lies as substitutes for insightful commentary. And on the other side of the political fence, a vehement denial of nuance in sex politics frequently appears as a thin disguise for some vaguely misogynistic impulse to deny the emotional, behavioral, and psychological complexity of the feminine experience (for it’s easier to brand every woman a victim for life, even after they’ve made peace with their offenders and politely invited their defenders to piss off). Rather than confront the messy psychology and uncomfortable truths inherent to the dynamic between victims and perpetrators of sexual abuse, talk news pundits (few of whom can be considered experts in social psychology) have apparently reached a consensus that the best way to talk about the issue is to: not actually talk about it; shame the perpetrators until they’re out of a job; and run the interviews of the victims being forced to describe (in gritty detail) their abuse to the flashing lights and rolling cameras; over, and over, and over again… Similar to the way most pundits talk about gun violence (except, no one has actually lost their job for enabling mass gun violence through aggressive gun lobbying; not that I’m aware of, at least).

And again.
What purpose could this possibly serve?

What strikes me the most about Kozelek’s song (which inspired, at least in part, this meandering diatribe) is how it either intentionally or, perhaps unintentionally highlights the banality of its own perspective. Every time I hear the song, I think to myself “I would never go out of my way to listen to a song that presents such a perspective with utmost sincerity:” it would be like taking Randy Newman’s “Rednecks” at face value. A song that feels so stiltedly obliged to assert moral autonomy, while somewhat sadistically proposing a recommendation of death to criminal offenders…  What purpose could this possibly serve? One hopes, the purpose of irony. For in making the listener consider words and deeds of such strict moral outrage—in confronting us with our own respective failures to accept some amount of gray in our black and white lives—one might then feel a little wiser, considering the possibility of something else. Not unlike in the films of Fassbinder, who strove time and again to show the audience the need for change, while never spelling out what that change ought to be (after all, shouldn’t we be smart enough to figure it out on our own?)

In an essay from a book I’ve had my nose in lately, detailing the merits of RWF’s 1971 film masterpiece The Merchant of Four Seasons, author and former acquaintance Christian Braad Thomsen observes:

“Fassbinder […] shows the necessity of vigorous action on the part of the viewer. But he’s not a school teacher, who wants to raise his finger and tell the audience what they have to do, if they want to change the world. He is the Socratic artist, who uncovers how the existing possibilities of living have failed, and points out that change is necessary.”

Throughout his prolific and multi-faceted career, Fassbinder sought to demonstrate the mechanisms of his inherently flawed and power-driven society, clearly enough for any viewer—regardless of their education, intelligence, or station in life—to understand the mechanism and, in turn, recognize the need to rise above it. Three years after releasing The Merchant of Four Seasons, Fassbinder carried his vision of societal deconstruction to an even more poetic and empowered level with Ali: Fear Eats the Soul. In an interview given after the film’s release, Fassbinder observed that “I tend to think that if […] depressing circumstances are only reproduced in film, it simply strengthens them. Consequently, the dominant conditions should be presented with such transparency that one understands they can be overcome.” Rather than taking a sadistic pleasure in portraying the misery of those too enslaved by a social mechanism to recognize how breakable their chains might be, Fassbinder sought to show love for these strange creatures called “humans,” by perpetually revealing the existence of the chains—and the absence of a wizard behind the curtain. In film after film and play after play (and without any undue condescension or simplification), he succeeded in demonstrating that all individuals (regardless of race, gender, sexual orientation, age, or politics) are capable of breaking the chains of exploitation, if they only choose to live a pure existence—predicated upon the inherent human values that society continually distorts (by claiming them as its own, and often assigning them a capital/nominal value).

Put plainly: one cannot go on playing un-elected judge to man’s folly, and resigning oneself to a vacant culture of reactionary outrage. That would entail rejecting the possibility of finding a different way to live, and to demonstrate, by example, an alternative to such folly. (And if one rejects the need for an alternative to folly, one is simply a fool.)


In his five-part TV mini-series, Eight Hours Don’t Make a Day, Rainer Werner Fassbinder paints one of his most positive portrayals of people trapped inside a social mechanism they yearn to break free from. © 1972-1973, Westdeutscher Rundfunk. Renewed 2017, Arrow Home Video.

I’ve returned to Fassbinder many times over the past years, and have always left with an uncanny sense of premonitory relevancy and an inspired momentum. A DVD set of the recently rediscovered (and beautifully restored) TV miniseries, Eight Hours Don’t Make a Day, has been a close companion these past few months; I’ve rarely seen a film so genuinely positive in its outlook, let alone a film of his. His characters are shown (as usual) to be moving parts in a social machine, but all the parts in this machine move beautifully—and each on their own terms. Instead of falling back on jargon, party slogans, or naïve Marxist sentiment, Fassbinder shows characters from all throughout the power structure as somehow genuine and, just as importantly, capable of empathy (and change). He shows that action will forever speak louder than the most eloquent words—while simultaneously revealing how words and images can be employed to further the awareness of a need for action. Not just social (read: collective) action, but individual action. Presently, fleeting social movements (via trends, hashtags, and viral videos) demand wide-spread attention, and the individual finds himself trapped between a biological drive to engage with one’s own self-actualization, and the socially conditioned response to ignore or reject this drive: to follow the horde or disappear. Hence, the “individual” is celebrated, but only on the terms of the individual’s bond with society; if the individual does anything to sever this bond with society, the individual essentially (and in certain cases, actually) will cease to exist.

Indeed, it would seem as though excommunication has been the only fear to consistently unify individuals, in societies across the world—and throughout the ages. The higher the threat of expulsion, the greater the anxiety in one’s life; the greater the anxiety in one’s life, the greater the relish in the expulsion of another. (Following this train of thought, one shudders to think of all the threats and anxieties our current President must have accrued in his lifetime.) I find it especially concerning that so many straight, white, and self-proclaimed “feminist” men appear to be foaming at the mouth to call out anyone who fails to speak the programmatic lingo they’ve conditioned themselves to communicate with; one wonders if some (or perhaps many) of these individuals might be protesting so loudly, for fear of having their own past improprieties exposed. Either way, I imagine a casual time traveler would have a hard time distinguishing our media’s contemporary treatment of sexual abuse scandals, from the Warren Commission’s treatment of the “Red Scare:” so many people eager to see the lives of others demolished, for fear of being next in line…

In another section of Thomsen’s illuminating text (entitled Fassbinder: The Life and Work of a Provocative Genius; quotes taken from the English text, translated by Martin Chalmers), analyzing Fassbinder’s explosively controversial (and for just reasons) play, Garbage, the City, and Death, the author reveals a parallel theme to the circle of exploitation: the corresponding closed circle of oppression—escape from which is a far more complicated matter. Thomsen writes that:

“Throughout Fassbinder’s work we see the oppressed assuming the norms of the oppressors, whether out of a conscious need for revenge or whether they have more or less unconsciously internalized the dominant norms. Fassbinder never has ‘pure’ heroes. Rather, he demonstrates one of the most melancholy consequences of oppression, that the damage to the souls of the victims makes them unable to find alternative norms, so that the only possibility left to them is to recapitulate the norms that have led to their oppression.”

While it sometimes feels as though Fassbinder has artistically resigned himself to a closed box of self-fulfilling prophecies, his work continually reminds the reader/viewer of the complexity of human behavior; a phenomenon which, after decades of misrepresentation (or reductive representation), we appear to have grown somewhat culturally blind to. (So we keep building new idols, only to tear them down after human behavior—yet again—reveals its darker potential.)


In his 1981 feature film of Lili Marleen (pictured above: a resplendent Hanna Schygulla, in the titular role), Fassbinder reveals the cyclical nature of exploitation and oppression through the story of a Weimar cabaret singer in love with a Jewish man, at the start of World War II. As its despairing plot progresses against the oppressive backdrop of the Nazi regime, all of the protagonists catch themselves in the act of being exploited and exploiting others, to survive and to pursue the faint possibility of self-actualization in desperate times.

As I continue to revisit Kozelek’s song—from one month to the next, and one criminal celebrity exposé to another—I’ve decided that I don’t ever want to catch myself reveling in the demise of another human being. Those words, “He’s bad/He’s dead/and I’m glad,” ring hauntingly hollow; they don’t feel genuine… As a stand-alone thought, without corrective, they feel like a disservice to the vastly complex potential of our human nature. And yet, one is so very often stumped, when confronted with the death of someone who did truly terrible things.

This morning, the headlines read: “Charles Manson Dead at 83.” Later in the day, upon arriving at my office, I was made aware of a death in the immediate family of one of my co-workers. I felt (and still feel) a profound sadness for my friends and their family. I thought of Leslie Van Houten momentarily, and the families of his victims, but apart from that I could muster little in the way of an emotional response to Manson’s death. The words of Kozelek’s song ran through my head again, and they rang false again; seeing as how gladness was an emotion, and I couldn’t bring myself to fit emotion in the equation of Manson’s death. A custodian at the office made small talk with me about the news while changing trash liners, observing that: “Someone who did so many awful murders… If I’d had my way, he would’ve been taken out back and put down. Saved the tax payers some money.” I acknowledged his observation, and respected his right to view the situation in such plain terms; I clarified that the death penalty was (rather controversially) suspended in California around the time of Manson’s sentencing. After our brief exchange, and upon considering Kozelek’s song and the deaths of other infamous criminals (and criminal artists) throughout history, I decided to follow my gut. After all, he was somebody’s child, and somebody loved him. Gladness seems glib, even in the plainest of contexts.

* * *

Where does all this leave us?
And what are we to do with the pieces we have left?

In answer to the first question: we are left alive and awake in the United States of America. We have a Constitution that has up until now guaranteed a fairly open space for independent speech and individual commentary. We have great books written by great minds; illuminating films by directors who see (or at least saw) the potential for the medium to show the possibility of an alternative, and the accompanying need for change. Beautiful records by our favorite musicians; museums and galleries full of artwork to expand our horizons (unlike the talk shows, reality shows, and click-driven online journals that rely upon the shrinking horizons of their viewership, in order to sustain their traffic and ratings). Blank paper, canvas, web outlets (free while they last), on which we can project our visions of an alternative and our private and collective need to change.


Anna Karina in Jean-Luc Godard’s dystopian masterpiece, Alphaville: channeling Alfred Hitchcock; channeled later by Ridley Scott. © 1965, Athos Films.

As for what we’re supposed to do with all this… well, that’s up to us: individually and collectively. I’m in no position to outline the course for an entire nation of people—each with their own individual views, ideas, convictions, and motives—but I do think we would be better off trying to learn something from the trials and tribulations we’re living through, rather than just repeating these tired tropes of scapegoating, public shaming, and language-policing (among other forms of dictatorial social conduct). As was sorely predicted by many of us, when the results of last year’s election rolled in, this year has been a nightmare on many fronts; and short of an organized revolt or an awakening of Republican consciousness, we’ll have to endure at least one more year of the nightmare. By continuing on the course our society has traversed this past year (a course that has both recycled traditional socio-economic exploitation tropes and invented new ones, thanks to the willingness of millions to surrender their thoughts, ideas, photos, and identities to metadata collectors—free of charge—to be exploited by the highest online bidder), we are sentencing ourselves to an increasingly dystopian future in which individual thought verges on extinction, civil liberties become novelties, sex is reduced to a formal contract, and humor is no longer recognized. Like Godard’s Alphaville, only far less cool to look at.

As far as my own experience of 2017 is concerned, I like to believe that I’m leaving this year older and more tired, but wiser as well; less quick to jump to conclusions, more open to the ambiguity of life and the possibilities for change. I advance into the wilderness of a new year with the knowledge that no socially-imposed chain of exploitation can hijack my freedom to think and act in accordance with a greater wisdom. Unless, of course, I grant this chain the power to do so.

“Even Richard Nixon has got soul.”
– Neil Young
(from the 1977 song, “Campaigner,” recently reissued on his Hitchhiker LP)

Vice Principals is the show that every American adult—and more specifically, every racially disoriented white American—should probably be watching and talking about over dinner. As it becomes increasingly difficult to satirize reality (with reality itself having become an un-ironic satire of social indecency), the creators of this half-hour HBO comedy series (Jody Hill and Danny McBride) have somehow managed to pointedly encapsulate everything bad that is afflicting our country’s societal wellness—while at the same time saving a space for the remaining dregs of decency, which are routinely squeezed out of similar attempts at encapsulating our problems in dramatic form. It’s a program defined by its crass, cruel, grotesquely arch, and (often unexpectedly) black comedy. But while the ostensible victim of the show’s first season was a black woman climbing the ladder of upper management in a public school system, it is the show’s prime villain (her cold-blooded VP, Lee Russell) who undergoes the greatest scrutiny and, ultimately, comes across as the “biggest loser.” Unlike other programs (in both documentary and fiction realms), which consume themselves with endeavoring to paint the plight of the minority citizen in shades of self-pitying helplessness—with a frequently less-than-subtle nod to a (typically white) social justice warrior, who rides in like a knight in shining armor to save the damsel in distress—Vice Principals is a ruthless portrait of the victimizers; making no excuses, and taking no… well, maybe a few prisoners.


Dr. Belinda Brown (Kimberly Hebert Gregory) sizes up her two infantile and devious cohorts in HBO’s Vice Principals. © 2016, HBO Networks.

When Dr. Belinda Brown (played superbly by Kimberly Hebert Gregory)—the impossible-not-to-be-enamored-with principal who sets the show in motion—is brutally forced out of its equation at the end of the first season, one feels a profound sense of loss. But not only hers: ours, as well. (If anything, Dr. Brown likely considers herself released from the toxic environment of her fellow protagonists’ making.) It is our loss to not have her as a prominent part of the show’s perversely hysterical conversation anymore, being left to contend exclusively with the petty hooligans who have taken her place. In actuality, at the start of the second season’s premiere episode, we find Dr. Brown alive and well—reunited with her husband and two kids, and living a good distance from the deranged vice principals who attempted to ruin her life (and very nearly succeeded). When she begins to hint at her own departure from the show’s narrative, it not-so-subtly calls to mind the departure of our country’s previous commander-in-chief—whom we’ve since seen skydiving, vacationing with his family, and generally conducting himself like an all-around decent human being. All the while, total chaos and insanity looms in the place he used to sit, and a nation is left watching history re-play itself out like a warped VHS tape of white power rallies, devastating hurricanes, incredulous White House leaks, presidential scandals, and arrogant white kids with bad haircuts and polo shirts, armed with tiki torches to defend poorly sculpted monuments of the Confederacy (just when you think you’ve seen it all…) It’s hard to watch this last season of Vice Principals and not blow a wish for Dr. Brown to come back and give one more inspirational pep rally in the North Jackson High School auditorium—just to feel a tingle of hope, that all is not (yet) lost.

Looking back, the first season was a chore for many to sit through: it garnered justifiable criticism for subjecting viewers to an exhaustive, vicarious experience of racist/sexist intimidation and persecution—which so closely echoes the real-life experiences lived by millions of Americans. But while the show certainly has its fair share of “cover my eyes ’cause I can’t bear to see where this goes” moments, I would argue that it remains a rewarding, perhaps even necessary experience for white Americans (especially white men). It forces the viewer to witness the devastating outcomes of intolerance, but not from an easy “scared straight” perspective; instead, the viewer actually has to do some work—to connect the dots between the shallow instincts that compel a person to behave in such a hateful fashion, and the reality such a person must effectively disengage from in order to fulfill such absolute hate. For hate is, ultimately, an uninhabitable condition (something one needs constant reminding of at this point in time). To highlight this truth, there comes a moment in every episode during which VP Neal Gamby (our anti-hero-cum-protagonist, played by McBride) will catch himself in the middle of some atrociously mean-spirited act—typically provoked by his far more nihilistic partner-in-crime, VP Russell—and question his ability to follow through with his ruthless vows, eventually caving in to his own vulnerability. In these moments, the viewer recognizes that even the Scroogiest of conservative white men has a soft spot, somewhere deep down; and in this act of empathetic recognition, the viewer finds their own embers of hateful inclination slowly sizzling out. (In turn, viewers with an overtly racist and/or sexist inclination—who might, at first glance, align themselves with the diabolic intentions of Russell and Gamby—are bound to cave in by the first season’s conclusion, upon realizing the fruitless and dispiriting outcome of the protagonists’ hate.)


Vice principals Lee Russell (left, Walton Goggins) and Neal Gamby (right, Danny McBride) contend with the unsustainability of their own prejudices. © 2016, HBO Networks.

Since the inauguration of 45, I’ve been troubled by the response of many a despairing liberal to the ill-informed cocktail of bigotry and racial intimidation perpetuated by the president and his base. On the one hand, it seemed to me the reaction of liberals was disproportionately soft—compared to the out-and-out violence (verbal, physical, psychological) that we found ourselves up against; on the other, it seemed a pretty ill-advised approach to fight fire with fire: to attempt and wipe out hate by singling out and shaming the haters, many of whom are so blinded by their own misinformation that they fail to recognize their bigotry as hatred incarnate. I recall beating my head against a wall (literally), and exchanging a series of frustrated emails with friends, most of which culminated with a half-joking recommendation that we split the country in half, effectively separating the evolutionists from the devolutionists. In seeking a broader perspective, I found myself drawn to the brilliant and frequently sardonic songs of Randy Newman, which have—throughout the past four-plus decades—effectively charted the folly of the stupid white man in America; sans effigies, platitudes, or other common forms of creative scapegoating. And I asked myself: Where are the Randy Newmans of today? Where are the Gore Vidals, the James Baldwins, the Nina Simones? How come every visible attempt at protesting the ignorant insanity of 45’s America appears to swing toward the two outer extremes of timid sloganeering and destructive violence? (Fortunately, not long after I went through this line of questioning, it was announced that Mr. Newman would be releasing a new studio album later this year—providing a much-needed salve. Far less fortunately, so many voices belonging to people of color have been effectively suppressed, repressed, depressed, or extinguished altogether; rendering it difficult for the range of creative perspectives the country ought to be represented by to truly flourish—and sentencing the fate of acceptable social protest to a kneel in a football stadium.)

Setting aside the apparent racial intolerance that has festered throughout the country (and the Russian interference that reinforced this intolerance through strategic interventions on social media), part of this dilemma likely stems from another root cause of 45’s presidency: the mindset underlying that lamentable term, “political correctness.” In hindsight, it is difficult to imagine 45’s candidacy gaining the kind of momentum it generated without the scapegoat of liberal hyper-sensitivity. Every slogan developed throughout his campaign served to highlight this critique: from “crooked Hillary,” to “bad hombres,” to “what a nasty woman,” to the cringe-inducing “he can grab my…,” to the swiftly appropriated “deplorable and proud of it,” the racial hatred permeating the campaign’s tone was matched only by its general disdain for pre-meditated and/or sensible syntax. And as with all false generalizations and stereotypes throughout history, there was, in fact, a justifiable criticism at the onset of this profane game of Chinese whispers. Namely, the criticism of the left’s increasingly rigid thinking on the subject of policing language: a well-intentioned effort to nip hate speech in the bud, but one that has frequently neglected to take into account the Quixotic nature of its own pursuit. For just like the idealist of Cervantes’ great novel, the “P.C. police” (as they’re commonly referred to by irritable right-wingers) often find themselves chasing windmills and missing the forest for the trees: so wrapped up in the semantics of isolated incidents, they lose sight of the motivators behind the language they are policing—which might foreseeably range from absolute, vitriolic hatred; to an infantile desire to provoke or offend; to sheer ignorance of the meanings attached to the words one has chosen.

It is within this context that Vice Principals presents a swooping breath of fresh, tension-splitting air. Although the premise of the show is itself a persistently tense exercise in caustic polarization, the manner in which it mirrors the real-life tensions surrounding its creation (considering that the first season’s airing coincided with the peak of the 2016 election) serves to deflate the pressure accompanying its subject matter. Here we find three character types that are frequently subject to the “politically correct” treatment—an effeminate, plausibly closeted gay man; a heavyweight divorcé; and a well-educated woman of color—released from the popular liberal’s cocoon of cultural suffocation, and allowed to live and breathe as characters that are every bit as nuanced as they are dense; almost like actual people. And if the show has a secret ingredient in the recipe of its greatness, it most likely lies within this astute recognition that vilification and deification are equally ineffectual tropes (both in narrative terms, and in lived reality). It would be easy—all too easy—to rewrite the show with Gamby and Russell (embodied by the relentlessly brilliant Walton Goggins) as dyed-in-the-wool hate-mongers, with a cheaply sketched-in backstory of how they came to be so hateful (e.g. childhood abuse; bullying; exposure to violent crime): the rest of the series—assuming the form of a prime-time melodrama—would essentially write itself, with the characters either achieving progress towards an awareness of the origins for their respective prejudices; or, conversely, digging their heels in deeper and, eventually, falling on the sword of their own bigotry. Not only would such a literal execution of the premise be uninteresting: it would render it increasingly difficult for the actors to bring any real pathos or complexity to their characters, since such a narrative is ultimately a glorified journey from point A to point B. In other words, this more “sensitized” approach would present the antithesis of a real person’s life journey, which invariably presents a more complex trajectory through various stages of change and emotional/intellectual growth.


Gregory (right) provides the heart and soul, and Goggins (left) the diabolical thrust behind Vice Principals—the only great satire thus far broadcast on American television in the year 2017. © 2016, HBO Networks.

Rather than taking the easy way out of contending with bigoted protagonists, Hill and McBride have boldly chosen the more challenging, and far more rewarding narrative approach. In Gamby and Russell, they have created two strangely… lovable bigots. Not that one loves them because of their bigotry (the show is structured in such a way that such sympathies are unlikely, at worst); one loves them in spite of the raging ignorance and intolerance that continually threatens to swallow them whole. Instead of being vilified and caricatured as two creatures from the black lagoon who’ve arisen to claim some distorted interpretation of supremacy, Gamby and Russell are just two stupid white boys with no real grasp on the concept of emotional maturity—and watching their psyches disintegrate from episode to episode is every bit as comical as it is maddening. Not unlike our current president, whose racist inclinations frequently appear to stem less from an inherent sense of racial superiority (I mean, just look at him), but more from a cynically strategic approach to soliciting support from pockets of the U.S. voter base, which any seasoned politician with a modicum of decency would refuse to entertain (e.g. David Duke and his cohort, and at least half of our Presidential Cabinet). But the real masterstroke of Vice Principals is that, despite the uncanny parallels between our presidential administration and the admin of North Jackson High, the show succeeds precisely where the president’s administration has failed: by actually making us care about the fate of its ignorant protagonists.

It is safe to say, at this point, that hardly a person in the country—or, more broadly, on the face of the earth—can be bothered to care about the personal fate of the 45th president. It is, in fact, difficult to think of any figure in our nation’s history who has been so widely (and so justifiably) reviled, across the board of political identification and cultural affiliation. And true to form, 45 has surrounded himself with individuals who only serve to further dehumanize his public persona: compounding the reality television aesthetic of his own making, and continually escalating the threshold of public disdain. And I would here argue that it is this aesthetic of idiocy—this constant talking down to the citizens of a country who, by and large, know they deserve better—that presents the biggest hurdle for his detractors to surmount. The brilliantly monotonous condescension of Maxine Waters, in addressing one of the president’s multiple administrative chumps, Steve Mnuchin, provides a case study in the only appropriate way one can respond to such arrogant bluster: consistently raising the point (“reclaiming my time”) of our administration’s inadequacy, incompetency, and seemingly interminable disrespect towards the citizens whose interests it has been charged to uphold.


Dr. Belinda Brown: carrying on with conviction and humor. © 2016, HBO Networks.

Likewise, in Season 2 of Vice Principals, Dr. Brown brilliantly dismantles Neal Gamby’s initial hypothesis regarding his violent assault at the culmination of Season 1: upon being accused of Gamby’s attempted murder, the former Jackson High Principal scoffs at this suggestion, instead drawing Gamby’s attention to a tattoo she has had affixed to her back—depicting her two former vice principals actually eating shit, while smiling and amorously holding hands. It’s her own personal idea of revenge: a gesture that hilariously highlights the racial divide at the heart of Season 1’s tension. For whereas the white male testosterone pumping through Gamby’s and Russell’s systems repeatedly compels them to acts of childish violence and lashing out, the cool “been there, done that” attitude of Dr. Brown—whose past experiences with indignant white men can only be imagined by the viewer—empowers her to keep her calm and carry on with humor and conviction: two things the country (if not the world) is in most dire need of now.

It has yet to be seen how the remainder of the series will play itself out. As Russell and Gamby delve deeper into their farcical investigation of Gamby’s shooting, one can’t help but think of the President’s own glorified wild goose chase: to single out his dissenters, and thereby satiate his acolytes with a gushing fountain of persecutory accusations directed at the liberals they all thumbed their noses at this last election (or, to expand upon this metaphor with an even more precise one, the noses they cut off to spite their own faces). Two well-played scenes in the most recently aired episode serve to highlight this real-life parallel: in one, Gamby enlists a black security guard from the school to search the lockers of multiple black students, all of whom he has targeted as prime suspects for his attempted assassination (without a shred of evidence, of course). After finding nothing but homework, textbooks, and a scientific calculator in one boy’s locker, the security guard observes in a disheartened tone: “Man, you actually made me think he was guilty!” The other scene in question entails Russell planting a hot mic in the teacher’s break room, in order to tune in to the gossip taking place behind his back (most of it directed at his gaudy wardrobe, social awkwardness, and apparently deadly halitosis): when he later proceeds to fire his entire faculty for subversion, one immediately thinks of Sean Spicer, Steve Bannon, Reince Preibus, Sally Yates, Michael Flynn; the Mooch.

For some prospective viewers, this will all prove a little too much too soon. And yet, in bringing ourselves to truly care about the fate(s) of Gamby and Russell—in wanting them to get at least a little woke; to stop being such selfish assholes, and to play a little bit nicer—there’s a chance we might bring ourselves to care a smidge more about the fate of this altogether asinine administration, along with the misguided minions who stubbornly refuse to withdraw their support for it. In turn, and for better or worse, it is they who now dictate the fate of our nation.