Archive

History

in the Home of the Brave.

“There’s no such thing as love, only proof of love.”
– Jean Cocteau

 

Is there such a thing as cinema? Do the images that flicker for us on that big screen—paired with foley effects, synced dialogue, and original scoring—compose something tangible and identifiable? Or is it all an illusion; a reproduction of a dream (that most intangible and abstract concoction of all)? More pressingly: is there still a place for cinema, in the age of social media (with its foremost byproducts: outrage and attention deficits), online dating, and reality TV presidents?

It’s a question that has been swirling around the toilet bowl of movie nerd-dom for several years now—fielded primarily by a circuit of twenty-something film school brats (I use the term endearingly; they all appear to be gainfully employed at IndieWire now, so it would seem they’ve landed on their feet), adjusting their glasses as they alternately defend the politics of streamable distribution formats, or decry the disappearance of that communal experience once known as going to the movies. As far as this writer is concerned, the debate can be rendered irrelevant with a simple understanding that where there is a will, there is a way; and regardless of the production/distribution methodology, we have a century-old addiction to recreating our dreams for projection on the big screen. This is unlikely to disappear outright—particularly if one considers that dreams are in greater demand than ever.

Last year saw the demise of many socio-cultural norms and institutions. It also bore witness to some awe-inspiring new works by our country’s foremost dream-makers, and the emergence of some powerful new voices in American cinema. In the former category, no achievement can match the awesome feat of Mr. David Lynch—whose 18-hour-long masterclass in film-making (Twin Peaks: The Return) has left viewers throughout the world kneeling in the dust of its tailspin; bowing to the shape of its receding genius. In addition to Lynch’s crowning achievement, there were strong showings from other established auteurs, including Paul Thomas Anderson, Noah Baumbach, and Todd Haynes. We were served a generous helping of the profoundly twisted, Hitchcockian meticulousness practiced by David Fincher (whose original miniseries, Mindhunter, gives long-form life to the investigative-cum-philosophical theorism of Se7en and Zodiac); we were also granted a fresh dose from the perceptive, loving, and quintessentially American gaze of Richard Linklater. In the newcomer category, there was a powerful entry from Catherine Gund and Daresha Kyi (Chavela); a directorial debut by the fabulously deadpan Greta Gerwig (Lady Bird); and a wobbly but noteworthy second feature by Eliza Hittman (Beach Rats). There was also an imperative documentary on the late civil rights activist and prolific writer, James Baldwin (I Am Not Your Negro, directed by Raoul Peck; worth the price of admission, but regrettable for its failure to tackle the full scope of Baldwin’s contradictory existence), and the surrealistic late-night comedy flair of Jordan Peele—successfully channeled into big screen, feature-length form in the topical blockbuster Get Out.

Brody-Faces-Places-Secondary

Photographer JR paces a beach in Normandy, where he and Agnès Varda have just pasted one of many portraits taken throughout Faces Places on the base of a WWII bunker—which was pushed off the precipice of a nearby cliff. © 2017, Cohen Media Group.

On the international stage, we were blessed with offerings from the subtle genius of Ms. Agnès Varda (whose latest documentary, Faces Places, is a fountain of joys), the sensuous intellectualism of Luca Guadagnino (in the James Ivory-penned audience favorite, Call Me By Your Name), and the slick auteurism of Denis Villeneuve (whose eagerly anticipated sequel to Ridley Scott’s seminal masterwork—Blade Runner 2049—left me breathless and teary-eyed). We encountered the quietly mysterious spiritualism of Olivier Assayas (who brilliantly melded the mystical horror of Nic Roeg’s Don’t Look Now with the existential melodrama of Krzysztof Kieślowski, in his original film Personal Shopper), the stark realism of Francis Lee (God’s Own Country), and the smarter-than-average populism of Guillermo Del Toro (The Shape of Water). And while I could easily use this essay to sing praises to each of these international works, it seems to me—with all the tumult and unrest engulfing us on the national (and international) stage(s)—that a more pressing need may be met by attempting to highlight the fruits of my homeland: a country that has, since its very inception, provoked justifiable skepticism around its merits.

Much has already been written about on-going struggles, pertaining to inequality and sexual harassment within the American film industry (along with every other facet of our socioeconomic structure). The movement to systemically advance opportunities for marginalized individuals—and the parallel movement, to raise awareness for the plight of those experiencing institutionalized harassment and discrimination—is long overdue. Perhaps because of this delayed reform, it seems there may be an unfortunate residual effect emerging from this discourse (and more specifically, from the online social media factor; for while this technology has proven well-suited to a number of ends, social progress has scarcely been one of them). That is, the tendency to cynically lament the shortcomings of a given system—in 2016, the “swamp” of Washington, D.C.; in 2017, Hollywood—all the while forgetting that not every individual involved in said system represents said shortcomings.

For instance, if we are to examine the strengths and deficits of the United States, circa 2018, it would be easy—too easy—to highlight the deficit column, and disregard altogether the finer qualities we’ve represented more capably in the past. But would such emphasis prove these qualities to be nonexistent in the present? Or would it merely bring to light the fact that these merits are an integral part of the American fabric—that they have fallen on hard times, and may need some attention to flourish once more? I am hopeful that this new wave of social activism will contribute to the reignition of our country’s innovation and resiliency; qualities which have fallen by the wayside for some time now (at least as far back as our cultural shift in definition—from innovation: discovery and development, to innovation: app development). I am fearful that—within our climate of antagonistic communication patterns, totalitarian politics, and a general predisposition toward reactionary patterns of behavior—this form of activism may all-too-easily be thwarted by neo-conservative powers, intent on branding minority-status citizens as victims for life, and thereby curtailing their power to advance the causes of restorative justice. Regardless of my hopes and fears, I have always found the presentation of a viable alternative to be the most effective strategy for social change (as opposed to the incessant hounding of those already well-known for fostering inequality; lest we forget that all publicity is good publicity, for those with no dignity left to jeopardize).

In a similar vein, I don’t see much merit in hounding on the immense miscarriage of finance that underlies the majority of Hollywood’s output (beyond pointing out that such a miscarriage exists). I’m a firm believer that, in a consumer society, we empower the type of work we want to see more of, whenever we make our selection at the box office ticket counter. Although the aggressive powers of marketing have escalated exponentially these past few decades—culminating in our present-day, tail-wagging-the-dog marketplace mentality—we are the ones who ultimately empower (or discourage) the makers of plastic cinema, when we hand them our attention and our money. Which is why most of us adopt a selective approach in our movie-going habits (let alone the absurd escalation in ticket and concession prices): just as in the world-at-large, one can have a positive impact on the future of cinema, by supporting the proofs of cinema which advance its more worthwhile attributes. And while each viewer has their preferences, I find it remarkable that so many of these attributes have long been shown to be universal. Consider the phenomenon—that a single film can be understood and lauded (or derided) by different nations of people, throughout every corner of the world. That we can each learn from the perceptions and experiences of perfect strangers, and in so doing develop a greater capacity for love and understanding. May this phenomenon never be taken for granted.

For the purpose of this entry (and for the cause of restoring some honor and dignity to a country that has little to champion in either department, as of this writing), I have chosen five of my favorite American films released in 2017: to hold them up as shining examples of our more worthy attributes; and to remind the reader (if one is in need of reminding) that there is still much worth championing in the American landscape. In times such as these, we may all need reminding.

 

Lady Bird
written & directed by Greta Gerwig; starring Saoirse Ronan, Laurie Metcalf, Tracy Letts, Lucas Hedges, and Timothée Chalamet
released by A24 and Universal Pictures 

TimotheeChalametSaoirseRonanGretaGerwigLadyBird

Greta Gerwig directs Saoirse Ronan and Timothée Chalamet in a scene from her beloved directorial debut, Lady Bird. © 2017, A24 and Universal Pictures.

I was first made aware of Greta Gerwig when I saw the first of several Noah Baumbach vehicles in which she appeared—the under-valued (in this writer’s opinion) and surprisingly buoyant dark comedy, Greenberg. I immediately took note of the name. There was something in the way she brought her character—and, consequently, the film—to life; something I couldn’t quite put my finger on, and didn’t particularly care to. I hate to use the term “star quality,” seeing as how what passes for a star these days would make the likes of Bogey and Bacall roll in their graves. Suffice it to say, Gerwig has the sort of innate brilliance and affability that could inspire one to ask her out for a cocktail, and debate whether Gene Kelly or Fred Astaire was the better dancer (for no other reason than to hear the sound of her voice as it struggles to keep pace with the winding movements of her wit).

Gerwig has already had a terrific run (and she’s only just begin), appearing in a pair of films she has since co-written with Baumbach—her erstwhile paramour—as well as giving memorable turns in works by Todd Solondz (Wiener Dog) and Rebecca Miller (Maggie’s Plan). Watching her take the lead and walk away with every scene in Frances Ha fostered in this writer the sort of unabashed, film-loving glee that only comes around once in a blue moon; the film’s nouvelle vague aesthetic, rather than making it appear dated, actually served to highlight the confidence and strength of its content and delivery. A year before that, I was positively enchanted by her incarnation of Whit Stillman’s alter-ego, Violet, in his politely subversive and drier-than-a-communion-wafer gem of a film, Damsels in Distress: finding myself only one of two people in the theater (the other being my companion) to laugh hysterically at its tenderly acerbic take on the follies and neuroses of bourgeois young adults, I wondered if Gerwig’s particular (some may say peculiar) sensibility could ever connect with a broader audience. Half a decade later, as I sat in the packed house of that same theater for a screening of her Oscar-nominated directorial debut, I grinned and laughed uncontrollably; I thanked all of our lucky stars this moment had finally arrived.

While one is never in doubt as to the film’s author (one can practically visualize Gerwig acting out every part in the movie during script readings), the ensemble cast of Lady Bird deserves a standing ovation for their dedicated and cohesive effort to bring Gerwig’s writing to life. I was especially taken with Laurie Metcalf (who, in addition to Saoirse Ronan—the film’s protagonist—is now up for an Oscar) and Stephen Henderson, whose subtle performance as a theater instructor in the Catholic high school frequented by Lady Bird has lingered in my memory. Lady Bird’s rotation of friends and acquaintances is equally memorable: from the “shitty Pavement fan” (Gerwig’s verbatim direction) boyfriend played by Timothée Chalamet, to the helplessly perky ex- played by Lucas Hedges (most immediately recognized as the kid in Manchester By the Sea), to her best friend and confidante, Julie (a beaming Beanie Feldstein).

Given time, Lady Bird is likely to be lumped in a basket with every other coming-of-age comedy to ever achieve critical acclaim (The GraduateCluelessRushmoreThe Breakfast Club, etc…). And while there would certainly be some fine company in this basket, it would be a disservice to the extraordinary nuance of Gerwig’s film—which unlike The Graduate, with its stylish cynicism (or Rushmore, with its stylish stylism) happens to be an unexpectedly intricate and layered portrait of adolescence; above and beyond what most are accustomed to getting out of a Wednesday matinee. That such an unabashedly smart, disarmingly confident slice of American film-making could emerge from our current cultural climate—and in the process, achieve international acclaim—is a testament to the finer qualities of the American sensibility. It is also a testament to the (possibly boundless) potential of a strong, idiosyncratic voice in the latest chapter of our nation’s cinema.

 

Last Flag Flying
directed by Richard Linklater; written by Richard Linklater & Darryl Ponicsan; starring Steve Carell, Bryan Cranston, Laurence Fishbourne, J. Quinton Johnson, and Cicely Tyson
released by Amazon Studios and Lionsgate

last_flag_flying_cranston_fishburne_carell_credit_wilson_web

Left to right: Bryan Cranston, Laurence Fishbourne, and Steve Carrell play three Vietnam war veterans in Last Flag Flying, Richard Linklater & Darryl Ponicsan’s “spiritual sequel” to The Last Detail. © 2017, Amazon Studios & Lionsgate

It is probably no great secret, among my friends and fellow movie fanatics, that I have a strong affinity for the work of Richard Linklater. Ever since my first viewing of Waking Life, in the form of a DVD borrowed from my local library, I have followed every step of Linklater’s career—with a mixture of fascination and mild apathy (something tells me he would approve of this response; it’s mostly fascination, anyhow).

In Last Flag Flying, Linklater tenderly pays tribute to another great film love of mine—the late Hal Ashby; whose 1973 adaptation of the earlier Darryl Ponicsan novel, The Last Detail, provides much of the spirit for Linklater’s quasi-sequel. It’s an honest, considered, personalized reproduction of the story Ponicsan wrote three decades later (at the height of the second Gulf War): in many regards, the narratives run parallel to each other; but this later entry is more firmly rooted in the trenches of death, and the sorrow of survival. Their events seem to overlap: in both stories, for instance, the three protagonists share a night on the town in New York—and subsequently miss their train. The fact that in one they’re looking to get laid, while in the other they’re looking to buy some mobile phones, is entirely beside the point; the echo effect is palpable, and it is bound to resonate with fans of Ashby’s cult classic. A large part of what renders Last Flag Flying such a noteworthy feat (or proof) of American cinema, is this sense of connected-ness: with the histories of its characters; the histories of its authors; and with the most radically inspired, promising film era in our nation’s cinema (spanning ’68 to ’79, or thereabouts; also the timeline for Ashby’s career). Some may deride this sort of praise as high-handed, but as our connectivity to history becomes increasingly scarce—with sound bytes and YouTube clips superceding context and formal analysis—I think it’s warranted.

THELASTDETAIL-SPTI-07.tif

Left to right: Otis Young, Randy Quaid, and Jack Nicholson play three Navy corpsmen in Hal Ashby’s 1973 adaptation of The Last Detail. © 1973, Columbia Pictures.

What is most notable about this picture, perhaps, are the thoughtful ways in which Linklater asserts his own personality and characterization throughout. For whereas both Ashby and Linklater linger on the spiritual questing of troubled characters, Linklater advances the quest through a far more directly pointed approach. In The Last Detail, Jack Nicholson’s “Badass” Buddusky rolls his eyes during a unitarian gathering of chanting practitioners; in Last Flag Flying, Bryan Cranston’s Sal embarks upon an incessant, often irritating (intentionally, at that; and effectively, kudos to Cranston) tirade against the perceived-as-indoctrinated rationale of his former buddy—now-Reverend—Richard Mueller (Laurence Fishbourne). Which isn’t to say this confrontational perspective belongs to the director himself (though the viewer may pick up subtle shades of Ethan Hawke’s Jesse in Cranston’s Sal); Linklater merely had the wisdom and good faith to reveal, whenever possible, the changes that time has inflicted upon his characters—along with the changes time has withheld. That there is no direct connection between the three characters portrayed by the actors in each film is especially effective—and affecting: for by pointing to separate instances of similar life patterns, Linklater and Ponicsan achieve a far broader sense of connectivity with the human condition. It’s the sort of artistic gesture that reveals how, even though our behaviors are developed through a complex mixture of environmental and biological triggers, they frequently perpetuate themselves through stubborn repetition, and through subjugation to damaging social constructs (in this case, the construct of war). And if the complexities of human behavior can be perpetuated, it follows they must also be capable of change.

In keeping with this insight (which doesn’t emerge until farther along in the characters’ journey), Last Flag Flying closes on a dark but optimistic note. The resolution belongs to Steve Carrell’s character—an ex-Navy corpsman known as “Doc” Shepherd; the heart of the film, in more ways than one (Carrell’s performance being a quiet and inexorable force throughout). The film fades out as “Doc” achieves a sort of closure with the premature death of his only son; the song that fades in during the end credits is “Not Dark Yet,” from Bob Dylan’s beautiful late ’90s offering, Time Out of Mind. It provides the perfect post-script for the trajectory of these characters—a trio of Vietnam war veterans struggling to connect the dots of their scattered lives (“I can’t even remember what it was I came here to get away from“). It also manages to connect their struggle to the more imminent struggles faced by our country, at this specific juncture in history; for as we sit around, waiting for someone to step up and dethrone the lunatic who’s been given free reign to distort our country for private gain, many of us search for signs of hope—struggling to find some comfort in the paradox betrayed by Dylan’s song: it’s not dark yet, but it’s getting there.

 

Chavela
directed by Catherine Gund and Daresha Kyi; starring Chavela Vargas, Pedro Almodóvar, Elena Benarroch,  Miguel Bosé, and Liliana Felipe
released by Aubin Pictures

chavela

Pedro Almodóvar and Chavela Vargas: two rebellious spirits, captured in Catherine Gund and Daresha Kyi’s exceptional documentary, Chavela. © 2017 Aubin Pictures.

I am so grateful that my local art house cinema (Neon Movies) picked up this very special and memorable documentary; it was particularly rewarding to have one of the film’s co-authors, Daresha Kyi, in attendance for a live Q&A post-screening. Her pensive and often comical commentary validated all of the finer presumptions this writer had gathered from the screening, but it also served to open up many of the complexities and contradictions scattered throughout the surface (and subtext) of Chavela.

According to Kyi, the process of making a documentary about the famed (and infamous) Mexican chanteuse, Chavela Vargas, began under different circumstances than what one sees in the finished product. The project actually originated with an in-person interview, conducted by Catherine Gund with Chavela at the start of the singer’s first major comeback in the early ’90s. Having gone through her personal archives and digitized all the decomposing film lying in canisters around her studio, Gund rediscovered the power of this twenty-some year old footage, and felt compelled to share it with Kyi. Upon viewing the footage together, and catching up on the later years of Chavela’s life story, the initial concept developed by Gund and Kyi involved having another Latina chanteuse narrate Chavela’s story through her own personal lens. Gund and Kyi assembled a rough promo edit of this approach, then screened the material for a group of potential investors. The consensus was clear: forget about the other singer (whom Kyi did not refer to by name during the Q&A); the story is Chavela’s, and she should be the star of her film.

Upon approval of an expanded budget, Gund and Kyi were able to license footage from different televised interviews and performances, conducted at various times throughout Chavela’s complicated (and at times, difficult to trace) career. They proceeded to film present-day interviews with persons of interest, spanning the course of Chavela’s professional and personal development: a former lover (and life-long private attorney); the Spanish filmmaker, Pedro Almodóvar (who was partly responsible for Chavela’s European comeback tour, along with Laura García-Lorca); and accomplished film composer/long-time admirer, Miguel Bosé. Weaving together the present-day interviews with archival materials, Gund and Kyi have achieved a seemingly well-rounded, often contradictory portrait of their subject—a character whose most prominent qualities arose from her own contradictions. Chavela’s story is alternately inspirational and tragic; outrageous and miraculous. It’s a story (and a voice) that resonates with the most profound notes on the human scale, triggering pulses and emotions that strike the viewer/listener on a multitude of levels. The film’s emotional power serves to eulogize the life of the film’s subject, but it also reminds us of the forest we sometimes fail to perceive—among the tangled trees of this modern existence.

It seems we have reached a point in our history, where tensions have risen about as high as they could possibly rise: we see many of our fellow Americans running for cover from their perceived opponents, from one uncertain day to the next. In times such as these, there is greater pressure than ever to conform to some kind of an agenda; to restore some modicum of stability, or at least the illusion thereof. In the midst of all this pressure, Gund and Kyi gently remind us that many great figures in world history happen to be individuals who refused to conform: women like Chavela, who first made waves by refusing to wear a dress—and later, by rejecting the more limiting definitions of the contemporary LGBTQ vernacular; men like Pedro Almodóvar, who refused to make boring, run-of-the-mill, politically “sensitive” comedies—eventually finding his own niche audience through a celebration of the most outlandish and perverse attributes of outlandish and perverse characters (and narratives). Theirs are the sort of rebellious gestures that will retain their power and intrigue, long after the sediment of history has settled above them.

Gund and Kyi are smart enough to not impose an expected emotional response to the story of their film’s protagonist (unlike the makers of Amy, a film which Kyi admitted to being inspired by, but which she has visibly surpassed): the audience I was a part of responded to Chavela’s story in a variety of ways, and I found this reassuring. For it gives one hope that one day, our dominant culture may catch up with this time-earned awareness: that new possibilities can only arise when we allow our agendas to be challenged, and maybe even discarded (and conversely, possibilities will wither and fade away, whenever we permit an agenda to override a truth).

 

Phantom Thread
directed by Paul Thomas Anderson; starring Daniel Day-Lewis, Lesley Manville, and Vicky Krieps
released by Focus Features

phantomthread2

The stunning power couple of Daniel Day-Lewis and Vicky Krieps share a New Year’s dance in Paul Thomas Anderson’s exquisite melodrama, Phantom Thread. © 2017, Focus Features.

Phantom Thread, the eighth film by American maverick Paul Thomas Anderson, is one of the finest pictures of 2017—and a powerful reminder of every quality that is unique to the tapestry of American cinema. Like Linklater, Anderson is an artist in touch with his film ancestry, unafraid to wear his influences on his sleeve; and much like Linklater, he refuses to cave in to the traps of plagiarism and self-aggrandizement. That his work often carries reverberations of Altman and Scorsese never implies an attempt to elevate his efforts beyond their given potential: rather, these reverberations serve to point the audience in the direction of a cinematic context—highlighting differences as much as similarities, and revealing the greatest common thread to be a stubborn adherence to one’s own dream logic.

Much like his previous Daniel Day-Lewis vehicle, the now-cult-worthy There Will Be BloodPhantom Thread has the quality of a runaway fever dream. But whereas in the previous outing, this sensibility was carried to the extremes of emotional abstraction and narrative inscrutability, their most recent effort takes a more carefully deliberated and thoughtfully contained approach. When one revisits the bulk of Anderson’s output, one often finds an artist struggling to incorporate as many of his (often brilliant, sometimes baffling) ideas into manageable feature-length form. In Phantom Thread, we find the same filmmaker who was responsible for the more quietly austere debut feature, Hard Eight (a.k.a. Sydney): an artist intent on chipping away at the excess—to sculpt a shape defined as much by its omissions as by its features. The resulting effort is ambiguous but precise; perversely comical (in a manner that would’ve made Buñuel blush) and intensely, convincingly melodramatic. It’s nothing short of a cinephile’s dream.

Although it is likely true that all great movies begin with a solid script, Anderon’s films often seem heavily predicated upon their casting (something that could just as easily be applied to Robert Altman, of whom Anderson was an avowed admirer). A substantial part of the joy provided by witnessing Phantom Thread as it unfolds, stems from the organic spark between the film’s three stars—each of them delivering Oscar-worthy turns—and the characters they’ve so adroitly given life. Lesley Manville, in particular, provides a sort of cornerstone for the elaboration of the film’s more subtle character constructions: in her own words (as quoted in a BFI interview), she embodies “this person who is quite rod-like, and can do so much with just one flicker of the eyes.” Around this immovable fixture, the heightened emotional volatility of Daniel Day-Lewis (as Reynolds Woodcock) and Vicky Krieps (as Alma Woodcock) swirls in varying degrees of pathological complexity: at times revealing itself to be an extension of the characters’ personal traumas—such as the chillingly gorgeous sequence, in which Reynolds evokes the ghost of his mother—and at others, boiling out of the alchemy between their respective pathologies. Ultimately, all three characters emerge with the sort of understated depth and intricacy that has, up to this point in film history, only been afforded the likes of Norma Shearer and Anton Walbrook (in the great British films of Michael Powell and Emeric Pressburger). Like all great American auteurs, Anderson knows to steal only from the best.

On the other side of the vaingloriously chauvinstic posturing of Day-Lewis, Krieps shines as a sly sort of antidote to the suffocating dogmatism of over-zealous social (media) activism. Quoted in the same BFI piece mentioned above, Krieps observes that: “I respect Alma so much because she doesn’t really need the recognition or the approval, and this makes us strong… If a woman is not seeking this approval, this is a strength that’s stronger than anything, and you don’t then have to fight your ground, you just take your ground. What I like about the movie is that it’s about a dance between a man and a woman. It’s not about who’s stronger and it’s not about who will win. Once we get past this idea of ‘are the men stronger or the women?’ and just accept that men and women are ultimately completely different and completely opposite and will never be the same—until we understand and accept that—we can then have the conversation, the real conversation we really need. That’s when it will be interesting.”

Perhaps no writing on Phantom Thread captures my feelings about the film more capably than the review penned by A.O. Scott for the New York Times: “There are movies that satisfy the hunger for relevance, the need to see the urgent issues of the day reflected on screen. Paul Thomas Anderson’s eighth feature—which may also be Daniel Day-Lewis’s last movie—is emphatically and sublimely not one of them. It awakens other appetites, longings that are too often neglected: for beauty, for strangeness, for the delirious, heedless pursuit of perfection. I’ve only seen this film once […] and I’m sure it has its flaws. I will happily watch it another dozen times until I find them all.”

 

Wonderstruck
directed by Todd Haynes; starring Oakes Fegley, Julianne Moore, Michelle Williams, Millicent Simmonds, Jaden Michaels, and Tom Noonan

released by Amazon Studios and Roadside Attractions

175137-full

Todd Haynes looks down on the immersive New York City panorama—showcased unforgettably at the conclusion of his latest offering, Wonderstruck. © 2017, Amazon Studios & Roadside Attractions.

Todd Haynes is one of the finest American artists working today, and I hope the relative poor performance of this latest offering (which left critics and audiences scratching their heads in unison) does nothing to dissuade him from following his gut—and venturing far into the wilderness of his boundless and brilliant imagination in the projects yet to come. (And dear god almighty: may the financing keep flowing.) If one reviews Haynes’s filmography to date, one may well identify a knack for engaging in meta-historical conversations with the history of art itself: from the inter-textual experimentalism of Poison (where Jean Genet, AIDS hysteria, the ’50s family melodrama, and the American B-movie collide in exquisitely strange unison), to the daring innovation of his pop music biopics, I’m Not There and Velvet Goldmine (both of which draw from a near-exhausting wealth of inspirations), to the so-far-ahead-of-its-time-it’s-frightening masterpiece, Safe (driven by the finest performance in Julianne Moore’s career-to-date, and an anti-aesthetical conviction that could have given Kubrick a run for his money—in its brutal, unrelenting aim to reveal the power of environment-over-character). And let us not forget the deceptively straightforward melodrama of Far From Heaven, a film so profoundly entangled in the yarn of its own history—which includes the melodramas of Douglas Sirk, the mythology of Rock Hudson (the reluctant Hollywood queer archetype), and the New German cinema of Rainer Werner Fassbinder—that most viewers barely begin scratching the surface of its possible interpretations.

I suppose any commentary on Haynes’s work is bound to solicit accusations of cinephilic elitism and hyper-cerebral analysis. And while such accusations may be warranted, I will readily revert to the same defense offered Last Flag Flying: that with so many contemporary film-makers disengaging from the quilt of film history, is it not acceptable for a handful of our remaining innovators to champion their roots and—more importantly—explore the remaining possibilities for cinematic evolution? For if the reader is open to such a notion, Wonderstruck will likely prove a rewarding and thought-provoking experience. It’s the sort of children’s movie we used to excel at producing in this country, but have seemingly forgotten how to tackle in more recent years. Haynes taps into the unstated wisdom of childhood: namely, a child’s natural ability to withstand the unfathomable sadness of their own existence; a sadness which many of us, as adults, find ourselves less equipped to withstand. Beyond this insight, Haynes revels in the mystified, tangent-prone mindset of his characters. He is the proverbial “kid in a candy store,” and it shows with every frame: just as the children are inclined to impulsive flights of fancy, Haynes is prone to indulge in the occasional bit of cinematic homage (in this instance, a couple of clever, well-played nods to Being There) and self-referentiality (as in the use of stop-motion dolls to reconstruct his characters’ fading memories, calling to mind his now-iconic use of Mattel dolls in Superstar: The Karen Carpenter Story; or the use of David Bowie’s “Space Oddity,” calling to mind his thinly-veiled reconstruction of the Ziggy Stardust story in Velvet Goldmine).

What sets Haynes’s work apart from the mass of self-made auteurs (many of whom bask in the onanism of referencing their own work) is his commitment to conversing with the work of other filmmakers, as much as with his own. And to this end, Haynes betrays a rather singular proclivity for establishing context around his art. Not unlike David Lynch (perhaps his closest relative, in postmodern terms), Haynes provides all the necessary clues for the audience to engage in their own private dialogue with his work. As artists, they share in a recognition that their audience will bring their own plate to the table; and they both know better than to dictate which ingredients their audience should eat. From this perspective, all that matters is that the audience be granted sufficient information to trace the lineage of the food on the table, if they so desire. (Or, if they’re inclined towards a more immersive experience, they can ignore the trail of clues altogether and just savor the feast.)

As for the story of Wonderstruck, suffice it to say that it is every bit as simple and convoluted as a children’s book ought to be (it is adapted from a hefty novel by Brian Selznick, which I have not read). All of the actors deliver strong, convincing performances—particularly newcomer Millicent Simmonds, who has the capacity to break your heart before forcing a smile in the course of an instant—and Carter Burwell’s scoring is sublime throughout. Without a doubt, the best write-up the movie could ask for was provided by the amiable John Waters, who coyly suggested in his year-end top 10 list: “Want an IQ test for your cinephile children? Just take them to see this beautifully made, feel-good kids’ movie about the hearing-impaired, starring a little girl who looks exactly like Simone Signoret. If your small-fry like the film, they’re smart. If they don’t, they’re stupid.”

* * *

So there you have it. Five proofs of American cinema; five signs of hope—that there are still those among us with adequate wisdom, perseverance, and vision to point a way out of the darkness. May these bright lights among us continue to shine through the falling night, and may they inspire others to do the same.

film_review_wonderstruck

Jaden Michael, Oakes Fegley, and Julianne Moore look up with wonder at a sky full of possibilities. Wonderstruck © 2017, Amazon Studios & Roadside Attractions.

Advertisements

or: An Open Appeal to a Sane Society

Meet the new houseguest who doesn’t intend to leave: the horror movie that doesn’t seem to end, and that you’re not allowed to look away from. Like a 21st century variant of Burgess’s Ludovico treatment—only worse, because you’re actually living with the images forced upon you by some diabolical overlord. Enter the age of 45: the Hotel California of the new millennium. Life confined to a locked, low-ventilation room; with a wild badger thrown in for companionship, and the expectation that you’ll keep cleaning up after the damage—while never being offered the option to expel the badger altogether. At least, not as long as the ratings are up.

ACWO_Ludovico

Alex de Lange undergoes the Ludovico Treatment in Stanley Kubrick’s adaptation of Anthony Burgess’s 1962 novel, A Clockwork Orange: the treatment entails forcing the patient to watch films of crimes and historic horrors, with the intent that exposure will prevent the patient’s committal of further crimes. Suffice it to say, the treatment is not entirely effective. © 1971, Warner Pictures.

As I sit here—wide awake, still a little stunned by the Senatorial victory of (Democrat) Doug Jones in the well-established Red terrain of Alabama—I realize just how much this bit of good news means to me: to my mental wellness, and my general sense of empathetic orientation with the human race. An orientation that has been shaken to its core since the traumatic national and international events of 2016.

Trauma changes people.

I realize tonight that this isn’t just about Doug Jones and Roy Moore, to me (and possibly, to many other American citizens as well). It’s not just about this shitshow of a presidential administration we find ourselves stuck living through—this wild badger thrown in the room, that we’re not allowed to remove for another three years (maybe less…). It’s about securing some fresh, statistical evidence that the people you’re sharing this country with (including your own self) are still capable of not being vicious, careless, misanthropic, narcissistic, mysoginistic, racist ogres. Evidence that we still have something worth fighting for, hidden among the bushes of the outrage mongers in talk news and the trolls, bots, and clickbait mongers on the internet.

Just as we must remember that the profoundly traumatic realities of 45’s election, his inauguration, and his repulsive miscarriage of Federal power, could have (and should have) been overshadowed by the 3 million plus voters outnumbering his “victory,” we must take (and savor) this moment as a signal that the human race hasn’t entirely surrendered its own plight—despite certain running indicators and unfortunate appearances. That contrary to Nick Cave’s misanthropic anthem (“People Ain’t No Good”), people ain’t entirely no good.

Above and beyond the effect this election portends for the state of Alabama itself (a state that hasn’t swung Blue in the Senate since the pre-Civil Rights Act days of LBJ), this event signals an anxiously awaited response from Republicans to the recent resignation of Democrat Al Franken (in light of the on-going denials put forth by 45’s administration, when confronted with the allegations of 19 women claiming past assault at the hands of our current president). Our nation’s sense of dread and anticipation was palpable, as Alabama faced the somewhat unreasonably challenging choice between a known, racist child predator, and a Democrat: would the state reflect the running trend in the GOP (deflecting attention from its own sins by playing an endless game of “pin the tail on the donkey”), or would they snap out of their Red state-induced coma long enough to recognize the hypocrisy that underlies every facet of their party’s current incarnation? Furthermore, would they perpetuate the mistake made by millions of Americans during the 2016 election—voters who somehow felt it wiser to support and elect the most morally defunct, greed-driven, and predatory Presidential candidate put forth in recent U.S. history, with the apparent delusion that they could return their purchase if it didn’t work out; Democrats, Republicans, and independents who apparently failed to recognize how much easier it is to prevent an elected demagogue’s abuse of power by not electing said demagogue in the first place—or would they prove to the rest of the country that they’d taken notes from that experience, and were willing to learn from past errors in judgment? And last (but certainly not least): would they demonstrate that the all-too-common social problem of sexual abuse (among other abuses of power) was identifiable as a human problem—a problem that transcends one’s party affiliation, or one’s like/dislike of the perpetrator—and not just some perverted political weapon, used to consolidate power and enable further abuses?

“There is more paradise in hell than we’ve been told.”
– Nick Cave (from One More Time With Feeling)

The trauma of waking up and having to see this horrendous failure of humanity (known by the acronym DJT) on every television screen, in every room (or check out for awhile, only to be haunted by fear and unease as to what might have transpired while you were sleeping), should never be downplayed or minimized. These are strange times, to be sure; but beyond the surrealism of it all, these are dangerous times. Dangerous for the fate of the planet; dangerous for the fate of children and adolescents, having to grow up out of the rubble of all this trauma. Dangerous for the fates of democracy: the right to free speech; the right to potable water; the right to our national monuments; the right to an affordable education; the right to affordable healthcare; the right to be a woman; the right to be a person of color; the right to a neutral internet. The right to not have the fragile egos of feeble leaders signing off on unnecessary wars and international conflicts—with the name of your country printed on the dotted line. The rights of veterans to access treatment and services, and to not be rendered homeless and helpless by the cruelty of weak men who sent them off to fight these unnecessary wars.

The right to love the person you choose to love. The right to vote for the candidates and issues you believe in and/or identify with—and the right to have your vote counted. The right to worship (or not worship) the deity of your choosing, and the right of others to do so in turn. The right to a fair trial in a court of law, overseen by a qualified judge who has undergone reasonable scrutiny before being entrusted with the fates of American citizens of all ages. The right to fight for environmentally-sound policy; functional infrastructure; fairer tax structures. The right to fight for the “little guy” (and gal), and a platform on which the underdog is allowed to speak and compete with the fastest runners.

The right to have all claims of sexual misconduct treated seriously, regardless of how much we may like or dislike the person whose reputation is on the line: the rights of the men and women who have experienced horrific personal traumas and abuses to have their stories heard—not exploited for the limelight, or an uptick in ratings, but actually listened to; respected; taken seriously. (Also, the right for the individual being prosecuted to speak on his own behalf and be heard, in the event some kind of foul play is in the works—or, in the event that the individual’s offenses are even worse than what was reported).

Klute-Dont-Be-Afraid

Jane Fonda plays a prostitute caught in a scheme of political intrigue, in Alan J. Pakula’s 1971 masterpiece of paranoia, Klute. The film was followed by two other entries in a “paranoia trilogy:” The Parallax View (1974) and All the President’s Men (1976). © 1971, Warner Pictures.

Over the past year, all of these rights have been (or are being) assaulted, defiled, defaced, or distorted beyond recognition. Many of us have turned to each other (or our respective deities) in desperation and confusion, hoping for solace and reassurance. Sometimes, we’ve been greeted with the terrifying vision of our neighbor’s desperation; other times—like tonight, after this small but somehow tremendous victory for the people of the United States—we are offered a ray of hope; a sign of life. A montage of baby steps towards a resolution, interjected at the end of the first chapter in some seemingly interminable (and poorly shot) blockbuster of political paranoia and international intrigue (think Pakula’s paranoia trilogy, or Polanski’s domestic horrors, as filmed by Jerry Bruckheimer; try not to vomit).

Trauma doesn’t usually leave when you ask it to: like that pesky houseguest (or that wild badger), it will linger and wreak as much havoc as allowed, and you may well find yourself on the verge of being evicted from your own home. And despite possible good intentions, lashing out in anger and aggression at the trauma you’re cohabitating with won’t do much good. I’m reminded of a scene in Noah Baumbach’s latest work—a straight-to-Netflix affair titled The Meyerowitz Stories (New and Selected): following their sister’s disclosure that she was molested by their uncle one summer in her childhood, two brothers decide to avenge her trauma by violently (albeit incompetently) trashing their uncle’s car in a hospital parking lot. They leave the scene of the crime giddy with pride at their perceived accomplishment; they feel somewhat less empowered after informing their sister, and hearing her disarming reaction: “it doesn’t change the fact that I’m fucked up for life.”

Elizabeth-Marvel-Meyerowitz-Stories

Elizabeth Marvel plays Jean Meyerowitz in Noah Baumbach’s The Meyerowitz Stories (New and Selected)—a Xerox executive who experienced sexual abuse during childhood at the hands of a relative, and explains flatly that there is nothing that can be done to remove the trauma from her personal history. © 2017, Netflix Pictures.

Hopefully, the trauma inflicted upon us by this deranged, dishonest, degraded, degrading, and possibly treasonous administration, will be survived by the good people of this country. Hopefully, the people who come out of this ordeal will look, think, and act a little more like the good people who turned out in droves for today’s vote in Alabama—people who chose to put principles above partisanship—as opposed to the people who enabled and supported this catastrophe back in its “preventable” stage. Hopefully, we will look back on this day as the day a nation came to its senses: the day we came to appreciate, collectively, just how much is at stake in this catastrophe; how much we have already lost, and how much more we have to lose if we don’t reject this putrescence—once and for all—and return to some core standards of intuition, decency, diplomacy, critical thought, self-awareness, and accountability.

There is still a long way to go, and a lot of work to be done: let’s not just rest on the laurels of a small step for man (however significant it may have been to the survival of mankind). Let’s keep moving ever-higher, up to the highest point on the curve of justice—outlining the arc of history in the most ambitious and humanistic shape possible. Let’s stay the course of sanity; for we should all be well aware by now, how easily we can be misled by the folly of ignorance, frail egos, and festering hatred.

Here’s looking to signs of life after trauma.

A deplorable year, in context.

petra

Margit Carstensen plays the embittered Petra Von Kant in Fassbinder’s 1972 film of his quasi-biographical play; pictured here during her final on-film meltdown in front of her family. © 1972, New Yorker Films.

It started with cocktails.

It was November 8th—Election night, 2016. My partner and I had dinner (nachos, I think) with a cocktail on the side, to try and wash away the bitter taste of the ugly year leading up to this occasion. We caught up on some pre-recorded programs in the DVR, and switched over to PBS for the occasional play-by-play of electoral returns. Of course, it was still “too early to tell” at this point; though the smugness of certain commentators—a less-than-subtle confidence in the already projected outcome (a Democratic “landslide”)—gave me pause.

In the months preceding this night, I endeavored to raise awareness of the complex and multi-faceted significance of this election—and the devastating ramifications if the Presidential seal were to go to the most corrupt, unqualified, and inexperienced candidate ever to campaign for this office (from foreign policy, to climate policy, to basic civil rights, to corporate privileges, to tax policy, to infrastructure, to cyber-security and net neutrality…) I had cautioned my Bernie-adoring friends that the so-called “lesser of two evils” was, after all, still “less evil.” I encouraged folks to consider the pragmatic perspective that many social workers (myself included) are forced to adopt on a day-to-day basis, as a consequence of living in an imperfect world with imperfect choices: while one can rarely take an action that will result in no harm whatsoever (with the notion of “no harm” being in direct opposition to the human experience), one can gather information and critically evaluate options in order to take the path of least harm.

As I sat in front of the television, sweaty glass of booze in hand, I saw the path opening in front of our nation: suffice it to say, it was not the path of least harm.

I would like to say that, in hindsight, I responded to this awareness with a proportionate level of disappointment. If I were to be perfectly sincere, I would admit that my disappointment and anxiety skyrocketed beyond any proportion I might’ve prepared myself for, and my subsequent display of emotion was probably on par with the most exhibitionist meltdown of a character in a Fassbinder film (think Petra Von Kant screaming at her family, drunk on the carpet; or Elvira recounting her history of trauma from inside a slaughterhouse). After fifteen minutes of incredulously gazing at the incredulity of the commentators on the TV screen, I wandered off to bed in a daze, and sobbed myself through a (seemingly endless) night without sleep.

Some time after, my partner wandered up and lay next to me—our dog Sam sprawled in between us: blissfully blind to the specifics of what was happening around him, but visibly aware that something was off. He rubbed his nose against my side and I scratched behind his ear, periodically reaching for my phone and checking the electoral map for signs of hope; none were forthcoming. At a certain point, I just stop checking—painfully aware of the heightened anxiety provoked by these micro-updates. And then, the indigestion started. And the routine visits to the toilet to try and purge the queasiness swirling around in my stomach. And the hours spent in near-delirium, staring at the ceiling and waiting for the night to end, while simultaneously dreading the thought of having to survive the night and emerge into the reality awaiting me on the other side.

I’m still lying awake when I hear the clicking of a computer—my partner having woken up before me (as per usual), now checking the news feed on his desktop. I counted the seconds between the first few mouse clicks, and the first audible, heaving sobs; I think it took about fifteen seconds. I turned my face into a pillow and cried.

* * *

I find myself reliving this fateful day, as I embark on this effort to put my experience of 2017 in some sort of context (call it self-therapy). I can’t help but feel that the answer to many questions that have arisen out of this disastrous, unsettling, and disorienting year, lies somewhere in the outcome of that night—and the collective reaction to an action taken by the smallest margin of our population ever to select a (proposed) “leader of the free world.” In the months immediately following the election, I was one of many to identify a heightened level of engagement with social media; and while I cannot attest to the motives of others, I will readily concede that my personal engagement was driven by a heightened awareness of the unprecedented impact social media had yielded throughout the course of the election. In reading the near-unbelievable, beyond-dystopian tale of Cambridge Analytica, and the well-documented strategies implemented by several shady figures in favor of a global right-wing coup, it became quite evident to me that we stood on the threshold of a deeper abyss than was projected by the most dour catastrophist during the election itself. I felt a compulsion to be more outspoken than I had been before (since, evidently, reserved compunction, blind faith in objectivity, and trust in the collective conscience of mankind had not yielded any favorable results). Looking back over some of the insights and commentary I shared publicly via social media at the start of the year, I regret none of what I wrote—but I can now recognize the general insignificance of my commentary with a greater degree of intellectual clarity.

This isn’t to say I’ve adopted a defeatist perspective. Today, I can sincerely claim (give or take a little) the same level of investment in the plight of humankind as I claimed last November; and the year before that, and so forth. But as our global village (if McLuhan’s term can even be fairly applied to our present-day climate) advances towards ever-increasing levels of chaos, I’ve become painfully aware of how incompetent and, in many cases, outright detrimental this twenty-first century drive to provide running commentary on the human experience has been to achieving any sort of actual progress. Retrospectively, in fact, one can trace the most recent phase of devolution (and devaluation) of the human species through a comprehensive anthology of our president’s impetuous Tweets—accompanied by the often-comparably impetuous retorts of commentators across the globe. If one were inclined to place these exchanges in context and illuminate the bigger picture for those in need of perspective, one could print this anthology of Tweets and comments and hang it on a wall in a museum; opposite this display, one could hang a display of climate data, pictures of the refugee crisis, profiles of newly-appointed right-wing judiciary representatives, annual hate crime statistics, research on hereditary traumaworld poverty statistics, annual gun violence statistics, opioid overdose statistics, and current nuclear arsenal statistics (with illustrations). The viewer of such an exhibit should be capable of drawing their own conclusions.

Suffice it to say, very little social progress has been achieved during the past year. One could go so far as to argue that we have taken such an enormous step back in our social evolution—the trajectory of social progress has been scrambled to such an extent that we have to redefine the very idea of social progress. For example: prior to the election, one could generally accept that, regardless of one’s economic status or party affiliation, sexual assault was a deplorable action. But something changed, somewhere along the course of the 2016 campaign trail. If one were to examine the Republican party’s response to the excavation of that infamous Access Hollywood tape, and compare it to their response to revelations that then-President-elect Bill Clinton’s had engaged in an extended affair, years before the 1992 election, one would have to resolve that the Right has either lowered their standards for outrage, or only complain when their majority is on the line. In addition to this, we find the emergence of a new Right-wing chorus (which will go on to be adopted by many a libertarian, third-party voters, and Democrats as well): the now familiar refrain of “fake news;” a magic potion for alleviating the symptoms of cognitive dissonance.

gennifer

During the historic 1992 campaign trail, it didn’t take long for the revelation of President-elect Bill Clinton’s 12-year affair with Gennifer Flowers to become a partisan weapon yielded by the Bush campaign to cast moral aspersions on his Democratic opponent.

In 1992, voters of all stripes wrestled with both the knowledge of Clinton’s affairs and an awareness that this information might be manipulated for partisan gain; in 2016, there appeared to be little-to-no wrestling at all. Polls at the time indicated that, by and large, 45’s base was actually strengthened by the revelation of the tape: casting the objective information of the tape aside, many of 45’s supporters voiced an opinion that their only concern lay with how this information might be skewed for partisan gain—and not with the implications of the information itself. In other words, the information of our then-President-elect’s predatory behavior (in combination with all the other evidence accrued to support the case for his predatory business practices) was as good as irrelevant. And so began the trend of alternative facts, and the convenience of being able to reject information that conflicts with one’s pre-existing belief pattern by merely denying its existence. Viewed along the action-reaction continuum, “fake news” was both a reaction to the leftist obsession with investigative journalism, and a positive action in its own terms (using “positive” in the Skinnerian sense). For by achieving an unspoken consensus among themselves—that information adverse to the advancement of one’s own political goals cannot (and should not) be bothered with in the first place—45’s supporters have succeeded in establishing a level of intellectual disengagement not seen at any other point during the nation’s past century of political discourse.

If we now consider this new right-wing action (“just say “fake news” whenever anything upsets you”), we must consider the subsequent leftist reaction (hyper-dramatically present the severity of upsetting developments, in an attempt to appeal to the emotional-spiritual side of right-wing fact-deniers). The leftist reaction can be seen throughout any number of impassioned Facebook and Twitter rants: that (somewhat-to-absolute) self-righteous outpouring of hysteria and concern, presented with all the pathos and drama of an argument in some generic TV courtroom drama. This brand of emotional reactivity has been, in some cases, strategically channeled to advance social issues (as in, most recently, Tarana Burke’s powerful #MeToo movement); on the flip side, the catharsis of social media engagement presents a stumbling block for individuals who have no conception of follow-through. For instance, the fanaticism of Howard Beale (Peter Finch) in Paddy Chayefsky’s Network (1976), which I see frequently shared (in the form of the “I’m as mad as hell” excerpt) by peers and acquaintances on social media, offers a prescient insight into the risks associated with commercializing outrage—though I fear some folks take the bit out of context and fail to apprehend the way it all falls apart.

In Sidney Lumet’s film of Chayefsky’s acclaimed script, Peter Finch convincingly plays a neurotic newsman who “flips a wig” after being let go from his job, and takes to the air to advertise his on-air suicide a night in advance. Instead of delivering on his promise, he launches into a sermon about how the world is going to shit, then beckons his viewers to run to their windows and yell into the streets with him: “I’m as mad as hell, and I’m not going to take it anymore!” His viewers comply, and the station executives hear of a boost in ratings: they investigate the situation further, and realize there’s big money to be made selling outrage to the deadened masses. In a relatively short period of time, the mentally unstable Howard Beale has been asked to front his own television variety show—to feature his now-trademark impassioned rant as a sort of nightly act. Beale displays some resistance early in production, but by the end of the movie has been brainwashed to the point of putty in the station’s hands: a once genuine expression of repressed jouissance has become a weird sort of household name, and the television executives profiting from his mental health condition wind up having him killed, because of an eventual decline in ratings.

network

In Network, Peter Finch plays a hysterical newsman (named Howard Beale) whose psychosis is co-opted by his employers at the network for a spike in ratings. Once the act grows old and ratings decline, Beale is bumped off by his executives, who will continue accruing royalties from his downfall. © 1976, MGM Pictures.

While extreme and grotesque in its scope, and rendered for largely satirical purposes, Chayefsky’s work does seem to offer a cautionary tale for our time. In order to prevent becoming as pathologically shortsighted as Howard Beale, one must always ask oneself, when contemplating such catharsis: What purpose could this possibly serve? What’s the intended follow-up plan for one’s outrage—or is there one? Is it possible that one is just yelling words into a digitized vacuum, which then captures one’s words and capitalizes upon them, selling them off as part of a metadata package? Is this essay going to become just one more yell into the vacuum? One hopes not, but one never knows.

Since human beings have still failed to learn the lesson the universe endeavored so painfully to instill in us throughout last year’s election (the lesson: social media activity will not fix most things, or even anything; but it can readily make things worse if given the opportunity), we’ve apparently doubled down, and now find ourselves caught in the middle of a surreal and bizarre game of “who has the most sexual predators in their camp?” (As far as what this game is intended to prove or resolve, I suppose anyone’s guess is as adequate as the next person’s.) One by one—day by day—famous celebrities and political pundits continue to drop like flies in the ointment of this “to catch a predator” show; inappropriately enough, this surreal game has been (and continues being) overseen by the predator who set this chain in motion last Fall (don’t worry: he’s not going anywhere anytime soon).

In keeping with the chosen leftist reaction to overstate one’s passion for a given issue—in a vain effort to “wake the deadbeats from their slumber”—we now find the exponential possibility of human folly achieving the highest (or lowest?) levels of stupidity. For starters, we have the borderline-comical leftist insistence on the morally “wrong” connotation of sexual assault: as if by insisting strongly enough, those who believe otherwise might instantaneously be converted. Furthermore, this juvenile proclivity for moral sermonizing has embedded itself as a point of division between proponents of liberal policy. Just as the more die-hard idealists who upheld the “purity” of Bernie Sanders against the “corruption” of Hillary Clinton drove a wedge between the otherwise-united front of liberal voters (aided and abetted by the Russian trolls who targeted third-party and Bernie supporters with strategically placed news stories to reinforce their disdain for Hillary), we now have liberal idealists thinning their own herd (yet again) by singling out anyone who fails to fall in line with the outspoken chants leveled against perpetrators of sexual assault.

I recently stumbled upon an article which provides a textbook illustration of the infantile thought process underlying this leftist penchant for “out-idealist-ing” one another. In an online Stereogum/Spin magazine article (filed under the “News” heading), a writer named Peter Helman takes issue with comments and views put forth by the ever-divisive Steven Morrissey in a recent Der Spiegel interview (yet again, I find myself stumbling upon the commentary before the news itself; which, in and of itself, isn’t news). Here’s a verbatim transcript of the opening paragraph, as printed in the article (whose writer acknowledges openly that he did not bother to pursue a proper translation of the interview, and relied upon Google translator as arbiter of the interviewee’s meaning):

“Hey look, Morrissey said a stupid thing! It’s been a while since Moz has said something truly objectionable and not just, like, ‘Oh, Morrissey is kind of an asshole.’ But now, in an interview with the German news outlet Spiegel Online on the heels of his new solo album Low In High School, he’s come through with some genuinely terrible opinions.”

First, we find the distinctly liberal cocktail of snark and finger-wagging writ large in the opening statement: before we are even offered a glimpse at the musician’s controversial comments (let alone the chance to remind oneself, as hopefully all reasonable and grown adults do in such instances: “what do I care what some music journalist thinks of what some musician thinks of some matter with which he has no direct affiliation?”), we are instructed (seeing as how the reader cannot possibly be intelligent enough to reach their own conclusion) that the comments are objectively “stupid.” Then, we have the reinforcement of this admonishment coupled with an insistence that one ought to consider these “stupid” statements even more offensive than whatever the last thing the writer had admonished the musician about. Then, as if the message had not yet been clearly conveyed (after all, we’re dealing with a reading audience that cannot be trusted with their own thoughts), the writer insists that this latest interview with the Moz reveals “some genuinely terrible opinions.” (Be still, my fluttering outrage odometer!)

I’m disinclined to even bother with an analysis of the article (let alone the comparably over-indignant commentary of those who shared the “story” on social media; excepting for maybe Shirley Manson, who brought up a valid point in suggesting that Morrissey appeared to not have the latest updates on the “plot[s]” of Spacey and Weinstein), but I nevertheless feel compelled to provide some sort of a corrective to the borderline-toxic preachiness of these self-appointed messiahs to moral indignation. Not that Morrissey’s views, as quoted here, are even that noteworthy or idiosyncratic: if anything, they seem to echo the contrarian tone of similarly uneventful remarks delivered by Johnny Rotten earlier this year. But whereas Rotten and Morrissey are merely doing what they’ve been doing all along in their respective careers (namely, being abrasively provocative), Helman’s heavy-handed critique—along with any analysis bearing the imprint of such thoughtless indignation—inflicts the greatest damage of all on the integrity of an intelligent dialogue: for not only does it inherently reject the reader’s intelligence (something that neither Rotten nor the Moz, bluster aside, would ever dare try), it functions primarily as the byproduct of a profit-driven online press: a press which now feeds vampirically on the outrage of the web-surfing public, frequently leaning on the crutch of self-righteous indignation as a shortcut to increase clicks and shares. (Hm… that sounds familiar.)

And since “writers” (at least, the successful ones; the ones whose bread-and-butter is outrage-tinted click-bait) save the most upsetting/eyebrow-raising/scintillating bits for last (in order to maximize the advertisement space between the reader’s first click on the article and the long scroll to its disappointing finish), there must be some build-up to the exhibit of [insert celebrity’s name]’s horrifying remarks. Like an 18th century freak show, in which true horror would have to be instilled in the imagination of the visitor, before being deflated by the banality of the exhibit itself. (Sure enough, cries of “shame!” and “how dare he?” were heaped upon the Moz within minutes of the article’s posting; after all, what’s one more pariah on the fire…) In keeping with every other un-news-worthy observation shared by Morrissey in an interview, a scandalous viewpoint has been tried and condemned for failing to align with the prevalent vernacular and perspective of the times, and persona non grata status has been duly granted to the offending party. From what we know about the artist in question, one ought to suspect this is what he wanted all along, anyway: win-win (I guess?)

Morrissey-660-Reuters

Steven Morrissey’s 30+ year career has been consistently marked by stylized, overly dramatic outbursts, coupled with the artist’s vegan activism and often reactionary views. As a (by)product of the British punk era, Morrissey is to many a poster-boy for resisting conformity. Also  renowned as a legendary pain in the arse.

My point here isn’t that Morrissey’s statements should be defended: he’s a grown man and should take ownership of whatever non-sense and/or half-sense pours out of his twisted mouth. Rather, my point is to ask: What purpose could this possibly serve? And moreover: What does all this exhibitionistic “journalism” imply about the state of social commentary? Have we truly devolved to the point that an individual needs to preface any commentary on the subject of sexual abuse (and the inherently complex psychology of victims and perpetrators) with an assertion that one does, in fact, disapprove of sexual abuse and predatory behavior? Are there popular articles out there that I’m not seeing, in which individuals go on record saying that they condone sexual abuse, and wish there was more of it? And if so, is the tone of such deplorable articles so unrecognizable from the tone of a level-headed writer’s, that level-headed writers need fear their audience suspecting they might, in fact, be pro-sexual abuse? And if so, wouldn’t the abuser-shamers serve their purported mission more capably by tracking down those pro-abuse folks and chastising them? Regardless of the answers to any of these questions, nothing remotely edifying can come of such conversations, if we cannot bring ourselves to respect (read: allow) the judgment and intellect of our reading audience—sans these forceful and belittling cues to trigger our moral outrage.

Which brings me back to the actual problem at-hand, and the elephant in the room that remains perpetually sheltered from the storm of allegations swirling around him: the President of the United States. For unlike Morrissey (or Johnny Rotten), our president has made it clear time and again that he is pro-sexual abuse, and despite the skepticism of his supporters (who feared that their boy’s well-documented predatory behavior might be yielded by leftist commentators for partisan gain), he has displayed no compunction about turning allegations of abuse into political weapons—so long, of course, as the allegations are directed at individuals outside the Republican umbrella. Which renders it all the messier when individuals on the left allow themselves to get caught up in the hurricane of abuser-shaming (often with noble intentions, at least at the start), since this is exactly what the most powerful person in the country has been relying upon this entire year to advance a truly abusive agenda—not least of all, through his success in appointing an entire slate of unnerving judicial assignments: out-of-touch bigots and bloggers; unqualified lunatics who will shape our country’s legislation for decades following the inevitable demise of this administration. All the while, his White House continues to ignore and deny the allegations of 16 women who have confronted the public with their abuse stories, and the President remains… the President. As of this writing, there have been no formal inquests proposed in Congress to investigate and pursue these claims further.

I suppose I should feel compelled here to state my own disavowal of sexual abuse, and to verbalize my support for the victims who have come forth with their alternately harrowing and unnerving stories. I’ve chosen to refrain from offering any commentary on the subject up until this writing for a combination of reasons; mainly, as someone (and more specifically, as a white man) who has not suffered sexual abuse firsthand, I feel it isn’t really my place to remark on a subject so close to others, yet so distant from my own lived experience. I’ve found that, in such cases, it’s best to just shut up and listen to those who know what they’re talking about.

* * *

The title of this essay is taken from a track on this year’s Sun Kil Moon/Jesu collaboration, 30 Seconds to the Decline of Planet Earth. The song takes as its subject the child abuse scandals that haunted Michael Jackson to his early grave: it appears to have been inspired by a conversation on a plane between the song’s writer (Mark Kozelek) and a young woman traveling to Greece to perform in a musical of Michael Jackson’s life. The song paraphrases a conversation between the two, in which Kozelek asserts (rather firmly) that the world is, undoubtedly, a better place without a pedophile R’n’B star living in it. At first listen, the lyrics to the song are more-than-slightly jarring: casual listeners might be inclined to interpret this perspective to be the actual opinion of songwriter Mark Kozelek, whereas those who’ve spent time with Kozelek’s other recordings may recognize the sound of his (often) darkly satirical social commentary.


I can’t say for certain whether the lyrics to “He’s Bad” come from a place of sincere commentary or social satire, but I find it difficult to accept the former interpretation. In fact, the perspective of the song’s narrator is often so wince-inducing in its generalizations, one can only make sense of it when read in quotation marks:

“Is the latest on him true?
Well I don’t fuckin’ know
But if I had a son, would I let him get into a car with Michael Jackson?
Fuck no
I’m sorry for the bad things that his father did to him
But it doesn’t add up to building a Willie Wonka trap for kids
And changin’ the color of your God given skin
He made creepy videos that the popular kids liked back in the eighties
And once over a balcony he dangled a baby
And did the moon walk
And talked like a 9 year old girl
I don’t give a flying fuck what he meant to the mainstream world
Roman Polanski went down in flames and was incarcerated
But this young little kid addict will forever be celebrated
A hundred plastic surgeries and paid two hundred million to shut people up
Took someone’s child like it was nobody’s business and dragged him around on a tour bus

He’s bad
And he’s dead and I’m glad
He’s bad
And he’s dead and I’m glad
He’s bad
And he’s dead and I’m glad
He’s dead and to me it ain’t that fuckin’ sad”

The song has stuck with me all throughout the ups and downs 2017 (and it was, for the most part, a year of downs). A friend of mine, who suggested I check out the record, cautioned me in advance about the song’s “cringe-worthy” quality; at first listen, I shared in his assessment. But upon further listens, a space opened up in the longer instrumental stretches of the track, and I found myself strangely drawn to it. Presently, I find it to be a brilliant piece of songwriting—perhaps even moreso, if these are, in fact, Kozelek’s verbatim opinions. The song capably highlights a common trend of generalization and oversimplification among present-day liberal pundits: one might as well call it the “make sure the baby goes out with the bathwater” syndrome. Because it’s easy (and more precisely, facile) to take a step back from the strange and unsettling case of Michael Jackson, and surmise that he was nothing more than a sick man who preyed on children—that consequently, the world is better off with him dead than alive, and he might as well have gone sooner. But had he never lived, this song (a highlight from the record, I think) would not exist: not just its lyrics, but its arrangement, structure, arpeggiation… all of which pay tribute to the late “King of Pop.” Which begs the question: Is it right for one human to judge the life of another and determine they ought not to exist—or have existed at all? It’s the same question that underlies the debate(s) surrounding the death penalty; war; abortion. Taken at face value, the perspective of Kozelek’s song sides with the affirmative answer to this question. But interpreted satirically, the question remains open-ended. Unlike the above-mentioned Stereogum article, the reader of Kozelek’s song is actually given a space to think for themselves and reach their own conclusions; so that, even if these are the songwriter’s dyed-in-the-wool beliefs, we don’t feel pressured into adopting them as our own (or, conversely, into rejecting them outright).

michael

Michael Jackson was the subject of great public scrutiny throughout his short and strange life, which provides the subject for the recent Sun Kil Moon / Jesu track, “He’s Bad” (from 30 Seconds to the Decline of Planet Earth, available from Caldo Verde records).

I refuse here to entertain the idiotic question that somehow goes on being debated in certain circles: can bad people make good art? (I will, however, quickly dissect the idiocy inherent to the question’s phrasing: firstly, there is no such thing as “good” people or “bad people;” and second, what do you think?) However, I do find it noteworthy that a lot of angst appears forthcoming in the public response to revelations that Louis CK, Charlie Rose, and Kevin Spacey—celebrities that, unlike the blowhards who preceded them (Roger Ailes, Bill O’Reilly, and Harvey Weinstein) were somewhat well-liked—have lived messy lives and done some deplorable things. Each time a new pariah gets added to the fire (all the while, the President shakes hands with Duterte on a visit to the Philippines, and swaths of Puerto Rico remain powerless), I’m reminded of the excellent documentary Happy Valley, directed by Amir Bar-Lev and released in 2014—a year or two following the explosive child abuse scandals involving the once-respected Penn State coaches, Joe Paterno and Jerry Sandusky. In his film, Bar-Lev explores the American (and outright human) proclivity for placing a fallible person on a pedestal, and then shaking one’s head in disbelief when fallibility rears its ugly head. As I watched the reactions pour in on social media (friends who initially felt inclined to defend their heroes against the allegations, and as soon as proof was provided, reluctantly gave in to the evidence, tossing the baby out the window), I had a flashback to the confused street riots that followed the Sandusky trial—in which the townsfolk of Happy Valley alternately mourn and celebrate the dismantling of a statue once proudly erected to Joe Paterno (who was not convicted of perpetrating, but was found to have enabled Sandusky’s behavior after being informed of its existence).

The psychology of the townsfolk, which is smartly and respectfully explored by Bar-Lev in his documentary, can easily be transposed to the public psychology surrounding this irrational debate unfolding on our national stage; the key question in the debate is: Where do we store our dismantled idols? (As opposed to the far more proactive question, which everyone seems too afraid to pose: Why are we obsessed with erecting idols in the first place?) For some (and most specifically, for those employed in talk news) the answer to this question is “straight to hell.” Never in my adult life have I seen such a rabid drive—propelled primarily by pundits who appear to take more than a little schadenfreude in exposing the discovery of (yet another) sexual predator—to excommunicate individuals from their professions (before their employers have even had a chance to evaluate each situation and weigh in on the matter; a scary social precedent, to be sure), and hold their mock-trial in the court of social media (for an especially disturbing case-in-point, see the response of so-called “progressives” to the revelation—forecast eerily by a Roger Stone tweet—that Al Franken did some things in poor taste on his USO tours).

happyvalley-4

A resident of Happy Valley, featured in Amir Bar-Lev’s 2014 film of the same name, takes a stand by the statue erected to honor Joe Paterno’s heroic status among the community. His sign reads: “Paterno, the coverup artist !! Paterno, the liar !! Paterno, the pedophile enabler !!” © 2014, Music Box Films.

On the one hand, this sort of “cleaning house” could be argued as a corrective to the long-delayed response of Fox News executives to the litany of allegations leveled against Roger Ailes and Bill O’Reilly (et al); on the other, the zealousness of this drive appears to be conspicuously overlooking the most powerful perpetrator in the room. And if we are not making an effort to prioritize the perpetrators who wield (and exploit) the largest amount of power, but gladly chase after those with relatively little power (yielding the digital equivalent of pitchforks and torches), then the only takeaway from this discussion is that humans are increasingly oblivious to their own addiction to the mechanisms of an exploitative society—to the extent that we routinely take private pleasure in exploiting other exploiters, all the while denying our own role in the circle of exploitation.

At the end of the day, it is this recognition of the multi-faceted power deferential involved, which appears to be the biggest stumbling block for most folks to navigate. Setting aside the heavy-handedness and gross over-simplification of Morrissey’s “controversial” remarks, he does appear to be clumsily pointing to a taboo truth that some people refuse to acknowledge: that in some (not all) instances, the prospective “victim” in the abuser-abused equation will find a way to subvert the power deferential for their own gain. And before the reader reaches for their pitchfork, allow me to clarify that I am speaking of these matters in the specific context of exploitative behavior within an exploitative society (one cannot ignore the reality that individuals make desperate decisions under desperate circumstances). One thinks of the psychologically-sound Lolita twist, for instance—in which an older man preys on a “nymphet,” only to find her turning the tables and exploiting his inappropriate adulation to achieve independence. Which isn’t to say that Nabokov lets Humbert Humbert—or his predatory leanings—off the hook; rather, he accepts the obvious element(s) in this equation, before pointing to the more taboo reality lived by more than just a few individuals in this power-driven society: a reality in which the exploited learns how to exploit, for lack of other identifiable options. (In her best-selling memoir, Reading Lolita in Tehran, author Azar Nafisi provides a feminist interpretation of Nabokov’s text—which she read as a metaphor for the often oppressive experience of life in the Islamic Republic of Iran; further highlighting the on-going significance of discovering fresh social commentary in old texts.)

Alas, such nuanced observations can never see the light of day in our current debate surrounding sexual abuse (or any other issue, for that matter), seeing as how the very idea of truth has already been co-opted and distorted by opportunistic sycophants and sociopaths at Fox News (and elsewhere)—who continue making a killing, selling dumbed-down distortions and good old-fashioned lies as substitutes for insightful commentary. And on the other side of the political fence, a vehement denial of nuance in sex politics frequently appears as a thin disguise for some vaguely misogynistic impulse to deny the emotional, behavioral, and psychological complexity of the feminine experience (for it’s easier to brand every woman a victim for life, even after they’ve made peace with their offenders and politely invited their defenders to piss off). Rather than confront the messy psychology and uncomfortable truths inherent to the dynamic between victims and perpetrators of sexual abuse, talk news pundits (few of whom can be considered experts in social psychology) have apparently reached a consensus that the best way to talk about the issue is to: not actually talk about it; shame the perpetrators until they’re out of a job; and run the interviews of the victims being forced to describe (in gritty detail) their abuse to the flashing lights and rolling cameras; over, and over, and over again… Similar to the way most pundits talk about gun violence (except, no one has actually lost their job for enabling mass gun violence through aggressive gun lobbying; not that I’m aware of, at least).

And again.
What purpose could this possibly serve?

What strikes me the most about Kozelek’s song (which inspired, at least in part, this meandering diatribe) is how it either intentionally or, perhaps unintentionally highlights the banality of its own perspective. Every time I hear the song, I think to myself “I would never go out of my way to listen to a song that presents such a perspective with utmost sincerity:” it would be like taking Randy Newman’s “Rednecks” at face value. A song that feels so stiltedly obliged to assert moral autonomy, while somewhat sadistically proposing a recommendation of death to criminal offenders…  What purpose could this possibly serve? One hopes, the purpose of irony. For in making the listener consider words and deeds of such strict moral outrage—in confronting us with our own respective failures to accept some amount of gray in our black and white lives—one might then feel a little wiser, considering the possibility of something else. Not unlike in the films of Fassbinder, who strove time and again to show the audience the need for change, while never spelling out what that change ought to be (after all, shouldn’t we be smart enough to figure it out on our own?)

In an essay from a book I’ve had my nose in lately, detailing the merits of RWF’s 1971 film masterpiece The Merchant of Four Seasons, author and former acquaintance Christian Braad Thomsen observes:

“Fassbinder […] shows the necessity of vigorous action on the part of the viewer. But he’s not a school teacher, who wants to raise his finger and tell the audience what they have to do, if they want to change the world. He is the Socratic artist, who uncovers how the existing possibilities of living have failed, and points out that change is necessary.”

Throughout his prolific and multi-faceted career, Fassbinder sought to demonstrate the mechanisms of his inherently flawed and power-driven society, clearly enough for any viewer—regardless of their education, intelligence, or station in life—to understand the mechanism and, in turn, recognize the need to rise above it. Three years after releasing The Merchant of Four Seasons, Fassbinder carried his vision of societal deconstruction to an even more poetic and empowered level with Ali: Fear Eats the Soul. In an interview given after the film’s release, Fassbinder observed that “I tend to think that if […] depressing circumstances are only reproduced in film, it simply strengthens them. Consequently, the dominant conditions should be presented with such transparency that one understands they can be overcome.” Rather than taking a sadistic pleasure in portraying the misery of those too enslaved by a social mechanism to recognize how breakable their chains might be, Fassbinder sought to show love for these strange creatures called “humans,” by perpetually revealing the existence of the chains—and the absence of a wizard behind the curtain. In film after film and play after play (and without any undue condescension or simplification), he succeeded in demonstrating that all individuals (regardless of race, gender, sexual orientation, age, or politics) are capable of breaking the chains of exploitation, if they only choose to live a pure existence—predicated upon the inherent human values that society continually distorts (by claiming them as its own, and often assigning them a capital/nominal value).

Put plainly: one cannot go on playing un-elected judge to man’s folly, and resigning oneself to a vacant culture of reactionary outrage. That would entail rejecting the possibility of finding a different way to live, and to demonstrate, by example, an alternative to such folly. (And if one rejects the need for an alternative to folly, one is simply a fool.)

201714438_1_IMG_FIX_700x700

In his five-part TV mini-series, Eight Hours Don’t Make a Day, Rainer Werner Fassbinder paints one of his most positive portrayals of people trapped inside a social mechanism they yearn to break free from. © 1972-1973, Westdeutscher Rundfunk. Renewed 2017, Arrow Home Video.

I’ve returned to Fassbinder many times over the past years, and have always left with an uncanny sense of premonitory relevancy and an inspired momentum. A DVD set of the recently rediscovered (and beautifully restored) TV miniseries, Eight Hours Don’t Make a Day, has been a close companion these past few months; I’ve rarely seen a film so genuinely positive in its outlook, let alone a film of his. His characters are shown (as usual) to be moving parts in a social machine, but all the parts in this machine move beautifully—and each on their own terms. Instead of falling back on jargon, party slogans, or naïve Marxist sentiment, Fassbinder shows characters from all throughout the power structure as somehow genuine and, just as importantly, capable of empathy (and change). He shows that action will forever speak louder than the most eloquent words—while simultaneously revealing how words and images can be employed to further the awareness of a need for action. Not just social (read: collective) action, but individual action. Presently, fleeting social movements (via trends, hashtags, and viral videos) demand wide-spread attention, and the individual finds himself trapped between a biological drive to engage with one’s own self-actualization, and the socially conditioned response to ignore or reject this drive: to follow the horde or disappear. Hence, the “individual” is celebrated, but only on the terms of the individual’s bond with society; if the individual does anything to sever this bond with society, the individual essentially (and in certain cases, actually) will cease to exist.

Indeed, it would seem as though excommunication has been the only fear to consistently unify individuals, in societies across the world—and throughout the ages. The higher the threat of expulsion, the greater the anxiety in one’s life; the greater the anxiety in one’s life, the greater the relish in the expulsion of another. (Following this train of thought, one shudders to think of all the threats and anxieties our current President must have accrued in his lifetime.) I find it especially concerning that so many straight, white, and self-proclaimed “feminist” men appear to be foaming at the mouth to call out anyone who fails to speak the programmatic lingo they’ve conditioned themselves to communicate with; one wonders if some (or perhaps many) of these individuals might be protesting so loudly, for fear of having their own past improprieties exposed. Either way, I imagine a casual time traveler would have a hard time distinguishing our media’s contemporary treatment of sexual abuse scandals, from the Warren Commission’s treatment of the “Red Scare:” so many people eager to see the lives of others demolished, for fear of being next in line…

In another section of Thomsen’s illuminating text (entitled Fassbinder: The Life and Work of a Provocative Genius; quotes taken from the English text, translated by Martin Chalmers), analyzing Fassbinder’s explosively controversial (and for just reasons) play, Garbage, the City, and Death, the author reveals a parallel theme to the circle of exploitation: the corresponding closed circle of oppression—escape from which is a far more complicated matter. Thomsen writes that:

“Throughout Fassbinder’s work we see the oppressed assuming the norms of the oppressors, whether out of a conscious need for revenge or whether they have more or less unconsciously internalized the dominant norms. Fassbinder never has ‘pure’ heroes. Rather, he demonstrates one of the most melancholy consequences of oppression, that the damage to the souls of the victims makes them unable to find alternative norms, so that the only possibility left to them is to recapitulate the norms that have led to their oppression.”

While it sometimes feels as though Fassbinder has artistically resigned himself to a closed box of self-fulfilling prophecies, his work continually reminds the reader/viewer of the complexity of human behavior; a phenomenon which, after decades of misrepresentation (or reductive representation), we appear to have grown somewhat culturally blind to. (So we keep building new idols, only to tear them down after human behavior—yet again—reveals its darker potential.)

19387416_303

In his 1981 feature film of Lili Marleen (pictured above: a resplendent Hanna Schygulla, in the titular role), Fassbinder reveals the cyclical nature of exploitation and oppression through the story of a Weimar cabaret singer in love with a Jewish man, at the start of World War II. As its despairing plot progresses against the oppressive backdrop of the Nazi regime, all of the protagonists catch themselves in the act of being exploited and exploiting others, to survive and to pursue the faint possibility of self-actualization in desperate times.

As I continue to revisit Kozelek’s song—from one month to the next, and one criminal celebrity exposé to another—I’ve decided that I don’t ever want to catch myself reveling in the demise of another human being. Those words, “He’s bad/He’s dead/and I’m glad,” ring hauntingly hollow; they don’t feel genuine… As a stand-alone thought, without corrective, they feel like a disservice to the vastly complex potential of our human nature. And yet, one is so very often stumped, when confronted with the death of someone who did truly terrible things.

This morning, the headlines read: “Charles Manson Dead at 83.” Later in the day, upon arriving at my office, I was made aware of a death in the immediate family of one of my co-workers. I felt (and still feel) a profound sadness for my friends and their family. I thought of Leslie Van Houten momentarily, and the families of his victims, but apart from that I could muster little in the way of an emotional response to Manson’s death. The words of Kozelek’s song ran through my head again, and they rang false again; seeing as how gladness was an emotion, and I couldn’t bring myself to fit emotion in the equation of Manson’s death. A custodian at the office made small talk with me about the news while changing trash liners, observing that: “Someone who did so many awful murders… If I’d had my way, he would’ve been taken out back and put down. Saved the tax payers some money.” I acknowledged his observation, and respected his right to view the situation in such plain terms; I clarified that the death penalty was (rather controversially) suspended in California around the time of Manson’s sentencing. After our brief exchange, and upon considering Kozelek’s song and the deaths of other infamous criminals (and criminal artists) throughout history, I decided to follow my gut. After all, he was somebody’s child, and somebody loved him. Gladness seems glib, even in the plainest of contexts.

* * *

Where does all this leave us?
And what are we to do with the pieces we have left?

In answer to the first question: we are left alive and awake in the United States of America. We have a Constitution that has up until now guaranteed a fairly open space for independent speech and individual commentary. We have great books written by great minds; illuminating films by directors who see (or at least saw) the potential for the medium to show the possibility of an alternative, and the accompanying need for change. Beautiful records by our favorite musicians; museums and galleries full of artwork to expand our horizons (unlike the talk shows, reality shows, and click-driven online journals that rely upon the shrinking horizons of their viewership, in order to sustain their traffic and ratings). Blank paper, canvas, web outlets (free while they last), on which we can project our visions of an alternative and our private and collective need to change.

alphaville

Anna Karina in Jean-Luc Godard’s dystopian masterpiece, Alphaville: channeling Alfred Hitchcock; channeled later by Ridley Scott. © 1965, Athos Films.

As for what we’re supposed to do with all this… well, that’s up to us: individually and collectively. I’m in no position to outline the course for an entire nation of people—each with their own individual views, ideas, convictions, and motives—but I do think we would be better off trying to learn something from the trials and tribulations we’re living through, rather than just repeating these tired tropes of scapegoating, public shaming, and language-policing (among other forms of dictatorial social conduct). As was sorely predicted by many of us, when the results of last year’s election rolled in, this year has been a nightmare on many fronts; and short of an organized revolt or an awakening of Republican consciousness, we’ll have to endure at least one more year of the nightmare. By continuing on the course our society has traversed this past year (a course that has both recycled traditional socio-economic exploitation tropes and invented new ones, thanks to the willingness of millions to surrender their thoughts, ideas, photos, and identities to metadata collectors—free of charge—to be exploited by the highest online bidder), we are sentencing ourselves to an increasingly dystopian future in which individual thought verges on extinction, civil liberties become novelties, sex is reduced to a formal contract, and humor is no longer recognized. Like Godard’s Alphaville, only far less cool to look at.

As far as my own experience of 2017 is concerned, I like to believe that I’m leaving this year older and more tired, but wiser as well; less quick to jump to conclusions, more open to the ambiguity of life and the possibilities for change. I advance into the wilderness of a new year with the knowledge that no socially-imposed chain of exploitation can hijack my freedom to think and act in accordance with a greater wisdom. Unless, of course, I grant this chain the power to do so.

“Even Richard Nixon has got soul.”
– Neil Young
(from the 1977 song, “Campaigner,” recently reissued on his Hitchhiker LP)

Vice Principals is the show that every American adult—and more specifically, every racially disoriented white American—should probably be watching and talking about over dinner. As it becomes increasingly difficult to satirize reality (with reality itself having become an un-ironic satire of social indecency), the creators of this half-hour HBO comedy series (Jody Hill and Danny McBride) have somehow managed to pointedly encapsulate everything bad that is afflicting our country’s societal wellness—while at the same time saving a space for the remaining dregs of decency, which are routinely squeezed out of similar attempts at encapsulating our problems in dramatic form. It’s a program defined by its crass, cruel, grotesquely arch, and (often unexpectedly) black comedy. But while the ostensible victim of the show’s first season was a black woman climbing the ladder of upper management in a public school system, it is the show’s prime villain (her cold-blooded VP, Lee Russell) who undergoes the greatest scrutiny and, ultimately, comes across as the “biggest loser.” Unlike other programs (in both documentary and fiction realms), which consume themselves with endeavoring to paint the plight of the minority citizen in shades of self-pitying helplessness—with a frequently less-than-subtle nod to a (typically white) social justice warrior, who rides in like a knight in shining armor to save the damsel in distress—Vice Principals is a ruthless portrait of the victimizers; making no excuses, and taking no… well, maybe a few prisoners.

vp3

Dr. Belinda Brown (Kimberly Hebert Gregory) sizes up her two infantile and devious cohorts in HBO’s Vice Principals. © 2016, HBO Networks.

When Dr. Belinda Brown (played superbly by Kimberly Hebert Gregory)—the impossible-not-to-be-enamored-with principal who sets the show in motion—is brutally forced out of its equation at the end of the first season, one feels a profound sense of loss. But not only hers: ours, as well. (If anything, Dr. Brown likely considers herself released from the toxic environment of her fellow protagonists’ making.) It is our loss to not have her as a prominent part of the show’s perversely hysterical conversation anymore, being left to contend exclusively with the petty hooligans who have taken her place. In actuality, at the start of the second season’s premiere episode, we find Dr. Brown alive and well—reunited with her husband and two kids, and living a good distance from the deranged vice principals who attempted to ruin her life (and very nearly succeeded). When she begins to hint at her own departure from the show’s narrative, it not-so-subtly calls to mind the departure of our country’s previous commander-in-chief—whom we’ve since seen skydiving, vacationing with his family, and generally conducting himself like an all-around decent human being. All the while, total chaos and insanity looms in the place he used to sit, and a nation is left watching history re-play itself out like a warped VHS tape of white power rallies, devastating hurricanes, incredulous White House leaks, presidential scandals, and arrogant white kids with bad haircuts and polo shirts, armed with tiki torches to defend poorly sculpted monuments of the Confederacy (just when you think you’ve seen it all…) It’s hard to watch this last season of Vice Principals and not blow a wish for Dr. Brown to come back and give one more inspirational pep rally in the North Jackson High School auditorium—just to feel a tingle of hope, that all is not (yet) lost.

Looking back, the first season was a chore for many to sit through: it garnered justifiable criticism for subjecting viewers to an exhaustive, vicarious experience of racist/sexist intimidation and persecution—which so closely echoes the real-life experiences lived by millions of Americans. But while the show certainly has its fair share of “cover my eyes ’cause I can’t bear to see where this goes” moments, I would argue that it remains a rewarding, perhaps even necessary experience for white Americans (especially white men). It forces the viewer to witness the devastating outcomes of intolerance, but not from an easy “scared straight” perspective; instead, the viewer actually has to do some work—to connect the dots between the shallow instincts that compel a person to behave in such a hateful fashion, and the reality such a person must effectively disengage from in order to fulfill such absolute hate. For hate is, ultimately, an uninhabitable condition (something one needs constant reminding of at this point in time). To highlight this truth, there comes a moment in every episode during which VP Neal Gamby (our anti-hero-cum-protagonist, played by McBride) will catch himself in the middle of some atrociously mean-spirited act—typically provoked by his far more nihilistic partner-in-crime, VP Russell—and question his ability to follow through with his ruthless vows, eventually caving in to his own vulnerability. In these moments, the viewer recognizes that even the Scroogiest of conservative white men has a soft spot, somewhere deep down; and in this act of empathetic recognition, the viewer finds their own embers of hateful inclination slowly sizzling out. (In turn, viewers with an overtly racist and/or sexist inclination—who might, at first glance, align themselves with the diabolic intentions of Russell and Gamby—are bound to cave in by the first season’s conclusion, upon realizing the fruitless and dispiriting outcome of the protagonists’ hate.)

vp2

Vice principals Lee Russell (left, Walton Goggins) and Neal Gamby (right, Danny McBride) contend with the unsustainability of their own prejudices. © 2016, HBO Networks.

Since the inauguration of 45, I’ve been troubled by the response of many a despairing liberal to the ill-informed cocktail of bigotry and racial intimidation perpetuated by the president and his base. On the one hand, it seemed to me the reaction of liberals was disproportionately soft—compared to the out-and-out violence (verbal, physical, psychological) that we found ourselves up against; on the other, it seemed a pretty ill-advised approach to fight fire with fire: to attempt and wipe out hate by singling out and shaming the haters, many of whom are so blinded by their own misinformation that they fail to recognize their bigotry as hatred incarnate. I recall beating my head against a wall (literally), and exchanging a series of frustrated emails with friends, most of which culminated with a half-joking recommendation that we split the country in half, effectively separating the evolutionists from the devolutionists. In seeking a broader perspective, I found myself drawn to the brilliant and frequently sardonic songs of Randy Newman, which have—throughout the past four-plus decades—effectively charted the folly of the stupid white man in America; sans effigies, platitudes, or other common forms of creative scapegoating. And I asked myself: Where are the Randy Newmans of today? Where are the Gore Vidals, the James Baldwins, the Nina Simones? How come every visible attempt at protesting the ignorant insanity of 45’s America appears to swing toward the two outer extremes of timid sloganeering and destructive violence? (Fortunately, not long after I went through this line of questioning, it was announced that Mr. Newman would be releasing a new studio album later this year—providing a much-needed salve. Far less fortunately, so many voices belonging to people of color have been effectively suppressed, repressed, depressed, or extinguished altogether; rendering it difficult for the range of creative perspectives the country ought to be represented by to truly flourish—and sentencing the fate of acceptable social protest to a kneel in a football stadium.)

Setting aside the apparent racial intolerance that has festered throughout the country (and the Russian interference that reinforced this intolerance through strategic interventions on social media), part of this dilemma likely stems from another root cause of 45’s presidency: the mindset underlying that lamentable term, “political correctness.” In hindsight, it is difficult to imagine 45’s candidacy gaining the kind of momentum it generated without the scapegoat of liberal hyper-sensitivity. Every slogan developed throughout his campaign served to highlight this critique: from “crooked Hillary,” to “bad hombres,” to “what a nasty woman,” to the cringe-inducing “he can grab my…,” to the swiftly appropriated “deplorable and proud of it,” the racial hatred permeating the campaign’s tone was matched only by its general disdain for pre-meditated and/or sensible syntax. And as with all false generalizations and stereotypes throughout history, there was, in fact, a justifiable criticism at the onset of this profane game of Chinese whispers. Namely, the criticism of the left’s increasingly rigid thinking on the subject of policing language: a well-intentioned effort to nip hate speech in the bud, but one that has frequently neglected to take into account the Quixotic nature of its own pursuit. For just like the idealist of Cervantes’ great novel, the “P.C. police” (as they’re commonly referred to by irritable right-wingers) often find themselves chasing windmills and missing the forest for the trees: so wrapped up in the semantics of isolated incidents, they lose sight of the motivators behind the language they are policing—which might foreseeably range from absolute, vitriolic hatred; to an infantile desire to provoke or offend; to sheer ignorance of the meanings attached to the words one has chosen.

It is within this context that Vice Principals presents a swooping breath of fresh, tension-splitting air. Although the premise of the show is itself a persistently tense exercise in caustic polarization, the manner in which it mirrors the real-life tensions surrounding its creation (considering that the first season’s airing coincided with the peak of the 2016 election) serves to deflate the pressure accompanying its subject matter. Here we find three character types that are frequently subject to the “politically correct” treatment—an effeminate, plausibly closeted gay man; a heavyweight divorcé; and a well-educated woman of color—released from the popular liberal’s cocoon of cultural suffocation, and allowed to live and breathe as characters that are every bit as nuanced as they are dense; almost like actual people. And if the show has a secret ingredient in the recipe of its greatness, it most likely lies within this astute recognition that vilification and deification are equally ineffectual tropes (both in narrative terms, and in lived reality). It would be easy—all too easy—to rewrite the show with Gamby and Russell (embodied by the relentlessly brilliant Walton Goggins) as dyed-in-the-wool hate-mongers, with a cheaply sketched-in backstory of how they came to be so hateful (e.g. childhood abuse; bullying; exposure to violent crime): the rest of the series—assuming the form of a prime-time melodrama—would essentially write itself, with the characters either achieving progress towards an awareness of the origins for their respective prejudices; or, conversely, digging their heels in deeper and, eventually, falling on the sword of their own bigotry. Not only would such a literal execution of the premise be uninteresting: it would render it increasingly difficult for the actors to bring any real pathos or complexity to their characters, since such a narrative is ultimately a glorified journey from point A to point B. In other words, this more “sensitized” approach would present the antithesis of a real person’s life journey, which invariably presents a more complex trajectory through various stages of change and emotional/intellectual growth.

vp1

Gregory (right) provides the heart and soul, and Goggins (left) the diabolical thrust behind Vice Principals—the only great satire thus far broadcast on American television in the year 2017. © 2016, HBO Networks.

Rather than taking the easy way out of contending with bigoted protagonists, Hill and McBride have boldly chosen the more challenging, and far more rewarding narrative approach. In Gamby and Russell, they have created two strangely… lovable bigots. Not that one loves them because of their bigotry (the show is structured in such a way that such sympathies are unlikely, at worst); one loves them in spite of the raging ignorance and intolerance that continually threatens to swallow them whole. Instead of being vilified and caricatured as two creatures from the black lagoon who’ve arisen to claim some distorted interpretation of supremacy, Gamby and Russell are just two stupid white boys with no real grasp on the concept of emotional maturity—and watching their psyches disintegrate from episode to episode is every bit as comical as it is maddening. Not unlike our current president, whose racist inclinations frequently appear to stem less from an inherent sense of racial superiority (I mean, just look at him), but more from a cynically strategic approach to soliciting support from pockets of the U.S. voter base, which any seasoned politician with a modicum of decency would refuse to entertain (e.g. David Duke and his cohort, and at least half of our Presidential Cabinet). But the real masterstroke of Vice Principals is that, despite the uncanny parallels between our presidential administration and the admin of North Jackson High, the show succeeds precisely where the president’s administration has failed: by actually making us care about the fate of its ignorant protagonists.

It is safe to say, at this point, that hardly a person in the country—or, more broadly, on the face of the earth—can be bothered to care about the personal fate of the 45th president. It is, in fact, difficult to think of any figure in our nation’s history who has been so widely (and so justifiably) reviled, across the board of political identification and cultural affiliation. And true to form, 45 has surrounded himself with individuals who only serve to further dehumanize his public persona: compounding the reality television aesthetic of his own making, and continually escalating the threshold of public disdain. And I would here argue that it is this aesthetic of idiocy—this constant talking down to the citizens of a country who, by and large, know they deserve better—that presents the biggest hurdle for his detractors to surmount. The brilliantly monotonous condescension of Maxine Waters, in addressing one of the president’s multiple administrative chumps, Steve Mnuchin, provides a case study in the only appropriate way one can respond to such arrogant bluster: consistently raising the point (“reclaiming my time”) of our administration’s inadequacy, incompetency, and seemingly interminable disrespect towards the citizens whose interests it has been charged to uphold.

belinda-brown-1024

Dr. Belinda Brown: carrying on with conviction and humor. © 2016, HBO Networks.

Likewise, in Season 2 of Vice Principals, Dr. Brown brilliantly dismantles Neal Gamby’s initial hypothesis regarding his violent assault at the culmination of Season 1: upon being accused of Gamby’s attempted murder, the former Jackson High Principal scoffs at this suggestion, instead drawing Gamby’s attention to a tattoo she has had affixed to her back—depicting her two former vice principals actually eating shit, while smiling and amorously holding hands. It’s her own personal idea of revenge: a gesture that hilariously highlights the racial divide at the heart of Season 1’s tension. For whereas the white male testosterone pumping through Gamby’s and Russell’s systems repeatedly compels them to acts of childish violence and lashing out, the cool “been there, done that” attitude of Dr. Brown—whose past experiences with indignant white men can only be imagined by the viewer—empowers her to keep her calm and carry on with humor and conviction: two things the country (if not the world) is in most dire need of now.

It has yet to be seen how the remainder of the series will play itself out. As Russell and Gamby delve deeper into their farcical investigation of Gamby’s shooting, one can’t help but think of the President’s own glorified wild goose chase: to single out his dissenters, and thereby satiate his acolytes with a gushing fountain of persecutory accusations directed at the liberals they all thumbed their noses at this last election (or, to expand upon this metaphor with an even more precise one, the noses they cut off to spite their own faces). Two well-played scenes in the most recently aired episode serve to highlight this real-life parallel: in one, Gamby enlists a black security guard from the school to search the lockers of multiple black students, all of whom he has targeted as prime suspects for his attempted assassination (without a shred of evidence, of course). After finding nothing but homework, textbooks, and a scientific calculator in one boy’s locker, the security guard observes in a disheartened tone: “Man, you actually made me think he was guilty!” The other scene in question entails Russell planting a hot mic in the teacher’s break room, in order to tune in to the gossip taking place behind his back (most of it directed at his gaudy wardrobe, social awkwardness, and apparently deadly halitosis): when he later proceeds to fire his entire faculty for subversion, one immediately thinks of Sean Spicer, Steve Bannon, Reince Preibus, Sally Yates, Michael Flynn; the Mooch.

For some prospective viewers, this will all prove a little too much too soon. And yet, in bringing ourselves to truly care about the fate(s) of Gamby and Russell—in wanting them to get at least a little woke; to stop being such selfish assholes, and to play a little bit nicer—there’s a chance we might bring ourselves to care a smidge more about the fate of this altogether asinine administration, along with the misguided minions who stubbornly refuse to withdraw their support for it. In turn, and for better or worse, it is they who now dictate the fate of our nation.

image1 (3)

Depeche Mode performing at the “DTE Energy Center” (formerly Pine Knob) in Clarkston/Detroit, MI, on August 27th, 2017.

I’m standing in a sea of people (most of them dressed in black, or something approximating), bobbing my head in nonverbal agreement as Dave Gahan leaps about the stage at a large outdoor venue in Clarkston, about an hour north of Detroit: according to its Wikipedia entry, the venue was formerly known as Pine Knob, before the “Pine” was dropped from the name. (Presently, the amphitheater is referred to by the markedly less spirited name of the corporation leasing it for advertisement.) Gahan slowly scans the crowd as he melodiously observes—in that well-established, sensual growl we’ve all grown to know and love: “You’ve been kept down/You’ve been pushed ’round/You’ve been lied to/You’ve been fed truths.” The theater grow increasingly silent, as fans lean in to decipher the words to a song from the newest Depeche Mode album: “Who’s making your decisions?/You or your religion?/Your government, your countries/You patriotic junkies…

The crowd roars with something between consensus and confusion; as though torn between the pride of one’s own patriotic addiction, and the awareness that this rather mundane line of lyrical questioning may be too on-the-nose for comfort. The roar swells to a cry of total submission as Gahan and songwriter Martin L. Gore join in unison (an octave apart) to deliver one of their most downbeat-ly whip-smart choruses (“Where’s the revolution?/C’mon, people, you’re letting me down“), before lunging into a second verse of inquisitive befuddlement at the evident complacency among the masses they once dedicated an entire album to.

The performance was riveting on multiple levels, not the least of which rates Gahan’s incredibly active on-stage presence. But beyond the acrobatic microphone twirling and hip-shaking, the timeliness of this tour couldn’t escape even the most oblivious of audience participants. In the previous week’s news cycle alone, the country learned of 45’s reversal of a ban on police departments purchasing military gear; the bafflingly inappropriate Presidential pardon of “America’s toughest sheriff,” Joe Arpaio; and the devastating wreckage being caused by Hurricane Harvey in the Southernmost regions of the country—calling to memory the fiasco surrounding the Bush administration’s handling of Hurricane Katrina in 2005 (and not yet calling to mind the wreckage of Hurricane Irma, still only a blip around the corner in the minds of most citizens).

With this as the backdrop, one couldn’t help but pick up shades of their ingenious Rose Bowl concert in June of ’88, which provided source material for one of the most legendary and influential live albums of the decade—Depeche Mode 101. Nearing the end of Reagan’s second term in office, and coinciding with the start of the UK band’s crossover success with listeners in mainstream America, the event was a phenomenon of culturally relevant bombast: from the then-quite-shocking, counter-religious anthem, “Blasphemous Rumours,” to the anthemic-yet-poignant “Black Celebration” (simultaneously calling to mind the band’s gothic glory and the dark cloud of AIDS), to the heroin-streaked exhilaration of “Never Let Me Down Again,” to their brilliantly ambiguous tribute to the virtues of capitalism (“Everything Counts”), 101 was a bona fide, counter-cultural harbinger. It was only fitting that it should’ve been captured by the acclaimed documentarian D.A. Pennebaker—who previously lent his visionary perspective to documentaries on the fateful Altamont festival, the Monterey Pop festival, Bowie’s final Ziggy concert with the Spiders From Mars, and the cultural zenith of Woodstock (among others). To this day, Pennebaker’s 101 film carries a gravitas that few other filmed music documents of the decade can reasonably lay claim to: the fact that the band had yet to unleash their most enormously successful record and tour (Violator) merely serves to highlight the historical weight of this concert; and more broadly, the on-going significance of its performers.

* * *

If one were to search for a musical document of comparable relevance, one shouldn’t have to go far to stumble upon that other behemoth of ’80s alternative pop, U2—a marginally more commercial enterprise by this point in the decade, but one that shared more than a few key ingredients: both were UK imports (a feature more proudly showcased among Bono & co., but an important element of both bands’ successes); both shared fairly inauspicious, working class origins; and they both shared a genuine love of American R&B—something that may be more apparent to U2’s bevy of American listeners, but is no less true of their more broodingly electronic counterpart (if in doubt, refer to the twangy riffs in “Personal Jesus” and “Pleasure Little Treasure;” or the surprising gospel ballad, “Condemnation”). They also shared a common visual design aesthetic, as seen through their respective work(s) with the acclaimed photographer/filmmaker, Anton Corbijn, and by their frequent reliance on highly polished, cinematic imagery.

depeche

Depeche Mode (from left to right: Martin L. Gore, Dave Gahan, Andy Fletcher) photographed by Anton Corbijn in 2017.

More significantly than their sonic and visual similarities, however, the two bands in question represent something far more macro and culturally meaningful: they both pointed—more adroitly at some times than others in their wide-spanning, lucrative careers—to the vastest possibilities of bombast in the still-blossoming arena of pop music; an arena that could be argued to have since dried up, having reached the most dreaded end of ought-to-be-extinction. Back in 1988, stage design aficionados had yet to see the likes of Madonna’s Blonde Ambition tour; jumbo-tron technology was still in its formative stages; and holograms were simply cheap stickers on plastic rings found in Cracker Jack boxes. There was an air of possibility and experimentation surrounding the prospect of a commercial band doing an arena tour. Surely, financial dividends proved to be the over-riding intent in such pursuits for many an interested party (as demonstrated in borderline-comical form at the end of Pennebaker’s film of 101, when the venue’s merchandising team—many of whom had never heard of Depeche Mode, and were clearly doubtful the band would be able to fill even a small portion of the rather sizable football stadium—scratch their heads in befuddlement as they wade in a sea of cash spent by loving fans on t-shirts, buttons, programs, pins, and posters); but the late ’80s represented a real pinnacle in the development of large-scale pop music performances, and it wasn’t all just about the dough.

A most telling example of this tug-of-war between commercial and artistic interests was the infamously over-wrought tour in support of Bowie’s 1987 studio album, Never Let Me Down: christened the Glass Spider tour, after one of the album’s showcased tracks, the venture was simultaneously a success and a fiasco. Though it is estimated that six million people attended performances throughout the tour, raking in roughly $86 million for the parties involved (thanks in part to sponsorship by PepsiCo, a decisively controversial move that would go on to provide a template for every large-scale touring act to follow), the Glass Spider tour was widely lamented by music critics as an overly-indulgent display of pomposity. Conversely, more open-minded critics displayed a willingness to read between the broadly painted lines of the tour’s dated production, in order to recognize the artistic intent hidden beneath the permed hair-dos and expensive props. Bowie himself appeared to be questioning the very reasons for his artistic continuity—a process of artistic disorientation that would follow him throughout his subsequent project as lead singer in Reeves Gabrels’s post-rock band, Tin Machine.

u2

U2 (from left to right: The Edge, Bono, Adam Clayton), as the subject of the 1988 film Rattle and Hum, directed by Phil Joanou. © 1988, Paramount Pictures.

Within this context, the dual phenomena of U2’s Rattle and Hum and Depeche Mode 101 seem to represent a turning point in the history of pop music: a point at which the interests of art and commerce converged most neatly, just before parting ways most decisively—the interests of commerce having emerged victorious, once and for all. And while the past 30 years have seen tours of much greater scale and ambition, one is hard-pressed to find moments of such decisively widespread cultural zeitgeist in music history books. The skeptical reader should keep in mind here that both of these concert films (the former directed by Phil Joanou) were major theatrical releases, which—alongside Prince’s equally innovative Sign O’ the Times concert film—paved the way for pop music documentaries as diverse as Madonna: Truth or DareDixie Chicks: Shut Up & Sing, and Peter Bogdanovich’s Tom Petty documentary, Running Down a Dream. Along with Demme’s acclaimed film of the Talking Heads Stop Making Sense tour, and Scorsese’s film of The Last Waltz (released a decade prior), the two features in question can be read as a sort of end-of-the-road signpost in the evolution of pop music narratives in mainstream film. For since then, there have been no mass-distributed music films of commercial note to take a pop music figure as their subject—apart from Justin Bieber: Never Say NeverKaty Perry: Part of Me, and Glee: The 3D Concert Movie (it is worth noting, however, that independently-produced documentaries on more cult-ish music figures—such as Rodriguez, Fela Kuti, Nina Simone, Conny Plank, and Death: the band—are currently on the rise in art houses and on Netflix).

With all of this taken into consideration, one would be forgiven for asking: what ever happened to meaningful bombast? Did Bob Geldof’s (debateably) miscalculated Live Aid events signal the end of an era once marked by pop-rock grandiosity—opening the door for a new generation of self-righteous pop stars, whose boastful passion for fundraising is outweighed only by their passion for the public’s attention/approval? Did the increasing involvement of corporate interests (signaled by Bowie’s Pepsi-endorsed Glass Spider tour, later culminating with TicketMaster and major concert arenas—such as the aforementioned Pine Knob—mutating into vehicles for commercial advertisement) drown out the artistic interests that previously endeavored to exert total creative control over such endeavors? Or is it just that, at the end of the day, a culture of cynicism has finally won out? I suppose that only time will tell; but an educated guess might well lean in the direction of the last hypothesis.

David-Bowie-GLASS-SPIDER-Stage-1024x648

David Bowie once more sets the template for pop music protocol, when he accepted the sponsorship of PepsiCo during his 1987 tour in support of Never Let Me Down, christened The Glass Spider tour (May 30th to November 28th, 1987).

And this is (in part, at least) why moments such as a live rendition of the new Depeche Mode single, “Where’s the Revolution?”, carry such a startling resonance in 2017. For not only is the song itself perfectly suited for the socio-cultural themes defining our day and age; the mere fact of a major touring band resorting to such an earnest strain of cultural commentary presents a sound for sore ears. In hindsight one finds that, as the early post-Live Aid years gave way to the dawn of slacker-ism, grunge, and a newly commodified variety of hip-hop (frequently laced with lazy machismo and even lazier beat-programming), the notion of a singer-songwriter earnestly expressing concern about the state of the planet began to completely evaporate. Women in pop music became (even) more heavily fetishized, with the boy band phenomenon representing the homo-erotic counterpart of a plastic pop movement coming into full swing. In seeming retaliation to such vacuousness, “hard” pop bands (with acts like Green Day and Blink-182 at the softer side, and Slipknot/Limp Bizkit/Korn at the harder end of the spectrum) represented, in actuality, another side of the same coin. The start of this cultural trajectory might arguably be traced back to the pop art movement—the formal separation of sincerity from artistic expression—but there have since been erratic flickers of endeavored sincerity; like the Green Day/American Idiot craze that swept the nation in the early aughts, or the hard/soft dynamic of Karen O and the Yeah Yeah Yeahs. Alas, the former example carried with it a distinct aroma of Hot Topic prefab-ness, while the latter has struggled to find stable footing between a drive for artistic integrity and an expectation of commercial success—resulting in a slew of overly eclectic records with several high points, but little in the way of textual consistency.

Compare this to Dave Gahan conducting his umpteenth live rendition of the hit Depeche Mode single, “Enjoy the Silence,” fully trusting the audience to sing the first run-through of the chorus (without missing a beat or a lyric) as he simply holds the microphone above the roar of the crowd. Other contemporary artists might lay claim to some catchy singles, but such cultural “events” seem harder to come by with each passing day; and while there is a greater wealth of brand new, quality music for us to consume than ever before, none of it carries the same conferral of greatness, which was only made possible through an unspoken agreement: that the forces of art and commerce should continually battle and work out their differences within the top 40. Case in point: the most recent, worldwide U2 concert series—supporting the 30th anniversary of their 1987 masterwork, The Joshua Tree.

image4

“I want to run/I want to hide.” U2 performing “Where the Streets Have No Name” against an astonishingly widescreen backdrop of Anton Corbijn-directed cinematography, at the Lucas Oil Stadium in Indianapolis, IN (September 10th, 2017).

Among the litany of great studio recordings produced during the 20th century, few can lay claim to the sheer magnitude of factors that triggered the enormous success of this album: from the band’s on-going collaboration with acclaimed producers Daniel Lanois and Brian Eno, to the engineering work of Flood, to the great kaleidoscope of American songwriting influences permeating the album’s 11 tracks, to the promotional album photographs snapped at Zabriskie Point by Anton Corbijn—right on down through the one-two-three punch of hit singles: “With Or Without You,” “I Still Haven’t Found What I’m Looking For, and “Where the Streets Have No Name”—it is a massive understatement to remark that all the right elements collided to form this behemoth of pop majesty. Building on the vast, open sound palette first patented by Eno and Lanois on The Unforgettable FireThe Joshua Tree begins with a great fireworks display of sonic dynamism and never lets up, retaining a shimmer of splendor even in its quietest moments (“Running to Stand Still;” “Mothers of the Disappeared”). Performing the album live in its entirety, start to finish, may seem like a parlor trick or a novelty act to some; but for the millions who have attended a performance of this anniversary event (including myself) it likely represented so much more.

For how can you pin a reductive label on a cultural phenomenon that has captivated so many hearts and minds throughout the years: a record so overwhelmingly full of pathos and soaring melodies, that many (if not most) who attend its live performance find themselves spontaneously able to recall every note and lyric to every song—including such minutia as the spoken word piece in “Bullet the Blue Sky,” or the staccato wails of “raining” that line the climactic resolve to “One Tree Hill”? Personally, the experience brought to mind a worn-out cassette tape that once resided long-term in the tape deck of my beat-up Ford Probe, having been lovingly transferred from a vinyl copy of the record I had pulled out of a crate in a thrift store. The sound of the record—brilliantly engineered so that, even in the most depreciated format, and played on the most dilapidated of sound systems, those waves of synth and effected guitars couldn’t fail to wash over the listener, swallowing us up in the grandness of its enterprise. In the album’s official “Making of” documentary, Flood speaks of the production process in terms of it being “very different from anything I’d ever approached before. It was a first for so many things. The whole process was totally different… The type of sound they wanted for the record was very different from anything anybody had asked for: open, ambient, a real sense of space, of the environment you were in. Not normal requests.”

As it turned out, the sound of The Joshua Tree wound up being one of the most highly imitated sounds developed during the annals ’80s pop: its reverberations can be traced directly through Flood’s later work with PJ Harvey, The Smashing Pumpkins, New Order, and—most pointedly—Depeche Mode, having soon after produced their beyond-sensational breakthrough in 1990 (not to mention the sound of other arena-filling acts of the ’90s and aughts; such as Radiohead, Garbage, The Verve, and Coldplay, to name a few). But in the case of U2 and The Joshua Tree, the decision to crack the band’s sound wide open—incorporating entirely new spaces and textures—seemed to reflect more than just an aesthetic choice: indeed, a parallel can be drawn between this newfound openness, and the utterly non-cynical, total sincerity and dedication of the band itself. Producer Brian Eno defined this level of dedication in the same “Making of” doc as follows:

“I had got a real sense that this band was capable of making… something that was self-consciously spiritual to the point of being uncool, and I thought uncool was a very important idea then, because people were being very, very cool. Coolness is a certain kind of detachment from yourself; a certain defensiveness—in not exposing something—because it’s too easy to be shot down if you’re exposed. Of course, everyone was in the process of shooting U2 down. They were not favoured, even though they had a big public following, but critically they were thought to be rather ‘heart on their sleeves.'”

In other interviews, Eno traced this disconnect between the band and the popular trends surrounding them back to their national origins. In a 1994 interview, for instance, the producer reflected: “When you think about it… cool isn’t a notion that you’d often want to apply to the Irish, a people who brilliantly and easily satirize, elaborate and haggle and generally make short stories very long but who rarely exhibit the appetite for cultural disdain—deliberate non-involvement—for which the English pride themselves… It is this reckless involvement that makes the Irish terminally uncool. Cool people stay around the edges and observe the mistakes and triumphs of uncool people (and then write about them)” (quoted in Noel McLaughlin’s essay, “Eno, Ireland, and U2”). Regardless of its roots, the “terminally uncool” demeanor of a band like U2 is bound to carry with it implications as complex as the demeanor itself; for instance, many music critics—bound to an arbitrary code of “cool”ness (read: aloofness)—tend to keep a calculated distance, whereas more non-critically oriented listeners may find themselves flocking to their enormous sound like moths to a flame.

image5

U2 performing “Beautiful Day”—the first encore to follow their full live performance of The Joshua Tree at Lucas Oil Stadium.

Needless to say, the demographic makeup of a U2 concert audience is a mixed bag, with a marked contingent of “non-critically oriented listeners” (I commented in passing, just prior to the start of the show at the massive Lucas Oil Stadium in Indianapolis, that I’d never seen so many audience participants wearing the official tour shirt to the concert—a generally accepted faux pas among dedicated concert-goers). Just in front of us, two forty-something women clad in tight jeans and fancy blouses devoted a good half-hour of the show’s warm-up time to snapping a puzzling, unimaginative series of “selfie” photographs with their phones; now from the left angle, now from the right. As the headliner worked their way through a powerhouse of a set, I was further confounded by one of the two women’s insistence on standing perfectly still for the duration of the performance (including the slower numbers, which provoked more embittered attendees seated behind me to instruct “okay: it’s time to chill…”), occasionally raising a hesitant arm in an apparent attempt at emotional involvement—before finally deciding against it and returning to a stance of stoic semi-engagement. It dawned on me, during this shameless exercise in people-watching—a habit I’ve never been able to break totally free from at live concerts, despite my best intentions—that the band’s audience has likely grown more and more generic (and consequently, less and less musically-informed) as the years have advanced. Strangely enough, it would appear that a band once renowned for its emotional over-zealousness, has since become a huge draw for individuals wholly detached and removed from the pure, childlike love of music this band sought to foster from the very start. But here I digress…

As far as Yours Truly is concerned, the performance could hardly have been more emotionally involving, or more existentially absorbing. From the opening guitar lines of “Sunday Bloody Sunday,” to the final refrain of the downbeat Achtung Baby anthem “One,” the performance was a wholly riveting and visceral exercise in what one might call “meaningful bombast.” For there was hardly an insincere moment to be had throughout the evening (barring Beck’s more irony-laden—at least, one hopes—rap-centric performance that comprised the event’s entr’acte); and I gladly count myself among the many attendees who caught themselves singing along to every song on the album proper, along with the earlier-era numbers they chose to open with, including the stunningly powerful “Bad”—my personal favorite U2 song.


The band’s intro to the album’s explosive culmination, “Exit,” was smartly paired with an image well-known to movie lovers: a pair of clenched fists flanking the stage screen—with the letters “l-o-v-e” tattooed across one set of knuckles, and “h-a-t-e” across the other. A film clip preceding Corbijn’s re-imagined visual (inspired by Robert Mitchum’s malevolent preacher in the 1955 Charles Laughton film, Night of the Hunter) shows a beady-eyed huckster addressing a town on the subject of a great wall he plans to build to keep bad people off the streets. Earlier in the night, the band’s lead singer had subtly reconfigured a lyric in “Sunday Bloody Sunday”—from “when fact is fiction and TV reality” to “when fact is fiction and reality TV.” Contrasted with Bono’s plea throughout “Exit,” to want to “believe in the hands of love,” this early bit of foreshadowing presents one of many arrows throughout the evening pointing to the night’s emotionally pivotal close (“One”). (As for the Joshua Tree denouement, it lived up to its reputation as a truly epic showdown between Edge’s painterly guitar, Larry Mullen’s loud-soft percussion, and Adam Clayton’s deceptively versatile bass lines—weaving in and out of unison to form one of the band’s most dramatic/cinematic numbers in their entire repertoire.)

On more than one occasion, the event called to mind the Depeche Mode concert in Detroit just a couple weeks prior; not merely for the slew of music-cultural associations enumerated above, but because the pure sincerity (or sincere purity?) of both performances stands in such stark contrast to just about everything that remains of pop music. When Dave Gahan led the crowd in an acapalla sing-along to the contagiously hummable chorus of “Everything Counts” (in a goosebump-inducing reprise of the grand finale to 101), it seemed to have been drawn from the same well of energy that fueled Bono’s leading the crowd in Lucas Oil Stadium through the gospel-inflected chorus of “I Still Haven’t Found What I’m Looking For.” When Gahan and Gore introduced their setlist with the hauntingly topical themes of “Going Backwards” (a song about “turning back our history,” “piling on the miseries,” and “counting all the casualties”), it paralleled the tense, patriotically-tinted paranoia of “Bullet the Blue Sky” (“and through the walls you hear the city groan/and outside is America…“). Unlike certain younger, more precious and precocious performers (whose names I will refrain from mentioning here, for fear of this turning into a piece of disparagement, instead of a piece in praise of a lost art), the age of these two remarkably active bands serves to enhance the convincing power of the messages buried in the texts of their songs, or hiding in plain view across their surfaces. A song as majestic as “Red Hill Mining Town” is hereby rendered even more powerful through our awareness that there are few (if any) songwriters of Bono’s age, at the time the song was recorded (which, by my count, would be 27), writing anything in the vicinity of its stately elegance.

Arguably, it is this difference—more than any other outstanding aspect of these bands’ tremendously moving and awe-inspiring tours—which sets their achievements (past and present) aside from those of the up-and-comers (and-now-they’re-goners) numbered in the contemporary pop charts. For here we have two bands from the last days of an era we might as well refer to now as “pure pop:” an era that began with Sam Cooke and The Shirelles, but burned out around the time of the debut albums by The Stone Roses and Oasis. Which isn’t to say there are no sincere pop artists left standing; but rather that the medium itself has become so contaminated with self-conscious irony and advertising obligations, it can no longer embody the wholly innocent open-mindedness it once revolved around.

And yet, walking back to our car at the close of Depeche Mode’s Detroit performance, we spot (for the second time) a pair of twenty-something hair metal kids losing their shit to a perplexing setlist booming from their truck’s stereo system—a mix that betrays no critical discrimination between The Doobie Brothers and Def Leppard. The possibility of such open-mindedness can’t help but bring a smile to one’s face. Here, I could even present myself as a case in point: having turned 30 during the same year as the U2 album I saw performed live the other night, my perspective is a generation removed from the folks who first came to know and love this music. Consequently, I can discern no un-surmountable barriers between the oft-perceived coolness of Brian Eno’s solo work, and the loud vulnerability of U2’s arena-filling anthems. They both seem (to me, at least) possessed of the same innocent open-mindedness that gave birth to the vernacular of pop music. Along with the more darkly tinted vulnerability of Depeche Mode, they embody a sort of sensual integrity that seems consistently lost in the shuffle of our increasingly incidental, soundbyte-streaming culture.

image6

Depeche Mode performing David Bowie’s “Heroes,” as an encore to their Spirit tour setlist in Detroit.

Digging in the recent confines of my memory, I return to that stellar performance at the Pine Knob amphitheater—and that deceptively passive incitement to “snap out of it” couched within the new Depeche Mode single (“Where’s the Revolution?”). In hindsight, it seems to me less a call to arms, and more a call to re-awaken one’s emotional engagement with the human condition. Just as Bono’s closing tributes to influential women throughout the annals of history (accompanied by the achingly beautiful high point in Achtung Baby, “Ultra Violet (Light My Way)”) read less as an act of political confrontation, and more as a genuine gesture of outward compassion to the plight of humankind; something that we, so accustomed to the cynical overtones of 45’s America (and to the passivity that produced it) may feel challenged to accept at face value.

Nonetheless, such compassion is there for the taking, spread throughout the global tours of two monumental bands who refuse to give in to the temptations of self-effacing irony—insisting instead on the primal emotional forces that propelled them to crossover success in the first place. Like John, Paul, George, and Ringo; or Keith, Charlie, and Mick; or Bruce; or Prince. Or Mavis; Nina; Marvin; and Joni. Or Stevie, Christine, and Lindsey; or Chaka; or Whitney. Like the Starman/Blackstar of pop music himself, whose “Heroes” was so lovingly and movingly recited by Dave Gahan at the closure of the band’s Pine Knob setlist (easily the finest vocal performance the frontman delivered that night; as though he had set aside a special reserve of emotional energy for this tribute, set to the simple, startling image of a black flag waving against a gray sky). At one point, Bono inserted an unexpectedly moving tribute to the late heathen of pop, as well—remarking that “nothing has changed… everything has changed.” The phrase could hardly ring truer.

image3

Lucas Oil Stadium fills up with expectant fans of that most successful Irish pop band, touring their most successful studio achievement.

Identifying the muses of Dirty/Clean’s ulter nation album and video project.

“Women of the world, take over
‘Cause if you don’t
The world
Will come
To an end
And it won’t take long.”
– Jim O’Rourke (from “Women of the World,” off the LP Eureka)

In the following interview, Josh Egeland questions Josh England on the subject of the latest Dirty/Clean album (ulter nation), and the music videos that have been produced in support of it. The interview took place Saturday, August 12th, over coffee and muffins. Questions asked and answers given were transcribed as closely as possible, with punctuation and parenthetical notations added for editorial purposes.

* * *

interviewpic

Josh Egeland (je) interviews Josh England (JE) on the topic of Dirty/Clean’s ulter nation project.

je: So I guess we can start by reviewing the videos.

JE: Okay.

je: How would you respond to allegations of plagiarism, pillaging, or creative appropriation?

JE: That’s your leading question?

je: I think it’s a fair one.

JE: Well, when you put it that way, I guess the videos are kind of plagiaristic. They do pillage from films far greater than the music on the record, and therefore represent a form of creative appropriation. So I guess I would respond by pleading guilty.

je: So you don’t personally perceive a problem?

JE: I can understand why it might be perceived as ethically problematic by some… but no, I don’t have a problem with it. Have you been to the movies much lately?

je: Can’t say that I have…

JE: …It doesn’t appear that we’re missing much. I’ve seen a lot of contemporary film-makers not struggling hard enough to discover the possibilities their predecessors had explored decades prior. Which wouldn’t be an issue, if they’d only discover possibilities of their own. But there just doesn’t seem to be a whole lot of possibility to take in at the box office… it’s all so pre-determined now, especially the CGI stuff. The way I see it, the movies I’m “quoting” in these videos—possibly the more well-known ones, even–they’re not as widely recognized or embraced by the upcoming generation as they were by my generation, and the generations before mine. I suppose, in a way, there’s a relief to be had in the notion that younger generations can discard the cultural baggage of their ancestors; in another way, it seems to reflect a broader trend of major attention deficits. I’m not delusional enough to convince myself that, by featuring these clips in my obscure little music videos, I’ll bring about some big revival of cinephilia. But I guess I see this less as pillaging, and more as showcasing: highlighting the possibilities of a craft, which currently appears addicted to its own degradation.

je: But there are still good movies being made, no?

JE: Absolutely! But as with any number of pursuits in our advanced technological age, the butter seems to be spread out rather thinly. It’s like this remark of Brian Eno’s, from an interview with some British magazine earlier this year: the problem isn’t that there aren’t good records being made anymore, but rather, there’s too much good music out there, and no honest distribution system in place to facilitate a genuine zeitgeist (as opposed to a strategized one). But with movies, I think we’re far worse off. It’s like we went from a generation of film brats, all scrambling to fill the director’s seat, to a generation that doesn’t appear to have any real perspective on the historical weight of the craft itself.

je: And you think you’re in some kind of position to address this perceived oversight?

JE: I don’t pretend to be an expert on the matter, no. But I’ve spent more hours digesting movies than most people spend digesting food in their lifetime. Maybe that’s what seems to be missing… true love of the craft, as opposed to love of one’s own style; there’s a lot of that going around now. Did you see La La Land?

je: Yes.

JE: Case in point.

je: It wasn’t a great movie, I’ll give you that. But the intention behind it seemed noble.

JE: And that’s the problem. There’s nothing more detrimental to a good movie than a self-imposed aura of nobility.

je: But how is what you’re doing here any different? I detect a hint of self-righteous nobility in your complaint…

JE: I’m not trying to reproduce the feel of a bygone era by running off a photocopy and filling it in with new faces.

je: But you did cover a rather early OMD song on this latest Dirty/Clean record, didn’t you?

JE: That was a very personal… a very important song to me. Not just as a musician, but as a person. If you listen, there’s nothing really stylized in what we did. Our cover is straightforward and fairly removed: I made a very deliberate, very mindful decision to not come across like I was cashing in on a classic. I hope I succeeded; I mean, if it had been successful, I would’ve been embarrassed… Which is in part why it’s tacked on at the tail end of the record. At one time, it wasn’t even going to be on the record.

Official music video for “Souvenir”—a cover of the 1981 Orchestral Manoeuvres in the Dark single—directed by Jennifer Taylor.

je: So if you don’t view your project in line with stylistic homage, what category would you place it in? Or is there a category you feel comfortable with?

JE: I personally view our video experiment more in line with DJ culture, and other sorts of post-modern music and video production. When you think back on it, and despite its detractors, the early days of MTV saw the rise of several different approaches: straight-faced, lip-synced performance clips; “literal” music videos; and those experimental, sometimes disengaged montages of found footage. Have you seen Devo’s music video for their early song, “Mongoloid”?

je: I think so. It’s kind of literal, isn’t it?

JE: It is—but it’s also made of found footage, so it’s pretty abstract. And that’s what makes it work, as a video. It’s the surrealism behind it: the message beneath the surface. If something “found” can coincide so directly with the message in the song, then the message can’t be all that original in the first place, can it? It’s a concession of redundancy. It’s about not pretending that what you have to say is entirely original, but accepting that it’s been said before; and its strength lies in its repetition.

je: Let’s move on and talk about your selection process, in putting these videos together. How do you decide what clips are going to accompany each song?

JE: Mostly by intuition, which is how most of the songs were written. In fact, a lot of the films quoted throughout these videos provided fairly specific inspiration for the songs.

je: I imagine you’re referring to “Red Desert,” “Eclipse,” and “I.D. d’une Femme”?

JE: All of them, really. But yes—those all carry film titles in their name, so the influence of those movies could have been more prominent.

je: I can’t help but notice that the women in these films are showcased more prominently than the male protagonists, in looking at your videos. Was that deliberate?

JE: Yes and no.

je: [expectant pause]

JE: Well, to the filmmakers’ credit—all of whom, in reference to the clips selected, were men—women were showcased rather prominently in their movies. I mean, god: Monica Vitti and Antonioni… can you think of a more visually co-dependent relationship in the history of movies, between muse and director?

je: [pensive pause] Robert Altman and Shelley Duvall; Fassbinder and Schygulla; Godard and Anna Karina—and later, Anne Wiazemsky; John Cassavetes and Gena Rowlands; Lynch and Laura Dern…

RR_07376.R.0

David Lynch and his long-time muse, Laura Dern, appearing side-by-side in Twin Peaks: The Return. 2017 © Showtime Networks.

JE: Godard and Cassavetes both cast their wives, which is a different dynamic altogether. Altman utilized Duvall in supporting roles, often—strong ones, no doubt. And Fassbinder used an entire theater troupe’s worth of women actors, more or less as frequently as he used Hanna Schygulla; she just got paid more. Lynch has a fairly fetishistic, late-era Buñuel thing going on these days… Have you seen how he’s cast Chrysta Bell in the new Twin Peaks?

je: There is a bit of the proverbial dirty old man in him…

JE: But at least he’s upfront and transparent about it: like the Mael brothers. I’ll take that over these broad gestures of pseudo-feminist empowerment vis-a-vis male writers looking to get laid, which is what we appear to be seeing a lot of these days.

je: Let’s get back to Antonioni.

JE: Certainly. What was the question again?

je: Was it a deliberate choice, for you to showcase Monica Vitti more prominently than, say, Marcello Mastroianni or Gabriele Ferzetti?

JE: It was a deliberate choice insofar as my eye instinctively gravitated towards the scenes with Vitti, Moreau, Maria Schneider, and Daniela Silverio dominating the frame. When you watch those films—the alienation trilogy, The Passenger, and Identification of a Woman—you’re basically just waiting for the women to come back into the picture, whenever they’re not in the scene. It’s actually the entire premise in Identification of a Woman, just as it is in L’Avventura. Only Mastroianni and Jack Nicholson come anywhere close to competing with the women for our attention, as viewers. And they still fall short some of the time, in my opinion.

je: But Jack Nicholson is the protagonist in The Passenger, and Mastroianni and Moreau play the leads in La Notte. I mean, isn’t Monica Vitti only in that one party scene?

JE: Yes—the one that Pauline Kael lambasted, in multiple reviews. Have you read her take?

je: I think so…

JE: If I’m recalling correctly, she referred to Vitti’s performance as a failed parody of a Hollywood glamour girl.

je: Ouch. I take it you disagree?

JE: I don’t know that I disagree, so much as I never gave it much thought from that angle. I mean, Monica Vitti is so captivating as a performer… maybe what Kael responded to so negatively in her performances was the way that she routinely sabotages, or at least calls into question, Antonioni’s over-reaching authorship of those movies. I’ve never quite been able to determine whether she just wasn’t a very good actor, and couldn’t execute her character the way it was written, or if she was a really amazing actor, trafficking in deliberate obtuseness. I think that’s part of what makes those movies so intriguing to this day; because there are other ways in which they have not aged well.

je: I take it you’re referring to that one scene in L’Eclisse

JE: That’s certainly a prime example! And in a perverse sort of way, it’s a testament to the unstated brilliance of Vitti’s performance: you can’t quite tell whether she is personally oblivious to the culturally abhorrent implications of donning blackface, or if she’s doing a really spot-on parody of an oblivious, bougie white woman. Either way, the scene itself is lamentable, and it probably spoils an otherwise great movie for many viewers.

je: While we’re on the subject of racial representation, how would you respond if someone criticized your project as Euro-centric?

JE: I suppose I’d have to say that it is. But isn’t it sort of obvious? I mean, the CD packaging has more Italian text on the cover than it has English. But like I’ve already written and spoken about in previous interviews, that component of the project pertains very specifically to my experiences growing up in Europe, and not experiencing my homeland until many years later. I’m fairly certain that if I had reached out farther than what I’m familiar with, geographically speaking, it would’ve seemed about as forced and incoherent as one of Monica Vitti’s malapropisms.

Official music video for “Red Desert,” showcasing more of the muses who provided inspiration for the songs on ulter nation. (More muses featured in the videos for “Eclipse” and “Into the Night (Pt. I)”).

je: Let’s talk about the most recent music video, for “Red Desert.”

JE: Sure thing. What do you want to know?

je: For starters, I notice that your credits in the video description highlight all the women in the video, but you neglect to make mention of the men. And it does seem to me that Aleksandr Kaydanovskiy [in Tarkovsky’s Stalker] and Richard Harris [in Antonioni’s Red Desert] share quite a bit of screen time with the women in your video.

JE: True, but that’s beside the point. “Red Desert” is one video for which I would definitively answer “yes” to your previous question—about how deliberate my “casting” of these women might have been.

je: What are you trying to convey through this gesture?

JE: I’m not sure that I’m really trying to convey anything in particular. The video is less a statement than a summoning.

je: Not sure I follow you…

JE: It’s most obvious in the Marianne Faithfull clips from that odd little Kenneth Anger movie, Lucifer Rising. And the scenes with Monkey, Stalker’s daughter in the Tarkovsky film.

je: You’re referring to the supernatural, then?

JE: Not just the supernatural in general, but the supernatural power of women in particular, throughout the annals of history. While working on the songs for ulter nation, I was reading a lot—which I find to be very helpful, creatively—and I was struck by this chapter Marianne Faithfull has published about her experiences with Kenneth Anger. It was for her second autobiography, called Dreaming My Dreams. Have you read it?

je: I believe so.

JE: It’s a great read. I think I like it even better than the first one. There’s this chapter where she recounts the full story of how she was living on this wall in Soho, strung out on heroin, and Kenneth Anger showed up and invited her to fly with him to Egypt to play [mythical figure] Lilith in one of his experimental movies. She did the part, but then realized, as she was crawling through a Muslim graveyard with Max Factor blood dripping off of her, that maybe it wasn’t such a great idea. She paints a more broadly desecrating picture of Kenneth in that first biography, but enough time seems to have passed by the time she revisits the story in her second book… she seems a little less one-sided on the matter. But she still seems affected by the fact that he placed some lame little curse on her, after she published that first tell-all.

je: She has had an awfully challenging few decades since then…

JE: Yeah, but she’s survived, hasn’t she? I mean, tomorrow isn’t a given thing, and the reaper will eventually pay us all a visit. But getting back to my initial point, I think Marianne Faithfull is a testament to the resiliency of humankind—and of women, specifically. I wanted to highlight that in the video for “Red Desert.” It’s a song that takes, as inspiration, my perception of women as having been trapped, all throughout history, in a man-made machine fueled by this primal fear of what might happen if they were unleashed. Like in Red Desert, where this incredibly engaging woman lives out a perfectly unnecessary, meaningless existence—in a landscape that’s been depleted of natural resources and coated in smog. Looking back, I think a lot of really great critics, like Pauline Kael, voiced their anger and disdain for this movie out of an incredulity that such a premise could ever come to fruition. It may be one of the first truly convincing, fully-realized dystopian films… a sort of antidote to Buñuel’s utopian vision of Robinson Crusoe.

fb545103c376a5bf3368b27c905ae2f2

Monica Vitti rules the screen in Michelangelo Antonioni’s Red Desert (1964). © renewed 2010, Criterion Collection.

je: [pause] Yeah, I can’t think of anything made prior to it that is comparable, at least in that regard. There’s a lot of dystopian motifs at play in the works of German Expressionists, but few are convincing from the standpoint of realism. And in looking at the clips you used in the music video, it does seem as though Antonioni’s film carries a pretty startling visual resonance—considering our current cultural and ecological circumstances.

JE: It totally resonates today. Because here we are thinking, “how much worse will things get, if, or when the effects of climate change become irreversible and totally relentless?” The movie itself came out around the same time the worldwide ecological movement started gaining momentum. You know, those years following the ravages of World War II, when the costs of environmental disregard started showing. But it seems to me there was a lot of complacency at the time—even within the movement. Which isn’t to say people didn’t really care about the environment, only that folks couldn’t easily appreciate the full ramifications of what all was at stake. Not as easily as we can now.

je: But aren’t ecological issues universal? I mean, they affect men just as eminently as…

JE: …women, and children; and cats, dogs; bees and plants. Of course they do. But we seem to be perched at a point in history where progressive politics—if they actually are going to persevere, and don’t just crumble in on themselves—will face a self-imposed choice between identity politics and environmental politics. And I sense an inherent danger at this intersection: that by quarantining social issues in order to focus on the “bigger picture,” we may still lose the war, and our social problems will only have gotten worse.

je: …Having lost the battle and the war simultaneously.

JE: Exactly. I mean, if we can’t all even brings ourselves peaceably live together on this planet, why try to save it?

je: And conversely, if we can’t bring ourselves to save the planet, why bother living peaceably together?

JE: They’re mutually dependent clauses. I think that’s something Antonioni implied, intentionally or inadvertently, in the text of Red Desert. The implications of the dilemma are totally discomfiting, and I can appreciate why someone like Pauline Kael would be miffed by a premise this bleak. When you consider the potential for nurturing and painting the environment you want to live in through artistic expression, it’s as if Antonioni did the exact opposite, while at the same time displaying a sort of willingness to put up with this uninhabitable world he created. Like Monica Vitti, he leaves us wondering about the degree of intended irony in his performance, as director. But deep down, I believe he was rooting for humanity. I think if he had been a total cynic, he would have just filmed buildings and left the people out altogether.

je: I believe Fassbinder made the same argument, in response to those allegations of misogyny: that a true misogynist wouldn’t even feature women in their movie.

JE: Yeah… looking back on that one, it’s an over-simplified retort, but it still rings true. I mean, I think the most popular form of misogyny these days is of the “I want women to exist, but only as pregnancy vessels” variety; you know, the whole Handmaid’s Tale, Mike Pence sort of thing.

je: There’s also a troubled history within the gay community…

JE: Yes. Men seem to be a recurring problem in this picture, don’t they? I mean, there have been truly militant, men-hating women throughout history…

je: You mean Valerie Solanas?

JE: Yeah, that whole SCUM Manifesto clique. But historically, most of the world’s sexist rancor seems to come from the other side of the gender spectrum—the side with the most inherited economic power.

je: An interesting point, but I fear we’re getting side-tracked. Let’s get back to that bit about summoning…

JE: Okay, shoot.

je: What do you see as the relationship between Monkey, Marianne, Julianne Moore, Monica Vitti, and Jane Bowles (as played by Debra Winger)?

JE: Apart from the fact that they all acted as my muses during this project, I think they are all women whose presence on-screen seems to summon an other-ness, an untapped energy—something beyond everyday, superficial gestures of power.

je: Please explain.

JE: Take Marianne, for instance. I mean, she was at (or near) her very lowest in that Kenneth Anger film. But she steals the movie, when you look at it today. All the other expressions of mystical occultism in the picture seem pretty hokey now, but she was an outsider from the start, and she carries that with her throughout her scenes. Even as a homeless woman strung out on heroin, she was able to project something way more powerful than all the other kitschy, ponderous gestures of magic in Anger’s movies. When she sobered up and started putting out these wonderful records, I think it became apparent just how under-estimated she had been, creatively speaking, in her formative years. Back when Kenneth Anger could be held up as this great, subversive film-maker, but Marianne could only be seen as a rich, spoiled junkie. I mean, that was hardly ever the public’s perception of Mick, and he had far more auspicious beginnings…

MarianneFaithfullMickJagger32-1024x681

Mick and Marianne, cotton candy in hand; photographed in the late 1960s by Jonathan Stone (date and location unknown).

je: And then there was the whole “Sister Morphine” debacle…

JE: Yeah. But they worked that one out eventually: I think there were some pretty pragmatic implications at play in her exclusion from the original songwriting credit—something to do with the Stones’ publishing arrangement. But the outcome didn’t reflect the nuances at play. She wasn’t really perceived to be a creative contributor to the Stones by most people, at the time.

je: So by featuring only her scenes from Lucifer Rising in the “Red Desert” video, are you attempting to restore some kind of artistic merit to her legacy?

JE: I don’t know that I would go that far… I mean, hasn’t she already done that for herself, several times over? She’s that rare sort of artist, whose records just seem to get better as years go by.

je: Good one.

JE: The pun wasn’t intentional. Horses and High Heels and Give My Love to London are truly amazing records.

je: And Before the Poison. And Kissin’ Time

JE: And Vagabond Ways: her reading of “Tower of Song”…

je: We’re getting side-tracked again.

JE: Rightly so.

je: Let’s talk about the other women in the video—Jane Bowles and Julianne Moore, for instance.

JE: Sure. Jane Bowles was this amazingly ahead-of-her-time fiction writer, whose work was largely eclipsed at the time by the popularity of her husband’s writing.

je: Paul Bowles.

JE: Yes. He hit it pretty big with The Sheltering Sky, but Jane had published her novel, Two Serious Ladies, some years prior. And Two Serious Ladies is arguably a much smarter novel, and maybe more prescient, in terms of literary evolution. It’s this wonderful, counter-hedonistic tale of two women vacationing together in Panama: they basically go searching for squalor, and then wind up in all these unnecessarily dangerous situations.

170119_jane_bowles_getty

Jane Bowles, photographed for Vogue magazine in 1946.

je: I’ve read it. It’s a very different sort of book, I’ll give you that.

JE: I think it’s one of John Waters’ favorites.

je: That would make sense.

JE: As for Julianne Moore, the scenes featured in our video are from a movie she did with Todd Haynes in the ’90s, called Safe.

je: A deeply unsettling movie-going experience, if ever there was one.

JE: It’s a challenging movie, to be sure. But it’s brilliantly subversive.

je: As I recall, you never really find out what caused her character’s illness, or whether it was psycho-somatically induced.

JE: Exactly. Like Picnic at Hanging Rock; or those really abstract noirs, like Laura. But it’s also subversive in its portrayal gender dynamics, and its dismantling of character stereotypes. For instance, there’s this therapist at the desert resort she goes to, played by Peter Friedman. When you first discover that he has HIV/AIDS, you’re naturally compelled to sympathize with him, as a character. I mean, Safe came out just two years after Jonathan Demme taught movie-goers that individuals living with AIDS are still people: at the time, that was a pretty radical idea to be conveyed through mainstream channels.

je: Through Tom Hanks, no less!

JE: Exactly! Even though he’d done Bosom Buddies and Bachelor Party, he’d earned a pretty straight-laced, non-delinquent reputation by the time of Philadelphia. And that performance set in motion a shift in public perception, in viewing people who live with HIV/AIDS. Hanks’s performance provoked viewers to sympathize, but in a really pitiful way; which I guess is the first step towards developing empathy for the plight of others, but it barely scratches the surface.

je: I think the proximity in time, between Demme’s film and the epidemic that wiped out the gay community in so many American cities, played a pretty significant role in the movie’s sentimentalized codes.

JE: I can only imagine how fresh those wounds must have been… But I also think there were some apparent detriments in the selection of Hanks, and in his subsequent characterization of Andrew Beckett. It wound up a little stilted in the direction of talking down to your audience. It also seems, in some ways, to echo that terrible phrase, “the deserving poor:” Hanks was seen by many at the time as “the deserving homo.” But this openly queer filmmaker [Todd Haynes] came along just two years later, subverting a fairly recently developed audience expectation with the character of Peter, who has the same illness but isn’t entirely sympathetic. Suddenly, the audience has to confront this culturally normalized, cognitive fallacy: the ridiculous idea that people living with illnesses—and specifically, individuals living with HIV/AIDS—are by default pitiful and apologetic.

Julianne_Moore_1995-Safe

Julianne Moore as Carol White, the confined protagonist of Todd Haynes’s early masterwork, Safe (1995). © Sony Pictures Classics.

je: Wouldn’t you say that Moore’s character comes across as pitiful at times?

JE: For sure! But it’s what you read into it; what you project, as a viewer. If you study her performance, which is a tour de force, you’ll notice she doesn’t really do a whole lot, in terms of positive character reinforcement. She’s just this slow-moving negative space, incapable of finding fulfillment within the shitty environment she’s entrapped by. And Peter winds up being this sort of oppressive male figure—flying in the face of what we’ve been conditioned to expect; especially when you consider that the author is a gay man.

je: What about Monkey, the daughter in Stalker?

JE: Like Marianne Faithfull in Lucifer Rising, she’s the real star of that movie, if you ask me.

je: Not a convincing assessment, if one were to judge by screen time. She appears in just a fraction of the movie’s three-hour running length.

JE: Screen time isn’t entirely relevant when considering who’s the star of a picture. Who do you see as the star in Blade Runner?

je: Harrison Ford[?]

JE: See, that’s where you’re wrong. It’s Rutger Hauer’s movie: Harrison Ford’s detective is only there—and I mean this narratively as well as interpretively—to lead you to Roy Batty. Who is, like Julianne and Monica’s characters, an entrapped outsider.

Official music video for “Into the Night (Pt. II),” featuring the entrapped outsider of Ridley Scott’s Blade Runner (1982): Roy Batty (Rutger Hauer).

je: As far as I can recall, however, Monkey isn’t much of an “entrapped” figure in Stalker.

JE: It is implied that she’s living with a physical disability. In this way, she’s entrapped by the limitations of her movement. Which she later succeeds in compensating for—or overcompensating for—through telepathy. I mean, if you really break it down, the girl who plays Monkey in Stalker makes the entire movie: visuals aside, I find the journey with the three men kind of tedious at times—which I’m sure was intentional on Tarkovsky’s part. But as far as entertainment goes, the movie succeeds because it saves the payoff for that very last scene. And Monkey is the payoff.

je: You certainly get a lot of mileage out of that scene in your video.

JE: It’s just an incredible piece of finished film, and I couldn’t pull myself away from it in the editing stage. And Natasha Abramova totally sells it: the magic of the scene; the mystery.

je: She looks kind of bored.

JE: Well, as with your reading of Julianne Moore, that’s just a projection. She doesn’t have to project a specific thought or idea in the scene, because all the scene seems to require is her presence—her aura. Like Marina Abramović, or Joan Crawford, Abramova’s presence is so far greater than the limitations of the medium. I think a lot of men who are filmmakers scramble to bottle this essence within the vessel of their movie—not always malevolently, mind you—but so often we’re left wanting more than what they were able, or willing to capture.

© 2010 Scott Ruddwww.scottruddphotography.com scott.rudd@gmail.com

Marina Abramović, being present (from her 2010 installation, The Artist Is Present).

je: So it sounds like this focus on women may have been more intentional than you led me to believe at first.

JE: Could be, I don’t know. Does it really matter?

je: In a sense, I think it does. I mean, don’t you think that restoring women’s perspectives within the arts is a job best done…

JE: By women? If we’re going to state the obvious, this entire project amounts to nothing more than a fledgling attempt at expressing my view of the world we live in.

je: Glad to hear you’re not posing as a provocateur. That would’ve been embarrassing for us both.

JE: If I’m trying to prove a point through this project, it’s how the history of women in film–which is chronically troubled by cases of women being sexualized and abused; having to adopt men’s names, just to get the writing credit they’d earned as a woman; not getting to express their creative vision with the same sort of unrestricted leeway granted their male counterparts—is frequently a history of confinement. Which echoes the history of womankind. There’ve been all these great performances, and films made by women throughout history; but we’re left wondering just how [emph. added] much more illuminating these works could’ve been if the power deferential in our society weren’t so unevenly distributed along gender lines.

je: Isn’t that a fairly broad statement, artistically speaking?

JE: It’s broad, because there’s a broader truth in it. But there is another, more specific truth that I’m trying to comprehend in all this: and that’s the growing absence of subversiveness in the arts. That seems, to me, a bona fide cultural problem right now.

je: How so?

JE: Well, for starters, it’s made for a pretty lame and increasingly confined reality, as of late. Nobody seems to be making any real waves, unless they engage in acts of brutal violence, or sacrifice themselves at the reality television altar.

je: Have you considered that may just be the cost of contemporary comfort? I mean, with all the wealth and the luxury we’ve acquired in our society, there seems to be less and less of a call for subversiveness.

JE: That is a factor, no doubt about it. But it doesn’t seem to entirely account for the bigger problem, either. After all, income inequality is at an all-time high; increasingly consolidated corporations continue to own and buy up everything in sight. There’s plenty for people to be upset about in the socio-political arena, yet all of it—the instigators, the responders, the counter-attacks—seems trapped in this disorienting veneer of reality television. And all of our movies seem to be paraphrasing some kind of past, whether actual or non-existent: they’re either nostalgia pieces or superhero remakes, a lot of them taking place during the time of the “greatest generation.” And I’m not saying it’s all bad by default, but it’s getting kinda old; and the redundancy only serves to draw one’s attention to how much money they always feel compelled to spend, the second and third time around…

je: But doesn’t social unrest often breed nostalgia and escapism, as an alternative to dealing head-on with the real issues?

JE: For sure! And comfort is the antithesis of anarchy. But I think the level of complacency we’re seeing is basically a direct extension of our technological comfort, as opposed to reflecting our essential creature comforts. Which is fairly new, in evolutionary terms. I mean, I imagine there must be a lot of people out there who, if they were forced to choose between clothing or shelter, and having a smartphone—they’d take the phone.

je: That might provide the basis for an interesting study…

JE: It would, but I don’t think people really want to know the answer. We’re all afraid to admit how much we’ve been afflicted by technological addiction; and it’s been rapidly changing the way we all think, feel, and communicate with each other. It’s also changed the way we view one another—either strengthening or challenging our perceptions of each other. For instance, there was that moment of shock, when the breakdown of voters in the 2016 election came out, and we learned that a majority of white women voted for this disgusting, misogynistic caricature that we now have to live with for four years.

je: That was rather alarming.

JE: It was… But then I was equally alarmed by how quickly people turned around and criticized women for a tragedy that’s been playing itself out for centuries now: the tragedy of people being told not to be themselves, over and over, to the point where they start following the negative instruction. And it’s all kinds of people: women, gay people, trans-gendered people, people of color… In a way, I think mainstream progressivism is frequently culpable of a similar offense—only from the more informed end of the spectrum, and in a more constructive fashion: they often tell people how to speak, how to act. Which isn’t the best approach, either.

je: A rose by another name?

JE: Not really. I mean, there’s no comparing the fascistic, idiotic, and reactionary rhetoric of the present-day right wing, to the Lean Cuisine progressivism of the present-day left. But taking into account the advanced technology we’ve been armed and mobilized with, it’s become that much easier to convince millions of people to fall in line: to stop thinking for themselves and to silence their own subversive thoughts—which is even less arduous, for the powers that be, than forcing them into silence. It’s like that thing Pasolini said in one of those late interviews, around the time he made Salò: that bit about politicians displaying a tolerance as vast as it is false.

je: Like that picture—the one with 45 waving the rainbow flag…

JE: Exactly! And look how many gay men fell for it. I mean, it’s sad and disappointing, but it’s also a reminder of the overarching human problem at play here. I mean, identity politics are so prominent and so profoundly important right now, and there’s no reason to downplay them. But there’s also the broader consideration that human minds are being bought and sold every day by algorithms and advertisements: and most of the time, we’re totally oblivious to it.

je: Like all the people whose votes were bought by savvy researchers at Cambridge Analytica.

JE: …Or the consumers who only want to see movies or buy records—that is, if they still spend money on music—when they have a certain rating on Rotten Tomatoes, or have earned a certain baseline of shares and likes from their friends on social media. Which is so weird to me, because there’s this unprecedented access to the widest array of media on the internet, and yet the majority of consumers appear to be stuck inside the same handful of pre-determined pathways; whether it’s the Huffington Post, Breitbart, Vice, Marc Maron, or the guy with the big glasses who reviews music on YouTube. Not that I have a problem with Marc Maron; he seems like a really nice guy.

KoyaanisqatsiWeb1-3

Still from Godfrey Reggio’s 1982 film, Koyaanisqatsi. © renewed 2012, Criterion Collection.

je: But wouldn’t you say there’s a more eclectic range of content and feedback on the internet, than there used to be in print?

JE: In quantifiable terms, yes. But you wouldn’t guess it by glancing through the first dozen or so search results. We’ve gone from one extreme to the other—from not having enough options to having too many options. And as a society, we’ve failed to establish any kind of real balance in our information hierarchy. It’s the prophecy of Godfrey Reggio’s Qatsi trilogy, fulfilled: a “life out of balance.” We can all see how it’s resulted in a lot of lowest common denominator communication—along with millions of people rehashing the same ideas over and over, not recognizing how they’ve been outmoded or disproven on any number of prior occasions. It all seems so tedious. I can only hope the previously foreseen possibilities of a one-way internet model appear less enticing to those who developed it, now that the worst of these possibilities are being actualized on a minute-by-minute basis.

je: What would you say are the positive possibilities that aren’t being actualized, artistically speaking?

JE: Honestly, I think the best we can hope for within the Berners-Lee system—as opposed to the Ted Nelson system, which would’ve been two-way, and would’ve preserved context—is post-modernist pastiche. It’s the only school of contemporary art that’s ironic enough to match the confused, constrictive implications of the World Wide Web. I mean, post-modernists used to get criticized in a lot of art circles—maybe they still do—for closing themselves off to more “genuine” modes of communication, and behaving as though irony were the only viable tone of creative communication. Then there were filmmakers, like Lynch and Almodóvar, who started pushing the limits of post-modernism in their movies—channeling this fairly surreal, but not-totally-insincere sort of melodrama that nearly took the medium to a new level, artistically speaking. I mean, we still have yet to live up to the possibilities revealed by Godard and Kieslowski; even Ophuls. But considering the state of the arts in 2017—not to mention the state of arts criticism—I’d settle for a revival of post-modernist irony at this point. Hell: I’d settle for just about any clearly stated artistic theory in the popular arts, at this point!

je: Let’s remember: Moonlight did win the Best Picture Academy Award this year.

JE: Yeah, that really was a beautiful thing… even though it probably wouldn’t have happened had 45 not been elected, which is a confoundingly sad thought. But you’re right: we must find hope somewhere.

je: Indeed. And besides, there’s nothing left to post-modernize.

JE: Touché.

* * *

stalker-scena-finale

Natasha Abramova plays Monkey, Stalker’s daughter in Tarkovsky’s acclaimed 1979 masterpiece. © renewed 2017, Criterion Collection.

ulter nation by Dirty/Clean is available to stream and purchase on BandCamp.

Indigestion’s a pain.

I found myself in the midst of an especially bad bout last night, tossing and turning in bed, struggling to fall back asleep. In such instances, I occasionally find myself achieving a heightened level of awareness and concentration: as if hyper-awareness of one’s natural (or unnatural, as the case might be) biological functions carried with it an increased sensitivity to other surrounding circumstances.

In this instance, I found myself dwelling upon a recent essay in-progress, which seems to be going nowhere slow. The subject of my reluctant essay is the suburban experience (more specifically: American films that have explored suburban themes in a Mythical vein). It’s one of those frustrating instances where the writer knows what he wants to convey—even how he wants to convey it—but once all the pieces are lined up together, they no longer convey what was meant to be conveyed.

I’m reminded now of a startling incident that occurred earlier in my workday, as I was driving a client back to her residence—which was located in a somewhat run-down suburban neighborhood. As we drove past some smartly structured houses, I offered some casual observations to break the silence of the drive—small talk about some of the more striking residences, many of which featured alarmingly pointed rooftops. It was then that my client interjected a most unexpected anecdote: “Yeah… A lady shot her two kids in the head last night, over there by that school. I guess she had told the cops the world was a terrible place, and she didn’t want them living in it anymore.”

Understandably, I found myself at a loss to form a suitable response. I’m certain I said something nominal and insufficient, something along the lines of “that’s horrific,” or “how terrible.” It was a jolting reminder of just how fleeting and cruel this life can be. It also underscored the inadequacy of my writings on suburbia, which paled in comparison to this shocking anecdote—having failed to represent the surreal perversity of the suburban experience, in its full scope. A recently released Sun Kil Moon record came to mind, as well. In the opening track, “God Bless Ohio” (a follow-up, of sorts, to the preceding “Carry Me Ohio”), songwriter Mark Kozelek pays tribute to the Northern gothic elements of Midwestern living, touching upon a range of suburban issues: alcoholism; A.A. meetings; the loneliness of being a child; nursing homes; psychotherapy; human trafficking; mass killings.

Maybe I should just scrap my essay and let Kozelek’s song speak for me, instead.


Sleeplessness has been a recurring motif of 2017 for me. During the day, I frequently find myself struggling to concentrate on basic tasks—easily distracted by the latest development in the investigation of our president’s relationship with Russian oligarchs and government operatives, as well as the on-going toll of devastation mounted by a conscience-free Congress and an administrative agenda fueled by corporate greed, short-term private gain, and a stiff middle finger to the vast majority of our country’s population. I was struck by a recent episode of Bill Maher’s show on HBO, in which Dr. Cornel West and David Frum were guest panelists. In an exchange that was (admittedly) cringe-worthy at times, Maher and West sparred on the subject of the 2016 election: West, who was outspokenly opposed to another Clinton presidency, stood by his idealistic decision to not vote for either of the primary candidates; Maher challenged his decision with an itemization of some notable areas in which the two primary candidates differed from one another, with an emphasis on the compounded harm being inflicted upon minority groups by 45.

Hearing Cornel West’s voice rarely fails to bring me joy: his combination of humor, zeal, and intellect is unsurpassed by his few peers, and his perspective is fiery but reasonable. Watching him spar with Maher on this issue brought to light the deeply personal nature of his investment in politics, and I found myself torn between two equally impassioned points of view. As I think back on the debate, I’m struck by the awkward correlation between religion and politics in this country. Apart from the obvious investment of religious power in American politics, it strikes me that politicians in this country are frequently placed on a similar plane to religious leaders: they are often evaluated as much on abstract moral principles (or lack thereof), as they are on competencies and qualifications. West makes it clear during the debate that his opposition to Hillary Clinton was of a moral nature—a perceived “lack of integrity,” as he defined it. On the flip-side of the argument, we find the pragmatism of the vehemently atheistic Maher, who is able and willing to look past the character flaws of a given politician in order to hone in on the practical, real-life outcomes of their stances and actions.

Setting aside my love of Dr. West (and that tremendous laughter of his), I cannot help but feel a sense of exasperation at our country’s obsession with bringing religion into all facets of life. I’m reminded of an observation shared by a philosophy professor I had in college, who attended multiple symposiums at home and abroad, only to find that European nations have little (if any) of the political hang-ups our country has developed in this regard. Theories of evolution and creationism coexist peaceably; women, atheists, and non-Christian theists are allowed to hold public office without controversy; and outside the Vatican (a unique religious outlier, if ever there was one), it’s unanimously agreed that religion ought not to be a deciding factor in economic and social policy. I think of David Fincher’s American film masterpiece, Se7en, in which the seven deadly sins of Christian folklore provide the foundation for a rigorously coherent series of horrific murders. I think also of real-life horrors committed by the Ku Klux Klan (a white Christian organization); the so-called “conversion therapies” imposed upon gay people in Christian communities; the persecution of victims of rape, in an assortment of forms, under the alarming guise that their assaults may have been “God’s will;” the historical genocide of Native American people, performed in the name of a Christian God and country.

“God bless Ohio
God bless every man
Woman and child
God bless every bag of bones, six feet under the snow
God bless O
God bless O
God bless Ohio”

I think of the recent terrorist attack in Manchester, which stole the lives of 22 unsuspecting concertgoers and injured 120 others. (I will refrain from making mention here of the terrorists responsible for the attack, or the religion of which their organization is a perverse offshoot, seeing as how they have gathered sufficient negative publicity over the years—and it doesn’t seem to be helping any. Perhaps it is best to remove the plank from one’s own eye, first.) I think of all the different religions in the world that provide a foundation for the most appalling crimes against humanity, and I think of the unscrupulous support lent to our current administration by millions of American Christians. I think of that genius of early American cinema, Ernst Lubitsch—having just watched Trouble in Paradise for the first time the night prior. I think of the excruciating cleverness of Lubitsch’s characters; the hilariously amoral, yet totally functional relationships they foster and maintain with one another. I think of Jorge Luis Borges’s beautiful and unassuming essays, compiling assorted theories of eternity and ontology: the power of the human mind to overcome the self-inflicted impositions of religion—and the seeming refusal of the human spirit to embrace the assets of pragmatism. I think of Morrissey’s early song for The Smiths (“Suffer Little Children”) about two highly pragmatic, non-religious sociopaths from a separate, but equally dark chapter in Manchester’s history (Ian Brady and Myra Hindley). I think of the silence on the moors where their innocent victims were slaughtered; I think of the screams and explosions that jolted Manchester Arena on this god-forsaken Monday night. I think also of the solace offered during a non-religious vigil held in Manchester on Tuesday, to mourn lost lives and lost innocence; and the open gestures of solidarity extended by individuals and cities around the world—none of which required the pretense of religion to achieve their intended message.

Oh, human (t)error:
So much to answer for.


I’ve thought a lot (and continue to think) about the ways in which the jolt of last year’s election outcome sent shockwaves pulsing through every facet of the American experience—many of which we have yet to fully appreciate (or, in some cases, even to recognize). I’ve noticed tiny paradigm-shifts taking place in areas of everyday life, some of which are so minute they might be disputed as misperceptions. For instance, there’s the weekly program CBS Sunday Morning, formerly hosted by Charles Osgood and currently represented by Jane Pauley: previous segments on ecology and environmental issues have accentuated the well-documented, factual impact of climate change upon different parts of the planet (many of which provide source material for the show’s closing “moment of nature”). In the most recently aired episode, Jane visits the city of Amsterdam, where she is forced (as commentator) to acknowledge certain obvious changes in the landscape—including a visible rise in the sea level, and subsequent changes in irrigation. A phrase she uses in this segment has been stuck in my mind all week: “whatever the cause.” As in, “whatever the cause of these changes…” As if the matter were still up for debate.

I think of the shifts in media coverage that have historically accompanied drastic regime changes in different countries throughout the world. I wonder to myself how long it might have taken for Mussolini’s state-operated propaganda machine to fully infiltrate popular Italian knowledge, or for Lenin to convince his minions of the evils of Western living.

I imagine this essay reading like a poor man’s attempt at a Mark Kozelek ramble. I’m reminded, again, of my meandering essay on the suburban experience—and how truly difficult it can be to write about something when you actually have some pre-existing knowledge of it (in contrast to the old adage). In a way, such a task is even more difficult than writing about the unfamiliar: at least then, one can quite easily acknowledge and convey the limitations of one’s lived experience. But in the case of a subject that lies close to home, the writer is expected to have some sort of preternatural grasp on the topic—a near-omniscient, no-stone-left-unturned level of understanding. Maybe this is why so many Americans are turned off when a politician fails to publicly answer a question with utmost knowledge and understanding of their personal interests: instead, they’re expected to be godlike magicians, sauntering into town on the campaign trail and telling everyone exactly what they need (or, more commonly, want) to hear. God forbid a politician should ever be heard saying those three dreaded words: “I don’t know.” Far better to hear someone say: “I am your voice… I alone can fix it.

* * *

I think of the recent return of David Lynch and Mark Frost’s much-beloved television series, Twin Peaks. I think of what a tremendous joy it was, watching those first two hours of this new 18-part series—momentarily forgetting about issues of popularized ignorance and man-made atrocity (of both the religious and the non-religious variety). I’m grateful for creators—true creators—like Lynch and Frost, who seemingly have made it their lot in life to build upon and restore popularity to Myth (the only human creation that continues to transcend pure reason and pure religion). It makes me feel lucky to be alive, to witness the brilliant and awe-inspiring fruits of their efforts. I hope these efforts—and the efforts of other keepers of the flame—are ample enough to keep the Myth alive, for all the atrocities that are coming down the pipeline.

And I continue trying to shake my hyper-awareness of how terrible things have gotten. I continue trying to just live life, for what it’s worth, and not let it bring me down. But damn: indigestion’s a pain.