How Generation X Taught Me To Be A Free Speech Absolutist
Don't relinquish the victories of the '80s
HOW GENERATION X TAUGHT ME TO BE A FREE SPEECH ABSOLUTIST
Don't relinquish the victories of the '80s
Those who expect to reap the blessings of freedom must, like men, undergo the fatigue of supporting it.
Thomas Paine, “The American Crisis”
As a free speech absolutist for most of my life, one marked by all the signifiers of Generation X membership, I blithely assumed mine was an uncontroversial opinion shared by many adults, save religious extremists. I was less motivated by solipsism than the recognition that the carving out of free speech, as well as protecting, testing, and expanding it, has been the culmination of a project which has occupied Western Civilization for the last 500 years or so, from Martin Luther’s 95 theses, to Marquis de Sade’s novels, to Thomas Paine’s pamphlets, to Lady Chatterley’s Lover, to Lenny Bruce’s stand up. I’m not religious, nor a consumer of pornography, nor a fan of hip-hop, yet I thank the names on the list for upholding Enlightenment ideals thus far, and for passing the baton to Robert Mapplethorpe, 2 Live Crew, and Salman Rushdie to recalibrate the boundaries of free expression and dissent.
Absolutism aside, I’ve noticed that the notion of free speech itself has become not just controversial, but exotic. A paradigm shift has been in the making, I just hadn’t been paying attention. It seemed gradual at first, from the trickle-down of grad school poststructuralism into the mainstream, a steady uptick in outrage and banishment for voicing discomforting opinions. The foment accelerated and crescendoed with the backlash to a New York Times op-ed by a politician whose name is reminiscent of chattel slavery. Regardless of how tired and decontextualized the “shouting fire in a crowded theater” analogy/litmus test has become, I searched for an equivalent predictor of danger in the fallout from the opinion piece. Yet I failed to find imminent physical threats to black journalists in the senator’s clamor for authoritarian protocols. Nevertheless, the op-ed was characterized like a molotov cocktail thrown into a crowded theater, except it turned out to be empty, and the glass was actually cheap plastic.
Shortly thereafter a switch was flipped, and now Brooklyn-based intelligentsia proudly promote the Orwellian rebranding of restrictions on free expression, not as censorship, but as “curation.” And free speech is now “free speech,” the air quotes added to signify the irony of such a notion, as if the principles of the First Amendment were only so-called, on par with phenomena worthy of conspiracy theories. I’ve come across arguments that “free speech” is only permissible if used responsibly. Is speech, like grain alcohol, in need of moderation to mitigate its inherently lethal possibilities?
With the history of dissent that birthed it, I can’t reconcile the expectation that free speech only works and is permissible when anodyne and genteel, the equivalent of a church social where like-minded attendees bite their tongues and exchange pleasantries over cake and ice cream. The roots of free speech, born out of long battles to challenge institutional power—monarchy and the church—are by nature unpleasant. It arose from the urgent need to call out corruption, greed, and abuses of power. It had to be wrangled from those who had hoarded it all. It was a revolutionary idea in the 16th century, so how has it become taboo 600 years later, when we’ve seen that it works? And after we’ve witnessed the horrors that arise when it’s suspended? And I especially cannot reconcile the expectation that free speech not traffic in the unpleasant and offensive, given the uncompromising attitudes and taboo-shattering cultural achievements of my brethren born between 1965 and 1980, that is, Generation X.
Maybe it’s because I was born and came of age in Chicago, in the haze of the Skokie Case of 1977—the infamous plan by the National Socialist Party of America to march in a suburb predominantly populated by Holocaust survivors—that I tend to view my generation as one framed by the wild dialectic of free speech debates. Citing the hurt and trauma that a parade of brown shirts adorned with swastikas would trigger for the suburban community, the Skokie municipality passed anti-protest laws so stringent that they “replicated the efforts of Southern segregationist communities to enjoin civil rights marches led by Martin Luther King during the 1960s.” After months of legal processes and filings by the ACLU on behalf of Frank Collin, the local National Socialist Party of America leader, the march was cleared. Though ultimately routed from Skokie to downtown Chicago, after a series of adjacent negotiations involving the organization, it nonetheless took place in the summer of 1978.
Frank Collin and his cohorts articulated vile beliefs, the apotheosis of hate speech, without the threat of imprisonment. I’m in no way sympathetic to any tenet of Nazi ideology and neither is David Goldberger, the ACLU attorney who represented the case, but he realized the stakes were too great to ignore: “The village’s determination to block the Nazi demonstration was so intense that it had the effect of turning the Skokie case into a landmark example of the vitality of the First Amendment.” The horrors of the Third Reich did not manifest. The counter-protests championed tolerance and created a broad coalition of anti-Semitic, anti-racist sentiment. In response, the Skokie township built a Holocaust museum. Chicago absorbed the shock and was stronger for it, as was free speech.
A few years later, in 1981, MTV launched into the airwaves with a slim roster of music videos by Pat Benatar, Rod Stewart, and others. The following year, six-year-old Adam Walsh was abducted and beheaded, ushering in an era of paranoia for latchkey kids like me and for working parents. Then the crack epidemic hit. In 1983, “The Day After” premiered just months before Ronald Reagan’s reelection, stoking fear of Cold War nuclear annihilation. The following year, “Red Dawn,” a relic of right-wing fear-mongering propaganda, premiered. It featured teen heartthrobs C. Thomas Howell, Charlie Sheen, and Patrick Swayze fighting a Russian invasion.
Amidst lingering post-Watergate skepticism of all things political, these were the formative cultural and political happenings during which my generation came of age, electrifying our impulse to focus on culture and artistic expression. Then a bomb was detonated…though not an actual one.
In 1985, Tipper Gore, then-wife of Al Gore, launched the PMRC, Parents Music Resource Center, with the cunning idea of cloaking censorship efforts in advisory labels, a move that presaged current efforts at “curation.” Her favorite targets, Judas Priest and Quiet Riot, were featured in the “Filthy Fifteen,” an official list of songs deemed problematic for allegedly promoting violence and sadomasochism. The list was eerily similar to the USSR’s list of banned cultural products. I guess the Kremlin hadn’t heard of 2 Live Crew—they were more concerned about Julio Iglesias as a purveyor of neo-fascism—but Tipper Gore certainly had. She might not have included 2 Live Crew on the “Filthy Fifteen” along with Twisted Sister’s “We're Not Gonna Take It,” or Prince’s “Darling Nikki,” and yet nevertheless Luther Campbell (aka Luke Skyywalker) was targeted by the movement she spawned and arrested for obscenity. His case was overturned, of course, on First Amendment grounds. Quiet Riot didn’t stand down either. Twisted Sister’s Dee Snider turned the tables and framed Tipper Gore and the PMRC as the obscene ones. A metalhead rocker testifying before Congress is the kind of political theater and Americana one can never unsee. Having enshrined Dee Snider, Luther Campbell, and his booty-shakin’ oeuvre in the pantheon of free speech martyrs alongside Thomas Paine, Tipper Gore and her organization faded away.
Once the principle of free expression had squashed yet another cabal of uptight Philistines, the mid-to-late-80s witnessed a blossoming of cultural creation, including the debut and ascension of Nirvana, Grace Jones, British New Wave, Public Enemy, and Basquiat to name a few. “Full Metal Jacket,” “Platoon,” and “Angel Heart” all hit movie theaters in 1987. As the first integrated generation—in schools at least—we were in the midst of a renaissance of film, art, and several new genres of music made of, for, and by a culture of dissent, all made possible by the zeitgeist and the forward momentum of free speech. Good times.
This Golden Age was short-lived though. 1989 introduced itself with a detonation…several of them, all real. Bookstores in the U.K. were bombed for selling Salman Rushdie’s The Satanic Verses. The Ayatollah had declared the novel blasphemous and the author punishable by death. Off the Booker Prize-winning novelist went scurrying into hiding for nine years, for daring to joke, in part, that maybe the illiterate shepherd credited with transcribing the Quran had experienced “version control” and copy editing issues. The pendulum had swung and the tide had turned worldwide.
In the same year, in Chicago, the Dread Scott Affair made national news. A student at the Art Institute, self-named after the slave whose suit for personhood prompted the Supreme Court to subvert the most democratic of values, had incorporated the American flag into an art installation. “What Is the Proper Way to Display a U.S. Flag” allowed participants to step on the patriotic symbol—the spirit of dissent was clear. Congress ruled that Scott’s First Amendment rights were superseded by anti-desecration laws. To protest the congressional rulings on his appropriation of the Stars and Stripes, Dread Scott went a step further and burned one. The Supreme Court ruled he was in his constitutional right in both instances, yet another victory for challenging and testing the contours of free expression.
Later that summer, the world watched in horror footage from Tiananmen Square, where voices and bodies were crushed by tanks—a slice of history curated out of official Chinese curriculum. It was sobering. Dissent is sometimes a matter of life or death, and not guaranteed to succeed.
The Cold War was still on, though there were rumblings of the USSR’s imminent collapse. Then, one fine day, the Berlin Wall didn’t so much fall as the Germans, East and West, dismantled it brick by brick. Had the spirit of free speech, fought for and cultivated by Generation X, in an acceleration of the momentum of previous generations, figured into this milestone? Were East Germans and citizens of other Soviet republics and satellites inspired by our expressions of protest, yearning for the right to explore their own forms of dissent, to listen to the Ramones and Prince, and make their own music?
Perhaps a more authentically Generation X metaphor for free speech than a church ice cream social is a concert—grunge, hip-hop, metal, and/or electronic—where some concert-goers gravitate to the stage to mosh and crowd-surf in vociferous, intimate immersion, while others hang back, jamming on the periphery to their own groove, and those all the way in the back, too zonked out to hear much, dig the vibe enough to stick around. It’s as fluid as it is inclusive, cathartic and, more importantly, symbiotic.
But what of my dissent, my voice? For fear of alienating those in my orbit by articulating positions deemed offensive or even outré, I’d been walking the line. I’d come dangerously close to subverting my own free speech and betraying the lessons of my generation. As part of a larger effort to figure out where I, a non-identitarian progressive, adrift in a sea of agitprop and hysteria from several camps, fit in an ever-shifting Overton window, this—2022—has been my outing year. Because the worst thing to be as a Gen X-er is a poseur.
Not bland at all. Gesha-Marie is an atheist, non-identitarian, humanist, Vanity Fair-published screenwriter learning to no longer keep her opinions to herself. She also loves shoes. She can be found exercising her right to free expression here and here as well as on Twitter.