One year ago, I launched a writing project on Patreon called Brisket. It was a home for essays I was writing that needed to be written “low and slow.” I recently sunset Brisket, and will be sharing the essays here on my blog throughout the coming months. This was the inaugural piece, published in April 2018.
I’ll begin with a story. In February of 2017, I was finishing a dissertation about the history of Jewish Community Centers in the United States. Also in February of 2017, Jewish Community Centers in the United States began receiving phoned-in bomb threats on a weekly, then a daily, basis. Local papers from San Francisco to Nashville to St. Paul to New York ran articles detailing the events of those frightening days: a call received, a Center evacuated, police summoned, the building searched, an all-clear, and a politician quoted condemning anti-Semitism and lauding their community for its tolerance.
It was the perfect opportunity to write an op-ed. To the New York Times and Washington Post I pitched the question: Why was the perpetrator targeting JCCs and not synagogues? I argued that it was because the Jewish Community Center, unlike synagogues, served both Jews and non-Jews. As a pluralistic space, it represented the fullest threat of intermixing between white Christian Americans and the Other—an affront to the “alt-right” white nationalist groups that I assumed were perpetrating the calls.
The Grey Lady passed, as did the Post, and so did the Jewish Forward and some other regional newspapers. The online op-ed page of the Nashville Tennessean finally accepted it, but the editor published it almost a month after I submitted my final draft. By the time it appeared online, in late March, the facts of the story had fundamentally changed. A Jewish teenager in Israel, not an American white nationalist, had been arrested and charged with perpetrating the bomb threats. And now my name appeared above an untimely, outdated argument that made no sense but would nonetheless live forever on the internet.
Understandably, the experience left me with a sour taste in my mouth. No one could blame me for assuming that the threat was domestic, a product of the hatred that the Republican presidential candidate intentionally stirred up during the 2016 campaign. Moreover, the paper should have killed the op-ed rather than publish inaccuracies. But it led me to realize that I had unwittingly engaged in punditry.
When we think of pundits, it is predominantly television personalities who come to mind. Whether Glenn Beck and Sean Hannity on the right or Chris Matthews and Rachel Maddow on the left, we know that they are filtering their news analysis through a sieve of moral values and political positions. This behavior, however, is not limited to the studios of Fox News and MSNBC, and many average folks comment on current events during meetings at work or around the dinner table. To be a pundit, as I see it, is simply to speculate, extrapolate, and theorize about present events in the news based on personal experience and expertise. That’s where my op-ed went wrong. I took an activist stance in defense of JCCs and argued that democratic pluralism offended my assumed perpetrator, rather than asking, “how can the history of the JCC movement help us understand why it might be targeted?” I offered an interpretation of a current event and argued why I believed it to be true. That was not necessarily a bad thing, and it could have provided a useful service to the audience if the story had not ended with a third-act plot twist. But had I taken the latter approach I would have done the work of an educator, foregrounding the context and not my own (incorrect) analysis.
* * *
As of late, it has become fashionable to lament the death of the public intellectual—writers and commentators like Noam Chomsky, Susan Sontag, Gore Vidal, William F. Buckley Jr., Norman Mailer, and Christopher Hitchens—who used current events and trends as a launching point to discuss the larger questions of the human condition. First of all, this is argle-bargle. White, predominantly male philosophers are still given platforms from which they can monologue their ideas at the masses, though I believe that the era of their supremacy has, mercifully, passed. The fragmentation of the media and publishing industries and the declining market share of the legacy outlets has allowed new, diverse voices—women, persons of color, LGBTQIA individuals, immigrants, and religious minorities, among others—to find audiences interested in their ideas and arguments. Moreover, plenty of academics contribute op-eds, offer public lectures, serve on museum advisory boards, and even write New York Times bestselling romance-adventure series steeped in scholarly research.
“I want to make it clear that I am not saying that in order to write well, or think well, it is necessary to become unavailable to others, or to become a devouring ego. This has become the myth of the masculine artist and thinker; and I do not accept it.”
— Adrienne Rich, “When We Dead Awaken: Writing as Re-Vision,” 1971
Second of all, this narrative of decline is a response to the rise of pundits like Bill O’Reilly, who serves his political agenda with a small side of purported academic expertise—the cable equivalent to being able to call macaroni and cheese healthy when you mix in a cup of frozen peas. O’Reilly is an egregious example of this phenomenon, but I think that the line between punditry and public intellectualism is not clear cut. In fact, according to the dictionary there is not much of a difference. A pundit is defined as an expert called on to share his opinion with the public. The term apparently derives from the Sanskrit word pandita, or learned man. And isn’t a public intellectual the epitomal pandita?
The temptation might be to call those with PhDs the intellectuals and label everyone else as pundits, but plenty of PhDs engage in punditry and plenty of public intellectuals don’t have PhDs. This also excludes artists, musicians, and novelists like Ai Weiwei, Beyoncé, and Chimamanda Ngozi Adichie whose works regularly critique racism, gender inequity, and governmental and social neglect of vulnerable citizens. The temptation is even stronger to assume that these are static categories rather than descriptors of behavior. I want to suggest that we forego labeling individuals as one or the other, and instead rethink “punditry” and “public intellectualism” as behaviors or approaches. To act as a public intellectual is to share ideas and arguments inspired by the present moment that provide the audience with context, helping individuals to better analyze and understand current events themselves. This is the work of the intellectual: to teach and inform and ask critical questions. It is not to advocate, it is to educate.
A public intellectual is someone whose opinions help to set the moral and aesthetic standards of her time; she draws fault lines, explains the stakes of present-day conflicts, interrogates collective intuitions. But more specifically — and more strangely — a public intellectual is someone who articulates alliances between seemingly disparate cultural and political opinions. It’s not self-evident that one’s stances on, say, abortion and what counts as a good movie should align, but they do, remarkably, again and again. To believe in enough of these correlations, and to convince others that you are right, is the role of the public intellectual. It is to possess Susan Sontag’s definition of intelligence as “a kind of taste: taste in ideas.”
— Alice Gregory, “Is it still possible to be a Public Intellectual?,” New York Times, November 24, 2015
This is righteous work, but punditry can also be righteous. Sometimes the best approach is to take a strong, unequivocal stand for what you believe in, even when your information is incomplete and you are forced into the realm of speculation and hypothesis. It’s a mistake to see public intellectualism as the only virtuous and high-minded contribution to public discourse.
There is a particular value to acting as a public intellectual, however, and that is to model informed citizenship and mold an informed citizenry. Recently a friend remarked to me that she only hears citizenship discussed in terms of passports and immigration these days, not in reference to voting and civic participation. Her point reminded me of the hundreds of pages of archival sources, written in the 1940s, that I read as part of my dissertation research. The frequent mentions of democracy and citizenship are noticeable, in contrast to their relative absence today, and signal that people valued these concepts (even if they could not agree on their meaning). Indeed, even an institution as specialized and exceptional as Jewish Community Centers claimed in the 1940s that their purpose was to promote “the development of Judaism and good citizenship” in their members.
Of course, this was during a period of war that pitted social democracy against authoritarian fascism; Americans have understandably become more jaded about democratic participation in the intervening years as political leadership ignored the will of the people and agreed to wage the Vietnam War, the Wars on Terror, and neoliberal capitalism’s war on the middle class. I cannot fault Americans for believing that there is little benefit to being an informed voter, when they have watched their elected representatives repeatedly do the bidding of moneyed interests rather than acting in their constituents’ best interests, and I cannot fault Americans for replicating the self-serving individualism modeled by those who wield their power over us. But the truth is we are all intermeshed and that our decisions all affect each other, and that the collective still matters, and that this is what humanistic and social science research bear out: none of us exist in a vacuum. Scientists have even found evidence of trees communicating with one another through networks of fungi. As citizens of a shared collective, a democratically governed republic, the onus rests on us to come together and make smart decisions.
So as a historian acting as a public intellectual, my role is to tell stories that show how decisions have cascading effects on others, and to explain how and why people made those decisions. As a historian acting as a pundit, I may even justify certain ideas and call for certain actions. The lesson from my unsavory op-ed is that we must make conscious choices about when and why to favor one approach over the other, and, if choosing punditry, to humbly admit when we are speculating or theorizing about present events in service of our interpretations and analysis.