Passover: A Holiday of Love and Liberation

Earlier this morning, my husband and my mother were in the kitchen strategizing about how to cook a brisket and bake almond macaroons at the same time. My husband patiently reviewed the detailed timeline he outlined two days ago--the gefilte fish out of the oven by 10 so the macaroons can get in and out by 11:30, when the vegetables go in, followed by the brisket and then the meatballs at 4:30. A few minutes later a storm rolled in, knocking out the power and forcing them to figure out how to continue without a functioning oven.  

Over the past week, Kevin has read through recipes, consulted with my mom, written and revised shopping lists, visited two grocery stores and Bed, Bath, and Beyond, helped switch out the normal dishes for the Passover dishes, cleaned out the fridge and pantry, and cooked. This is what true love is: taking on the most labor-intensive holiday of your wife’s religion despite being an atheist. My family does this to feel connected to an ancestral tradition and to a global community. Kevin does this to feel connected to us. 

* * *

Last night, we went out to Los Pollos for dinner with my parents. We sat at a picnic table on the restaurant’s porch, enjoying our last rice and beans before a week without bread, grains or legumes. My father fumbled with the ribs we had ordered, struggling to remove a serving for himself.

“Have you ever had ribs before?” Kevin asked.

“We had Chinese spareribs on our first date,” my mom recalled, “which almost didn’t happen because your father told me the wrong corner to meet him on, the southeast instead of the northeast. Then we saw Yentl, which your father hated.”

My father nodded in agreement.

“Abba,” I said, “tell me your version of how you and mom met. I’ve only heard the story from her perspective.”

“It was in December of 1983, at a Hanukkah party organized by and for Israelis at NYU. It was held at HUC [Hebrew Union College] in the West Village. I went with my friend Shaul…”

“Describe Shaul to them, Ido.”

“Shaul was also at NYU, writing a dissertation on dreams in the Hebrew bible. He was diminutive and dressed slickly.”

My mother interrupted. “I went with a friend, an occupational therapy student, who was short and I was tall. I convinced her that we should go over and dance and she started dancing with Shaul and I started dancing with your tall father.”

“Yes. We danced and then I asked her for her number. I didn’t have my cell phone so she couldn’t just type in her number. She had to write it down on paper.”

“Show them, Ido.”

I gasped. I had heard this story from mother so many times and never once did she indicate that any evidence remained. 

TXXGYy1dSyCQx59E29BDmQ.jpg

“It’s practically disintegrated.” Nevertheless, my father pulled out his wallet. Out of one of the smallest compartments he extracted the 35-year-old scrap of paper, in two pieces. In her familiar handwriting, but in Hebrew, she had written her name—Jodi Barkin—and two phone numbers, the one for her apartment and for her parent’s house on Long Island where she often spent the weekends. I flipped it over and found course listings for HUC seminars. They must have picked up whatever was on hand in the room. 

My father called a few days later and asked my mother out to a Chinese restaurant at 97th and Broadway. By New Years they were living together in my mom’s studio apartment at 71st and Columbus. In December of 1984, one year after they met, my father proposed. 

* * *

A few years ago I invited a friend to have lunch with me and my parents in Georgetown—I can’t remember why we were in D.C., but this friend lived there. It was her first time meeting my folks, and as we were leaving she turned to me and said, “your parents are so in love.” It took me by surprise, because although I knew my parents had a happy marriage and that they loved each other, I did not realize that it was evident (or of interest) to other people. It’s only now that I am older, having seen many unhappy and dysfunctional relationships, that I realize how lucky I am to have been able to take love like my parents’ for granted. 

* * *

Love may seem like an awkward topic of reflection for a holiday that’s about freedom from the bondage of slavery, but I would argue that it’s the crux of freedom’s goodness. There is an anecdote in Te-Nehisi Coates’ Atlantic article “The Case for Reparations,” which I assign to my undergraduate students, that describes an enslaved man watching his wife and children sold away.

Image 4-19-19 at 2.41 PM.jpg

Every time I read this article, this line makes my chest squeeze and my breath hiccup. The image is so vivid, and if the situation is not relatable the sorrow of loss certainly is. Students always bring this anecdote up in discussion. Whereas the labor and economy of slavery is abstract to them, family, and love, is not. Liberation from slavery (whether in Egypt or the Americas) was not only motivated by love—the need for agency, autonomy, safety, equality, and power were also essential drivers—but it is my students’ visceral understanding of the former and their uneven experiences of the latter that provokes their empathy. Reading this anecdote, students come to understand that slavery is more than just a “bad” thing to do to other people; slavery is a personal terrorism of separation from, loss of, and grief for the people you love.

* * *

So I feel fortunate, on this holiday, for my freedom and the love that surrounds me. It is not to be taken for granted. It is a reminder to fight for a just world in which everyone shares the same freedom. For once we were slaves in Egypt.

You Can Love Cities without Hating the Suburban Shopping Mall

I am in Philadelphia for the annual conference of the Organization of American Historians. The meeting is being held at a hotel near the convention center and Reading Terminal Market, in the heart of downtown Philly. Here the buildings are tall, the sidewalks are crowded, and at 8 am this morning commuters poured out of the subway stations. 

My close friend Amanda and I are staying in a neighborhood about a mile away, at the home of my cousins-in-law. It’s on the border between Hawthorne and Queen Village, on a quiet street of row houses just off bustling 8th Street. Within walking distance are hundreds of bars and restaurants and unique local businesses. There is a park a block away, and on our way to the conference this morning we passed many neighbors on their morning rounds with their dogs.

This is the beauty of cities: the density, cosmopolitanism, and options. It’s why I love cities, and study cities, and travel to cities whenever I have a chance. Despite this, there are two suburban institutions that I (controversially) find superior to their urban counterparts: the grocery store, and the mall.

There is something to be said for the limited selection and convenience of a small neighborhood grocery, but suburban grocery stores are cheaper, offer more choices, and the aisles are three carts wide. You never get stuck behind someone trying to choose between pamplemousse or passion fruit La Croix. Yes, you have to drive there, but there is always ample parking. It’s a pleasant experience.

The mall is superior to the urban alternative of shopping at individual, non-contiguous retail locations. It is much more convenient to enter a single, climate-controlled edifice where all of your shopping needs are served. If you need an outfit for an event, or a new suit, you can walk around comparing options until you find stylish, well-fitting pieces in your price point. Then you can find shoes to go with it, eat deliciously shitty Chinese food for lunch, and walk out with a Cinnabon for dessert. It’s all right there! When it’s hot or cold or rainy, you’re protected from the elements. When you need to kill some time, you can window shop and exercise at the same time. If it’s your local mall, you’ll likely run into an acquaintance you haven’t seen in a while—though you could also run into your high school English teacher at Victoria’s Secret. Either way, it’s a quasi-civic space that brings people together. 

I hear your arguments that the mall is a soul-sucking, anonymous, conformist, local-business-killing capitalistic hell hole. But sometimes you need something, and that something needs to be acquired quickly, or you need to try it on first, or you don’t know which store will have it. And that’s when the suburban mall beats urban retail. 

Photo by author at the Smithsonian National Museum of American History, Washington, D.C. March 2019.

Photo by author at the Smithsonian National Museum of American History, Washington, D.C. March 2019.

Although the topic of this newsletter is light, I have one new and one new-ish essay on the internet this week that tackle weightier subjects. 

The first is The Historian's Craft: Thoughts on Reading and Making History in the Wake of Tree of Life, part of a series on the blog of the Political and Legal Anthropology Review entitled Speaking Justice to Power: Local Pittsburgh Scholars Respond to the Tree of Life Shooting. This is the third of three essays I wrote about the tragedy. The first, on teaching after the shooting, can be found on The Metropole. (The second is currently unavailable).

I also re-published the first essay from my Brisket Patreon project, The Pundit vs. The Public Intellectual, on my personal blog. Throughout the coming year I will be making these essays available, without a paywall, on my blog. I hope you find them thought-provoking reading!

The Pundit vs. The Public Intellectual

One year ago, I launched a writing project on Patreon called Brisket. It was a home for essays I was writing that needed to be written “low and slow.” I recently sunset Brisket, and will be sharing the essays here on my blog throughout the coming months. This was the inaugural piece, published in April 2018.

I’ll begin with a story. In February of 2017, I was finishing a dissertation about the history of Jewish Community Centers in the United States. Also in February of 2017, Jewish Community Centers in the United States began receiving phoned-in bomb threats on a weekly, then a daily, basis. Local papers from San Francisco to Nashville to St. Paul to New York ran articles detailing the events of those frightening days: a call received, a Center evacuated, police summoned, the building searched, an all-clear, and a politician quoted condemning anti-Semitism and lauding their community for its tolerance. 

It was the perfect opportunity to write an op-ed. To the New York Times and Washington Post I pitched the question: Why was the perpetrator targeting JCCs and not synagogues? I argued that it was because the Jewish Community Center, unlike synagogues, served both Jews and non-Jews. As a pluralistic space, it represented the fullest threat of intermixing between white Christian Americans and the Other—an affront to the “alt-right” white nationalist groups that I assumed were perpetrating the calls.

The Grey Lady passed, as did the Post, and so did the Jewish Forward and some other regional newspapers. The online op-ed page of the Nashville Tennessean finally accepted it, but the editor published it almost a month after I submitted my final draft. By the time it appeared online, in late March, the facts of the story had fundamentally changed. A Jewish teenager in Israel, not an American white nationalist, had been arrested and charged with perpetrating the bomb threats. And now my name appeared above an untimely, outdated argument that made no sense but would nonetheless live forever on the internet.

Understandably, the experience left me with a sour taste in my mouth. No one could blame me for assuming that the threat was domestic, a product of the hatred that the Republican presidential candidate intentionally stirred up during the 2016 campaign. Moreover, the paper should have killed the op-ed rather than publish inaccuracies. But it led me to realize that I had unwittingly engaged in punditry.

When we think of pundits, it is predominantly television personalities who come to mind. Whether Glenn Beck and Sean Hannity on the right or Chris Matthews and Rachel Maddow on the left, we know that they are filtering their news analysis through a sieve of moral values and political positions. This behavior, however, is not limited to the studios of Fox News and MSNBC, and many average folks comment on current events during meetings at work or around the dinner table. To be a pundit, as I see it, is simply to speculate, extrapolate, and theorize about present events in the news based on personal experience and expertise. That’s where my op-ed went wrong. I took an activist stance in defense of JCCs and argued that democratic pluralism offended my assumed perpetrator, rather than asking, “how can the history of the JCC movement help us understand why it might be targeted?” I offered an interpretation of a current event and argued why I believed it to be true. That was not necessarily a bad thing, and it could have provided a useful service to the audience if the story had not ended with a third-act plot twist. But had I taken the latter approach I would have done the work of an educator, foregrounding the context and not  my own (incorrect) analysis.

* * *

As of late, it has become fashionable to lament the death of the public intellectual—writers and commentators like Noam Chomsky, Susan Sontag, Gore Vidal, William F. Buckley Jr., Norman Mailer, and Christopher Hitchens—who used current events and trends as a launching point to discuss the larger questions of the human condition. First of all, this is argle-bargle. White, predominantly male philosophers are still given platforms from which they can monologue their ideas at the masses, though I believe that the era of their supremacy has, mercifully, passed. The fragmentation of the media and publishing industries and the declining market share of the legacy outlets has allowed new, diverse voices—women, persons of color, LGBTQIA individuals, immigrants, and religious minorities, among others—to find audiences interested in their ideas and arguments. Moreover, plenty of academics contribute op-eds, offer public lectures, serve on museum advisory boards, and even write New York Times bestselling romance-adventure series steeped in scholarly research.

“I want to make it clear that I am not saying that in order to write well, or think well, it is necessary to become unavailable to others, or to become a devouring ego. This has become the myth of the masculine artist and thinker; and I do not accept it.”


— Adrienne Rich, “When We Dead Awaken: Writing as Re-Vision,” 1971

Second of all, this narrative of decline is a response to the rise of pundits like Bill O’Reilly, who serves his political agenda with a small side of purported academic expertise—the cable equivalent to being able to call macaroni and cheese healthy when you mix in a cup of frozen peas. O’Reilly is  an egregious example of this phenomenon, but I think that the line between punditry and public intellectualism is not clear cut. In fact, according to the dictionary there is not much of a difference. A pundit is defined as an expert called on to share his opinion with the public. The term apparently derives from the Sanskrit word pandita, or learned man. And isn’t a public intellectual the epitomal pandita

The temptation might be to call those with PhDs the intellectuals and label everyone else as pundits, but plenty of PhDs engage in punditry and plenty of public intellectuals don’t have PhDs. This also excludes artists, musicians, and novelists like Ai Weiwei, Beyoncé, and Chimamanda Ngozi Adichie whose works regularly critique racism, gender inequity, and governmental and social neglect of vulnerable citizens. The temptation is even stronger to assume that these are static categories rather than descriptors of behavior. I want to suggest that we forego labeling individuals as one or the other, and instead rethink “punditry” and “public intellectualism” as behaviors or approaches. To act as a public intellectual is to share ideas and arguments inspired by the present moment that provide the audience with context, helping individuals to better analyze and understand current events themselves. This is the work of the intellectual: to teach and inform and ask critical questions. It is not to advocate, it is to educate.

A public intellectual is someone whose opinions help to set the moral and aesthetic standards of her time; she draws fault lines, explains the stakes of present-day conflicts, interrogates collective intuitions. But more specifically — and more strangely — a public intellectual is someone who articulates alliances between seemingly disparate cultural and political opinions. It’s not self-evident that one’s stances on, say, abortion and what counts as a good movie should align, but they do, remarkably, again and again. To believe in enough of these correlations, and to convince others that you are right, is the role of the public intellectual. It is to possess Susan Sontag’s definition of intelligence as “a kind of taste: taste in ideas.”

— Alice Gregory, “Is it still possible to be a Public Intellectual?,” New York Times, November 24, 2015

This is righteous work, but punditry can also be righteous. Sometimes the best approach is to take a strong, unequivocal stand for what you believe in, even when your information is incomplete and you are forced into the realm of speculation and hypothesis. It’s a mistake to see public intellectualism as the only virtuous and high-minded contribution to public discourse. 

There is a particular value to acting as a public intellectual, however, and that is to model informed citizenship and mold an informed citizenry. Recently a friend remarked to me that she only hears citizenship discussed in terms of passports and immigration these days, not in reference to voting and civic participation. Her point reminded me of the hundreds of pages of archival sources, written in the 1940s, that I read as part of my dissertation research. The frequent mentions of democracy and citizenship are noticeable, in contrast to their relative absence today, and signal that people valued these concepts (even if they could not agree on their meaning). Indeed, even an institution as specialized and exceptional as Jewish Community Centers claimed in the 1940s that their purpose was to promote “the development of Judaism and good citizenship” in their members.

Of course, this was during a period of war that pitted social democracy against authoritarian fascism; Americans have understandably become more jaded about democratic participation in the intervening years as political leadership ignored the will of the people and agreed to wage the Vietnam War, the Wars on Terror, and neoliberal capitalism’s war on the middle class. I cannot fault Americans for believing that there is little benefit to being an informed voter, when they have watched their elected representatives repeatedly do the bidding of moneyed interests rather than acting in their constituents’ best interests, and I cannot fault Americans for replicating the self-serving individualism modeled by those who wield their power over us. But the truth is we are all intermeshed and that our decisions all affect each other, and that the collective still matters, and that this is what humanistic and social science research bear out: none of us exist in a vacuum. Scientists have even found evidence of trees communicating with one another through networks of fungi. As citizens of a shared collective, a democratically governed republic, the onus rests on us to come together and make smart decisions. 

So as a historian acting as a public intellectual, my role is to tell stories that show how decisions have cascading effects on others, and to explain how and why people made those decisions. As a historian acting as a pundit, I may even justify certain ideas and call for certain actions. The lesson from my unsavory op-ed is that we must make conscious choices about when and why to favor one approach over the other, and, if choosing punditry, to humbly admit when we are speculating or theorizing about present events in service of our interpretations and analysis.    

My Week of Known Knowns and Known Unknowns

“The more you learn,
the more you know you don’t know!!”

I found myself nodding vigorously in agreement when I read this line on Tuesday morning. It was written in response to a question I asked a fellow historian in her Member of the Week post on The Metropole⁠. One of my responsibilities as co-editor of the Urban History Association’s blog is to run the Member of the Week series. I send emails asking fellow urbanists to respond to a set of canned questions that apply to almost everyone, but I also write one personalized question for each member. This week I featured a PhD who left an architectural preservation consulting firm to start a whole-animal butchery business with her husband. “The more you learn, the more you know you don’t know” was her immediate reaction when I asked her what parallels she sees between academia and entrepreneurship. 

My week has been defined by this uncomfortable tension between knowing and the unknown. Two weeks ago I began teaching a mini-course at Carnegie Mellon on the evolution of “the ghetto” from Venice in 1516 up to the segregated black neighborhoods of present-day American cities. The first four class sessions are focused on Jews in the sixteenth through nineteenth centuries, and the rest of the course focuses on black Americans in the twentieth century. There are a few Jewish students in the class, but the majority are not, and so I have spent a lot of time explaining everything from the definition of “a Jew” to the diaspora to the distinction between “Ashkenazi” and “Sephardi.” On Wednesday night I broke down the differences between Orthodox and Reform Judaism, and described why tensions ran high between the established German Jewish community that immigrated to the US in the mid-nineteenth century and the Eastern European Jewish immigrants who arrived at the turn of the twentieth century. Standing up at the front of the room, in the role of professor and speaking as a Jew, for most of my students I am their foremost authority on am Yisrael, the Jewish people.

And then…. I go home, and return to my real life and the recognition of how little I know. On Twitter this past weekend I noticed a stream of tweets directed at a non-Jewish woman who described herself as a shiksa and expressed a desire to “rage bake” Trumpentashen in a show of interfaith solidarity and political resistance.⁠ I thought her tweet was fine, and didn’t blink twice at either her use of shiksaor baking hamentaschen. I appreciated the interest in our tradition! As I read through the replies, however, I learned about the origins of this derogatory term for non-Jews—⁠and how hurtful it is for converts to Judaism, in particular.⁠ And I realized that other Jews on Twitter felt like she was appropriating Jewish culture.

It prompted me to think about the overlaps in the venn diagram between myself and other American Jews. I grew up in a smaller city with a small Jewish community. I was one of only a handful of Jews in my high school. Back then I would burst with pride and happiness at any expression of interest in Jews and Judaism from my peers, because the norm was being invited to church or overlooked. My senior prom was scheduled for the first night of Passover, forcing Jewish students to choose between attending seder or prom. That one of my peers would think to make hamentaschen would have thrilled me, not felt like appropriation. 

And then yesterday on Hey Alma (which, by the way, I can’t recommend more highly) I read an article about Christian seders⁠. I completely understand the argument that this is appropriation, but I also cannot tell you the number of times that I have been able to connect with non-Jews because their experience at their church seder familiarized them with Jewish traditions. I have hosted non-Jews at my own seders here in Pittsburgh who felt more comfortable joining in an interfaith celebration because they had a basic familiarity with how the meal would go. Do I think that this ultimately justifies the practice? Probably not. But I also don’t find myself mad about it.

Where I do find myself is stuck in the knowing/unknowning murk. I know enough to teach my non-Jewish students about my identity, religion, and culture, and I know enough to know how much I don’t know about Jews, Judaism, or Jewish culture. I think this is a point that often gets lost in our contemporary debates about identity politics. Many members of identity groups are still learning the contours of their histories and cultures. Tweets and articles like the ones I read this week are informative and important, but also speak on behalf of the entire Jewish community. Speaking for myself, I find this behavior alienating—even when it’s educational and in our own defense.

+YNyL08nRDaAqHYR3V4McA.jpg

In addition to shouting out Hey Alma, which even some of my non-Jewish millennial girlfriends enjoy reading,  I want to recommend a few other things I've been enjoying on the internet lately.

I love watching @julierosealex do her bold, colorful eye makeup looks on Instagram Live. 

I'm obsessed with the podcast Erin and Aliee Hate Everything, particularly their feminist/queer hot takes on politics and pop culture. 

Apps are not really the internet, but we do get them from the internet, ergo I include Woody Puzzle--my current favorite mind-numbing game. 

The Week of Lambapalooza

Yesterday was the best day of the year. Our close friends have an annual tradition of throwing a big party for their wide circle of friends and colleagues; the centerpiece of the celebration is a roast lamb (and in recent years a pig as well). For the past five years I have helped with the preparation and roasting of the lamb. We start prepping early in the morning and, after we get the fires going, sit in the sunshine and read and jam to classic rock. And then people begin arriving around 3 PM and the party gets going and the drinks start flowing. 

IMG_1633.JPG

Here's what captured my attention this week...

I'm reading: Rebecca Traister's All The Single Ladies, which I'm racing to finish before the Feminist Book Club meets on Tuesday evening at White Whale Bookstore to discuss it (plus I want to return my copy to its owner when I visit her next weekend in Atlanta). This morning I finally finished Too Fat, Too Slutty, Too Loud by Anne Helen Petersen. Petersen devotes each chapter to a prominent woman, from Serena Williams (Too Strong) to Lena Dunham (Too Naked). With each celebrity, Petersen identifies how the controversial aspects of their fame reveals a boundary line between what contemporary American society considers feminine and what it considers unruly. Rather than delivering a single thesis about how misogyny currently operates and what its affect is on women, these case studies show that there are multiple ongoing challenges to the norms and conventions of femininity:

  • What a woman's body should look like (Too Strong; Too Fat; Too Old; Too Pregnant)
  • What a woman's voice, both literal and figurative, should sound like (Too Gross; Too Shrill; Too Loud)
  • How a woman should behave (Too Slutty; Too Queer; Too Naked)

For example, in the chapter on too-pregnant Kim Kardashian, Petersen describes how magazines like Us Weekly and People broke the taboo of printing pictures of pregnant celebrities after their advertisers realized that there was a lot of money to be made by selling products to pregnant women. The way the magazines portrayed pregnancy, however, was as an easy, natural, and transcendent experience in every woman's life. When Kim had preeclampsia, rapid weight gain, and pain during her first pregnancy, Peterson argues she shattered the myth of this "beautiful" life-cycle experience and faced immense public ridicule for not having the typical "cute" baby bump; furthermore, Peterson reminds readers that despite her discomfort Kim refused to wear dowdy maternity clothing and instead walked the Met Gala red carpet in a form-fitting Givenchy gown (albeit one with a bold floral pattern that overwhelmed and obscured her). Kim, in spite of (and because of) being known for her glam put-togetherness, challenged the social conventions of what a woman should look like while gestating a baby. As Petersen argues in the book's conclusion, Kim and the rest of the celebrities she writes about are fundamentally normative--they could not be deviant or unruly otherwise--but that does not minimize the significance of their challenges to the particular standards of femininity that they encounter. 

The case studies are the book's greatest strength and also its primary weakness. Despite insightful analysis of media portrayals of Hillary Clinton as "too shrill," that chapter seemed to drag and I just wanted to move on to another topic. And my favorite chapters, on Nicki Minaj and Jennifer Weiner, ended too quickly and left me wanting a book-length investigation of the topics they explore. Yet the book succeeds, even in the moments you find yourself disengaged, because Peterson consistently reveals fractures in the ways that men and women think about, discuss, perform, and value gender that you had never noticed or questioned before.

I'm listening to: Kanye West's new album, out of curiosity.

I'm watching: "Hard Knock Wife," the new comedy special from Ali Wong. If you enjoyed "Baby Cobra," Wong's first stand-up special on Netflix, you're sure to enjoy this one. The storytelling is less tight--there's no big reveal or punchline at the end of this one--but Wong is more explicitly feminist, calling for policies that protect and benefit new parents.

Enjoy posts like this one? Check out Brisket to read more about what's on my mind--just bring some bread to go with the meal! Right now I have a new video up where I introduce the theme for June: feminist entrepreneurship. I've made the post public, so you do not have to be a patron to view it.