Quantcast
Channel: Criticism – Literary Hub
Viewing all 419 articles
Browse latest View live

What Barry Jenkins Missed in His Adaptation of If Beale Street Could Talk

$
0
0
beale street

“I had no childhood,” James Baldwin informed a French journalist in 1974. “I was born dead.” The dark sentiment seemed to reflect Baldwin’s own despair over his childhood; his family was poor, his parents were restrictive, he felt torturously out of place in the narrow world of church and school his parents wished him to occupy, and he learnt, from early on, that his blackness made him both invisible to white Americans in certain situations and monstrously, dangerously present in others.

If childhood is often characterized (somewhat idealistically) in literature as a move from youthful innocence to adult experience, Baldwin appeared to be indicating that he had moved abruptly, sharply from one to another. The experience of the black child, he would write in that same decade in a famous piece for The New York Times, was feared by white Americans, so black children had to be kept in the night-dungeon of ignorance and simplicity, feared for their experience of—and refusal to “repudiate his experience” of—the iniquity and inequity of the world.

He said this to the journalist the same year he published what I consider his masterwork, If Beale Street Could Talk, a late novel about black Americans in love, systemic racism, and the injustice of the prison system. It was both a searing critique of American racism and a tender portrait of young black New Yorkers falling in love and sticking by each other, despite the oceanic weight of the world’s unfairness. Somewhat ironically, two years later, he would release Little Man, Little Man, a paean to the beauty and dangers of a black child’s experiences in Harlem, in which TJ, an African-American child based on Baldwin’s nephew, Tejan, provides a glimpse into a life as full of mundane fun as of threat—like the police officers killing black men that TJ describes like someone who has seen it all too often. A black child, the lesson seems to be, must often grow up quicker than other children. All of this, to me, made Beale Street astonishing. It had despair, but it also, importantly, had love. It lives resonantly today, with its explorations of police brutality, sexual assault, racist incarceration, and more.

Sadly, Barry Jenkins’ adaptation of If Beale Street Could Talk is a failure, at least compared to the majesty of its source material and to Jenkins’ extraordinary earlier venture, Moonlight. Jenkins’ film softens and often entirely removes the sharp edges of Baldwin’s novel. The most painful and shocking moments and messages are faint or don’t appear at all; instead, we have a film that must be praised for its gorgeous depiction of black Americans in love, but which could have been stronger with less glacial pacing and more faithfulness to the barbed, painful parts of Baldwin’s book. It has been defanged, declawed—and Baldwin’s novel works precisely because of those jagged, painful edges.

Baldwin deserves better.

Like the book, Jenkins’ film follows Tish and Fonny, a pair of young black Americans in Manhattan. Fonny has been accused of rape by a Puerto Rican woman, who simply claimed that a black man did it; despite the fact that Fonny seems to have been on the other side of the city when she was assaulted, he is put behind bars, anyway, by a racist white cop, Bell, who has long yearned to arrest Fonny.

Beale Street is magisterial to look at and, at times, moving. Yet Baldwin deserves something that hews closer to his intentions.

The film’s policeman is a model of grotesquerie, walking forward with a grimacing swagger and gargoyle-like features; he looks bad, so, therefore, we can believe he’s “one of the bad ones.” But this choice to portray Bell in so exaggerated a manner misses the point of why systemic racism is so pernicious: we cannot tell, just by looking at someone, if they will be humane or horrific, exemplars or executioners. Racism may appear grotesque, like the film’s policeman who embodies this principle, but bigotry can as easily come from someone who appears beautiful and gentle.

There is a long tradition in art of associating evil with ugliness, alongside a similar tendency to describe the night and darkness as bad and dangerous, which, in turn, leads to assumptions about the supposed dangerousness of a dark-skinned person walking towards you on a sidewalk at night, when all of these elements come together; the more nuanced approach is to understand that good and evil inhabit all bodies, all forms. In the book, Bell is bad—“he’s already killed one black kid,” Tish reveals—but it’s clearer in the novel that he is simply a particularly clear example of a racially unjust system.

Ironically, the officer’s most grotesque act in Baldwin’s novel is never mentioned in the film: that when the Puerto Rican woman says it was a black man who assaulted her, Fonny is captured and made the sole black man in the lineup, making him the only possibly choice. The absurd cruelty of this is clear in the novel, but the film removes this small plot thread altogether, thus softening the edges of the barbaric cop who arrests Fonny. We know, in the film, that the game was rigged against him; what we do not see is just how severely that game was rigged.

It is not enough that we see a racist cop on screen—anti-black brutality from officers on the screen is nothing new, like the explicit murder of a young black man named Radio Raheem in Spike Lee’s Do the Right Thing (1989), or the casual racism of cops in more recent films like The Green Book and BlacKkKlansman. In Do the Right Thing, worlds collide in Bedford-Stuyvesant—Italian-Americans, Korean immigrants, African-Americans, Afro-Latinx New Yorkers—and it results, in a horrific final scene, in a white cop choking a black man and even lifting him, sputtering, off the ground, keeping the pressure on his throat long after another cop yells at him to stop. The officer wanted to execute him, and did, and pretended he did not by poking Radio’s body, urging his corpse to get up. This officer does not look like a villain; his very banality is the point, for he is the murderer walking in day or night, eager for our blood, our screams. As a result, this racist cop is more relevant in today’s discourse than Beale Street’s, because he is more terrifying in his ordinariness.

Baldwin’s novel, too, hints at the horrors of the carceral system. Daniel, Fonny’s friend and later witness (albeit held off from testimony by having been put in prison again), reveals, in a chilling moment, that he was raped in prison. (He was arrested for supposedly stealing a car, despite not even knowing how to drive, and because he had “a little grass” on him.) Prisoners rarely get a fair light shone on their world; Baldwin has Daniel reveal, sobbing the night Fonny is arrested, that “he had seen nine men rape one boy: and he had been raped. He would never, never again be the Daniel he bad been.” He has become consumed and convulsed by trauma. Beyond this, Daniel, once arrested again, will be “beaten until he changed his [testimony”—that is, white guards will maul him until he decides to lie and claim Fonny was, in fact, the man who assaulted the Puerto Rican lady. In the film, Daniel alludes, with horrified features, to what happened in prison; his expression speaks volumes by itself, as does his inability to articulate it. But Baldwin bravely chose to reveal Daniel’s trauma, shining light on abuse of many kinds. Jenkins’ film, by comparison, seems to skirt the issue.

Perhaps the movie’s ending differs most, leaving us on a more saccharine note than the novel’s uncertainty and funereal beauty. Fonny’s father commits suicide; shockingly, this is not present in the film. In the final sentence, Tish reveals that her child has been born, but its great wailing will not stop: “the baby cries and cries and cries and cries and cries and cries and cries and cries, cries like it means to wake the dead.” The child, then, is associated as much with life as with death; its howls call up the dead, and in a novel so deeply about chains on black bodies, the dead here perhaps refers to the vast stream of murdered black Americans through this country’s red centuries. Fonny is described as “working on the word, on the stone, whistling, smiling,” yet the child’s lamentations “from far away” are “coming nearer” and nearer, until they swallow up both Tish and Fonny. Fonny may not even still be with Tish, if the child is so “far way”; it is implied he may be home again, but we never learn this for certain. All Baldwin reveals is that the child shrieks, banshee-wails, a strikingly morbid tone on which to end the book.

The sentiment seems to poignantly echo Baldwin’s declaration about being born dead to a French journalist. Tish and Fonny’s child may be alive, but it is associated with death, its new bones calling to the old ones all around it. The ending, like the child’s future, is ambiguous, shifting from the image of a smiling Fonny to something more sepulchral and sinister, perhaps because Baldwin knew that the only appropriate ending for his novel of black love in a fiercely racist world was one of twilight shades, of star and night.

In the movie, however, the child, now old enough to talk and walk, goes with his mother to see Fonny, who is still in prison. In a cute but cloying scene, the boy says a prayer over the snacks Tish has brought for Fonny. To be sure, an ominous note lingers; Fonny remains in prison, after all. But the chilling ambiguity of the novel’s ending does not translate here, and I left the cinema feeling cheated, almost, and upset at myself for feeling so, given the film’s momentous importance, but knowing, all the same, my response was justified.

Jenkins meant well. Beale Street is magisterial to look at and, at times, moving. Yet Baldwin deserves something that hews closer to his intentions. Ironically, I imagine Jenkins could be the director who pulls just that off; he just hasn’t done so, as yet, to me.

If streets could talk, the title of book and film alike appears to ask, would we be prepared to listen to the stories they have to tell? Their stories, after all, would be scarier than any horror tale. For all the richness and wonder in Jenkins’ film, we are left without knowing what, exactly, Beale Street might tell us; we have glimpses of beauty, joy, and pain, but none approaching the comprehensive, emotional chiaroscuro Baldwin paints for us in his novel.


31 Books in 30 Days: David Varno on Lacy M. Johnson

$
0
0
lacy m johnson the reckonings

In the 31 Books in 30 Days series leading up to the March 14, 2019 announcement of the 2018 National Book Critics Circle award winners, NBCC board members review the thirty-one finalists. Today, NBCC board member David Varno offers an appreciation of criticism finalist Lacy M. Johnson’s The Reckonings (Scribner).

*

We live in a world full of both punishment and injustice. What if, rather than continue along with the belief that the former cures the latter, we were to recognize that punishment actually perpetuates injustice? Outside the bounds of the long-held (though glacially affecting) discussion around criminal justice reform and structural racism, this can become a radical proposition. After all, it is human nature to punish, to shame, to seek retribution, to destroy our perceived offenders.

Lacy M. Johnson’s essay collection The Reckonings, a follow-up to her unblinking, transcendent memoir The Other Side (NBCC finalist in 2015), works outward in response to a question she often faces: “What do you want to have happen to him, to the man who did this to you?” “This,” Johnson knows, refers not just to the nightmare she survived and wrote her way out of in her last book, but to “all of the therapy, the nightmares and panic attacks, the prescribed medication and self-medication, the healing and self-harm.” The women who ask her this question understand what she is carrying, and they assume she must want him dead, her ex-boyfriend with whom she was in love, and who raped her, kidnapped her, threatened her with murder, and now lives free in a foreign country with a new life. Johnson’s answer surprises: “I want him to admit all the things he did…and then to spend the rest of his life in service to other people’s joy.”

This vision of atonement, of our global supply of joy restored and sustained by those who have taken from it, forms Johnson’s definition of a proper reckoning in our search for justice “where the crime is not intimate and personal but massive and public.” The essays in The Reckonings address racism, rape culture and misogyny, gun violence, and violence against the environment, building from the core of her own experience and impulse toward self-reflection and growth. The book is a revelation after another dark year of endless blockbuster books that are quick to capitalize on the problems we face but fail to show a new way forward.

If you read The Reckonings, you will begin to contemplate what it might really take for us to reach a reckoning with any one of our massive devastations. In “What We Pay,” she looks at the aftermath of the Deepwater Horizon explosion and oil spill to show that we need more than a payout from BP. All of us who are complicit in the endurance of the fossil fuel industry, she writes, must learn to “give at least as much as we take [from the Earth], to repair all that we’ve harmed.” In “Against Whiteness,” Johnson tracks her initial reaction to Audre Lorde’s critique of white feminism in “The Master’s Tools Will Never Dismantle the Master’s House.” As an undergraduate poetry student who saw herself on the margins of whiteness rather than the peak of privilege, she bristled at the definition of whiteness that was meant to encompass her.

Johnson’s candor opens up a space to contemplate the scope of white identity that many whites would prefer to simply reject. Nothing is accomplished by denying that one is a racist, but something could come from recognizing how one is complicit in “violences, large and small, that I am asked to accept,” that whiteness also means “power and privilege [that] depend on this acceptance, and also on the condition that I keep silent about it all my life.” Rather than prompt readers to perform or reject wokeness, Johnson invites them to see the truth for themselves.

Johnson is a writing professor at Rice University, and the classroom often becomes ground zero in her essays’ desire for artists of all backgrounds to create a space for reckoning. In “Art in the Age of Apocalypse,” a student claims, after the 2016 presidential election results, “There is nothing we can do.”

“What is art for,” Johnson responds, “if not precisely this moment?” It can feel like an impossible order, but Johnson has heeded the call herself. “Speak Truth to Power” is one of the strongest works to emerge in the wake of recent college and juvenile sexual assault cases and #MeToo, and in “The Precarious,” she examines her own relationship with gun culture in the wake of post-Columbine mass shootings.

In all of these works, the writing is just the beginning. Art also means seeking opportunities for activism. The book is a work of artist activism of the highest order. Rather than anger, it is fueled by love, compassion, empathy, and self-examination, with the goal to empower the thousands of young developing artists who come up in “suburban garages and church basements . . . under bridges and on street corners with spray cans, in after-school programs and on playgrounds . . . putting their hands and voices to work each day trying to remake the world.”

__________________________________

David Varno is the VP of Tech for the NBCC and Digital Editorial Associate at Publishers Weekly. He is a former Dispatches editor for Words Without Borders, and his writing has also appeared in BOMB, The Brooklyn Rail, The Cleveland Plain Dealer, Electric Literature, Minneapolis Star-Tribune, Newsday, Paste, Tin House, and other publications. 

Marriage Isn’t the Only Plot for Love

$
0
0

I was about a week into being engaged—not yet reflecting on my qualms over our culture’s tendency to reward marriage over other kinds of commitment—when I started reading Briallen Hopper’s Hard to Love: Essays and Confessions. In Hard to Love, Hopper explores love outside of romantic attachment, giving relationships with friends, siblings, even objects in a storage unit their due. Hopper’s essays make the case that our love for others doesn’t have to be circumscribed into one primary relationship (in my case, my soon-to-be marriage)—our loves can be multiple and multiply nurturing.

It’s a message that can be difficult to find in our culture, wherein the marriage plot is still the prescribed narrative for love. My fiancé Robby and I have dated for five years and lived together for four, but until we performatively say our vows, our relationship will not receive the stamp of seriousness that only marriage can bestow. Now we’re checking off items on a list written by our society, which will reward us by cradling us and lifting us up once we are wed. We will benefit financially when we can mark “married filing jointly” on IRS forms. As an adjunct professor and freelancer, I will benefit by joining Robby’s employer-sponsored health insurance rather than paying for an increasingly expensive marketplace plan. If one of us ends up in a hospital, the other will be able to visit and make decisions as next-of-kin.

As Robby and I plan our very Brooklyn millennial wedding, we are reminded at every turn of the breadth of the wedding industrial complex that our capitalist culture has built to further this narrative. Every day I get served dozens of online ads for gowns (so many gowns!), bands engraved with our wedding date, wedding venues to book (“Our sophisticated venue is the perfect backdrop for your wedding”), items for our registry (“Dinner in with your forever date is better with Lenox®”), stores to register at (“You’ve found the person of your dreams. Now create the home of your dreams when you register with Williams Sonoma”). These ads push a specific vision of the wedding dream in which you must spend more and more money (or encourage your guests to spend it on you) to achieve the perfect reflection of your love.

Hopper writes in an essay titled “On Spinsters”: “in our culture, marriage is a choice, but it also isn’t. It’s a rom-com ending and a party with a cake, but it’s also a systemic mechanism that separates the enfranchised from the disenfranchised.” The marriage-related American dream was legally out of reach for certain communities for far too long: interracial couples have only been federally allowed to marry since 1967, and same-sex couples since 2015.  For the black community, Hopper writes, “marriage has been made simultaneously compulsory and unattainable,” from “the cruelty of slavery to the callous ‘marriage cure’ of the George W. Bush administration.” And marriage has recently become more of a mark of socioeconomic and educational privilege, declining among poor and working-class people without college degrees, whereas in 1990, “51 percent of poor adults, 57 percent of working-class adults and 65 percent of middle- and upper-class adults were married,” according to research from think tanks American Enterprise Institute and Opportunity America.

In the midst of all this, our culture is starting to rewrite the marriage plot, making it less of a given—or, at least, a given that we’re pushing back against and pushing off. Older millennials like me are getting married later—the average age for marrying in 2017 was 29.2 for women and 30.2 for men, according to The Knot’s Real Weddings Study. We’re pushing marriage back for a variety of reasons: wanting to spend more time getting to know one another before the commitment, feeling financially unprepared for marriage (because of our record levels of student debt, lower levels of assets and income than previous generations, and the aftershocks of coming of age in the aftermath of the Great Recession), and wanting to focus on building our careers first.

As Robby and I plan our very Brooklyn millennial wedding, we are reminded at every turn of the breadth of the wedding industrial complex that our capitalist culture has built to further this narrative.

In Hard to Love, Hopper side-steps the marriage plot entirely—it’s not for her. But that doesn’t mean that she’s closing herself off to love, or even the creation of a family. While other recent books like Kayleen Schaefer’s Text Me When You Get Home have spotlighted the value of female friendships in a new way, Hopper’s personal essays are love letters, not just to the women with whom she celebrates “Galentine’s Day”, but the chosen family she has created and the series of commitments and responsibilities she has taken on to sustain her. Hopper has fostered a love-filled life that doesn’t include any romantic attachments, and Hard to Love is a testament to that life’s fullness.

*

In “Lean On,” the opening essay of Hard to Love, Hopper articulates the two options for supportive love that our culture presents: either you have it in marriage, or you’re on your own. “Sometimes it seems like there are two American creeds, self-reliance and marriage, and neither of them is mine,” she writes.

Self-reliance is the bootstrapping American way. Hopper points out that dependence is “despised in our culture,” down to our language: “‘Codependence’ is a beautiful word that could mean mutual support but instead means mutual harm . . . ’Depend’ is an adult diaper brand that provides an essential product but also reinforces the connections between dependence, weakness, and public shame.”

The only kinds of dependence that seem “socially sanctioned” are “romantic partnerships,” specifically marriage. Hopper, who attended Yale Divinity School, writes, “marriage, in the words of the Book of Common Prayer, ‘was ordained for the mutual society, help, and comfort, that the one ought to have of the other.’” That’s the way I see my relationship with Robby: we each listen so that we can unburden our swimming minds, we each take care of cooking, dishes, and dog walking so that the other doesn’t have to, we scratch each other’s backs and throw out each other’s snotty tissues. Of course, the “mutual society” of marriage was also established for reasons beyond love: furthering the patriarchy, guaranteeing heirs and the exchange and inheritance of property, essentially selling the bodies of women.

Hopper might not fit into either the marriage plot or the independence myth, but she—like me, like all of us—is a “leaner.” She leans unapologetically, eager to fit her life into the crooks of others’. I found myself nodding along as I read Hopper’s “declaration of dependence”: “I experience myself as someone formed and sustained by others’ love and patience, by student loans and stipends . . . I believe we are all obviously a part of one another, elements in one ecosystem, members of one body, all of us at the mercy of capitalism, weather, genes, and fate.”

For love, Hopper finds a third way, a choice outside of the two American creeds: spreading out her dependence across friendships. She puts in work to keep her friendships sturdy: “I bake gingerbread for my friends in the winter and shortcake in the summer . . . across years and oceans, my distant friends and I co-create structure of togetherness through group texts and phone dates and regular reunions.”

As the collection progresses and Hopper writes about those she leans on in essays that reference everything from Cheers to A Fault in Our Stars, she makes clear that these third way relationships are not effortless or easy—they can be as vexed and vexing as any romantic love. Each essay sparked new reflections for me, making me reconsider the meaningfulness and centrality of commitment outside of marriage, and why those commitments are worth their cost.

*

“Dear Octopus,” in which Hopper discusses her relationship with her brother and J.D. Salinger’s Franny and Zooey, was the essay that had my mind firing off questions about commitment and what we owe to one another.

Hopper grew up in an evangelical, hippie household with “hordes of siblings” but just one brother, with whom arguing was a form of affection. As a teenager, Hopper saw her and her brother’s debates reflected in Franny and Zooey’s conversations: “I recognized our posture and purposes, our inflections and allusion,” especially in the “insistently Christian” language.

The problems came when Hopper left home and stopped sharing a religious language, or “a sense of the truth,” with her brother: “To engage with [my brother] open-mindedly would have required me to make my right to love and work and take communion into something precarious and contingent and up for debate.” Similarly, she now looks at Franny and Zooey as representative of “a particular kind of harsh language and gendered judgment I needed to guard against.” Still, Hopper acknowledges that “families of blood and law” have a “visceral centripetal force” that draws us back. She writes: “I don’t believe that the sibling soul-mingling of youth can ever be undone . . . But that doesn’t mean it can be sustained.”

Hopper might not fit into either the marriage plot or the independence myth, but she—like me, like all of us—is a “leaner.” She leans unapologetically, eager to fit her life into the crooks of others’.

My sibling soul-mingling must be sustained. Unlike Hopper, I grew up with one sibling: my brother, John, who is three years older than me. Reading “Dear Octopus,” I thought about how my relationship with John is one that needs to mean a lot: we’re the only immediate family either of us have left. Our parents died when I was a young teen and John was an older one, and that forged us together—our “sense of the truth” lies in an understanding of the depths of loss. After Mom and Dad died, John and I transitioned into taking care of one another—we feel that we owe it to each other to do so.

But after reading “Dear Octopus,” I thought about the times I haven’t paid enough mind to our relationship. I thought about when Robby and I first moved in together—how John calmed me down when I lost the keys to the apartment I was leaving and how he carried box upon box of books up flights of stairs for me and how I was so wrapped up in cementing my romantic love by signing a joint lease that I missed for months how deep John had been sinking into a depression. I thought about the conversation I had with Robby in which I confessed my guilt and shame over my blindness to my brother’s needs. I thought about the resolve I made to myself then to never let my side of the rope go slack, and how Robby has helped me pull it taut—they have forged their own strong friendship—and how that’s part of why I’m marrying Robby: he is merging into my pre-existing ecosystem and I into his.

*

Friendships are different than sibling-ships—they aren’t predicated on the same depth of shared experience that growing up under the same roof can provide—and they can be undone. I have lost many friends, not to animosity but to lack of effort. While it’s easy for me to make friends when thrown into shared circumstances, I can’t always bring myself to consistently reach out and respond to being reached out to once we’re thrown back into our “real lives.” I find it difficult to make the shift from talking in person every day in, say, an office to texting or calling when that common space is no longer uniting us.

Hopper is a much more effortful friend. Many of the friendships she writes about in Hard to Love are ones that have gone through trials more complicated than living apart. In “Young Adult Cancer Story,” Hopper writes about her friendship with Ash, who has Stage IV esophageal cancer; Hopper is part of her care team. In “Hoarding,” she writes in part about moving in with her friend Cathy, who was “a tenure-track professor with a husband and a kid and a four-bedroom house” when Hopper was “a single, childless, broke divinity student.”

Each of these friendships benefits from an empathy that requires pushing past feeling understood and validated in experiences we share. Though “most of Ash’s experiences with cancer can’t be shared,” Hopper finds ways—like reading and watching  A Fault in Our Stars—to understand. In “Hoarding,” Hopper puts in the sort of work that can be uncomfortable in a friendship that feels like it’s not working like it used to: retrospecting on ourselves, and what our actions must look and feel like to the people we love. We owe anyone we lean on that kind of empathy and care, and we are damn lucky if we can live in a robust network of love that doesn’t start and end at marriage. As I get ready to wed, I am resolving to keep the lines that tie my ecosystem together alive, to reach out and pull friends and family in, to offer “mutual society, help, and comfort” to my loves outside my spouse.

Sense or Sensibility… What if Jane Austen Had to Choose?

$
0
0
sense and sensibility

How do you know if you will love Sense and Sensibility? What if it doesn’t happen to you? First-time readers of Austen’s fiction, knowing its reputation for literary greatness, may approach this novel with Marianne-like expectations. You want to be bowled over, to find charms in every sentence, or to discover that all the novel’s beauties are entirely shared on a first reading. It could happen to you. There are certainly those who love the book in ways that might seem imprudent or excessive.

Readers who love Sense and Sensibility in its original form are rarely describing their first encounters with the book. Instead, they’re describing what it means to read and reread it or to revisit it through film, television, and stage. These devoted readers have developed, rather than discovered, a Marianne-like inability to love the novel by halves while internalizing Elinor’s more measured approach to its prose. To love Sense and Sensibility—if you seek to—it’s crucial to enter its pages with gusto, as well as with deliberative care.

You may come to this novel already knowing the rough outline of the plot, having seen a recent popular film or stage version. Whether or not you’re reading the novel knowing how it all turns out, you may find the language challenging. If you read this book more slowly, you’ll be more likely to notice the way its language and its themes come together, as well as to appreciate its understated, wry comedy. It’s both a profoundly serious and an amusingly comic novel.

Look, for instance, at how the novel introduces Elinor and Marianne’s elder half-brother, John Dashwood, in the first chapter: “He was not an ill-disposed young man, unless to be rather cold hearted, and rather selfish, is to be ill-disposed.” Reading just the first phrase, you’d expect that what follows the comma would be a compliment. To be “ill-disposed” is to be someone disposed to do ill—to be a bad person. But what follows the clause telling us that John is not a bad young man is another double negative. He’s not bad, unless we think being coldhearted and selfish make a person bad—as, of course, we should!

At its core, Sense and Sensibility deals with how individuals make meaningful lives in a world that is often deeply unfair.

The way Austen backs us into that assessment of John is not just clever. It’s funny. It’s designed to make us think about how community determines a man’s respectability. By all outward appearances, he’s a “good man.” At the same time, his impulses and feelings are atrocious. To make matters worse, he’s married to a woman who brings out his worst coldhearted, selfish qualities. Together, John and Fanny’s actions (and inactions) produce real evils.

That sly, double-negative, ironic approach to storytelling means the novel’s lessons are often opaque. Austen’s fiction asks readers to consider truths and half-truths, omissions and lies, and generosity and greed. At its core, Sense and Sensibility deals with how individuals make meaningful lives in a world that is often deeply unfair. Its questions are set out by differences in its two sister-heroines’ opinions and styles.

The story also shows how the sisters have an impact on each other’s evolving judgments. Marianne’s lines that she “requires so much” in a man might be seen through Elinor’s eyes. Elinor would surely substitute the word “too” for Marianne’s “so.” Which one of them gets it closer to right, in the world of the novel or beyond it? Perhaps, in love, so much is too much. But perhaps setting one’s sights on so much leads to opportunities for so much more.

*

Readers ought to channel their inner Elinors, as well as their inner Mariannes, to read Sense and Sensibility with greater sense and sensibility. We ought to pause to define these terms, because the words have changed in meaning over the past two centuries. The title’s two nouns are neither synonyms nor antonyms. The meaning of “sense” then was close to what it means in our day—rationality, wisdom, reasonableness. It suggested having well-regulated powers of mind. But “sensibility” in the Romantic era (the late 18th and early 19th centuries) had nothing to do with being sensible or wise. Sensibility signaled emotional sensitivity, sympathy, and susceptibility. It was a power of the senses, of perception or taste, and of the heart. To claim to feel more deeply, or to express stronger feelings, was a very fashionable form of sensibility. Dozens of studies have been published on what’s called the era’s cult of sensibility. Janet Todd’s Sensibility: An Introduction (1986) remains one of the best places to begin.

Histories of sensibility in literature often start with Laurence Sterne’s novel A Sentimental Journey through France and Italy (1768), Henry Mackenzie’s The Man of Feeling (1771), or Johann Wolfgang von Goethe’s The Sorrows of Young Werther (1774). It seems audiences encountered these novels in order to experience and share the powerful feelings that the stories called up. Whether these books were devoured in contemplative solitude or read aloud among family or friends, the point was to be so moved as to shed tears. Both men and women were invited to cry with these male characters, but over time, the concept of sensibility became further divided by gender. Men might cultivate admirable sensibility, but women were widely believed to have greater natural or biological access to it. The ramifications of these stereotypes about gender, thinking, and feeling still linger with us.

Sensibility was also a politically controversial concept. Some then argued that more “refined” (or higher-class) people had greater taste, sensitivity, and sensibility. Sensibility could be used to sort out the well-born from the low-born or the well-educated from the uneducated. Whether learned or inborn, the moral qualities of a privileged person were said to heighten his or her ability to recognize and respond to beauty.

Others, however, believed just the opposite. If sensibility were truly about sympathy, then anyone could claim to have it. It could be found in all walks of life. Indeed, recognizing sensibility in all sensitive people—regardless of class, race, gender, or nation—could also prove one’s own worth. If a privileged person could show sympathy for the plight of the sensitive downtrodden and wanted to better the lot of those who shared his fine sensibility, then he could claim to be a superior moral person. The problem, of course, was that sensibility could be feigned. Many sounded alarms about the dangers of false sensibility, too.

Austen’s fiction suggests that she had definite opinions about, and a healthy skepticism toward, fashionable sensibility. Her distrust of the excesses of sympathy is made especially clear in one humorous conversation in the novel. While walking on a beautiful fall afternoon, Elinor teases Marianne, “It is not every one who has your passion for dead leaves.” It’s both a throwaway joke and an implicit criticism. Do dead leaves deserve sympathy? Are there other things that deserve our passionate concern more than dead leaves?

Elinor’s snarky line implies that Marianne’s powerful feelings may be more excessive and indiscriminate than admirable. But Marianne shoots back at her, with seriousness and self-assurance, “No; my feelings are not often shared, not often understood. But sometimes they are.” Marianne defends the idea that anyone with her kind of fine sensibility is a superior person. To love dead leaves is to show an exquisite appreciation of nature, beauty, and life, from cradle to grave. Elinor’s response suggests she thinks that loving dead leaves is not a sure sign of anything admirable about a person. She would seem to prefer to evaluate people on their intentions, judgments, and behavior rather than on their heightened sensitivities, dramatic empathy, or professed taste.

Sensibility could be used to sort out the well-born from the low-born or the well-educated from the uneducated.

The sisters’ disagreements over feelings and dead leaves point not just to their individual differences but to their family problems. Marianne follows in the emotional footsteps of her mother, with little regard for moderating her responses. Elinor serves not only as Marianne’s adviser but as her own mother’s counselor. To make matters worse, young Margaret is said to have Marianne’s romantic turn without her sense. (It’s important to note the implication in this line that Marianne, too, has sense.) Elinor’s worries for her mother and sisters are hardly selfish, but they are informed by her own self-interest. She must realize that the choices of one of them could affect all of them. One sister’s damaged reputation, outlandish behavior, or unconventional choices could have an impact on the marriage prospects of the rest. That fact ought to make any female character from this period who worries about finding love—for herself or others—appear the furthest thing from a frivolous person.

This was a culture in deep conflict about the roles of love and money in marriage. Public debates were still being staged in the 1790s, asking whether it was a greater evil to marry for love without money or to marry for money without love. As one satirical source put it, love without money is “like a Hive without Honey.” Another jibed, “If truth may be spoke / ’Tis but a mere joke, / For love without money / Will vanish like smoke.” Works of fiction usually privileged the love side of the equation. Money would miraculously fall into place by story’s end. Marianne’s ideal man is a figure ripped out of the pages of the day’s improbable novels. Like her fictional heroine, Austen must have considered long and hard what a worthy hero ought to look like in a work of realistic fiction. It was a literary problem, too.

*

To understand where Marianne’s perfect hero comes from, we might compare him to one from a bestselling novel in Austen’s day. Jane Porter’s The Scottish Chiefs (1810) uses as its hero the 13th-century warrior William Wallace. He’s described as having

married Marion Braidfoot the beautiful heiress of Lammington. Of the same age, and brought up from childhood together, reciprocal affection grew with their growth; and sympathy of taste, virtues, and mutual tenderness, made them so entirely one, that when at the age of twenty-two the enraptured lover was allowed . . . to pledge that faith publicly at the altar which he had so often vowed to his Marion in secret, he clasped her to his heart, and softly whispered,—“dearer than life! part of my being now and forever! blessed is this union, that mingles thy soul with mine to all eternity!”

Marianne Dashwood clearly wants a Wallace. She looks for entire sympathy of taste and virtues, a two-making-one, a soul mingling. Her man of great feeling delivers his poetic lines of love in tender whispers, with breathy exclamation points.

Marianne also thinks true love happens just once, doubting the heart’s ability to love fully a second time or more. It’s an odd belief for someone who is herself the product of a happy second marriage, as Elinor points out. But their tussle over this question is in keeping with the rest of the novel, because Sense and Sensibility repeatedly asks how its characters, and we, are supposed to place a value on what’s first, second, and even third. It explores not just first and second loves but first-, second-, or third-born children and first, second, and third generations. Siblings in the novel come in distinct pairs and triples. Names double up. The novel asks us to measure, compare, and contrast in ones, twos, or threes. Firsts are often prioritized, but we’re repeatedly prompted to ask whether first is really best.

Although Sense and Sensibility was probably the second full-length novel that Jane Austen began to write (and likely the third she “finished”), it was the one she published first. Begun, according to family legend, as an epistolary novel—a novel told in letters—in the late 1790s, the novel that became Sense and Sensibility would stake Austen’s literary claim. Her first readers had no way of knowing that, because she published it anonymously. She used the phrase “By a Lady,” instead of her name, on the book’s title page. She concealed her identity, but not her gender, from the public.

There remains a great deal of misunderstanding and myth about that move. It was neither an oddity nor a female requirement. Anonymity appears to have been a majority choice among all fiction writers in the period, male and female. More than 60 percent of novels published in the late 18th and early 19th centuries were not signed on their title pages. Even the authors who would become the most famous among these novelists—Sir Walter Scott, Maria Edgeworth, Frances Burney, and Mary Shelley—all published their fiction anonymously at first.

They only later signed their real names to their works. It would seem they waited to see whether their books would attract positive notice and fame before revealing their authorship. We tend to assume that Austen, had she lived, would always have preferred anonymity. These examples suggest another possible path. We don’t know what Austen would have chosen to reveal about her authorship had she lived a longer life. We don’t know what she might have chosen to do had she witnessed, as these other authors did, that her novels continued to attract positive attention.

Marianne also thinks true love happens just once, doubting the heart’s ability to love fully a second time or more.

Women writers were responsible for about half of the approximately 1,500 works of prose fiction published in Britain in the first two decades of the 19th century. That makes the novel of the time an equal opportunity genre, by the numbers at least. (Women were not so well represented in other genres, which surely made the novel seem more female-dominated by comparison.) As an author, Austen could have chosen to mask her gender, but she must have wanted to be identified as female. What we’ve been slow to notice is that, in making the choice for “By a Lady,” she was out of step with her contemporaries. There was a major explosion of “By a Lady” novels in the 1780s, when Austen was a little girl. Almost a quarter of the novels published in 1785 were signed that way. By the late 1790s, however, when Austen may have first started drafting Elinor and Marianne (the title she’s said to have first given the story that became Sense and Sensibility), the number of novels signed “By a Lady” dropped to just 5 percent. After 1800, it is said to have dwindled to a thin trickle.

That means Austen’s “By a Lady” signature was an out-of-fashion outlier in 1811. Was she deliberately marking herself out, not only by gender but with an antiquated or nostalgic phrase? It’s possible. It’s also possible that she was following a family-author tradition. Her cousin Cassandra Cooke—who was also her godfather’s wife—published her Battleridge: An Historical Tale, Founded on Facts (1799), anonymously as “By a Lady of Quality.” Austen, in choosing to go with “By a Lady” but omitting Cooke’s further class marker (“of quality”), might be said to have chosen a less aggrandizing, yet still polite, way of describing herself. Perhaps Austen was both echoing and bucking family tradition in referring to herself in print this way.

What’s notable is that Austen would directly identify as a “Lady” author on a title page only once. She published Pride and Prejudice (1813) as “By the Author of Sense and Sensibility.” Mansfield Park (1814) was published as by the author of Sense and Sensibility and Pride and Prejudice, still giving her first novel pride of place on the list. Either Austen or her publishers—or both—wanted readers to be able to track her authorship. Her second novel was attached to her first by its name. In linking Sense and Sensibility to Pride and Prejudice, on her second novel’s title page, Austen was creating her own literary brand, as we might put it today.

__________________________________

From Sense and Sensibility (Penguin Classics Deluxe Edition). Used with the permission of Penguin Classics. Introduction copyright © 2018 by Devoney Looser.

The Death of a Symbol: How Western Writers Exploit the Tiger

$
0
0
tiger

Tigers live secret lives, tucked away in forests and mountains, and avoid contact with humans at all costs. They hide when they need to, engage when they need to. They require large swaths of land to call their own, and they mark their territory by scratching trees with their claws, writing their names. The Siberian tiger needs the most land, almost 4,000 square miles. That’s almost 20 times the size of Manhattan.

You’d want to help a newborn. They’re born toothless and sightless, and when their milk teeth grow in, they’re thin, like pins. A mother becomes engaged in the ruthless conundrum of survival: if she leaves her cubs for too long, they’ll die, and if she doesn’t leave them at all, they will surely starve.

You can hear a tiger’s roar two miles away, but that’s for the best, because their bite is worse. Their canines can get as long as your middle finger. One chomp of their 30 teeth renders a pressure of 10,000 pounds per square inch. They could snap your back in half.

Do you want to kill them because you are afraid—or because you covet their power?

*

A few months ago, The New York Times’ South Asia bureau chief started writing about a tiger hunt in India where authorities ruled that T-1, a 5-year-old tiger who was proving to be a menace to village populations, could be shot on sight. For the purposes of this essay, let’s give her a name: Tara.

This bureau chief—a white man—bylined several breathless articles about this tiger hunt, and in one quoted a hunter who said that “once a tiger encounters a person and kills, it may develop a taste for human flesh [which] is sweeter than other animal meat because of all the ginger, salt and spices people consume.” South Asian human flesh apparently tastes better than regular animals—a curious, trivial tidbit thrown in for good measure in an article amused with its own description of Calvin Klein’s Obsession as viable catnip. (Similar to an Axe advertisement, the cologne is described as “scientifically proven to make wild cats go gaga.”)

In the space of one article, Tara is called “psycho,” but at the same time “brilliant.” Another piece in The New York Times calls her “something out of a Rudyard Kipling tale.”

Tiger hunting may seem antiquated, but the practice only started with the Mughal emperor Akbar’s rule in the latter half of the 16th century. Past royals would engage in combat with the tigers, usually with swords. It was, in other words, a dangerous venture. But the British turned shikar into a sport; royalty, both white and otherwise, would go out in large parties with as many as 50 elephants, oftentimes drugging and baiting tigers so hunters faced little danger. Historian Mahesh Rangarajan estimates that around 80,000 tigers were slaughtered in the half-century between 1875 and 1925.

The killing of a Malay tiger in 1852 was described in The New York Times in dramatic fashion:

Behold him in our presence! More beautifully striped than the zebra, snorting, astonished much more than frightened at our presence—immovable at first, putting forth deafening and profound roars, raising his furry eyelids, licking his half-opened lips with a rough and red tongue… A bullet is discharged, the tiger roars, attempts to spring, but falls to the ground like an aerolyte; the young girl advances and lances her harpoon, which penetrates his body, he attempts to retreat, but the more he moves the more deadly weapon enters his flesh. A general discharge of our rifles brought his end to a “dead certainty.”

Here, the tiger hunt is poetic: it is the stuff of literature, and any collateral damage makes it ever the more romantic. (The writer goes on to remark as an aside that the remaining two Malay people found part of the head and throat of their younger brother.)

Despite the history of this callous approach to living, breathing tigers, symbols of tigers are everywhere: embroidered jackets, restaurant logos, laptop stickers, book covers. Our visual culture is saturated with tiger substitutes that allow us to marvel at the species’ fierce beauty without the inconvenience of confronting their ferocity.

The values of the colonizer endured the bloody rupture of the subcontinent.

In Kipling’s The Jungle Book, it is curious that Shere Khan is the major antagonist in a jungle full of potentially villainous animals. Shere Khan, depicted as isolated and fixated on murdering a human, is alienated from the law of the jungle—“for the strength of the pack is the wolf, and the strength of the wolf is the pack.” In The Jungle Book, Mowgli is the native ready for taming; and Shere Khan, the wilderness that threatens to consume him.

The tiger hunt was legitimized through vilification, and not just of the literary variety. Tigers became evil monstrous murderous cats in the public view around the time the sport picked up in the British Raj—after the Crown took its vicious hold of the subcontinent in 1857—when more hunters showed up, lesser skilled than ever before, maiming tigers and slowing them down. Shaitans (devils). Unable to hunt their usual prey, injured as they were, tigers took to hunting humans. Now perceived as cunning maneaters, tiger hunts became more urgent, more necessary.

The killing of tigers escalated in India after independence in 1947. The values of the colonizer endured the bloody rupture of the subcontinent. The Maharaja of Surguja told a biologist that he had killed over a thousand tigers. Everyone had guns and bloodlust. Everything was a trophy. It was a seizing of nature. Lives had value; so did tiger fur.

*

Kurtz, the wayward imperialist of Heart of Darkness, turns from taking up the “white man’s burden” to wanting to kill indigenous people; the book is often read as a seminal work on racism and colonialism. Chinua Achebe describes the common stereotype of “Africa as a metaphysical battlefield devoid of all recognizable humanity, into which the wandering European enters at his peril,” in An Image of Africa: Racism in Conrad’s Heart of Darkness. “Can nobody see the preposterous and perverse arrogance in thus reducing Africa to the role of props for the break-up of one petty European mind?” Achebe asks.

“Jungle” comes from the Sanskrit word meaning uncultivated, arid land, not some impenetrable thicket of the Orient, as Western literature pictures it. The “jungle” came to refer, in the colonial imagination, to undominated land: tropical rainforests, mangroves, the habitat of the “wild native.” Jungles were colonized lands before the white man’s arrival.

And along with the jungle, its creatures, especially sinewy marvels of evolution with massive jaws and impressive, though cryptic, abilities, became a vivid metaphor for the wild—and the colonial drive to conquer it. Jim Corbett, hailed as a savior-conservationist-hunter who only killed big cats that threatened human lives, couldn’t resist the urge to kill a leopard with the first rifle he received as a young boy.

Tipu Sultan, the Tiger of Mysore, ruled over a largely Hindu population in the south of India in the 18th century and hated the British, though at first he tried to ally with them against the Marathas. Around 1795, Tipu had his people—some historians say with the help of the French—build a mechanical tiger that roared as it bit the face off a prone, weeping British soldier. Tipu delighted in this imagery; the mechanical tiger was one of a series of artworks portraying the violent dismissal of the British, usually by tiger. Tipu weaponized the metaphor of the tiger against the British.

Large carnivores view humans as superpredators, and they greatly fear us. Their avoidance of us disrupts ecosystems.

In response, the British were outraged at this portly man who sat on a golden tiger throne. Their army killed him and broke apart his throne to divide as loot—the “Prize Fund”—between the generals, though the Governor-General, who wanted to present the throne to the English King, did not approve. (A gold ornament from Tipu’s throne, bashed apart with a sledge hammer, sold for half a million dollars ten years ago.) The East India Company issued medals emblazoned with the British lion destroying the tiger in celebration.

The East India Company captured Tipu’s Tiger in 1808 to display in its own museum and library, where the English public could freely crank the mechanism. Within thirty years or so, reports of its state of disrepair spread. Transferred to the Victoria Albert Museum in 1880, it was originally intended for the Tower of London, famous for its prison. 

It is imagined that this memorial of the arrogance and barbarous cruelty of Tippoo Sultan may be thought deserving of a place in the Tower of London,” wrote the Marquess Wellesley to the Governor General after they killed Tipu Sultan in 1799. India has not made a formal request for Tipu’s items, perhaps because Tipu’s popularity varies by population. Some think the Muslim ruler beheaded the Brahmins near the city of Mangalore, though some historians disagree with this narrative. Vijay Mallya, a billionaire who is facing extradition on allegations of stealing money from India, bought Tipu’s sword and other objects to bring them back to India. As troubling is the knowledge the British still profit from their loot, it’s almost more disturbing that Mallya “gave away” the valuable sword after his family said it was bringing him bad luck; patriotism, history and capitalism make slippery bedfellows.

A study released last year from researchers working at Western University and the University of California Santa Cruz shows that large carnivores view humans as superpredators, and they greatly fear us. Their avoidance of us disrupts ecosystems. There is only one superpredator. The narrative of human supremacy—loosely described as an implicit belief in the exceptionalism of our species, that is, and our rights over everyone else’s—is ultimately what makes the hunt of a maneater so self-righteous, and thus morally unquestionable.

Threats to the tiger population still thrive, especially in the forms of governments that turn a blind eye and sanctions. China lifted its ban on tiger bone recently, but their supply comes from India: the South China tiger is “functionally extinct” because it hasn’t been sighted since the ban was put in place 25 years ago. The Chinese medicine trade ushered in what was called the “second tiger crisis” in the form of Chinese poachers in 80s India, after Prime Minister Indira Gandhi’s assassination. There are more tigers in captivity in the U.S., maybe even just Texas, than in the wild.

*

I read this essay out at a popular venue and disappear to soothe my nerves. A number of people seek me out, tell me about the time they pet a tiger, they didn’t realize it was wrong until I said there is no honorable way for me to cohabit space with a tiger. Tara is shot dead two days after. I find out that animal rights advocates had been calling her Avni—the earth.

A white British man tries to caress my hand, earnestly venting about how Trump is the worst thing to have happened to him, personally. Trump, says this editor of a major magazine, is one of the worst disasters in human history. I laugh in my booming, warning way. My teeth are large, and they show.

31 Books in 30 Days: Michael Schaub on Jane Leavy

$
0
0
jane leavy big fella

In the 31 Books in 30 Days series leading up to the March 14, 2019 announcement of the 2018 National Book Critics Circle award winners, NBCC board members review the thirty-one finalists. Today, NBCC board member Michael Schaub offers an appreciation of biography finalist Jane Leavy’s The Big Fella: Babe Ruth and the World He Created (Harper).

*

George Herman “Babe” Ruth Jr. “wasn’t a baseball player,” argued the broadcasting legend Ernie Harwell. “He was a worldwide celebrity, an international star, the likes of which baseball has never seen since.” Since then, the sport has featured its share of celebrities, from Hammerin’ Hank to A-Rod, but the Bambino still remains the brightest star in the baseball firmament.

Jane Leavy’s fascinating new biography of Ruth, The Big Fella, follows Ruth and his Murderer’s Row teammate Lou Gehrig as the two Yankees went on a barnstorming tour across the country. The reader learns about Ruth’s charisma, sometimes hot temper, lavish lifestyle, which included a predilection for drinking, and his outsize personality, in addition to his impressive skills of fielding and hitting.

Leavy’s approach is not a straightforward one. She uses the 1927 barnstorming tour as something of a frame, which allows her to dive into Ruth’s childhood, baseball career, and death of cancer at the age of 53. Her research is beyond impressive; not simply relying on previous biographies, but on varied primary sources including newspaper archives, genealogical records, interviews with Ruth’s family, legal documents and more.

The result is a fuller picture of Ruth than other writers have managed to paint. But Leavy doesn’t just do an admirable job tracing Ruth’s life and baseball career, she also uses the athlete to ask fascinating questions about the nature of celebrity, both in the early twentieth century and the present. By the end of the book, the reader has learned much about Ruth, but also about how his career shaped the American dream, and the American concept of fame, as we know it today.

Intelligent, absorbing, and written with a keen eye to detail, The Big Fella is a book worthy of Ruth’s remarkable life, as well as one of the best sports biographies to be published in quite a while.

__________________________________

Michael Schaub is a regular contributor to NPR and the Los Angeles Times. His journalism has appeared in the New York Times Book Review, The Washington Post, The San Francisco Chronicle, and the Guardian, among other publications. He lives in Austin, Texas.

The Challenges of Writing for White People

$
0
0

When I was a child, I thought there were no white people in Los Angeles.

There’s a story I sometimes tell to demonstrate how thoroughly my perspective was distorted by de facto segregation. It’s from a day when I was not yet ten years old and a big yellow bus delivered me and a bunch of my classmates to the zoo on a field trip. We pulled into a lot jammed with other big yellow buses and I stepped down into a world I had never dreamed existed.

There were little white kids everywhere.

In memory, they are all blond, though, of course, that’s highly unlikely. Wandering through the zoo, I remember being more fascinated by this odd species of human than by any of the apes, giraffes and elephants I had come to see. Having grown up in a Los Angeles where, outside of police officers and teachers, the only white people you ever saw lived in Mayberry or on the Ponderosa, I literally could not imagine where these children had come from. They were running eagerly all over the place and I remember being worried that when the day was over, their teachers would have trouble sorting and gathering the right kids for delivery to the right parents.

You see, they all looked alike to me.

One other thing I remember is that we were given box lunches of cold fried chicken and I didn’t eat.  Mind you, I loved fried chicken, but I had some vague sense that it was a food for which Negroes—we were still “Negroes” then—were reputed to have an unnatural lust. And I would give no one—no one white, at any rate—the satisfaction of seeing me tear hungrily into a drumstick. All around me, kids black and white were doing just that, but I made an excuse and left my lunch untouched.

Over 50 years later, I find it amazing—and a little sad—that I had such things in my head at that tender age. I have no idea where it came from. This was L.A., after all; I had never seen a Whites Only sign, had no idea who Martin Luther King was. But somehow, I knew the thing about fried chicken and people like me.

That day, for the first time, I became conscious of myself in a different way. That is to say, I began to realize that I was not just me, a spindly-legged little boy in glasses whose idea of entertainment was digging up ant nests in the backyard. No, I was a small piece of something called “Negro” and it, in turn, was a piece of a larger something called America and the two were ever at odds, bonded on the one side by an abiding hostility and on the other by a simmering resentment.

I joke sometimes that every white person who reads me thinks they’re the only white person who reads me.

I was already thinking of myself as a writer then, most of my stories centering on the adventures of a spindly-legged little boy in glasses who was secretly a superhero. But though I couldn’t know it and wouldn’t know it for decades, I had found my muse that day, found the subject that would perplex and fascinate me all my writing life. Meaning the challenge of explaining to those with no frame of reference what it means—how it feels—to be a piece of a piece.

From the vantage point of over half a century later, it occurs to me that I’ve spent most of my writing life writing for white people about race.

Well, let me tweak that observation a bit. As a matter of intention, I almost never write “for” any group more specific than people who read. If you’re a reader, reasonably versed in the English language, then you are my target audience.

But if that’s the ideal, the practical reality is that, yes, I write for white people.

It makes sense, if you think about it. I’ve been a columnist for The Miami Herald since 1991, with national syndication since 1997. The demographics of the newspaper industry being what they are—which is to say, dominated by white Baby Boomers—my readership could hardly be expected to be other than what it is. Still, when I go out to do a speaking engagement or a signing for one of my novels, it is often the case that white people are taken aback to find themselves in an audience full of white people. Many have said as much outright

I joke sometimes that every white person who reads me thinks they’re the only white person who reads me. Occasionally, I’ll receive an email from one of them that says something like, “I’m pretty sure I’m not your average fan…”

Oh, I say, but you are.

I cannot imagine that I am an easy read for them. Race is not an easy subject and on it, I am not an easy writer.

The cliché, one I have indulged as much as anyone, goes that America needs to have “a conversation about race.”  But as President Obama noted in eulogizing the Rev. Clementa Pinckney, who was killed along with eight other members of his Charleston church when a white supremacist invaded their prayer meeting, “Every time something like this happens somebody says we have to have a conversation about race. We talk a lot about race.”

He’s right. It’s not that we don’t talk. It’s that we talk superficially. We talk around. We talk without knowledge—indeed, sometimes denying knowledge—of the history that makes talking necessary. We talk in ways designed to protect our fortresses of identity and to spare us from excavating those places where guilt and hurt, bitterness and anger, lie not at all deeply buried.

This is true of us all but it is especially true, in my long experience, of white people—a term, by the way, that one of my white readers once chided me for using. “Divisive,” she called it. She was not the first or the only “white” person to say that. The very fact that she finds that term problematic says a lot about why this subject is so often so difficult for people like her.

Race, as scientists on the Human Genome Project determined conclusively in 2000, is an idea with no scientific basis fact. It is entirely a social, cultural and political construct. Indeed, as Nell Irvin Painter explains in her book, The History of White People, our modern conception of race, i.e., the idea that you can judge a person’s worth by judging her melanin content and hair texture, has existed only for the last few centuries. Prior to that, it was common (and just as scientifically coherent) to judge that person by geography or climate.

“Black” and “white”—the concept and machinery of race—became convenient tools as the competition to exploit the riches of the so-called “New World” heated up. “White” conferred a superior form of humanity. “Black” denied any humanity at all. Taken together, they justified a system of kidnap, purchase, exploitation and extermination, allowed good Christians from Europe to engage in acts of barbarous sadism without troubling conscience.

In other words, black and white were useful lies. The truth, however, was that the Europeans who raided the coast of West Africa for human chattel thought of themselves not as “white” men united in some vast common undertaking, but as Dutch men, French men, English men and Portuguese men in competition for the riches of the North American continent. Similarly, the human beings they stole did not think of themselves as “black,” but rather, as Taureg, Mandinkan, Mende and Songhay, thrust together in the reeking hold of those European ships.

But if “white” and “black” are equally false, “black” quickly came to take on the markings of a real and common culture by simple dint of the fact that, their varied tribal identities notwithstanding, the people to whom that word was applied were all forced to share the same oppression and denigration, the same rapes and child-stealing, the same unheated hovels and meager plates, the same torchlight violence. These things brought them together and held them together—and hold them together still.

“White,” on the other hand, has been held together only by the single condition of being not black. It has had little existence apart from that.

A simple thought experiment illustrates the point. If asked to define black literature, you would likely—and promptly—invoke Maya Angelou, James Baldwin, Ralph Ellison, Toni Morrison or some other dark-skinned giant of the written word. But what if you were asked to define “white” literature? The mind quails and resists, does it not?

Note that, if asked about English literature, you might mention Shakespeare or Dickens. French literature? Hugo or Flaubert. American literature? Twain or Melville.

But “white” literature? To whom would you be comfortable ascribing that categorization? What author would be comfortable accepting it? Would Stephen King agree to it? Would William Faulkner or Robert Frost? Would Jodi Picoult or F. Scott Fitzgerald?

Unlikely. They would realize, if only subliminally that the word carries accusations. And those accusations are no less potent for being unspoken.

That’s one of the things—one of the many things—that makes writing about race so difficult to do. The very language of it conspires to defeat you.

And mind you, that’s before you ever get down into the marrow of the thing, before, for example, you write word one about systems of oppression, what the sociologist Eduardo Bonilla-Silva famously dubbed Racism Without Racists. It is before you talk about the phenomenon of mass incarceration that Michelle Alexander explores in her book, The New Jim Crow. It is before you talk about housing and job discrimination, unequal access to health care, the educational achievement gap, barbecuing while black, affirmative action or reparations. It is before you invoke the name of Trayvon Martin, Philando Castile or Tamir Rice, diving into the hot and sticky emotion of unarmed black boys and men killed because of what some white person mistaken, irrationally, believed.

I express an abiding belief that black and white . . . can ultimately find their own humanity reflected in one another if they can only muster the courage to look for it.

If the mere word “white” makes some people recoil, imagine the response you court when you cut down into that marrow. People will call names they would be embarrassed to let their children hear them speak. They will, putting it mildly, say things that do not reflect their best selves.

Or, as my assistant once wrote to me at the end of a long day of sifting through my emails: “I don’t know why you don’t hate white people.”

I reminded her, as gently as I could, that she herself is white. And that hating people because of their paint job is the reason we are in this mess to begin with.

Besides, for all the rancor that boils out of some white people, other white people keep showing up for my lectures. They keep reading my columns and my novels. They keep listening to stories that are difficult to hear. They keep engaging.

I choose to believe that in the telling and the listening, they and I express an abiding belief that black and white—and all the other colors of the racial rainbow—can ultimately find their own humanity reflected in one another if they can only muster the courage to look for it, to see what is already there.

My four novels could not be more different from one another in settings and themes—from a love story unfolding in the aftermath of slavery to a political thriller set on the day of Barack Obama’s election. But each has at least one moment that speaks to that hope, one moment where characters from different so-called races find themselves brought face to face with their own common humanity. It’s not something I ever set out to do, not something I was even aware I was doing, but that moment recurs in every book.

In Freeman, it’s as grandly dramatic as a love scene between a former slave and a white abolitionist, both shattered by the aftermath of civil war.

In my latest, The Last Thing You Surrender, it’s as simple as a white man and a black woman sharing a segregated park bench at the end of World War Two.

Call it an act of faith. And yes, I am painfully aware of how often faith goes unrewarded in this life, how often we live down to the meanest and most wretched version of ourselves. But faith yet endures. Which is, I suppose, the thing that makes it faith.

More to the point, story telling is how we explain ourselves to ourselves—and how we explain ourselves to one another. We sift and curate the disparate fragments of our experiences, the flotsam of our lives, the things we have seen and know—and hope. We hammer it together on a scaffolding of narrative and we present it to somebody else as way saying, This is how I see the world. This is what I believe.

If you say that to people who come from the same place you do and have seen and felt the same things, you give them validation, let them know they are not alone, give voice to things they may not be equipped to express. And that matters; that is important.

But if you say it to people who don’t know those things, whose passages have never intersected yours—if, in other words, you turn from the choir at your back and start preaching to the congregation in front of you—you challenge them, you upend their understanding of the world, you push them to see things they perhaps never thought to see before. You shatter paradigms.

It can be a tricky thing to do.  I remember once, after a school shooting in a white San Diego suburb—one of a series of similar such shootings around the country—I wrote a column noting that in talking about these tragedies, we were missing the elephant in the room. Namely, that virtually all them involved white kids shooting other white kids in white rural or suburban areas.  If a similar wave of violence broke out among black kids in the inner city, I argued, we would be sifting through the pathologies and challenges of their lives to determine what was going on. We would ask, What’s wrong with black kids?

But it would never occur to white people to wonder—not even after the shootings in Littleton, Colorado in San Diego, in Eugene, Oregon, in West Paducah, Kentucky and on and on and on—what’s wrong with white kids. Acutely conscious of who I was writing for, I made that observation as delicately as I could, even stopping at one point to say explicitly what should have been obvious: that I wasn’t suggesting we ought not be concerned because the victims and perpetrators were white.

It felt silly having to say such an obvious thing.  So I laughed out loud when I read an essay a writer named Tim Wise who made the same argument I’d made but, being white, felt no need to be delicate. “An awful lot of white folks need to pull our heads out of our collective ass,” he said.

There is, needless to say, a certain freedom that comes in writing for white people about race and being white yourself.

Doing so while being black, on the other hand, is a constant negotiation between the need to tell the truth, to make it as plain as words allow—and the knowledge that too much truth might be toxic, that too much truth, from you especially, runs the risk of turning people away because it challenges too sharply what two celebrated African-American authors, James Baldwin and Ta-Nehisi Coates, have described as the fiction of white racial “innocence.”

As Coates notes in his book, Between The World and Me, written as open letter to his son, “there exists, all around us, an apparatus urging us to accept American innocence at face value and not to inquire too much. And it is so easy to look away, to live with the fruits of our history and to ignore the great evil done in all of our names. But you and I have never truly had that luxury.”

Later, speculating about the possibility of his own death at the hands of police officers who would then be exonerated and returned to work, he writes that his demise would be framed not as “the fault of any human but the fault of some unfortunate but immutable fact of ‘race,’ imposed upon an innocent country by the inscrutable judgment of invisible gods. The earthquake cannot be subpoenaed. The typhoon will not bend under indictment.”

In The Fire Next Time, James Baldwin offered an indictment that was, if anything, more pointed and less forgiving:

And this is the crime of which I accuse my country and my countrymen, and for which neither I nor time nor history will ever forgive them, that they have destroyed and are destroying hundreds of thousands of lives and do not know it and do not want to know it. One can be, indeed one must strive to become, tough and philosophical concerning destruction and death, for this is what most of mankind has been best at since we have heard of man. (But remember: most of mankind is not all of mankind.) But it is not permissible that the authors of devastation should also be innocent. It is the innocence which constitutes the crime.

And if these are, indeed, painful truths, well, what other choice do you have as a black writer? You have the same choice Coates and Baldwin had, which is to say none. So you tell those truths. You try—or at least, try—to do it in such a way that the people hearing you have room to consider themselves a potential part of the solution instead of an immutable and irredeemable part of the problem. But still, you must tell those truths fearlessly, forthrightly and clearly, because they demand no less.

You tell them and if you are lucky, people—“white” people—will keep trying to understand that which the whole weight of American mythmaking conspires to keep them from understanding. If you are lucky, they listen, maybe receptively, maybe with wariness, but they listen. If you are lucky, every once in a long while, one of them will sidle up to you, having read the column or the novels or having heard the speech and say something like, “I never thought about it that way” or, “You really changed my thinking.”  These things happen if you’re lucky.

And I’ve been lucky.

Indeed, if telling and hearing stories about race constitute acts of faith, then moments like that constitute faith’s redemption.  They validate my stubborn and abiding conviction that if I can ever just explain the thing clearly enough, somebody who didn’t live it will nevertheless get it. My humanity will touch theirs. And that will mean something, move something, make something different somehow.

That is, yes, what a writer always hopes.

But it’s what a black writer writing about race for white people desperately needs to believe.

The Books That Mattered Most to David Bowie, Bibliophile

$
0
0
david-bowie-reading

David Bowie once recounted a story from the filming of The Man Who Fell to Earth in 1975. Relocating from Los Angeles to New Mexico for the shoot, he brought hundreds upon hundreds of books with him, a “traveling library” that he ported in cases large enough to hold an amplifier. His director, Nicolas Roeg, seeing Bowie sifting through piles of books, told him that “your great problem, David, is that you don’t read enough.” Bowie said he didn’t realize for months that Roeg was joking. Instead he berated himself, asking “What else should I read?”

Not the typical 1970s rock star anecdote. Throughout his life, Bowie was a colossal bibliophile, with books as the one habit he never relinquished. When he wasn’t recording or touring, he appears to have spent much of his days reading. Late in life, he was a regular sighting at his local bookstore, McNally Jackson in Soho, and a photograph taken of him in 2013 parallels his Man Who Fell to Earth story. Again, Bowie’s sitting on the floor surrounded by stacks of his books, here including Geeta Dayal’s Another Green World and Stephen Fry’s The Ode Less Travelled.

Many rock songs have been hatched from books: “Sympathy for the Devil” (Bulgakov’s The Master and Margarita); Kate Bush’s “Wuthering Heights”; the Mekons’ “Only Darkness Has the Power” (Paul Auster’s The Locked Room) and so forth. But Bowie’s catalog is particularly rife with songs indebted to fiction and nonfiction, to short stories and (laughing) gnomic philosophical treatises. He even wrote lyrics via a literary device—William S. Burroughs and Brion Gysin-style cut-up, where he’d scissor up a page from, say, a novel that he was reading and use randomly-selected text strips to kick-start a verse. (By the 1990s, he had software do the grunt work for him.)

It’s telling that among Bowie’s final public statements was a list of his Top 100 books, offered as part of the David Bowie Is museum exhibit. As Bowie has apparently left no memoir behind, the closest that he ventured to autobiography is this list of books. Some he chose because he wanted his fans to read them, but many selections have a deeper resonance in his work.

The following are some books that influenced Bowie’s songwriting, from his early records in the 1960s to his last albums.

Alan Sillitoe, The Loneliness of the Long Distance Runner

Bowie’s 1967 debut greatly consists of third-person narrative songs: the most “realist” work, lyrically, he’d ever do. At age 19, he was consumed with the “Angry Young Men” British authors, with John Braine’s Room at the Top, Keith Waterhouse’s Billy Liar and There Is a Happy Land and John Osbourne’s Look Back in Anger among his favorites (the latter two would title Bowie songs). But the most influential was Alan Sillitoe’s 1959 short story collection, which Bowie raided for early songs. He used the plot of Sillitoe’s “Uncle Ernest” for his “Little Bombardier,” while his own “Uncle Arthur” came from elsewhere in Sillitoe’s collection, “The Disgrace of Jim Scarfedale.”

Henrich Harrer, Seven Years In Tibet

Bowie discovered Buddhism in the mid-1960s, befriending exiled Tibetan lamas in London and devouring Heinrich Harrer’s 1952 memoir. Harrer had lived in Tibet at a time when few Westerners had ever ventured there, and documented its last days as an independent kingdom before the Chinese conquest. Harrer’s depictions of the Dalai Lama’s Potala palace would shape Bowie’s 1967 “Silly Boy Blue,” the song of a dreaming monk in Lhasa. And decades later, Bowie named a song on his Earthling album after Harrer’s book. In his own “Seven Years in Tibet,” Bowie returned in the mid-1990s to find a land still under the tyrant’s heel, still dreaming resistance.

 

Friedrich Nietzsche, Thus Spoke Zarathustra

Though he once joked that he’d only read the book jacket of Nietzsche’s ode to the overman, Bowie was being modest. Zarathustra would be central to shaping Bowie’s work in the late 1960s and early 1970s—he’d quarry images from it for songs including “All the Madmen,” “The Supermen,” “Quicksand,” and “Ashes to Ashes.”

Arthur C. Clarke, Childhood’s End

Clarke’s 1953 SF novel of a race of alien beings who come to Earth to midwife the next step in human evolution has echoes in Bowie’s generational “changing of the guard” songs of the early 1970s, particularly “Oh! You Pretty Things” and “Changes.”

Anthony Burgess, A Clockwork Orange

Though more inspired by Stanley Kubrick’s 1971 film of Anthony Burgess’ novel, with Droog uniforms as a template for Spiders From Mars stage outfits, Bowie was also fascinated by Burgess’ invented “Nadsat” language for his narrator’s voice. It inspired his own attempts in the 1970s to create a “fake world…that hadn’t happened yet.” Decades later, on his last album, Bowie wrote a funny, nasty song chock full of Nadsat (“Girl Loves Me”).

 

Nik Cohn, I Am Still the Greatest Says Johnny Angelo

“Violence and glamour and speed, gesture and combustion, these were the things he valued—none else.” Cohn’s 1967 portrait of a rock star/force of chaos, based in part on Elvis and P.J. Proby, heralds the plastic rocker “leper messiah” that Bowie created. Ziggy Stardust and Aladdin Sane are inconceivable without it.

George Orwell, Nineteen Eighty-Four

Up there with A Clockwork Orange in terms of books that Bowie couldn’t leave alone. He tried to write a musical based on it and though being denied the rights by Sonia Orwell, basically did it anyway with Diamond Dogs. He later said that Orwell’s world had felt like his own upbringing in postwar Brixton and Bromley, and throughout his life Bowie returned to images of people trapped in systems of control, whether those of religion or government.

Julian Jaynes, The Origin of Consciousness in the Breakdown of the Bicameral Mind

MIND YOUR BICAMERAL/ YOUR BICAMERAL MIND, Bowie etched into the runout groove of his 1979 “Boys Keep Swinging” single, along with a lyre symbol. Jaynes’ 1976 treatise argued that the brain’s two lobes were once isolated, so early humans “heard” thoughts from one lobe as the voices of gods—it was the sort of compelling crackpot theory that Bowie loved. “The speaker was an angel,” he sang in “Look Back in Anger,” a possible reference to Jaynes’ claim that the emergence of angels, ca. 1000 BC, was a sign the barriers between the brain’s halves were breaking down.

Otto Friedrich, Before the Deluge: A Portrait of Berlin in the 1920s

Bowie lived in West Berlin in the late 1970s and spent his time there as a literary reenactor. He yearned to be in the Berlin of Christopher Isherwood’s novels, to the point of even looking at times like Michael York’s character in Cabaret. One tourist guide was Friedrich’s portrait of Weimar Berlin, a doomed city of exiles, revolutionaries and artists. Bowie would later use a Vladimir Nabokov quote from Friedrich’s book in “I’d Rather Be High.”

Hans Richter, Dada: Art and Anti-Art

When Bowie was cutting Scary Monsters in 1980, he had Dadaist artists on the brain, telling an interviewer that as the Dadaists had said art was dead in 1924, “what the hell can we do with it from there on”? This sense of “lateness” and struggle pervaded his album. He was taken with Dadaist artist Richter’s 1964 study, to the point where he nicked whole lines from it for “Up the Hill Backwards.”

Peter Ackroyd, Hawksmoor

Ackroyd’s novel, with its interwoven narratives of an occultist architect in 18th Century London who consecrates his churches with human sacrifices, and a 20th Century “rationalist” detective, is deep in the roots of Bowie’s “art murder” mystery album, 1.Outside.

Yukio Mishima, Spring Snow (The Sea of Fertility Quartet)

Japanese author Yukio Mishima was among Bowie’s favorites, with the former’s The Sailor Who Fell From Grace With the Sea found in everything from “Heroes” to Bowie’s regular internet pseudonym “sailor.” And “Heat,” the closing track of The Next Day, opens with a reference to Spring Snow, the first part of Mishima’s last work, the Sea of Fertility quartet, whose thematic concerns of reincarnation, glamour, decadence, death and transformation are in Bowie’s late albums as well.

Lawrence Weschler, Mr. Wilson’s Cabinet of Wonders

It opens with Weschler stumbling upon David Wilson’s Museum of Jurassic Technology, a storefront museum in Culver City, California, whose exhibits include “piercing devil” bats and horned African “stink ants.” Intrigued, Weschler chronicles the bizarre holdings of Wilson’s collection. He soon finds that listed details have little connection with scientific fact. Is Wilson a fraud? Is the collection itself Weschler’s fiction? Walking the thin line between magic and science, gifted con men and obsessive curators, strange truth and honest fakery, Mr. Wilson invokes the work of David Bowie as much as anything did.


31 Books in 30 Days: Laurie Hertzel on Richard Beard

$
0
0
richard beard day that went missing

In the 31 Books in 30 Days series leading up to the March 14, 2019 announcement of the 2018 National Book Critics Circle award winners, NBCC board members review the thirty-one finalists. Today, NBCC board member Laurie Hertzel offers an appreciation of biography finalist Richard Beard’s The Day That Went Missing (Little, Brown).

*

Richard Beard’s memoir, The Day That Went Missing, is an excruciating read in parts, but it is an enthralling one.  The book opens with Beard’s memory of his younger brother Nicky’s death, vivid in its terrible details. Beard was 11 when he and Nicky, 9, went to the Cornish coast with their family on summer holiday. The two boys found a private cove where they played, leaping over big waves as they crashed on the shore.

One powerful wave came with an undertow, and it dragged both boys out to sea. Beard’s description of what follows is unforgettable—his last glimpse of Nicky, mouth clamped tight against the water, ligaments in his neck straining to keep his head above the sea, a whimper coming from his throat. Beard turned away and thrashed toward shore, abandoning Nicky, desperate to save himself.

It is a terrible memory. And it is all that Beard can remember. Not what happened before, and almost nothing that happened after. Nicky’s death was something his family—stiff-upper-lip Brits that they were—simply never discussed. He did not even know the date his brother died.

And so in mid-life he set out to re-learn the past and re-discover the facts and emotions of that time. He questions everyone he can think of who knew about the day—his mother, his brothers, the first responder, the schoolmaster at the boarding school he and Nicky attended. He reads coroner’s reports and news articles; he roots around in the attic and finds Nicky’s things—his school papers, his books, his sports jersey.

One story conflicts with another, the facts sometimes butt up against someone’s recollections, and piecing together what actually happened through the thicket of flawed memory and revisionist history is not an easy task.

The Day That Went Missing is a nearly perfect memoir, with a compelling story, deep introspection, fine writing and an unflinching quest for factual and emotional truth. Beard interrogates himself as fiercely as he interrogates others.

He unearths not just the facts of Nicky’s death and its aftermath, but his own emotions, as well – his pain at losing his brother, his terrible guilt at “goading” Nicky into the water in the first place and then abandoning him.

He also discovers a terrible rage for his late father, who had been diagnosed with terminal cancer at the time of the holiday and so, Beard reasons, had nothing to lose and should have tried to rescue Nicky himself. He is also furious at his father for forcing the family carry on as usual—even to the point where, after Nicky’s funeral, they returned to the Cornish coast to finish their holiday.

It was a lesson, he says, he learned well.

“I haven’t mourned him, and I did not cry at his funeral. … I knew when my dad died to carry on as if nothing awful had happened. A lesson he taught me himself.”

This haunting book is a profoundly moving study of remembering and forgetting, of denial and grief.

__________________________________

Laurie Hertzel is the senior editor for books at the Minneapolis Star Tribune and the autobiography committee chair for the NBCC.

On Kate Bush’s Radical Interpretation of Wuthering Heights

$
0
0
kate bush

The first few seconds of the video are usually enough to stun the classroom into silence. Some students will shift uncomfortably in their seats. Some may be counting the minutes until this digression ends. Some will laugh. One or two may be transfixed. One might even be transformed.

On the screen in the front of the classroom, a teenage Kate Bush stares goggle-eyed, her arms like wings. She dances. She cartwheels. She sings, in a high trilling voice, “Heathcliff, it’s me! I’m Kathy!”

I teach literature and creative writing at a small college in New England, and every time I’ve taught Emily Brontë’s Wuthering Heights, I show the video for Kate Bush’s 1978 song of the same name (there are two versions: one features Bush indoors in a white dress, another has her outdoors in a red dress). The video is more than a digression—and more, I hope, than a professor’s attempt to impress his musical tastes on his justifiably skeptical students. But if every syllabus participates in the construction of a canon, and in doing so defines the terms of cultural literacy, then I want my students to participate in building a culture that highly values Emily Bronte and Kate Bush, and that recognizes the ways in which Bush has spent decades defining a space for women artists as visionary creators.

I’m not alone in hearing echoes of Kate Bush’s voice echoing across the culture. Though she has always been much better known in the UK, her American fans make up in ardor what they lack in numbers. To celebrate the November release of a career-spanning Rhino Records boxed set, Margaret Talbot unpacked her decades-long connection to the songs of Kate Bush, whom she identifies as a forerunner to Perfume Genius, St. Vincent, and Mitski. Earlier in the year, Wesley Morris lovingly deconstructed the use of “Running Up That Hill” in the FX series Pose.

So if a consideration of Bush’s “Wuthering Heights” briefly leads the class away from discussions of the uncanny in Wuthering Heights or the othering of Heathcliff or the tricks that Brontë plays with the reader’s complicity, the song is well positioned to open up a host of other, equally valuable, conversations that have nothing to do with my inclination toward post-punk British music. A close look at the video puts us only a cartwheel away from conversations about art, where it comes from, who it ignores, and who gets to make it.

According to the lore that surrounds the song, Kate Bush’s first encounter with Wuthering Heights came in 1977 when she caught the closing minutes of the BBC miniseries. She wrote the song in a single night, mining lyrics directly from the dialogue of Catherine Earnshaw Linton, one of the star-crossed lovers at the heart of the novel.

But as Bush borrowed from the dialogue, she made a crucial transposition in the point of view. When she sings, “You had a temper, like my jealousy / too hot too greedy,” the my refers to Cathy and the you to Heathcliff, the novel’s brooding protagonist/antagonist/antihero/villain (depending on your point of view). But the novel itself never inhabits Cathy’s consciousness: she is seen and heard, her rages and threats vividly reported, but everything we know about her comes from either Nelly Dean, a longtime housekeeper for the Earnshaw and Linton families, or through Lockwood, a hapless visitor to the Yorkshire moorlands and the principle first-person narrator of the novel (most of the novel consists of Nelly’s quoted speech to Lockwood, who is eager to hear the complete history of the inhabitants of Wuthering Heights and its neighboring property, Thrushcross Grange). Although the novel spans decades and multiple generations of Earnshaws and Lintons, Kate Bush’s shift into Cathy’s point of view centers the song entirely on Cathy and Heathcliff—which is fittingly how Cathy, in the novel, views the world. She and Heathcliff share one soul, she claims; everyone else, including her husband Edgar, is little more than scenery.

With this choice, Bush gives voice to a female character who—though an electric presence in the novel—is denied the agency of self-narrating, or even of being narrated through a close third person. Nelly may be presented to us by Lockwood as a simple, transparently objective narrator, but the novel is littered with moments where Nelly complicates the lives of those around her by revealing or concealing what she knows. Bush’s musical interpretation of the novel makes visible the questions that surround point of view: who does the telling? What is their agenda? Who can we really trust?

By opening up these questions, the song situates itself in the tradition of other so-called “parallel texts” that respond to or reinvent earlier, often canonical works of literature: think Jean Rhys’s Wide Sargasso Sea and Charlotte Brontë’s Jane Eyre, or Kamel Daoud’s The Meursault Investigation and Albert Camus’s The Stranger. In each pairing of “parallel” and “source” text, the later work privileges characters narrated about, but never before narrated from within.

Like the novels by Rhys and Daoud, Bush’s song demonstrates how art can respond to art, and points to the ways in which crucial reevaluations of past works take place not only in scholarly articles but in one artist grappling with the erasures and silences of an earlier age. Rhys and Daoud both insist on a voice for a silenced, maligned, or dismissed colonial subject. Their aim is not to create a work that merely amends (or acts as a footnote to) the earlier text, but to produce a narrative that calls into question the primacy, and even the authority, of the earlier text.

Kate Bush’s shift into Cathy’s point of view centers the song entirely on Cathy and Heathcliff—which is fittingly how Cathy, in the novel, views the world.

Kate Bush may not have been aiming to supplant Emily Brontë, but just as the song itself points to issues within the novel, Bush’s role as its creator exposes the straitened public personae of the Brontë sisters in 1840s England. Remember that the Brontës—Charlotte, Emily, and Anne—published their own work under vaguely male pseudonyms: their first joint publication, in 1846, was The Poems of Currer, Ellis, and Acton Bell. Jane Eyre appeared a year later, attributed to Currer Bell, and a year after that Ellis Bell’s name appeared on the title page of Wuthering Heights.

It was unthinkable at the time that young, unmarried women would circulate their names so freely on books that portrayed the love between a wealthy man and his hired governess, or the flare-ups of passion and cruelty that marked the relationship of Cathy and Heathcliff. The sisters also knew that women authors were routinely dismissed or pilloried by the all-male fraternity of critics, and they hoped that the Bell names would offer protection and a fair shake from reviewers. Still, one early review blasted the incidents in Wuthering Heights for being “too coarse and disagreeable to be attractive,” while even a more positive review called it “a strange book. It is not without evidences of considerable power: but, as a whole, it is wild, confused, disjointed, and improbable.” Two years after Emily’s death in 1848, an edition of Wuthering Heights was published under her own name, with a preface and biographical note by Charlotte defending her sister’s moral character against the aspersions cast on her.

Fast forward to the late 1970’s and Kate Bush finds herself a young female artist in a culture industry still dominated by men. Her record company, EMI, pushed for another song, “James and the Cold Gun,” to be her first single, but Bush insisted that her debut had to be “Wuthering Heights.” After winning that argument, she delayed the release of the single in a dispute over the cover art, and later referred to herself as “the shyest megalomaniac you’ll ever meet.” When the single was finally released in early 1978, it needed only a few weeks and a performance by Bush on Top of the Pops to claim the #1 spot on the UK charts, displacing ABBA’s “Take a Chance on Me.” Only 19, Bush became the first female singer to make it to #1 with a song that she herself had written. At a time when women were viewed primarily as interpreters of others’ lyrics—as instruments rather than creators—Kate Bush upended the narrative with her first piercing notes. She would narrate from within, and in her own words.

The song’s connections to debates about cultural literacy, art-as-critique, and the fraught space of the female artist are enough to earn the video its place in the classroom. But I also count on “Wuthering Heights” to speak directly to my students about some of life’s other, bigger questions. My students, like teenagers everywhere, often wonder when their real lives will begin: when their ideas will matter to the wider world; when the art they make will feel like more than another assignment to be graded. But if high school students campaigning across the country against gun violence can illustrate the political power of the young, then Kate Bush argues that your artistic impulses also matter, that they’re valid, and that there’s no reason to wait.

And, it’s important to add, Kate Bush doesn’t care if you’re laughing. Because she is all-in, all the time. To watch the “Wuthering Heights” video is to see an artist consumed by a sense of personal vision. She isn’t aiming for the mainstream. She’s singing about a 19th-century novel known to most of her peers from their A-Levels cram sessions. Her voice soars and rumbles, and her dancing is a mix of pirouettes, leaps, and contortions. She even mimes sleepwalking. The video also uses every trick in the 1970’s A/V club handbook: gauzy filters, freeze frames, lighting gels, multiple exposures, a fog machine. It’s completely over the top, but Bush seems to know that holding back or winking at the camera will break the spell and cause the entire project to collapse into a cheap parody. If nothing else, Kate Bush is authentically herself.

I have to ask myself: was I that bold at 18, or at 28? Reader, I was not. But since then I’ve tried to shake off the timidity of my younger self—someone desperate to create but paralyzed by uncertainty about the leap into the unknown. So in the end, I hope that what my students are getting from Kate Bush—if they need it, if they haven’t already seized it for themselves—is permission. To make art at any age, to express it in whatever way feels authentic, and to let the culture catch up to them.

High Lonesome: A Dispatch from the National Cowboy Poetry Gathering

$
0
0
cowboys

I’ve heard Kenny Rogers’ “The Gambler” twice already since I landed in Elko, and I think I might be hearing it again in the Star Hotel. This place and its restaurant opened in 1910 to cater to a northern Nevada Basque immigrant clientele who shepherded here. Now, its big neon sign is a beacon on top of a two-story box of a building for folks in town for the National Cowboy Poetry Gathering.

There are a lot of us. I’ve only ever been here at this time of year, so the scene is just as I remember it: standing-room only and loud, with all red-vinyl bar stools occupied. There’s a lodge feel, with dark wood paneling, decades-old framed pictures, paintings of horses everywhere, and a tin ceiling in the dining room. Bartenders line up picon punches—the potent, unofficial drink of the Gathering—on the bar and hand them out as fast as they can. There are meaty handshakes and back slaps and everyone seems happy to have returned to this place for the weekend festivities.

Singer Ramblin’ Jack Elliot waits for a table for at least an hour with everybody else. He’s a New York surgeon’s son who left the city at 15 in the 40s to join a rodeo and learn to cowboy. Later, he became a student of Woodie Guthrie and a legendary folk singer in his own right as well as close friends with Johnny Cash. He’s here for the Gathering too, one of thousands of people who make the annual trip to the remote Great Basin town every winter. Jack was not born a cowboy. But earlier in the day he sang a raw cover of Bob Dylan’s “Don’t Think Twice,” and nobody in the room doubted his authenticity. It was the kind of performance that draws people to Elko.

Jack was not born a cowboy.

The event has come here for 35 years, but it feels much older. At moments, it seems like 1884, when the trail drives were still happening, moving cattle across Texas towards railheads that took them to cities like Chicago. The genre was first published in that era, with Lysius Gough self-publishing 1,000 copies of his verse “composed on the trail in 1882.” Conventions established themselves soon after: rhyming couplets, colloquialisms, and heavy rhythm. (Think Robert Service’s “The Cremation of Sam McGee.”) But cowboy poetry is more contemporary than it wants you to believe. Hal Cannon—folklorist, cowboy poetry anthologist, and one of the Gathering’s founders—tells attendees in his keynote, “National Cowboy Gathering, here’s to the next 35. Even as we count the years, you exist outside of time.”

*

At the Star, food courses come in predictable waves: cabbage soup with thick slices of bread, Caesar salad with so much garlic it stings, two kinds of beans, plus green beans, and spaghetti. You get it all, family style, usually followed by a massive cut of steak or lamb.

Article continues after advertisement

Back in the kitchen, there’s one table left open for the hotel’s boarders, Basque men, descendants of shepherds, or perhaps shepherds themselves. A waitress packs rice pudding into a styrofoam container. She’s taking it to one of the “old guys” on her way home, because he couldn’t make it in tonight.

There is a way of telling the history of Western poetry and its evolutions over a millennia that begins with sheep. It’s based on the ancient Roman poet Vergil’s personal path to success, in which a poet’s career trajectory begins with the pastoral—shepherds singing in hilly landscapes while they tend their flocks and fall in love—passes through agrarian georgics, and lands on nationalist epic poetry, fully accomplished.

Much Cowboy poetry could be called pastoral poetry, which is a slippery genre. New critic William Empson’s influential 1935 reading held that these poems were “about the people” but not “by or for the people,” whether in Virgil or Renaissance poetry like Spenser’s Shepheardes Calender, or even 20th-century proletarian literature. More recently, contemporary, experimental poet Lisa Robertson explained pastoral another way when she improbably resuscitated the seemingly antiquated genre for her own feminist work: “Let’s pretend you ‘had’ a land. Then you ‘lost’ it. Now fondly describe it. That is pastoral.” This gets a little closer to cowboy poetry’s kind of pastoral. Cowboy poetry is loaded with such fond descriptions of a West that no longer exists, that perhaps never really existed, at least not without the dark shadows of colonialism and genocide. At the Gathering, the work of Native American poets like Henry Real Bird or Métis musician Jamie Fox remind us of other ways of having a relationship to the land and alternative histories of that land, while staying within the cowboy poetry genre.

There is a way of telling the history of Western poetry and its evolutions over a millennia that begins with sheep.

*

In the Elko Convention Center, former New Mexico state slam poetry champion Olivia Romo performs poems about sheep herding, water rights, climate change, and corn seed. She weaves together Spanish and English. When she delivers the words “l’aigua es una frontera,” the audience, a mix of ranchers and cultural tourists, seems to know exactly what she means. She recalls the Mexican cowboy or “vaquero,” phonetically rendered into English as “buckaroo.” Vogue Robinson, another slam performer and poet laureate of Clark County, Nevada, is at her first Gathering and in the audience after performing on an earlier panel.

Romo is on a panel titled “New Voices,” joined by Forrest VanTuyl, a youthful, bearded Oregonian poet and songwriter who writes about losing horses to wolves and quotes lines from Alfred, Lord Tennyson. Joshua Dugat, from Texas, rounds out the panel. He can recite “Pied Beauty” (another 19th-century classic) from memory and writes formally complex meditations on rural environments and ranch life.

After several rounds of readings in the Ruby Mountain Ballroom, the poets give up the stage. The next panel, on “Rural Journalism,” features prose writers. They discuss 1,000 layoffs in the media industry the week before, how to report stories on mining and resource extraction, and who gets to tell the story of contemporary rural America. They talk about how hard it is for most of the country to know what’s going on in rural America.

Romo’s ecopoetics seem like one way to learn about what’s going on. Her pastoral is not a fantasy land, but the representation of a charged landscape where her community tries to survive climate change. The sheepherders in her poems are real.

*

The next day, I go to see Joel Nelson. He has been performing at the Gathering since the 1980s. He runs his hand and its swollen knuckles up and down the edge of the podium, measuring time deliberately between poems. He starts with the poem “Ambush,” which plays with the crowd’s expectations from its opening line: “Beside the trail in the A Sầu Valley / My team is lined up motionless.” The poem is set in Vietnam, where Nelson served in his mid-twenties, rather than a 19th-century trail drive or even Nelson’s home ranch in West Texas. This poem, and the ones that follow—others set in Vietnam and love poems to his wife—lack the couplets of his more traditional work, like “Sundown in the Cow Camp,” with its description of a camp cook:

His expression kinda clues you
That his memories have flown
To other camps at sundown
And the cowboys that he’s known.

Nelson doesn’t read “Breaker in the Pen,” about the solitary civilizing force of the horse breaker, but he does talk to the crowd about writing poems on alfalfa feed sacks while breaking horses in Hawaii. He’s self-referential in a poem about wind in West Texas “blowing like a bad poem that goes on way too long.”  He starts to wind down with a poem titled “Definitely Not Cowboy,” an ode to acetylene torch welding.

Poets in the 80s accused him of killing cowboy poetry with his free verse.

John Dofflemeyer, another fixture at the Gathering since the early days, tells me that some of the other poets in the 80s accused him of killing cowboy poetry with his free verse, his refusal to recite from memory, and his insistence on publishing cowboy poets in print in the journal Dry Crik Review. In the late 60s and early 70s, he listened to Leonard Cohen while writing poems. Dofflemeyer said he felt that Gary Snyder’s descriptions of the Sierras gave him permission to write about his own experience of that landscape as a fifth-generation cattleman. Reading Allen Ginsberg helped him develop his own free verse. Of course, this short list of influences obscures the women central to poetic communities of the same era, like Margaret Atwood, whose devastating poem “Backdrop addresses cowboy” appeared in her 1968 collection The Animals in That Country. Even so, the Gathering, whose future is being smartly charted by Kristen Windbigler, executive director of the Western Folklife Center, has long featured women poets and storytellers.

Both Nelson and Dofflemeyer write after the Beats, after Dylan and Cohen, after the Vietnam War, moving along a path that connects the cowboys and the 60s poetry and folk scenes of San Francisco, Los Angeles, and New York. This is where Ramblin’ Jack Elliot fits in, too: the Jewish kid from Brooklyn who was born with the name Elliot Charles Adnopoz until he decided to become a cowboy troubadour.

*

After a full schedule of performances take place, and after the closing Saturday midnight dance, the Gathering breaks up. At the lone terminal of the Elko airport, before 5 a.m. on a Sunday, performers in a single-file line dutifully remove cowboy hats and boots and belt buckles to pass through security, headed for the first flight out of town. For a few more minutes, we are contemporaries, in the same time zone and the same poetic moment.

31 Books in 30 Days: David Varno on Denis Johnson

$
0
0
denis johnson largesse of the sea maiden

In the 31 Books in 30 Days series leading up to the March 14, 2019 announcement of the 2018 National Book Critics Circle award winners, NBCC board members review the thirty-one finalists. Today, NBCC board member David Varno offers an appreciation of fiction finalist Denis Johnson’s The Largesse of the Sea Maiden (Random House).

*

In They’ll Love Me when I’m Dead, a documentary about Orson Welles’ unfinished, posthumously-released film The Other Side of the Wind, Peter Bogdanovich paraphrases Welles to suggest that his compulsion to continue working often shaped the themes of his projects: “No story has a happy ending unless you stop telling it before it’s over.”

The stories by the late, great Denis Johnson are often unhappy, but they leave off with a trace of hope, an aftertaste of promise that refuses to turn bitter. In Johnson’s immortal collection Jesus’ Son, a man called Fuckhead holds true to his name though a journey of sad, strange, sometimes violent encounters, and even agrees when a friend tells him, “‘Fuckhead’ is a name that will ride you to your grave.” Maybe so, but acceptance helps loosen the clouds overhead, and by the end, he discovers how it can lead to growth. “All these weirdos, and me getting a little better every day right in the midst of them. I had never known, never even imagined for a heartbeat, that there might be a place for people like us.”

Johnson’s final book, The Largesse of the Sea Maiden, was published 26 years after Jesus’ Son and nine months after his death at 67. It is comprised of five stories in 200 pages, which were composed over a long period and completed with the knowledge that death was close. Death is not defied in these stories (as if it could be), but it does inspire their protagonists to imagine something more. In the title story, a man who works in advertising notes, “I’ve lived longer in the past, now, than I can expect to live in the future.” He wanders through the scenes of his old life in New York City, a place he’d long ago left, claiming “It was never my town.” Alone on the street with the first flakes of a sticky snowfall, the town is finally his. But to what end, and for how long? What he needs is a little magic. Fortunately, Denis Johnson, like Orson at his own craft, was a magician of the highest order. “Once in a while,” the ad man tells us, back home in his snowless southern California bed, he “read[s] something wild and ancient . . . Then sometimes I get up and don my robe and go out into our quiet neighborhood looking for a magic thread, a magic sword, a magic horse.”

The desire for stories to be real sets the stage for a writer’s desire that his stories endure. The theme runs central through Sea Maiden, a book that will ensure and deepen its author’s continued resonance. Even after the masterful novella Train Dreams (2002) and the National Book Award-winning novel Tree of Smoke (2007), Sea Maiden feels like a return to form. This kind of statement is not usually made about posthumous books, but in this case it feels right because of how far Johnson was willing to push the stories to contain his emotional depth and poetic sensibility. Some of the themes and stylistic techniques are familiar, such as the use of the em dash to suspend an impending action with discursive reflection, so that a car crash is rendered in slow motion. Two of the stories in Sea Maiden, set in an addiction recovery center and a county lockup, could be coming from an old Fuckhead associate, someone who was left to cast about in the ether for some kind of relief, but only affects more damage. The other three, including the title story, offer something new altogether, and they constitute a collection that was widely hailed an “instant classic” upon its release.

“Triumph Over the Grave” and “Doppleganger, Poltergeist,” the other two in the glorious above-mentioned trio, are peopled with writers rather than barflies or criminals, and the register shifts from raconteur to literary references and apocrypha. In “Triumph,” after the death of a friend from a long illness, a writer reflects on his role as caretaker and inhabitant of the friend’s house. “It’s become my religion to carry things to and fro in the temple, and I find no reason to adjust right away to the demise of our god—here in this house not haunted, but saturated through-and-through with the life of its dead owner.” The space of death leads him to make something of his memories, and he circles back on another experience of losing a friend to illness, another successful writer named Darcy. Waiting to see Darcy at the hospital, he observes the visitors of the sick and wounded, “loved ones bent over mystifying paperwork or staring down at their hands, beaten at last not by life but by the refusal of their dramas to end in anything but this meaningless procedural quicksand.” With relief, he is invited to leave this horrific scene for the sacred space of Darcy’s room, where he is struck by his friend’s “stoic poise” after being told of his metastatic cancer.  What did Darcy know that he didn’t? Does he know it now? Perhaps he does, because he is able to end the story with a note of his own striking stoicism: “It’s plain to you that by the time I write this, I’m not dead. But maybe by the time you read it.”

Characters in fiction, and in life, tell each other the same stories over and over, and as time passes, with the end in sight, the stories get sadder. But if the master of their dramas is Denis Johnson, they might be lucky enough to find some light, or even deliverance from the quicksand.

__________________________________

David Varno is the VP of Tech for the NBCC and Digital Editorial Associate at Publishers Weekly. He is a former Dispatches editor for Words Without Borders, and his writing has also appeared in BOMB, the Brooklyn Rail, the Cleveland Plain Dealer, Electric Literature, Minneapolis Star-Tribune, Newsday, Paste, Tin House, and other publications.

31 Books in 30 Days: Mark Athitakis on Robert Christgau

$
0
0
is it still good to ya robert christgau

In the 31 Books in 30 Days series leading up to the March 14, 2019 announcement of the 2018 National Book Critics Circle award winners, NBCC board members review the thirty-one finalists. Today, NBCC board member Mark Athitakis offers an appreciation of criticism finalist Robert Christgau’s Is It Still Good to Ya? (Duke University Press).

*

For decades now, Robert Christgau has been known as the Dean of American Rock Critics. He’s embraced the titleit’s right there on the top of his websitebut it’s one that perhaps hasn’t served him well, or at least mischaracterizes his value as a critic. That “dean” business suggests that Christgau serves as rock and pop’s lead tastemaker, and that all other critics are simply following his lead. It suggests an overly persnickety manner. Lastly, it suggests somebody who’s no fundid you aspire to hang with your college dean?in a genre that’s all but defined by joy and pleasure, licit and otherwise.

To be sure, Christgau has handed out a lot of letter grades over the years, and he’s been dinged for his fussiness and presumed authority of his assertions. “I dunno why / You wanna impress Christgau,” Sonic Youth ranted on its song “Kill Yr Idols.” (“I wasn’t flattered to hear my name pronounced right,” Christgau coolly retorted.) But if that “dean” title gets it right, it’s because of this: He does the work, the rigorous, sometimes scholarly work of understanding an artist as deeply as possible, and understanding as much music around the world as possible. Is It Still Good to Ya?, a career-spanning collection of his longform reviews, is a testament to vigorous, big-eared listening. You might have thought he’s been stingy about handing out A-pluses to records in his capsule reviews, but the book makes clear he’s trying to earn one himself.

The book’s subtitleFifty Years of Rock Criticism 1967-2017is somewhat misleading. Most of the book’s essays were written after 2002. At a time when most critics of his generation are either retired or failing to keep up, comfortable with covering warhorses, Christgau has remained an intrepid listenerhis appreciations of Lil Wayne, Brad Paisley, M.I.A., and Eminem, are rooted in genuine enthusiasm while skeptical of how the winds of publicity, fandom, and critical consensus have moved perceptions of those artists. And yet, Christgau is also a consistently inviting and generous critic. (“It’s fine not to like almost anything, except maybe Al Green,” he writes.) Across the pages of this book, you can see him being open to different perspectives on musicians, wrestling with his understanding of Thelonious Monk and coming to terms with Sonic Youth, who he once dismissed as “impotent bohos” but now “loves to pieces.”

But while the book speaks to the breadth of Christgau’s listening, there is also a thematic specificity to the book: Practically every piece is rooted in the notion that music is a prism through which we can better understand race, society, and politics, especially in America. For Christgau, artists old (Billie Holiday, Woody Guthrie, Frank Sinatra, Chuck Berry) and new (Gogol Bordello, Lady Gaga, Jay-Z) evoke a vision of American life that’s been embraced or attacked over time, and will continue to shift. “It was Chuck Berry who had the stones and the cultural ambition to sing as if the color of his skin wasn’t a thing,” he writes, not to deny Berry’s race but to comprehend the cross-cultural fusion he pushed listeners toward. Rock, however you define it, continues to speak to a political ideal we’re still working toward.

This is complicated work, but for a dean it’s plenty fun, and joy to dip into or out of, both for full appreciations and single lines. Offering some tips for “growing better ears” on the book’s first page, he suggests you “spend a week listening to James Brown’s Star Time.” The ensuing pages will keep you listening and thinking for many, many more weeks besides.

__________________________________

Mark Athitakis’s reviews and essays have appeared in the Washington Post, LA Times, Humanities, Virginia Quarterly Review, and many other publications. He is the author of The New Midwest: A Guide to Contemporary Fiction of the Great Lakes, Great Plains, and Rust Belt. He lives in Arizona.

What Eight Missing Manuscript Pages Can Tell Us About a 20th-Century Genocide

$
0
0
armenian zeytoun

Los Angeles. The J. Paul Getty Museum. Present day. The illuminated manuscript pages dwelled in the modern cabinet in the sterile storage room. A visage emerged from the carpet of gold on one of the parchment sheets. He had been restless ever since the great rupture. He stole a glance upward.

Some things had not changed. Above him the roosters strode confidently toward a jeweled vase. To his left his twin visage kept watch. Beneath them an arch opened up like a fan. The partridges and blue birds hidden in the gold leaf still pecked at tendrils. All this rested on three columns of painted porphyry and patterned gold. The column capitals, the blue ox heads, bore the weight, docile as ever. Beyond the frame birds alighted on the pomegranate trees and the outlandish plants Toros had devised. The visage could still feel the painter’s breath as he labored over every detail, holding his delicate brushes. The letters Toros had inscribed beneath the arch stood at attention at their appointed places in a grid outlined in gold. The visage could see all the way down to the base of the columns. Tiny red dots sprinkled along the base resonated with the red of the pomegranates, the roosters’ combs and wattles, even the visage’s own headgear.

Other things had changed, however. The visage remembered that in the beginning he had lived across from another page, a near-echo of his own, with similar roosters, oxen, pomegranate trees, the fanlike arch, even the red dots. His page and the original echo page featured myriad differences, too, that the visage delighted in finding over and over again. Since the great rupture, that echo page had moved away. The visage shared a bifolium with another page instead. Even though it too had an arch, a grid of letters, and even the same color scheme, its differences were jarring. Its trees were palms bearing owls, its partridges pecked at a silver vase, its carpet of gold bore distinct ornaments, its column capitals were twin birds, and its grid of letters was denser. It was not his echo. It was never meant to be seen across from him.

The visage resumed his silent vigil. He had survived the great rupture, but would he ever find his echo again?

*

Consider the Canon Tables of the Zeytun Gospels, preserved today at the J. Paul Getty Museum in Los Angeles. You are looking at four sheets of parchment. Each sheet of parchment is folded down the middle, turning into two connected leaves. In the art of bookmaking, a folded sheet is called a bifolium and yields four sides: four folios. Each folio measures 26.5 by 19 centimeters. The Canon Tables consists of a total of sixteen folios. Eight of the folios bear illuminations, while their eight backs are left blank. These folded sheets of parchment were once nested together in a gathering and bound with other gatherings of folded parchment.

In the resultant book, the pages appeared in a carefully ordered sequence. As you opened the book and looked at the Canon Tables, you saw the illuminated folios as matched sets, as four pairs of pages facing one another. The facing pages echoed each other’s decoration. As your eye traveled from one page to the other, you would notice their similarities as well as subtle differences, not unlike a refined “Spot the Difference” puzzle. Between each illuminated pair the blank folios allowed you to pause and cleanse your palate before turning to the next meticulously crafted pair of images. The makers of this artwork of great luxury used the most lavish materials, and they could afford to have only one side of a parchment page painted.

The illuminated pages feature decorated architectural frames. The frames shelter golden grids that contain series of letters written in the Armenian alphabet in bolorgir, a lowercase script. Around the frames many species of birds frolic; some hold fish in their beaks, and others drink from vessels or nibble at stylized plants and flowers. Within the frames you discern more birds and even human faces, nestled among ornamental fields in brilliant colors. The facing pairs of the Canon Tables feature the same layouts, yet each page looks unique. The painter Toros Roslin, working in 1256 in the Kingdom of Cilicia, unified them in design yet created subtle distinctions. He distilled the essence of medieval visual harmony into eight glorious painted pages.

This crease marks the moment when the work became a fragment; it is the trace of its loss.

The letters within the frames represent numbers. The grids of letters are thus numerical tables of a specific kind. The pages at the Getty depict canon tables—concordance lists of passages that relate the same events in two or more of the four Gospels. Designed by Eusebius of Caesarea in the 300s, by the early Middle Ages canon tables almost always preceded the Gospels and usually featured columns of numbers assembled within painted architectural structures. The Canon Tables now at the Getty was once part of a manuscript copy of the four narratives of Christ’s life by Matthew, Mark, Luke, and John that make up the Christian New Testament. The manuscript is known as the Zeytun Gospels after the remote mountain town where it was once kept and revered for its mystical powers of blessing and protection. When the people of Zeytun were exiled from their homes and exterminated, the manuscript too was taken away and broken into fragments.

The gathering of illuminated Canon Tables that is now in Los Angeles was detached from the mother manuscript of the Zeytun Gospels. No longer part of a book, it now appears as component parts: four sheets of parchment, folded in the middle. You can still see the small holes in the vertical fold at the center of each bifolium, where the threads that bound the manuscript together into a codex would once have been. Perhaps the Canon Tables came loose from the binding over time. Or perhaps someone cut the thread. In any event, somehow the pages bearing the Canon Tables were removed from the Zeytun Gospels.

This crease marks the moment when the work became a fragment; it is the trace of its loss.

Viewing the Canon Tables displayed at the museum, you will also notice another feature that does not readily lend itself to photography. A crease extends horizontally across the two connected pages. It seems that no amount of careful conservation will smooth it out. This crease tells you something about the life story of the Canon Tables. It was likely caused when the gathering was removed from the mother manuscript and folded up. This crease enables you to imagine how, at some point, unknown hands removed the Canon Tables from the mother manuscript, how they folded it, perhaps tucked it in a pocket or in the folds of a fabric belt like the ones men wore in the waning days of the Ottoman Empire, and took it away. The crease shows us that the work of art bears the imprint of the actions it endured, and of its separation from the mother manuscript. This crease marks the moment when the work became a fragment; it is the trace of its loss.

At that moment the holy manuscript cleaved into two. Each piece acquired a new possessor and embarked on a distinct journey. The mother manuscript followed a twisted path that eventually took it to the Republic of Armenia. The Canon Tables left the Mediterranean littoral, moved across the Atlantic Ocean, and decades later made landfall on the Pacific shore, in one of the world’s greatest and wealthiest museums. In Los Angeles, descendants of the community that once revered the Zeytun Gospels as a devotional object brought out only on special religious occasions can now view its detached Canon Tables on exhibition, displayed alongside other works of art in a museum hall, open to the public.

A fortuitous chain of circumstances brought the Canon Tables to Los Angeles. The Zeytun Gospels is a remnant of a medieval world that is lost forever. It is also the only medieval relic that has come down to us from the once-rich treasuries of Zeytun’s churches. It is among the rare manuscripts to survive the unprecedented assault on Armenian cultural heritage that was part of what we now know as the Armenian Genocide. For every manuscript that endured, many more were lost forever, intentionally destroyed, burned, recycled for other uses, abandoned, or left to decay.

*

Provenance and Power

In 1995, shortly after acquiring the Canon Tables, the Getty Museum introduced it to the public with the following brief provenance:

Catholicos Constantine I (1221–67); bound into a Gospel book in Kahramanmaras, Turkey; Nazareth Atamian; private collection, U.S.

In 2016, the Getty amended the provenance:

1256, Catholicos Konstandin I, died 1267; by 1923–1994, in the possession of the Atamian Family; 1994, acquired by The J. Paul Getty Museum; 2016, gift of the Catholicosate of the Great House of Cilicia, by agreement.

These terse lists condense the biography of the Canon Tables since the creation of the Zeytun Gospels in 1256. In the twenty-odd years between the two versions of the provenance, the Canon Tables entered the manuscripts collection at the Getty Museum, appeared in scholarly exhibitions, became the subject of a contentious lawsuit, and saw it resolved through a settlement. The Armenian Church stipulated the change in provenance as one of the conditions of the agreement; in turn, the Church donated the Canon Tables to the Getty. The changes in the provenance are telling, but its silences are telling as well.

Provenance is a highly specific type of record, a chronological list of the successive owners of a work of art, and the manner of its transfer among them. Provenance communicates the itineraries objects trace through space and time as they are sold, inherited, bartered, and transferred. Provenance can transform the significance and value of an object in varied ways. It can alter its meaning, impact, and visibility just as surely as it impacts its location, state of preservation, and documentation. When provenance lists are known for objects of great antiquity, they may tell us much about the historic development of taste, the relationship between the present and the past, realities of war and economic exigencies. Recently art historians have delved into provenance as a meaningful type of text, even as an “alternate history of art,” a window on to the social life of art since its creation.

Provenance often tells ordinary, perfectly legal stories of sale, purchase, or inheritance. The archives of auction houses record which artworks sold and which languished unwanted, the bidding wars that reveal the allure of a particular object, as well as the rise and fall of artists whose works are in demand at one point then utterly forgotten. Inventories of art collections disclose the movement of paintings and sculptures as they are inherited, bequeathed, gifted, or sold to pay debts. Sometimes the artworks themselves tell their own stories: successive owners place their stamp or monogram on a valued object or add a binding to a medieval manuscript or a new frame to a Baroque panel painting. All of these sources provide information about provenance but also about the contexts of each transfer of ownership, the human stories and historical circumstances.

When a new owner acquires an object, its provenance acquires an added item. Curators and scholars make changes in the official provenance of objects in museum collections when they discover new information or correct an error. Sometimes, however, changes in provenance are made in response to pressure from legal action or public opinion. Consider, for example, the first time the Getty Museum made public the provenance of the Canon Tables from the Zeytun Gospels, in 1995, shortly after acquisition. This brief list mentions the first owner, Catholicos Constantine, Nazaret Atamian, and the Getty. The list mentions no owners between the death of Constantine in 1267 and Nazaret Atamian’s acquisition of the Canon Tables almost seven centuries later.

In 2016 the Getty modified the Canon Tables’ provenance as part of its agreement with the Armenian Church. The new provenance includes additional information about the Canon Tables’ ownership history in the twentieth century, the period under contention in the lawsuit. Its careful wording stops short of attributing ownership to the Atamian family, with whom the Canon Tables remained between 1923 and 1994. The new provenance acknowledges in effect that the Canon Tables always belonged to the Church, even when it resided with the Atamians and when it was acquired by the Getty in 1994, until the Church gifted the Canon Tables to the Getty in 2016. In this case litigation and negotiation invited renewed attention to the artwork’s history and eventually led to a new provenance. In its concision the provenance reveals little of the great labor of scholarship and litigation that went into its production.

Art objects, especially those ancient and valued, often have convoluted lives. Their provenance is rarely seamless. Gaps appear when records go missing, research yields no information, or objects are excavated by accident or by looters and their findspot remains undocumented. Darkly, in some cases, provenance can also tell—or conceal—stories of theft, war, atrocity, colonialism, genocide, destruction, appropriation, or exploitation. Art objects with a fragmentary or sketchy provenance may have been the result of illegal excavation or looting. For instance, shadowy middlemen or unscrupulous owners may obscure less savory episodes of an object’s life, as when looters pilfered an antique bronze from an excavation site in Italy, or when acquisitive collectors forcibly seized a sacred object from a Native American community, or when a Nazi official usurped a painting from a Jewish art dealer.

Certain kinds of record-keeping can be acts of resistance, uncovering suppressed episodes of an object’s history.

In addition to telling tales, or obfuscating others, provenance itself can become contested terrain. In recent years attorneys, activists, indigenous communities, and some governments challenged collectors and museum officials with renewed zeal on the provenance of certain objects. They brought restitution claims regarding pieces that had been stolen, seized, or illegally exported, violating laws and international norms. After much resistance, the art world responded by emphasizing transparency in the provenance of objects in museum collections, and the art market grew more cautious in the sale of artworks with incomplete or questionable provenance. Despite greater awareness of such problems, the contest between communities and powerful institutions over the control of cultural patrimony continues—as in the case of the Canon Tables. The struggle for restitution, repatriation, and the reunification of art or sacred objects with their communities, and the fraught questions this raises, is one of the central issues of 21st-century art history.

These contests throw into sharp relief the relationships of power that structure the circulation of art objects and underlie every transaction that provenance chronicles. During conflicts, when art is liable to be looted, record-keeping becomes an act of power that can serve the interests of the powerful rather than the rightful claims of ownership that it purports to present. Thus, meticulous record-keeping was part and parcel of the Nazis’ organized looting of Europe’s art during World War II. Provenance can be a record of added value and prestige but it can also serve to obliterate traces of violence and injustice. Conversely, certain kinds of record-keeping can be acts of resistance, uncovering suppressed episodes of an object’s history. The curator Rose Valland, who witnessed the Nazis’ mass looting of art from French national institutions as well as private French Jewish collections, famously kept a secret inventory of looted objects, risking her life. Her notes proved invaluable for the recovery of art after the war.

The Canon Tables’ provenance, written and rewritten, chronicles the life of an artwork. It also speaks of the people who created it, worshipped it, traded it, and in some cases took terrible risks to save it. Violence, exile, separation, and migration mark the Zeytun Gospels’ trajectory over the last century. The silences in its provenance also tell tales about unequal struggles, seemingly hopeless causes, resistance, resilience, and contestation. The story that the provenance of the Zeytun Gospels tells matches the modern history of Armenians.

*

The Armenian Genocide

The Armenian Genocide and its many afterlives shape modern Armenian history, just as they determine the fate of the Zeytun Gospels. The Ottoman government carried out the systematic extermination of its own Armenian community during the Great War. On April 24, 1915, the symbolic beginning of the genocide, the Ottoman police rounded up Armenian community notables, politicians, intellectuals, and businessmen in the imperial capital of Istanbul. Most of them were murdered after varying periods of detention in central Anatolia. Parallel to this “decapitation” of the community’s leaders, the Ottoman state initiated the uprooting and extermination of ordinary Armenians in provinces throughout the empire. Zeytun was among the first localities targeted in early April 1915.

The pattern repeated itself with few variations: authorities separated out the Armenians from the rest of the population and ordered them into internal exile with little warning. Gendarmes and military officials led the civilians on long death marches away from cities, only to massacre them outright or allow them to perish from attacks by looters or bandits or from exposure. Survivors endured renewed ordeals, pushed ever eastward into the Syrian desert, away from the prying eyes of urban communities, journalists, or diplomats. The gendarmes herded Armenians into ill-equipped concentration camps in the desert where disease, starvation, and attacks ravaged them. While the last survivors were still agonizing, back in their homes neighbors and others were looting their houses and businesses; the state was confiscating their property through special laws; and the desecration, destruction, and plunder of religious and cultural sites was meeting little opposition.

The active extermination phase of the Armenian Genocide concluded when the Allied Powers defeated the Ottomans. The empire gave way to new republics, like Turkey, or successor states under the sway of France, such as Syria and Lebanon, or of Great Britain, such as Iraq and Palestine. Yet there was no full reckoning for the Armenians. The Republic of Turkey adopted an official policy of denial. Leaders of the republic included unrepentant genocide perpetrators, while the economic elite derived their affluence in part through confiscated Armenian wealth. In addition to silencing the past, the Turkish state subjected the remaining Armenians and all non-Muslim minorities to discrimination and persecution, and condoned hate crimes against them, fostering a culture of impunity.

Decades after its founding, the Turkish state continued to confiscate Armenian property, including communal religious property. Denial, continued persecution, hatred, expropriation of wealth, destruction of cultural monuments, appropriation of cultural achievements: there has not been acknowledgment, let alone apology, atonement, or reparation to any degree and any kind, even the most minimal, by state institutions. Indeed, genocide does not only consist in active killing but also extends to persecution, oppression, and dispossession, both open and clandestine, to the prevention of a group from the free practice of their language and religion, and to the creation of conditions that make it impossible for a community to continue to exist physically, religiously, and culturally. From this perspective, the Armenian Genocide is not only unacknowledged and denied; it persists into the present.

__________________________________

 

From The Missing Pages. Used with permission of Stanford University Press. Copyright © 2019 by Heghnar Zeitlian Watenpaugh.

Transforming a Tiny Mexican Town into an Iconic Hollywood Backdrop

$
0
0
wild bunch film

Whatever criticisms could be leveled at Sam Peckinpah, no one could question his dedication to a film project once it was under way. He labored away at it like a fiend. It became the thing; nothing else much mattered. He was too deep into his alcoholism to give up drinking altogether, but he cut way back. Compared to his consumption during the previous hunting trip in Ely, Nevada—when he took his nephew to the local whorehouses to get laid and Sam wound up dead drunk on wire spools in the back of a truck—he was almost a model of sobriety. He limited himself to drinking beer at night after work was complete. Contrary to the reputation he developed in the 1970s, he was never drunk on the set while he was working on The Wild Bunch. Too much was on the line for him, both professionally and artistically. His intensity was unmatched by anyone else’s. He was at his creative best as he created The Wild Bunch, the story that had obsessed him for more than a year now.

Likewise, William Holden swore off hard liquor. He was a beer sipper in Parras, carefully eschewing entry into the blackout zone. He generally avoided the after-hours liquor-soaked high jinks that other members of the company engaged in after shooting wrapped for the day. He likewise stayed away from the prostitutas—some imported from Mexico City—and spent his evenings quietly. Several years earlier, he’d been on safari with the goal of killing an elephant in Africa. Once the guides had led him into place and an elephant was an easy rifle shot away, Holden was unable to pull the trigger. In an epiphany it came to him that he should be working to protect African wildlife, not destroy it. Conservation of African wildlife was now his passion—and would remain so for the remainder of his life. Evenings in Parras, beer in hand, he loved nothing more than to while away the hours talking about Africa to anyone who would listen. One person who showed up at Holden’s table night after night was Billy Hart, the Texas-born stuntman and actor, who hung on to every word Holden uttered about elephants and lions, totally fascinated.

Holden’s career and personal life may have been in a slide, yet he was part of Hollywood’s royalty, at least in the eyes of many in the cast and crew. He had a regal air, but he also strived to be very much a regular guy, just Bill Beedle from South Pasadena. One day he went for a walk and encountered a large rattlesnake, which he shot, then brought back to show his Wild Bunch colleagues—the kind of thing that any guy might do, although Eddie O’Brien’s son, Brendan, staying with his dad in Parras, saw Holden with the dead snake and was scared of the movie star thereafter.

Following Peckinpah’s lead, the other members of the Wild Bunch company worked incredibly hard, Holden among them. He had his vanities to be sure. Holden may have shown up looking like a fifty-year-old who’d aged more than his years—face heavily lined, gut soft. But when Sam asked him to wear a mustache as Pike Bishop, that was too much. Holden replied, “The hell I will.” But he didn’t hold out long. He was soon sporting a mustache in front of the camera. Sharp-eyed observers noted that Holden’s fake lip hair was similar to Peckinpah’s real mustache.

Film was disappearing faster than blank rifle cartridges.

Holden, the veteran of the old studio system way of making films, had never been a part of something like Peckinpah’s free, improvisatory approach. It was nothing like the methods of, say, Alfred Hitchcock, with everything precisely storyboarded. Yet Sam stayed in control of everything, though it was a lot to manage. Day four of filming of The Wild Bunch was not atypical: 244 extras, 80 animals, 43 animal handlers. The caterer provided 372 lunches. Guns were everywhere, 239 of them as props. Hundreds more arrived in the hands of the Mexican Army troops, hired as extras for the film; they brought their service rifles with them. After the original supply of ammo ran out on the second day of filming, Phil Feldman ordered in more than ninety thousand rounds from Warners, a jaw-dropping amount by 1968 standards.

Film was disappearing faster than blank rifle cartridges. Peckinpah shot more than twenty-five thousand feet during just the first week from 131 camera setups. Each can of exposed film had to be transported from Parras to Torreón either by car or small aircraft. From there, it was moved through questioning Mexican and American governmental officials at a port of entry into the United States. Thence it went to L.A., where it was developed and printed. The dailies were then transported back to Mexico, where Peckinpah and others viewed them. The process was labor-intensive, to say the least. Though fraught with possibilities for mishaps, it worked. Records indicate just one batch of lm was accidentally ruined during a border crossing.

Lucien Ballard was at his creative best, overseeing a crew of camera operators who filmed with a variety of cameras outfitted with different-size lenses. Typically, Ballard used six cameras for action sequences, each running at a different speed. To capture images at real time, a Mitchell or Panavision camera operated at twenty-four frames per second. To achieve slow motion, a cinematographer would run the film through the camera at a faster rate. Ballard had his cameras set at variety of frames-per-second rates: 30, 60, 90, even 120. Cameras were sometimes modified to allow them to achieve fast run rates, but even then, they would have to crank through yards and yards of lm before they hit the increased speed needed. (Peckinpah’s film editors would later monkey around further with speed of shots using an optical printer.) The dailies that came back from L.A. were stunning. Ballard the cinematic alchemist and Peckinpah had conducted tests with film-stock exposures with the goal of achieving a slight sepia cast to the images without diminishing the color palette. Ballard had succeeded. The effect gave The Wild Bunch a hint of being something of a relic.

Miles and miles of film were shot each week. The best frames, the ones that ended up in the movie, were like miniature works of art, in ways akin to Frederic Remington’s paintings of rough and reckless cowboys. In Burbank, Warners vice president Edward S. Feldman was blown way: “Suddenly, the dailies came back with a yellow tinge on them. And the yellowness was fantastic. It was as if you could feel the heat coming off the film. But we’re sitting in dailies and the head of postproduction at Warners then was a Teutonic editor named Rudi Fehr, and he said, ‘Don’t worry, Ed, we’ll get it out in the lab.’ But I said, ‘You don’t understand. This is genius, whoever did this.’ He said, ‘But it’s not clear. It looks like the heat is coming off the ground.’ I said, ‘That’s what they’re trying to do.’”

Feldman realized that Warner Bros.-Seven Arts was achieving a new kind of realism. “What Peckinpah wanted to show, basically, was that dying was not glorious and people getting shot was not heroic. And he brought that quality to the picture.” With the action shot at all those different speeds, Ballard’s work created endless possibilities for Peckinpah’s film editors.

In typical Peckinpah fashion, he didn’t elucidate just what he meant by that.

Peckinpah and his crew were providing Ballard plenty of gold to capture on lm. Gordon Dawson was in full pit-bull attack mode as he and his crew churned out the costuming needed for each sequence. Who knew what Peckinpah would ask of him? James Dannaldson, who transported the ants on the plane while seated next to Coleman, stood around six feet eight inches. With Anglo faces in short supply in Parras, Sam decided he wanted Dannaldson to appear in front of the camera as—well, how about a banker? How was Dawson supposed to achieve that on the director’s whim when the guy was that tall? But Dawson sorted through all the clothes hanging on the hundreds of yards of piping and came up with something that would work.

Early on, Sam decided that he wanted Strother Martin, playing the bounty hunter Coffer, to look like a 1913 version of a Hells Angel. In typical Peckinpah fashion, he didn’t elucidate just what he meant by that. It was up to Dawson to figure it out. He rushed into a trailer and furiously began to dig around until he found a rosary, never paying any attention to the crowd of curious locals, most if not all of whom were devout Catholics, who had gathered to observe what the yanqui movie man was up to. Dawson grabbed some needle-nosed pliers and tore the tiny crucified Jesus from the rosary cross and tossed it onto the sidewalk outside the trailer door. He then wired a rifle cartridge onto the cross in place of Jesus. He turned to leave the trailer and saw the crowd of Mexicans, mouths agape, starring at him as if he were Lucifer himself. The bullet cross hanging from rosary beads became a signature prop for The Wild Bunch. I felt more than a twinge of emotion when I visited Dawson at his house in Woodland Hills decades later and he allowed me to slip that very bullet cross, which he’d carefully preserved as a souvenir from the shoot, around my neck.

Dawson’s challenges in dressing a cast representing people caught up in violent, changing times were not small. The costumes ran the gamut from then-contemporary knickers and newsboy hats for small boys to traditional cowboy attire. He had to outfit American soldiers in campaign hats with cavalry cords and Mexican federal troops as well. He had to come up with wardrobe for Anglo townspeople as well as Mexican villagers and revolutionaries. He was charged with telling a story through wardrobe. Martin’s and L. Q. Jones’s characters were the vilest to turn up in the movie, and they hardly looked like cowboys at all. Along with a newsboy cap, Jones wore an automobile driving coat that hung down to his knees, making it almost seem as if he were wearing a dress of some sort. Martin wore a filthy, battered narrow-brim hat with a wide hatband, a style with twentieth century stamped all over it, as if to say that he represents the depravity of new times.

He explained the contradiction: The movie camera was not the product of technology but rather a gift from the gods.

Peckinpah certainly wanted to push the theme of the evil that awaited mankind in the age of technology. There was no small irony here. He believed that technology suffocated the elemental humanity of people, that it was purely malevolent, trumpeted by conservatives of the ilk of Richard Nixon, whom Peckinpah loathed. I remember my college mentor, an English professor who embraced many left-wing tenets, telling us in class one day that we all had an ethical obligation to go home that day and use a shotgun to blast the screens of our TVs.

Peckinpah would have applauded that kind of sentiment even as he worked in cinema, the most technologically driven of all creative media of his day. He explained the contradiction: The movie camera was not the product of technology but rather a gift from the gods. In this he was not unlike the aging members of the Wild Bunch gang, cowboys ill at ease with the modern era who nonetheless seemed to value such technological advancements as the Colt M1911 semiautomatic pistol, pump shotguns, machine guns, and hand grenades. They no doubt likewise considered such things to be gifts from the gods.

In depicting the technological evils sprouting up in the early twentieth century, Peckinpah introduced a few anachronisms. Internet gun nerds have pointed out that Martin’s bolt-action rifle dates from the 1940s, not from the 1910s. The machine gun the gang acquires dates from late in World War I; it wasn’t in service in 1913. Other anachronisms beyond the guns appeared here and there. At the time, Peckinpah was making a picture he assumed would be seen only on the big screen, with few people taking it in more than once or twice. That someone would watch The Wild Bunch twenty or thirty times at home, pausing to examine details of rearms in high definition, was inconceivable in 1968 and ’69, and that led to small errors.

It was of no great consequence if Sam filmed Old Man Sykes being shot in the right leg and then he showed up at the end of the film with what might be a bandage on his left leg. The sore-thumb mistake that stood out the most was a shot of Pike Bishop’s gang fording what was supposed to be the Rio Grande, leaving Texas behind for Mexico. The water in the swollen river flows left to right. Anytime you leave Texas and enter Mexico, the Rio Grande below you will run from right to left as the river makes its way from New Mexico to the Gulf of Mexico.

One day early on in the production, Holden had a day off from shooting, but Sam invited him to show up anyway to observe that day’s filming. The sequences showed the aftermath of the San Rafael/Starbuck shoot-out, with Martin and Jones as the primary actors. Holden watched as Peckinpah directed the two actors, nudging them to put more and more into their performances. It clicked then with Holden that this was not going to be just another cowboy picture. He arose from his chair and began to walk away. Sam stopped him: “Wait a minute, where are you going, Bill?”

“I’m going back to my room.”

“Well, why are you going back to your room?”

“Is that the way you’re going to shoot the rest of the picture?”

“Yeah—”

“I’m going home and studying.”

No one on the cast saw Holden until it was time for him to appear in front of the camera again. Jones, who witnessed the event, said, “He went back and started working on his script because he saw this was what Sam was going to do, and this is what the actors that he was working with were prepared to do. So Bill was going to carry his end of the load. He’d obviously seen the picture slightly different, and then realized, ‘Wait a minute, this is what Peckinpah’s going to do with the supporting actors, he’s going to want the same intensity from me, so I better get my ass in gear.’ And that’s what he did.”

Holden was doing more than just studying his script. He also was studying his director. As filming continued, cast and crew noticed that Holden was developing Pike Bishop into a character very much like Peckinpah himself, right down to the vocal inflections and hand gestures. One of the crew members told Peckinpah that Holden was “doing” him. Peckinpah said, “Ah, you’re full of shit.” But the character of Pike Bishop began to have much in common with Peckinpah, a man who had lost his grip and was struggling to regain it. Holden captured the intensity of Peckinpah in Parras.

__________________________________

From The Wild Bunch. Used with permission of Bloomsbury Publishing. Copyright © 2019 by W.K. Stratton


On the Iconic First Line of One Hundred Years of Solitude

$
0
0
time tropics

I have come very late to the work of Gabriel García Márquez. I cannot quite explain why it has taken me so long to read one of his books: perhaps there was too much of a sense of duty about the endeavor, a Nobel-laureate-male-pillar-of-the-literary-canon kind of duty.

I remember One Hundred Years of Solitude on my parents’ bookshelf when I was a child: it was the “one hundred years” that put me off: it sounded like it must be something to do with history, very boring history; “solitude” didn’t sound like much fun either. I imagined it was about a man being alone for a hundred years, talking endlessly to himself in the manner of “To be or not to be?” There was also Love in the Time of Cholera, which I assumed must be about cholera. (There were many medical textbooks in the house, both my parents being doctors. I had often leafed through The Handbook of Tropical Infectious Diseases, and knew all about cholera.)

However, when I was in my twenties and happened to be browsing in the English-language section of a bookshop in Amsterdam, I picked up One Hundred Years of Solitude and read the first sentence. I read the rest of the paragraph, and then down to the end of the page, and then I went back and read the first sentence again. I put the book down and moved on, but as I wandered around the bookshop, I occasionally glanced back towards the table where the book lay. I left the bookshop empty-handed and went on with my life, but, over the next twenty or so years, the sentence kept returning to me, and, every time, I listened to the sequence of words, trying to put my finger on what was so intriguing about them. It was something to do with time, I felt—something that connected the “many years later” to the “distant afternoon,” and something about the surprising way in which the main verb had been rendered. The full sentence, in one English translation, is this:

Many years later, as he faced the firing squad, Colonel Aureliano Buendía was to remember that distant afternoon when his father took him to discover ice.

And, of course, we must have it in the original Spanish:

Muchos años después, frente al pelotón de fusilamiento, el coronel Aureliano Buendía había de recordar aquella tarde remota en que su padre lo llevó a conocer el hielo.

I will come back to this sentence, but, before I do, I would like to mention another little experience, something that happened more recently—only a couple of years ago. I was, as I am now, living in London, married, with children, and trying to write. I had been trying to write for some time, with mixed results. I went to a writing course and the teacher talked about time. “Time,” this teacher said, “goes in one direction. Forward, not back.”

She was a very good teacher; what she was saying was unarguably true, and I wondered how I had neglected to notice such an important and obvious fact; a greater awareness of this fact, I realized, would improve my writing considerably. “Indeed,” she continued, “we only have to look at the seasons, at the leaves falling in autumn, to be reminded of the passing of time.”

It was in this moment that a number of things occurred to me. Firstly: that sentence and its unusual presentation of time. Second, that García Márquez was from the tropics, like me. Third, that in the tropics we do not have four seasons—in fact, it might be said that we simply do not have seasons at all. “What you are saying does not apply to us in the tropics,” is what I thought. “Time is different for us.”

Many years later, as he faced the firing squad, Colonel Aureliano Buendía was to remember that distant afternoon when his father took him to discover ice.

You see it everywhere in fiction: the passing of time denoted by the change of the seasons—authors show us summer turning to autumn and winter turning to spring as a means of indicating the passing of time, each season with its accompanying symbolism. Now that I have lived in a temperate climate for such a long time, I understand the pervasiveness of it. As I write this, for example, it is winter: it is nearly four o’clock, already dark; I have been busy all day, and haven’t been out for a walk, and now the day is nearly gone! And yet the months of winter seem to drag by; we eagerly anticipate the return of spring.

Summer comes with a sense of permanence, as though it might go on for ever, but all too soon the leaves are turning, and then falling from the trees, and again as you walk down the street, you think, “Just like that! Another year gone!” In a temperate climate, the world is constantly shifting before your eyes: the environment presents a constant reminder of the passing of time.

But it is not so all around the world. In the tropics, there is no autumn. There is no widespread falling of leaves. There is no winter or spring. Consider the pattern in the country I come from, Trinidad & Tobago, which is a small, twin-island nation eleven degrees north of the equator, off the coast of Venezuela. It is, more or less, the same temperature all year round—80-90°F in the daytime. Every day of the year, the sun rises at around six in the morning; it is directly overhead at midday; and it sets at about half past six in the evening.

The fluctuation throughout the whole year is by about twenty minutes, little enough to feel as if there is no difference at all, as if every day is pretty much the same as the one that went before. There is no lengthening and shortening: either it is day, or it is night, simple as that. The trees have their own rhythms, but they do not coordinate with each other. There no widespread shedding of leaves, no sense of death or rebirth. Aside from the question of whether there is rain or no rain, every day is exactly the same as the one before.

In fact, the only real variation is the rain. For half of the year, it is very dry: we call this the dry season. For the other half of the year, it is very rainy: we call this the rainy season. Dry season, rainy season, dry season, rainy season: that is how it goes. Thus our rhythm is a simple one-two: dry-wet, or day-night. Dry-wet, day-night, dry-wet, day-night, a never-ending one-two one-two, like the endless ticking of a metronome. And then, of course, everywhere, at our perimeter, is the sea. Think of it. The wave comes in, then goes out. In, out. In, out. Does it ever end? Do you see any death and rebirth? Dry, wet, day, night, in, out. No, our landscape does not suggest the passing of time, as your landscape does to you in your temperate climate. Our landscape suggests something entirely different: the eternity of time, the never-endingness of time.

Our rhythm is a simple one-two: dry-wet, or day-night.

Let’s look at that sentence again.

Many years later, as he faced the firing squad, Colonel Aureliano Buendía was to remember that distant afternoon when his father took him to discover ice.

There are many wonderful things about this sentence, but the treatment of time is what I would like to talk about. Many years later connects two points in time. There’s a time, let us call it X, in the past, and the time many years later, call it Y. Many years later connects those two points, and hints at the path taken between them. Something that happened on day X is related to something that happened on day Y, and if you imagine making two dots on a piece of paper, one to represent day X and another to represent day Y, you can draw a line between them, and the line you draw is its own entity, an entity which is some function of X and Y. It’s as if it transcends the one-dimensionality of each of those coordinates, and it becomes a higher order piece of information. And we can visualize this line, the line between X and Y, as any kind of line at all: it could be a straight line, or a wiggly line, or a spiral. Whatever it might be, it is described by “many years later.” 

Now, of course, many writers may use such a phrase. When I was in my twenties, in a bookshop in Amsterdam, I read the first line of One Hundred Years of Solitude. Many years later, in my forties, living in London, I read it again. X is when I was in my twenties, in Amsterdam; Y is when I was in my forties, in London. Many years later represents the twenty or so years that have elapsed between the two events. But does the sentence give us any suggestion that there’s anything special about the period of twenty-or-so years? Could we substitute “the next day” or “some months later” or “one year later”? I think we could. I guess there’s a possibility that the author of these sentences (in this case, me) will come back and attach meaning to the “many years later,” but so far, there’s nothing to suggest that there’s anything special about that particular period of time. So all we have is:

Something happened.

Time passed.

Something else happened.

There’s nothing wrong with that; I write stuff like that all the time, and so do many other writers. The point I’m making is that to use the phrase “many years later” is not in itself a stroke of genius.

So what has García Márquez done differently? Consider the next phrase: when he was to face the firing squad—here’s where it all begins to get a little crazy. First of all, here we are, in the opening line of a novel (a long novel at that), and we’re side by side with a character who’s facing a firing squad. One moment: cosy armchair in your living room, maybe with a cup of tea next to you, say a crackling fire, the cat at your feet; the next moment: firing squad. Secondly, we are presented with knowledge of something which is usually unknowable, the moment of one’s own death. Immediately, we move into the slightly surreal territory of omniscience: that is to say, we keep one foot in reality, because, sure, a skeptic might reasonably point out that this character’s death is imminent enough for him to see it coming; but the other foot is not so firmly rooted to reality, because to foresee the moment of one’s own death is not part of our normal existence at all. 

Next, we come to the was to remember, sometimes translated as would remember (había de recordar). Where are we in time? Are we at X or Y? The moment when the character does the remembering happens at time Y, but was to remember seems to place us somewhere outside time Y, even to suspend us outside time altogether.

At the end of the sentence, we find the X, which fulfills the many years later we encountered at the beginning: that distant afternoon when his father took him to discover ice. Later in the chapter, we find out that the ice is actually ice carried around in a cool-box by a gypsy, but in this opening sentence, it could be any ice—it could be a huge and ancient glacier. The fact that it is the father taking the son to “discover” something hints at something ancient, something further past even than the “distant afternoon,” and gives the impression of knowledge of previous generations rippling through time. (The Spanish is “conocer”, which is sometimes translated as “take to see for the first time”, but it also means “to meet” as in to meet a person, or “to become acquainted with” or “to be acquainted with.” “Discover” is certainly a very beautiful choice in English.)

So, by the end of the sentence, those first three words, many years later, quietly reverberate with added meaning, because they contain almost the sum total of this man’s life, having almost reached its end. There’s a pathos in those words, depicting a character in his final moments, simultaneously reaching back across this great expanse of time—and that, in itself, is a transcendence of time—and, on the other hand, having a complete knowledge of his future. All this is captured in this single elegant sentence, this brief moment when he is able to see the whole stretch of his life, from end to end.

García Márquez presents an alternate conception of time, of time not simply as something linear, but also as something never-ending.

I’ve since gone on to read a little more of García Márquez, and it’s only through close reading of some of his other works that I’ve gained a better understanding of why this particular sentence made such an impression on me. My hypothesis is that García Márquez presents an alternate conception of time, of time not simply as something linear, but also as something never-ending.

In one sense, my teacher was, of course, quite right to say that time moves in one direction, forward and not back; none of us can ever go back to a past moment to do or undo, to say or unsay. It is a law that governs all our lives, and if fiction in some way is a reflection of what it is to be human, then fiction too must obey this law. Indeed, now that my eyes have been opened to this fact, I notice again and again that good fiction observes this law, and bad fiction ignores this law. García Márquez obeys. He does more than obey: he stands to attention, and salutes—and then he gives a little wink.

Because, simultaneously, he also manages to present an alternative notion of time, the one which I think is suggested by the tropical climate: time as infinite. Everywhere in his stories, we see notions of circularity, of endlessly repeating cycles. I have been through some of his works with a fine-tooth comb, and marked sentences according to whether they refer to future or past, and I have been not entirely surprised by the pattern that emerged, of future-past, future-past, future-past . . . the forward-back, forward-back of a ticking metronome. Just as he does in that first sentence of One Hundred Years of Solitude, often García Márquez writes as if there were a kind of parity to past and future, almost as if there were no forward arrow of time at all.

As a developing writer, I find García Márquez’s treatment of time fascinating and exciting. There are, perhaps, “rules” to good writing—and yet here is someone who did not confine himself to the space defined by these rules. These rules may have—unintentionally, invisibly—been defined by a certain portion of the world—and here they have been challenged by someone from a different part of the world, a part of the world that I myself come from. It gives me a certain confidence. It makes me feel as if anything is possible.

31 Books in 30 Days: John McWhorter on Chris Bonanos

$
0
0

In this 31 Books in 30 Days series leading up to the March 14, 2019 announcement of the 2018 National Book Critics Circle award winners, NBCC board members review the thirty-one finalists. Today, NBCC board member John McWhorter offers his appreciation of biography finalist Chris Bonanos’ Flash: The Making of Weegee the Famous (Henry Holt & Company).

*

Christopher Bonanos’ Flash: the Making of Weegee the Famous offers all of the pleasures and benefits that a solid biography should. It is, for one, the first biography of a figure recognized by a great many more from passing mentions than detailed coverage. We are surprised that the job hadn’t been done before, and gratified that it now has been, by Bonanos, City Editor at New York and thus a kind of culture vulture about town well-placed to picture and recreate New York in the way that Weegee would have perceived it.

Moreover, this book gets us behind the eyes of a person who only left so many documents of their inner thoughts to posterity and would largely rather we kept apart from them. In this, Bonanos peels away layers of mythology and reveals the truths underneath, often as intriguing as the longstanding distortions. Along the way, it serves as a primer about the emergence of an art form, journalistic photography, while in the bargain giving us a richer sense of Weegee’s art than the usual smattering of grisly little pics we often encounter in meeting him on the fly.

Finally, Weegee’s life turns out to have been a great tale in its way, beginning in immigrant poverty he seems to have largely pretended never existed, cresting in a certain renown amidst the general public and fervent respect from fellow and aspiring photographers and artists, and a slow decline in which Weegee never found a viable third act Beethovenian “late stage” and declined into mannerism. At the prime of his life, he resided in near-flophouse conditions voluntarily, spending his nights on calls chasing down opportunities for saleable photos. Predictably, settled romantic relationships and even true friendships were elusive, but this meant little to him amidst his artist’s obsession with his trade.

However, a special pleasure of the book is watching Weegee help pave the way to journalistic photos of the vivid sort we now consider normal, in contrast to the stilted or barely readable ones typical of the press before the 1930s. Far from being a mere archive of the gangland casualty shots that get around most, Weegee’s oeuvre gives us living persons at work and play in his era with a relatability that makes all but a few media photos before him look like daguerreotypes.

At the end of the day, a biography must be readable, whatever the importance of its subject. Bonanos has written a page-turner about, of all people, a grubby loner scrambling around Manhattan taking pictures of usually humble and often dirtyish goings-on, usually after dark, and with a focus bordering on the compulsive. Some would have trouble getting a magazine article out of such a man, but Bonanos neatly makes Weegee’s life more viscerally interesting than any full-length portrayals of Ulysses S. Grant or even Franklin Roosevelt. One takes up Flash grateful that someone finally got to Weegee and, almost surprised, feeling even more grateful when the book is over.

__________________________________

John McWhorter is Associate Professor of English and Comparative Literature at Columbia University, teaching linguistics, Western Civilization and music history. He has written extensively on issues related to linguistics, race, and other topics for Time, The New York Times, CNN, the Wall Street Journal, The New Republic and elsewhere, and is a Contributing Editor for The Atlantic. He is the author of The Power of Babel, Doing Our Own Thing, Our Magnificent Bastard Tongue, The Language Hoax, Words on the Move, Talking Back, Talking Black, Losing the Race, and twelve other books, including three academic monographs and two academic article anthologies. The Teaching Company has released five of his audiovisual courses. He spoke at the TED conference in 2013 and 2016, hosts the Lexicon Valley language podcast at Slate, and has appeared regularly on Bloggingheads.TV since 2006.

Pearl Harbor Was Not the Worst Thing to Happen to the U.S. on December 7, 1941

$
0
0
Pearl-Harbor

December 7, 1941. Japanese planes appear over a naval base on O‘ahu. They drop aerial torpedoes, which dive underwater, wending their way toward their targets. Four strike the USS Arizona, and the massive battleship heaves in the water. Steel, timber, diesel oil, and body parts fly through the air. The flaming Arizona tilts into the ocean, its crew diving into the oil-covered waters. For a country at peace, this is a violent awakening. It is, for the United States, the start of the Second World War.

There aren’t many historical episodes more firmly lodged in national memory than this one, the attack on Pearl Harbor. It’s one of the few events that most people can put a date to (December 7, the “date which will live in infamy,” as Franklin Delano Roosevelt put it). Hundreds of books have been written about it—the Library of Congress holds more than 350. And Hollywood has made movies, from the critically acclaimed From Here to Eternity (1953) starring Burt Lancaster to the critically derided Pearl Harbor (2001) starring Ben Affleck.

But what those films don’t show is what happened next. Nine hours after Japan attacked the territory of Hawai‘i, another set of Japanese planes came into view over another US territory, the Philippines. As at Pearl Harbor, they dropped their bombs, hitting several air bases, to devastating effect.

The army’s official history of the war judges the Philippine bombing to have been just as disastrous as the Hawaiian one. At Pearl Harbor, the Japanese hobbled the United States’ Pacific fleet, sinking four battleships and damaging four others. In the Philippines, the attackers laid waste to the largest concentration of US warplanes outside North America—the foundation of the Allies’ Pacific air defense.

The United States lost more than planes. The attack on Pearl Harbor was just that, an attack. Japan’s bombers struck, retreated, and never returned. Not so in the Philippines. There, the initial air raids were followed by more raids, then by invasion and conquest. Sixteen million Filipinos—US nationals who saluted the Stars and Stripes and looked to FDR as their commander in chief—fell under a foreign power. They had a very different war than the inhabitants of Hawai‘i did.

Nor did it stop there. The event familiarly known as “Pearl Harbor” was in fact an all-out lightning strike on US and British holdings throughout the Pacific. On a single day, the Japanese attacked the US territories of Hawai‘i, the Philippines, Guam, Midway Island, and Wake Island. They also attacked the British colonies of Malaya, Singapore, and Hong Kong, and they invaded Thailand.

It was a phenomenal success. Japan never conquered Hawai‘i, but within months Guam, the Philippines, Wake, Malaya, Singapore, and Hong Kong all fell under its flag. Japan even seized the westernmost tip of Alaska, which it held for more than a year.

Looking at the big picture, you start to wonder if “Pearl Harbor”—the name of one of the few targets Japan didn’t invade—is really the best shorthand for the events of that fateful day.

But though those embassies were outposts of the United States, there was little public sense that the country itself had been harmed.

*

“Pearl Harbor” wasn’t how people referred to the bombings, at least not at first. How to describe them, in fact, was far from clear. Should the focus be on Hawai‘i, the closest target to North America and the first bit of US soil Japan had struck? Or should it be the Philippines, the far larger and more vulnerable territory? Or Guam, the one that surrendered nearly immediately? Or all the Pacific holdings, including the uninhabited Wake and Midway, together?

“The facts of yesterday and today speak for themselves,” Roosevelt said in his address to Congress—his “Infamy” speech. But did they? Japs Bomb Manila, Hawaii was the headline of a New Mexico paper; Japanese Planes Bomb Honolulu, Island of Guam was that of one in South Carolina. Sumner Welles, FDR’s undersecretary of state, described the event as “an attack upon Hawaii and upon the Philippines.” Eleanor Roosevelt used a similar formulation in her radio address on the night of December 7, when she spoke of Japan “bombing our citizens in Hawaii and the Philippines.”

That was how the first draft of FDR’s speech went, too. It presented the event as a “bombing in Hawaii and the Philippines.” Yet Roosevelt toyed with that draft all day, adding things in pencil, crossing other bits out. At some point he deleted the prominent references to the Philippines and settled on a different description. The attack was, in his revised version, a “bombing in Oahu” or, later in the speech, “on the Hawaiian Islands.” He still mentioned the Philippines, but only as an item on a terse list of Japan’s other targets: Malaya, Hong Kong, Guam, the Philippines, Wake Island, and Midway—presented in that order. That list mingled U.S. and British territories together, giving no hint as to which was which.

Why did Roosevelt demote the Philippines? We don’t know, but it’s not hard to guess. Roosevelt was trying to tell a clear story: Japan had attacked the United States. But he faced a problem. Were Japan’s targets considered “the United States”? Legally, yes, they were indisputably U.S. territory. But would the public see them that way? What if Roosevelt’s audience didn’t care that Japan had attacked the Philippines or Guam? Polls taken slightly before the attack show that few in the continental United States supported a military defense of those remote territories.

Consider how similar events played out more recently. On August 7, 1998, al-Qaeda launched simultaneous attacks on U.S. embassies in Nairobi, Kenya, and Dar es Salaam, Tanzania. Hundreds died (mostly Africans), and thousands were wounded. But though those embassies were outposts of the United States, there was little public sense that the country itself had been harmed. It would take another set of simultaneous attacks three years later, on New York City and Washington, D.C., to provoke an all-out war.

An embassy is different from a territory, of course. Yet a similar logic held in 1941. Roosevelt no doubt noted that the Philippines and Guam, though technically part of the United States, seemed foreign to many. Hawai‘i, by contrast, was more plausibly “American.” Though it was a territory rather than a state, it was closer to North America and significantly whiter than the others. As a result, there was talk of eventual statehood (whereas the Philippines was provisionally on track for independence).

Yet even when it came to Hawai‘i, Roosevelt felt a need to massage the point. Though the territory had a substantial white population, nearly three-quarters of its inhabitants were Asians or Pacific Islanders. Roosevelt clearly worried that his audience might regard Hawai‘i as foreign. So on the morning of his speech, he made another edit. He changed it so that the Japanese squadrons had bombed not the “island of Oahu,” but the “American island of Oahu.” Damage there, Roosevelt continued, had been done to “American naval and military forces,” and “very many American lives” had been lost.

An American island, where American lives were lost—that was the point he was trying to make. If the Philippines was being rounded down to foreign, Hawai‘i was being rounded up to “American.”

“Yesterday, December 7, 1941—a date which will live in infamy—the United States of America was suddenly and deliberately attacked by naval and air forces of the Empire of Japan” is how Roosevelt’s speech began. Note that in this formulation Japan is an “empire,” but the United States is not. Note also the emphasis on the date. It was only at Hawai‘i and Midway, of all Japan’s targets, that the vagaries of the international date line put the event on December 7. Everywhere else, it occurred on December 8, the date the Japanese use to refer to the attack.

Did Roosevelt underscore the date in a calculated attempt to make it all about Hawai‘i? Almost certainly not. Still, his “date which will live in infamy” phrasing further encouraged a narrow understanding of the event, one that left little room for places like the Philippines.

For Filipinos, this could be exasperating. A reporter described the scene in Manila as the crowds listened to Roosevelt’s speech over the radio. The president spoke of Hawai‘i and the many lives lost there. Yet he only mentioned the Philippines, the reporter noted, “very much in passing.” Roosevelt made the war “seem to be something close to Washington and far from Manila.”

This was not how it looked from the Philippines, where air-raid sirens continued to wail. “To Manilans the war was here, now, happening to us,” the reporter wrote. “And we have no air-raid shelters.”

If the Philippines was being rounded down to foreign, Hawai‘i was being rounded up to “American.”

*

Hawai‘i, the Philippines, Guam—it wasn’t easy to know how to think about such places or even what to call them. At the turn of the twentieth century, when many were acquired (Puerto Rico, the Philippines, Guam, American Samoa, Hawai‘i, Wake), their status was clear. They were, as Theodore Roosevelt and Woodrow Wilson unabashedly called them, colonies.

Yet that spirit of forthright imperialism didn’t last. Within a decade or two, after passions had cooled, the c-word became taboo. “The word colony must not be used to express the relationship which exists between our government and its dependent peoples,” an official admonished in 1914. Better to stick with a gentler term, used for them all: territories.

It was gentler because the United States had had territories before, such as Arkansas and Montana. Their place in the national firmament was a happy one. The western territories were the frontier, the leading edge of the country’s growth. They might not have had all the rights that states did, but once they were “settled” (i.e., populated by whites), they were welcomed fully into the fold as states.

But if places like the Philippines and Puerto Rico were territories, they were territories of a different sort. Unlike the western territories, they weren’t obviously slated for statehood. Nor were they widely understood to be integral parts of the nation.

A striking feature, in fact, of the overseas territories was how rarely they were even discussed. The maps of the country that most people had in their heads didn’t include places like the Philippines. Those mental maps imagined the United States to be contiguous: a union of states bounded by the Atlantic, the Pacific, Mexico, and Canada.

That is how most people envision the United States today, possibly with the addition of Alaska and Hawai‘i. The political scientist Benedict Anderson called it the “logo map.” Meaning that if the country had a logo, this shape would be it.

The problem with the logo map, however, is that it isn’t right. Its shape doesn’t match the country’s legal borders. Most obviously, the logo map excludes Hawai‘i and Alaska, which became states in 1959 and now appear on virtually all published maps of the country. But it’s also missing Puerto Rico, which, though not a state, has been part of the country since 1899. When have you ever seen a map of the United States that had Puerto Rico on it? Or American Samoa, Guam, the U.S. Virgin Islands, the Northern Marianas, or any of the other smaller islands the United States has annexed over the years?

In 1941, the year Japan attacked, a more accurate picture would have been this:

What this map shows is the country’s full territorial extent: the “Greater United States,” as some at the turn of the twentieth century called it. In this view, the place normally referred to as the United States—the logo map—forms only a part of the country. A large and privileged part, to be sure, yet still only a part. Residents of the territories often call it the “mainland.”

I’ve drawn this map to show the inhabited parts of the Greater United States at the same scale and with equal-area projections. So Alaska isn’t shrunken down to fit into a small inset, as it is on most maps. It’s the right size—i.e., it’s huge. The Philippines, too, looms large, and the Hawaiian island chain—the whole chain, not just the eight main islands shown on most maps—if superimposed on the mainland would stretch almost from Florida to California.

This map also shows territory at the other end of the size scale. In the century before 1940, the United States claimed nearly a hundred uninhabited islands in the Caribbean and the Pacific. Some claims were forgotten in time—Washington could be surprisingly lax about keeping tabs. The twenty-two islands I’ve included are the ones that appeared in official tallies (the census or other governmental reports) in the 1940s. I’ve represented them as clusters of dots in the bottom left and right corners, though they’re so small that were I to draw them to scale, they’d be invisible.

When it came to strategy, those dots mattered.

Why include them at all? Was it important that the United States possessed, to take one example, Howland Island, a bare plot of land in the middle of the Pacific, only slightly larger than Central Park? Yes, it was. Howland wasn’t large or populous, but in the age of aviation, it was useful. At considerable expense, the government hauled construction equipment out to Howland and built an airstrip there—it’s where Amelia Earhart was heading when her plane went down. The Japanese, fearing what the United States might do with such a well-positioned airstrip, bombed Howland the day after they struck Hawai‘i, Guam, Wake, Midway, and the Philippines.

When it came to strategy, those dots mattered.

The logo map excludes all that—large colonies and pinprick islands alike. And there is something else misleading about it. It suggests that the United States is a politically uniform space: a union, voluntarily entered into, of states standing on equal footing with one another. But that’s not true, and it’s never been true. From the day the treaty securing independence from Britain was ratified, right up to the present, it’s been a collection of states and territories. It’s been a partitioned country, divided into two sections, with different laws applying in each.

The United States of America has contained a union of American states, as its name suggests. But it has also contained another part: not a union, not states, and (for most of its history) not wholly in the Americas.

*

This is not a country that has kept its hands to itself.

The proposition that the United States is an empire is less controversial today. The leftist author Howard Zinn, in his immensely popular A People’s History of the United States, wrote of the “global American empire,” and his graphic-novel spin-off is called A People’s History of American Empire. On the far right, the politician Pat Buchanan has warned that the United States is “traveling the same path that was trod by the British Empire.” In the vast political distance between Zinn and Buchanan, there are millions who would readily agree that the United States is, in at least some sense, imperial.

The case can be made in a number of ways. The dispossession of Native Americans and relegation of many to reservations was pretty transparently imperialist. Then, in the 1840s, the United States fought a war with Mexico and seized a third of it. Fifty years later, it fought a war with Spain and claimed the bulk of Spain’s overseas territories.

Empire isn’t just landgrabs, though. What do you call the subordination of African Americans? In W.E.B. Du Bois’ eyes, black people in the United States looked more like colonized subjects than like citizens. Many other black thinkers, including Malcolm X and the leaders of the Black Panthers, have agreed.

Or what about the spread of U.S. economic power abroad? The United States might not have physically conquered Western Europe after World War II, but that didn’t stop the French from complaining of “coca-colonization.” Critics there felt swamped by U.S. commerce. Today, with the world’s business denominated in dollars and McDonald’s in more than a hundred countries, you can see they might have had a point.

Then there are the military interventions. The years since the Second World War have brought the U.S. military to country after country. The big wars are well-known: Korea, Vietnam, Iraq, Afghanistan. But there has also been a constant stream of smaller engagements. Since 1945, U.S. armed forces have been deployed abroad for conflicts or potential conflicts 211 times in 67 countries. Call it peacekeeping if you want, or call it imperialism. But clearly this is not a country that has kept its hands to itself.

Yet in all the talk of empire, one thing that often slips from view is actual territory. Yes, many would agree that the United States is or has been an empire, for all the reasons above. But how much can most people say about the colonies themselves? Not, I would wager, very much.

And why should they be able to? Textbooks and overviews of U.S. history invariably feature a chapter on the 1898 war with Spain that led to the acquisition of many of the territories and the Philippine War that followed it (“the worst chapter in almost any book,” one reviewer griped). Yet, after that, coverage trails off. Territorial empire is treated as an episode rather than a feature. The colonies, having been acquired, vanish.

It’s not as if the information isn’t out there. Scholars, many working from the sites of empire themselves, have assiduously researched this topic for decades. It’s just that when it comes time to zoom out and tell the story of the country as a whole, the territories tend to fall away. The confusion and shoulder-shrugging indifference that mainlanders displayed at the time of Pearl Harbor hasn’t changed much at all.

Ultimately, the problem isn’t a lack of knowledge. The libraries contain literally thousands of books about U.S. overseas territory. The problem is that those books have been sidelined—led, so to speak, on the wrong shelves. They’re there, but so long as we’ve got the logo map in our heads, they’ll seem irrelevant. They’ll seem like books about foreign countries.

__________________________________

From How to Hide an Empire: A History of the Greater United States. Used with permission of Farrar, Straus & Giroux. Copyright © 2019 by Daniel Immerwahr.

Eula Biss: “A book I can’t defend, a book I can’t renounce.”

$
0
0
flag

It’s been ten years this month since my essay collection Notes from No Man’s Land was published. The book was never finished, it seems to me now, or maybe I was never finished with it. Still, I distinctly remember feeling that I had exhausted my abilities in those essays and could take them no further.

After spending my twenties asking myself what it meant to be white—in my family, among American followers of an African religion, as a teacher in New York City, as a reporter for an African American newspaper in San Diego, as a tourist in Mexico, as a graduate student in Iowa, as a new arrival to a gentrifying neighborhood in Chicago—I was in my thirties and pregnant with my first child. It was time for the book to be finished. When I stopped reworking the essays, I stopped tracking the movement of my mind on the page, but my mind kept moving and the world kept turning. White supremacy, once a term that I rarely saw applied to our contemporary American condition, has been making increasing appearances in major newspapers over the past ten years. And white privilege is all over the internet now. In this context, the book feels new again, and newly unfinished.

I remember holding the page proofs of the book in my lap while I watched Barack Obama’s speech at the Democratic National Convention in 2008. It seemed, in that moment, that I’d written the wrong book. Obama was not a legitimate citizen of this country, his critics were contending, and his wife, they would later say, was a monkey. They’re going to kill him, my husband murmured during the long applause after that speech. That applause was a preamble to the applause white liberals would give themselves after Obama’s election, as if the work of righting this country’s racial wrongs was a job well done. They did not kill him, as it turns out, they assassinated his policy. And this they, the they who did not kill Obama and the they who applauded him, were, like me, white.

Late that summer, while I was reviewing the copyeditor’s corrections to the final proofs, I was following a series of op-eds that argued Obama was either too black to be elected or too white to be a black president. I feared that I had written the wrong book because I wasn’t interested in whether Obama was too black or too white, but in what exactly it meant to be white.

In February 2009, the first month of Obama’s first term, I went into a bookstore to buy a copy of my newly released book. But after searching the table of new nonfiction and the shelves of essays and the shelves of autobiographies, I couldn’t find it, so I asked an employee for help. It’s in African American history, she told me, consulting the computer. I felt uneasy as I pulled my book out from in between Elijah Anderson’s Code of the Street and Douglas Blackmon’s Slavery by Another Name. African American history is where I went to learn about white America, but what I brought to the page of that history seemed too incomplete, and too much in service to my own project to qualify me for a place on this shelf. I took my shelving issue, which was also something of an identity issue, to the white employee of the bookstore which, as I know now, was about to go out of business. The book, I told her tentatively, having never put it this way before, was really about whiteness. “We don’t have a shelving category for that,” she said, in a tone of voice that added of course.

That whiteness was not widely considered a meaningful category was part of what made writing about it, from within it, so difficult. Whiteness was everywhere and nowhere. Nowhere in the sense that white people never seemed to talk about it. Everywhere in the sense that there was a shelf for African American history and the rest was just history. A friend of mine once joked, in response to the arrival of whiteness studies at her university, “Why do we need whiteness studies when we’ve already got art history?” My research into whiteness brought me to a study that haunts me still, a study that suggested every major life decision a white person makes—where to live, where to work, where to send her children to school—is likely to involve race as a deciding factor, though that white person is unlikely to recognize the role race plays in her life.

During the years immediately following the publication of Notes from No Man’s Land, I was very often asked some variation of the question, “What made you think you could say something about race?” I understood that I was being asked on what authority I had written this book. I had no authority, only my experience and the questions raised by that experience. “Essays about one person’s affective experience have, by their very nature, not a leg to stand on,” writes Zadie Smith. “All they have is their freedom. And the reader is likewise unusually free, because I have absolutely nothing over her, no authority.”

“More” remains the single most important word of editorial advice I have ever received.

In February 2009, I was one week from giving birth and still sweating from the effort of walking my heavy body down Navy Pier for an interview with Chicago Public Radio when I was asked, “What is the future of race in America?” I found that moment funny, as excruciating as it was. My interrogator was Richard Steele, an older black man, and as I looked across the studio at him I thought, “Fair enough. I deserve this question. But both you and I know that I can’t answer it.”

*

It would, if I had pitched it as a work of immersion journalism, have looked like an ambitious plan for a book. To move ten times in ten years, to live on the east coast, the west coast, and in the Midwest, working for a parks department and then a temp agency and then for an elite university, all the time asking myself, as I moved through one community after another, what my race meant—to me, to the people around me, and to our country.

But I didn’t pitch the book because I didn’t know I was writing a book. I was just writing essays as they came to me. When Jeff Shotts at Graywolf Press expressed interest in my work, I sent him every essay I’d ever written and he, in his gracious way, let me know that this was a miscellany, not a collection. “Find a theme,” he suggested. I shuffled the essays and sent them all back to him again, telling him the theme was “music, pain, and love.” He laughed and asked, “What else is there in life?” I still hadn’t found a theme. I removed a number of the essays, including one of my best, and read through them a few more times before it occurred to me that racial identity might be the theme of this unfinished collection. And then Jeff told me to write more. “More” remains the single most important word of editorial advice I have ever received.

I was not a journalist but an itinerant artist, though I was no less embedded in my work. I was writing to make sense of my life. The questions I was asking myself took the form of essays, the literary essay being an expansive genre that generously holds all kinds of quandaries, uncertainties, transgressions, and imperfect efforts. The essay is critical but not comprehensive. It is a refuge for unorthodox history and irregular philosophy. The essay is dedicated to the marginal. And, as Shamala Gallagher has suggested, it is the rightful domain of the marginalized.

I wasn’t marginalized by my race, but I wrote my essays from the margins. Wherever I moved, I lived at the end of the line, the last stop on the train. I had very little income for most of my twenties and no health insurance. Beyond my constant contact with other people who lived on the margins, I wasn’t well connected or particularly well informed. I followed the news, but it read like distant dispatches from a center of money and power that I didn’t inhabit. In Notes from No Man’s Land, the major news events of those years—9/11 and the war in Iraq and Hurricane Katrina—appear only briefly, and usually as the backdrop to a drama taking place outside the reach of the national news.

When I wrote in response to the news, I was often drawing from the back pages, from the story of a Long Island woman who, after a mix-up at a fertility clinic, had given birth to a black baby and a white baby who were technically twins but unrelated to each other. Or I was drawing from local newspapers like the Voice and Viewpoint, which ran a series of stories about black women whose children were taken from them by white social workers. I was drawing from documents in the Iowa Women’s Archive that told the history of Buxton, an integrated town in rural Iowa with a population that was just over half black in 1900. An entire book had already been written about Buxton by the time I did that research, but I had never heard of it. As I would discover, most of the history of race in America was history I had not heard. Last year, when I picked up Carol Anderson’s White Rage: The Unspoken Truth of Our Racial Divide, the first thought that occurred to me was that the truth of our racial divide is not unspoken, it’s just unheard.

The essays in Notes from No Man’s Land were an effort to write myself out of my own ignorance. If the narrator of those essays seems impossibly naïve at times, that’s because she was. True ignorance and false innocence were what I was writing against. I wanted to undo my ignorance and, in the process, refuse any claims to innocence that I might be tempted to make. I was trying to grow up, in other words. Those of us who are white are always in danger of remaining eternally children, in so far as we believe ourselves to be inherently innocent. And as long as we remain children, we live protected lives of helpless confusion while we burden other people with our care. “For black people,” Hilton Als writes, “being around white people is sometimes like taking care of babies you don’t like, babies who throw up on you again and again, but whom you cannot punish, because they’re babies.”

*

In the ten years between when Notes from No Man’s Land was first published and when it was reissued this month, I became a mother to a child who can now ride a bicycle and read Christopher Paul Curtis’s The Watsons Go to Birmingham—1963. I stopped moving, I held a steady teaching job, I wrote another book, I bought a house, and I planted a garden. I sat on the board of advisors for a nonprofit preschool and volunteered at the local elementary school. I attended city council meetings and school board meetings. I grew up, in a way. And I rejoined the middle class, which can serve as a blind for children who are hunting the kind of adulthood that is defined by financial security.

These past ten years were the years of the Charleston church shooting and the Ferguson demonstrations and the killing of a protestor at a white nationalist rally in Charlottesville, and these were the years of Black Lives Matter. A litany of murders made the national news, a list of names that is always incomplete and begins and ends with an ellipsis, as in… Trayvon Martin, Dontre Hamilton, Eric Garner, Michael Brown, Akai Gurley, Laquan McDonald, Tamir Rice, John Crawford, Freddie Gray, Samuel Dubose, Jamar Clark, Alton Sterling, Philando Castile, Sylville Smith, Keith Lamont Scott, Terrence Crutcher, Alfred Olango… Video footage replayed death after death as names were added to this list from every city I had ever lived in, and every city I had ever visited. The litany felt to me like a continuation of the list of lynchings in the first essay of Notes from No Man’s Land. As Angela Davis says, “There is an unbroken line of police violence in the United States that takes us all the way back to the days of slavery, the aftermath of slavery, the development of the Ku Klux Klan.”

I was no longer the person who had written this book. I could write a new book, but I couldn’t re-inhabit my younger mind, I couldn’t think backwards, and I couldn’t fill the gap between what I knew then and what I know now.

I did not fully appreciate how unbroken that violence was when I placed my essay about lynchings in a section of my book titled “Before.” But in 2014 I felt a nauseating sense of familiarity as Michael Brown’s murder was covered in the news, the particulars echoing the New York Times articles from between 1880 and 1920 that detailed 2,354 lynchings, articles that underscored the criminality of the hanged man, suggesting, without saying so, that the man was more criminal than the hanging. After Ferguson, when I opened my book to the first section, I knew that I had made a mistake. Lynching is our before and our now. Jim Crow, as Michelle Alexander reminds us, is not in the past. We have a new Jim Crow. And Obama’s presidency didn’t change that. Our current state of stasis was summed up by a man on the street in Chicago, who remarked to a local radio reporter the day after Trump was elected, “I was a black man in American yesterday and I’m a black man in America today.”

In my copy of Notes from No Man’s Land, the first essay is covered in marks and corrections accumulated over years of reading it at universities and conferences and bookstores. I do not, after all these years, find it any easier to read that list of lynchings aloud, to put that history in my mouth. My slashes and cross-outs and notations in the margins are evidence of my ongoing arguments with the essay, which, among other omissions, does not name the victims of the lynchings it recounts.

The possibility that I might correct all my mistakes for the tenth-anniversary reissue of Notes from No Man’s Land was an appealing fantasy that dissolved almost as soon as I opened the book. I was a few pages in, with my pencil ready, before I understood that I had to change everything or nothing. I was no longer the person who had written this book. I could write a new book, but I couldn’t re-inhabit my younger mind, I couldn’t think backwards, and I couldn’t fill the gap between what I knew then and what I know now. My writer’s impulse, ever impractical, was to tear up the essays and start over. But Jeff had advised me to avoid making major changes. So I erased a few sentences and corrected a few typos. I replaced the word problematic, which I’ve come to dislike, with the word false. As I read through the book from beginning to end for the first time in ten years, I flickered between mortification and pride. There were moments when I felt amused by my former self, and then frustrated, and then chilled, and then edified, having learned from my own writing something I had once known but needed to learn again. I did not feel, in the end, an uncomplicated sense of accomplishment. I felt instead that I had undergone an uncanny revisitation—a return to a place I had left behind but could never forget.

I still love the book. It’s a book I can’t defend, and a book I can’t renounce. It has its faults, I know. It discusses race, as critics have observed, almost entirely in terms of black and white. But the book has its reasons. The symbolic opposition embedded in the words black and white, the opposition harnessed by the Black Power activists of the 1960s who preferred black to the more polite negro, is the opposition I was trying to think through in my life. My repeated return to the lived experience of everyday politics between blacks and whites was not just an homage to black activism and its lasting legacy, it was my way of marking an ongoing economic relationship that is still at its most unequal and least reciprocal between black and white Americans.

Both Asian Americans and Hispanics—categories used by a study of income inequality recently covered in the New York Times—occupy a different economic reality than African Americans. That study traced the fortunes of 20 million children from my generation and found that Asian Americans and Hispanics enjoyed more upward mobility than African Americans. The only group with as little upward mobility as African Americans was Native Americans. But African American boys, specifically, grew up to face the widest income gap compared to white men. The white Americans in the study who were raised in the middle class, as I was, were more likely to move into the upper middle class than to fall into the lower middle class. After spending my twenties doing nothing that would seem to promise a place in the upper middle class, after writing three books and teaching for fifteen years, I’m uncomfortably aware of the possibility that I owe my current economic position more to my race than to my work.

The book isn’t finished. It’s not done. And neither am I. “It may stop,” William Carlos Williams wrote of the essay, “but if it stops that is surely the end and so it remains perfect, just as with an infant who fails to continue…. Whatever passes through it, it is never that thing. It remains itself and continues so, pure motion.” Unless an essay is moving and changing under my hand, it’s dead to me. I’ve always regarded publication as a process of petrification. But in my return to Notes from No Man’s Land I’ve found that time has changed the work without my hand. Time has opened new meanings, new frictions, new problems, and new questions in these essays. And so the book isn’t perfect, like an infant, but imperfect, like an adolescent still in the process of growing up.

*
Read an excerpt from Notes from No Man’s Land here.

John Freeman, Aminatta Forna, and Eula Biss will be in conversation at Georgetown on on February 26.

31 Books in 30 Days: Kate Tuttle on Nora Krug

$
0
0

In this 31 Books in 30 Days series leading up to the March 14, 2019 announcement of the 2018 National Book Critics Circle award winners, NBCC board members review the thirty-one finalists. Today, NBCC president Kate Tuttle offers her appreciation of autobiography finalist Nora Krug’s Belonging: A German Reckons with History and Home (Scribner).

*

Both of Nora Krug’s parents were born in 1946 into a Germany still reeling from war; little wonder that they rarely spoke of it when Krug herself was growing up. Casting the Nazi years into a distant and disdained past was a typical German coping mechanism, so thoroughly ingrained that when Krug begins to wonder what role her own ancestors played in the regime, she’s brought up short to realize just how recent the history really is.

Krug and her classmates were trained to understand the Holocaust, to interrogate the evils of Hitlerism. For a generation steeped in self-examination, there was an awful lot, it turned out, they didn’t look at. “We prepared questions for the old women who travled from America to tell us about the camps,” she writes, “but we never thought to ask about one another’s grandparents.”

In Belonging, Krug blends text and images into a kind of roadmap taking her back to a homeland that both comforts and confuses her. The visual components of the book range from drawings to cartoons to archival letters and photographs; married to Krug’s words, they create a stunningly effective, often moving portrait of Krug’s memories and her exploration of the people who came before her. There’s her father’s older brother, whose name her father inherited after the brother’s death in Italy as a teenaged soldier in Hitler’s army. There are fractured families and hard stories on both sides. Most affectingly, Krug’s investigation into her ancestors leads her to cousins, contemporaries, with whom she can begin to repair the old rifts.

Always lurking is the question: were my ancestors evil? Were they complicit? Were they merely surviving? Were they, perhaps, even heroic? One is rumored to have helped local Jewish families, another reputedly spoke out against the Nazis. As Krug circles closer and closer to the central facts, she’s forced to face the likelihood that these comforting stories are yet another way of deflecting a reality to painful to live with.

Throughout, Krug wrestles with the German idea of “Heimat,” a loose and perhaps untranslatable term for the landscape from which a person springs. What if your heimat is a site of collective guilt? What if you also, after moving away and marrying an American Jewish man, miss it terribly? Sprinkled throughout the book are “things German,” lovingly sketched “from the notebook of a homesick émigré”a particular brand of bandage, a red rubber hot-water bottleand it is in holding up these homely items that “Belonging” feels most tender of all. A place, and a people, can contain multitudes, from the most evil to the most beloved.

__________________________________

Kate Tuttle is president of the National Book Critics Circle. Her reviews, as well as profiles of literary figures ranging from Salman Rushdie to Leslie Jamison, have appeared in the Los Angeles Times, New York Times, Washington Post, and Newsday. She writes a weekly column about books and authors for the Boston Globe. Her essays on childhood, race, and politics have appeared in DAME, Salon, The Rumpus, and elsewhere. 

Viewing all 419 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>