Herramientas
Cambiar el país:
theatlantic.com
theatlantic.com
Poem in Autumn
Illustrations by Miki LoweMay Sarton was a novelist and an avid keeper of journals, but she considered herself a poet above all else. Novels and journals, she said in 1983, are concerned with growth over time, but “the poem is an essence … it captures perhaps a moment of violent change but it captures a moment.” In “Poem in Autumn,” she seizes just that: fall’s fleeting turning point between a memory of warmth and the cold’s inevitable creep. In that suspended instance, she sees the leaves, “touched by death,” take on a shining gold. In the first stanza, we know death to mean the coming winter. The leaves won’t survive it—they’ll shrivel and fall—but they burst with vivid color on their way out, almost as if they’re aware that time is slipping away. In the second stanza, though, Sarton is no longer talking about foliage: Now it is we, human beings, who are touched by death. We know the end is coming, and that knowledge changes something in us—our senses are heightened, our heartbeats amplified, our grief transmuted into radiance. She is capturing a moment of change, yes—but a moment can last a minute, a season, or a lifetime.— Faith HillYou can zoom in on the page here.
3 h
theatlantic.com
Palestine Isn’t Ferguson
In the imagination of the Christian West, Jews have been forced to fill every role. For 2,000 years, they have been seen as the ultimate shape-shifters: craven, feeble, abject, weak, and humiliated, but also powerful, conspiratorial, and demonic. They are the prime, indeed fatal, danger to the societies in which they live: arch-capitalists and arch-revolutionaries. Jews are a symbol, a metaphor, an essence. So it should come as no surprise that the state of the Jewish people, where almost half of the world’s Jews live, is also viewed in this way. Israel is both an obsession and an abstraction—as the Jewish people have been for much of Western history.Israel is unusual in that it existed as an idea before it existed as a nation-state. Today, it is also unusual, even remarkable, for lacking internationally recognized borders—an indispensable marker of sovereignty—and for decades it has been depriving Palestinians in the occupied territories of political rights and freedom. “After 1967, Israel stopped becoming a normal nation-state,” Arnon Degani, a Hebrew University history professor who is a member of the anti-occupation veterans’ group Breaking the Silence, told me recently. “Time passed on, and Israel becomes more and more abnormal.” Leftist Israelis—many of whom define themselves as Zionists—call the occupation criminal, atrocious, unbearable; their critique is broader, and deeper, than most of what you read or hear in the United States. As a result of the occupation, the literary critic Nissim Calderon told me, “Wider and wider circles of life, both for Israelis and Palestinians, become infected with cruelty.”But the peculiar ways in which Israel has been historically viewed—and the ways in which, in the most recent Israel-Hamas war, it was depicted as an almost metaphysical evil—have deeper, and other, roots. “The reality of Israel is, in large measure, a projection of fantasies, both by those who want to love the place and those who are consumed by hatred for it,” wrote the Israeli American writer Joel Schalit. Or, as Etan Nechin, an Israeli journalist who edits The Bare Life Review, a journal of immigrant and refugee literature, argues, “The left thinks that Israel exists only on a highly ideological-political level. There are no people in it. It’s only a tabula rasa.”[Matti Friedman: Israel’s problems are not like America’s]Any useful analysis of the Israeli-Palestinian conflict requires engaging with an unresolved, frustratingly complex, grievously resilient struggle between two national movements, each with a justified claim to the land. Once that effort is abandoned, a vacuum ensues. It is filled by the transformation of a country into a metaphor; by the rewriting (or ignoring) of history; by Manichean thinking; and by the conversion of language into a means of performance rather than a description of reality.Leftist theorists have a long tradition of turning the Jewish people into an abstraction. In his 1843 essay “On the Jewish Question,” a very young Karl Marx wrote that, because Judaism’s essence was “practical need, selfishness … haggling and money,” a truly free world would entail “the emancipation of humanity from Judaism.” Some—including the Marxist philosopher with whom I live—argue that this essay isn’t anti-Semitic, because Marx wasn’t addressing actual Jews but rather Jews as the symbolic essence of capitalism. But this is precisely the point, and the problem: As many racial-justice theorists have pointed out, transforming a people into a concept is an act of dehumanization.Since Zionism’s inception, the left—following Marx—has often projected its fixations onto Israel and the state’s political conflicts, and thereby sorely misunderstood them.Hannah Arendt and Arthur Koestler were each, in their different ways, exemplars of this propensity. Both traveled to the pre-state Yishuv (and then to Israel), both had extremely conflicted attitudes toward Zionism and Israel, both can be categorized as having been, at various times, Zionists and anti-Zionists. Arendt was a rhapsodic supporter of the Yishuv, though she opposed partition, hated David Ben-Gurion, and was a fierce critic of the Zionist movement. Her fears that Israel would devolve into ethnic nationalism, and would find itself in constant conflict with its Arab neighbors, proved astute—painfully so. But she rejected the prism of either colonialism or imperialism. Instead, she perceived that the early Zionists had created something new: History is not merely a series of repetitions. “The building of a Jewish national home was not a colonial enterprise in which Europeans came to exploit foreign riches … at the expense of native labor,” she wrote. The Yishuv “could not possibly fit into the political scheme of imperialism because it was neither a master nor a subject nation.”Yet Arendt also tended to view the new state through the catastrophic lens of German history. Visiting Israel for Adolf Eichmann’s trial in 1961, she wrote, “The parallels are fatal, particularly in the details.” Actually, it’s hard to think of two nations that were—and are—less parallel than Germany and Israel. This mistaken identification led her to prophesize a series of disasters for Israel that were wide of the mark. Statehood, she insisted in 1948, would lead to the Yishuv’s collapse, though the opposite proved true; later she warned of a possible military dictatorship, isolation, cultural sterility, and domination by the Soviet Union. And her understanding of Palestinian politics—an essential part of the equation—was virtually nil.Arthur Koestler—fervent Communist, fervent anti-Communist—became enamored of militaristic Revisionist Zionism as a university student, and in 1926 he had a brief, unhappy stint at a Zionist commune in Palestine. (In fact, his comrades expelled him.) Like Arendt, Koestler transferred the traumatic European politics of the interwar period—especially its leftist politics—onto Israel. This meant that he misunderstood quite a lot. He viewed the strife between Labor Zionists and Revisionists as a replay of the deadly Stalinist-Trotskyist antagonism of the Spanish Civil War. He believed that Hebrew (which he failed to master) would separate Israelis from European culture and prove intellectually sterile. He charged Ben-Gurion with establishing a “totalitarian” regime, and compared what he called “Haganahism” to Nazism and Stalinism. In his view, interest in Israel would wane: He predicted that 50 years after its founding, “few will take an interest” in Israel’s birth or would dispute partition, and that Israel would ultimately “become an entirely ‘un-Jewish’ country”—a prospect of which he highly approved. He turned out to be far less prescient than Arendt.In the late 1960s and early 1970s, a new theory emerged among leftist anti-Zionists: Jews who had fled their homes in the Arab world—Iraq, Yemen, Morocco, and elsewhere—would unite with Palestinians to overthrow the presumably oppressive Zionist state and establish … well, something else. These leftist activists assumed a natural—that is, ethnic—affinity between Palestinians and Jews from the Arab world. After all, both were apparently non-European (or, in today’s parlance, “people of color”). The theory proved catastrophically wrong, because it ignored the discrimination—and, sometimes, violence—that Jews had experienced in Arab countries, and the enmities that led many of their Muslim Arab neighbors to drive them out. Today, Arab countries have virtually no Jewish citizens, and Mizrahi Israelis constitute a key part of the Israeli right’s base.[Micah Goodman: How to shrink the Israeli-Palestinian conflict]When a country—or a people—is treated as a blank canvas, almost anything can be painted onto it. Israel’s victory in the 1967 War—which birthed the occupation—transformed the country, in the eyes of the global left, into the colonialist, imperialist, racist, even fascist monster of the Middle East: “the new Shylock of the non-aligned world,” as the socialist-Zionist Simha Flapan wrote at the time. This was true in both Europe and America. Just two months after the war—when there was still free movement between the conquered territories and Israel, when there were virtually no settlements, and when the occupation was far from certain—the Student Nonviolent Coordinating Committee accused Israelis of “imitating their Nazi oppressors.”This tendency to veer from vehement but rational political criticism to grotesquely engorged vilification was most extreme in the theory and practice of the West German New Left. The tormented descendants of the Auschwitz generation aligned with Palestinian terror groups, and—irony of ironies—designated Israel as the fascist, genocidal successor to the Third Reich. The Israeli German historian Dan Diner has termed this bizarre equation of Germans and Jews an “exonerating projection”: an attempt to normalize Nazism by transposing it onto its victims.Something similar is happening with the delegitimizing charges of “imperialism” and “settler colonialism” that some members of today’s left in Europe and the U.S. hurl against Israel, the historian Benny Morris told me. “The liberal left feels guilty about its past crimes,” said Morris, whose book The Birth of the Palestinian Refugee Problem, 1947–1949 is a canonical work in revisionist Israeli history. “And this is projected onto current conflicts, especially the Israeli-Arab conflict.” He added, “There’s a basic anti-Semitism in the West and a basic obsession with the Holy Land in the Christian West. And these two things make it impossible for anybody to look at Israel in a neutral way.” Seventy years after its founding, Israel is regarded (by Jews and non-Jews, right and left, West and East) as a cause, a tragedy, a miracle, a nightmare, a project—one that is highly provisional and should perhaps be canceled. Is there any other sovereign nation, from the most miserable failed states to those that are flourishing, of which the same can be said?Once specificity vanishes, metaphors bloom. One of the left’s favorites is Israel–as–South Africa. The Boycott, Divestment, and Sanctions movement, for instance, is built on the insistence that Israel is a replica (or, in Arendtian terms, a “parallel”) of South Africa pre-1994. In this view, because an international boycott isolated South Africa and helped end apartheid, an international boycott will isolate Israel and help end the occupation (or, perhaps, end Israel itself, as many BDS supporters seem to hope). But the two countries aren’t really the same, and a strange thing has happened: In the years since the BDS movement was founded, Israel has become less isolated from other nations, its economy has flourished, and Arab Israelis have made impressive gains in education and employment—even as the occupation has become more entrenched. Something is wrong with the metaphor. Still, BDS soldiers on, routinely proclaiming its victories. It is doubtful, though, that a boycott of Israel—even by Sally Rooney!—will persuade most Israelis that the occupation should end, any more than a boycott of the United States would have convinced many Americans to dump President Donald Trump.Accompanying obfuscating metaphors are profound distortions of history—or, rather, anachronistic readings of it. The socialist magazine Jacobin, for instance, confidently states that Israel was “born out of nineteenth-century European imperialism.” The inhabitants of the Yishuv were a varied lot, but many, indeed most, were immiserated refugees fleeing oppression and then extermination: “Israel is the State of the displaced person,” the decidedly non-Zionist Isaac Deutscher, a Marxist historian, noted. The contemporary left has somehow transformed these refugees into wily, powerful, “non-indigenous” imperialists who sat in Kyiv and Vilnius, scheming to steal land from Arab peasants. (It is baffling to hear leftists, the great defenders of refugees and immigrants, divide the inhabitants of Israel and Palestine into those who deserve to build a life there—the “indigenous”—and those who don’t.)[Read: A new word is defining the Israeli-Palestinian conflict in Washington ]Rather than imperialism, modern Zionism was rooted in the national-liberation and socialist movements of the late 19th and early 20th centuries. In particular, the kibbutzim were hailed by leftists as the purest form of anti-authoritarian communism—built, as a delighted Deutscher wrote, “by the self-sacrifice and courage of idealistic intellectuals and workers.” But apparently Deutscher was wrong; another Jacobin writer now informs us that the kibbutz movement embodied the “negation of socialism” and the sin of “ethnic separatism.”The ethnic-racial lens is a particularly inapt frame through which to view the unique circumstances in which Zionism developed. The Hebrew Labor movement—the spine of the state—was based on the principle that Jews must earn the right to the land through their own self-sufficient labor and that they could not exploit Arab workers; they refused to become bosses of Arab workers or peasants. Yet for Jacobin, this epitomized contempt for Arab labor as “a primitive mode of production unfit for the proletarian revolution” and “the total racialization of the class struggle,” an almost comic misreading. As the journalist and historian Bernard Avishai pointed out to me, Hebrew Labor was “in many ways the opposite of classical colonialism … The left never understood that the ‘colonial project’ [of Israel] was basically a desperate effort to create a Jewish cultural life that would be resilient enough to survive the modern world.” He adds that the Zionists “were afraid to become Arabic-speaking overseers of Arab labor. So by the time of the second Aliyah, there were collective institutions that excluded Arabs … That looked like a racist thing, but so does affirmative action look like a racist thing, if you don’t understand its purpose.”Along with the misreading of history is its essentialization: Zionism is a project “of systemic, massive violence,” one recent BDS petition contends. The extraordinarily tangled history of the Zionist movement—which includes Marxists and capitalists, peacemakers and militarists, secularists and believers, humanists and racists—is actually a consistent record of being “inherently violent,” according to a student group called Michigan in Color. (Whether the Palestinian movement has an “inherent” character remains unexplored.) Zionism is depicted as a kind of iron cage—stamped from the beginning, so to speak—instead of a fluid political movement that developed in dialectical relation to world events. Such essentialist views have traditionally been expressed by historians who believed in the German concept (later embraced by the Nazis) of the Volksgeist; it is startling to hear them propounded by progressives.And Zionism developed, especially, in relation to the national movement of its neighbors, the Palestinians. Their agency, too, has been erased; instead, they are depicted in cartoonlike form as either mighty, unbowed anti-imperialist warriors or innocent, reactive victims. In fact, leftists seem as uninterested in the rich texture of Palestinian politics as they are in Israeli politics. A wide range of views exists among Arab Israelis and Palestinians in the occupied territories; the American left might at least notice that Arab Israeli leaders such as Ayman Odeh, the head of the Arab Joint List, and Palestinian leaders such as Marwan Barghouti, now imprisoned on multiple terrorism charges, both support a two-state solution.In the Israel-Hamas war last May, the “racialization” of the Israeli-Palestinian conflict—the latest use of unilluminating metaphors and false symmetries—became widespread. “From Ferguson to Palestine!” appeared on posters and petitions and rang out at demonstrations. The Black Lives Matter movement—and African American oppression in general—was repeatedly likened to, or even conflated with, the Israeli-Palestinian conflict. According to the U.S. Campaign for Palestinian Rights, residents of Ferguson, Missouri, and those in Palestine both face “an occupying force.” An activist with the progressive American Jewish group IfNotNow confidently explained to The New York Times that racism in America and the conflict between Israelis and Palestinians amount to “the exact same system.” Zellie Thomas, a Black Lives Matter organizer in New Jersey, asserted at a pro-Palestinian demonstration, “We know occupation; we know colonization.” The contemporary American obsession with race and skin color, in which politics reduces to stark racial categories and racial categories reduce to even starker moral ones, was transposed to a country and a conflict in the midst of the Arab world. But is the situation of a stateless Palestinian living under the corruption and ineptitude of the disempowered Palestinian Authority, or ruled by the jihadist authoritarians of Hamas in isolated, besieged Gaza, meaningfully analogous to that of a Black citizen in 21st-century America? If not, what do the words occupation and colonization signify, other than linguistic bravado? This is a form not of solidarity but of self-regard. And surely the Palestinians deserve far more than this—deserve, that is, to be seen within the political and moral context of their own society, movement, and history rather than as a projection or pawn of American preoccupations.This transposition of a national conflict between two peoples into a racial one strikes many Israelis as, in the words of the historian and journalist Gershom Gorenberg, “insanely absurd” and “embarrassing.” In reality, Israel is one of the most multicultural societies on Earth, composed of immigrants from around the world; anyone standing on a Jerusalem street for half an hour will see Jewish Israelis, born in countries from Scandinavia to the Horn of Africa, who, naturally, range widely in appearance. It is estimated that a majority of Jewish Israelis are descendants of those who fled, or were kicked out of, the Arab or Muslim countries in which they had lived for centuries; they are more likely to hail from Morocco than from Germany. Arab Israelis and Palestinians also vary widely in appearance, which is why so many Jewish Israelis are indistinguishable from so many Palestinians.The Ferguson metaphor is no more useful than the South African one, and it illustrates the great weakness—and the great temptation—of metaphorical thinking in general: It offers ready-made analyses and ready-made solutions. “The problem with analogies,” Gorenberg told me, “is that they take something you don’t understand, equate it with something you do understand, and make you think you understand it.”Once language is unmoored from reality, it can become unhinged, which may be why the old, ugliest eliminationist rhetoric that the Palestinian Liberation Organization used before the Oslo Accords circulated widely among purported progressives during the last war. An organizer for Students for Justice in Palestine pithily explained at a rally, “Zionism is genocide. Zionism is racism. Zionism is violence.” In 2016, the Movement for Black Lives accused Israel not only of exploiting and oppressing the Palestinians, which it is, but also of committing “genocide.” The union of New Yorker workers tweeted its solidarity with Palestinians “from the river to the sea” without, apparently, understanding that the phrase has traditionally implied the elimination of Israel. (The union later deleted the tweet and apologized.)In statements and petitions, the words racism, imperialism, colonialism, settler-colonialism, apartheid, capitalism, and genocide were clotted together into a smorgasbord of evil, as if the writers couldn’t decide which to choose. I received many of these petitions. They reminded me of George Orwell’s warning, in “Politics and the English Language,” about the intimate connection between debased political language and debased political thought: “As soon as certain topics are raised, the concrete melts into the abstract and no one seems able to think of turns of speech that are not hackneyed: prose consists less and less of words chosen for the sake of their meaning, and more and more of phrases tacked together like the sections of a prefabricated hen-house.” The intent is not to make a political argument—to explain, to convince—but to elicit Pavlovian reactions of disgust, thereby bypassing actual thought.The recent equation of African American oppression and the Israeli-Palestinian conflict has been hailed as a triumph of intersectionality, whose proponents aim to build international solidarity across barriers of class, race, gender, and nation. And sometimes, they do. But in the current case, the theory has been used (or, I would argue, misused) to occlude complex realities, negate history, prevent critical thinking, and foster juvenile simplifications.Intersectionality’s original theorists were Black women who developed nuanced arguments about the tangled political, legal, social, historic, and structural factors that undergird inequalities. Thus, a truly intersectional approach to the Israeli-Palestinian conflict would, of necessity, incorporate the Jewish people’s torturous history of expulsion, pariahdom, statelessness, and genocide. A truly intersectional approach would incorporate the realization that, while Israel is far more powerful than the Palestinians, it is an often besieged minority within the larger Arab and Muslim worlds—something of which even the most left-wing Israelis are acutely aware. (As Nissim Calderon, who has been an anti-occupation activist for 50 years, explained to me, “In the reality of the Middle East, without a state, we will be murdered. By Iran, Hezbollah, Hamas—everyone.”)A truly intersectional approach would recognize Israelis’ need for, and right to, security. An intersectional left—or a simply honest one—would not delicately turn away from the religious sectarianism, violent repression, and anti-feminism of Hamas and Islamic Jihad. It certainly could not dismiss discussion of Hamas’s rockets as, in the words of Scholars for Palestinian Freedom, “stale talking points.” A truly intersectional left might notice that the recent Arab Lives Matter movement, organized by Israel’s Arab citizens, is angrily demanding more police protection in response to the alarming surge in crime, including murder, in Israel’s Arab-majority towns. Apparently, Taibeh and Minneapolis aren’t quite the same.Instead, what we now have is a kind of deformed intersectionality—intersectionality lite—in which the theory has been robbed of its challenging nuances and flattened into a starkly reductionist insistence that the Israeli-Palestinian conflict is Manichean. Or even, by a sleight of hand, that it doesn’t exist at all; as a Canadian Green Party lawmaker recently tweeted, “There are no two sides to this conflict, only human rights abuses! #EndApartheid.” Of course, the right is no different from the left in finding something comforting, or at least comfortable, about this sort of dichotomous vision. Right-wing American supporters of Israel—including many members of AIPAC, for which the Jewish state is a perpetually innocent dream palace—are equally facile, and willfully blinkered, in their views.There is another problem with intersectionality, at least in the way it is now being used. It, too, is a kind of conceit—an updated version of “We Are the World.” As the political theorist Michael Walzer told me, “Intersectionality is a genuinely useful idea. But there is no intersection between American Blacks and Palestinians. The moral significance of solidarity is that it extends solidarity to people with whom you have no intersection. Intersectionality is an entirely different idea from internationalism.” The Israeli journalist Etan Nechin observed to me that the American left’s discourse on Israel is “an offshoot of identity politics, with emphasis on ‘me.’ But internationalism was never about that.” To support other peoples or movements because they are somehow “like” you—or because they “look like you”—betrays the traditional ethos of internationalism.And in the Manichean imagination—and this, I think, is its greatest sin, if I can use that word—the democratic forces within Israel, both Jewish and Arab, are rendered literally invisible, as if by a perverse magic trick. In Haaretz, Nechin recently charged that those on the American left—and particularly the Jewish American left—“dismiss realities on the ground in Israel and Palestine entirely, and instead offer high-minded ideological critiques.” As for ending the occupation, American leftists “expect … if that day comes, [that] it won’t be because of the work of decades by the Israeli left, but because Americans boycotted SodaStream.” Gone missing are “the hundreds of thousands of union workers, writers, doctors, teachers, activists, and everyday people within the Green Line who protested the Jewish Nation State Bill, or go out on a Friday afternoon to stand in solidarity next to their Palestinian neighbors.”***Today’s left, and today’s liberals, are in a bit of a pickle—or at least in a state of moral and theoretical disarray. I don’t exempt myself from that. It is extremely hard to figure out how to extend solidarity—in real, not rhetorically grandiose terms—to Syrians and Afghans; to democracy activists in China, Nicaragua, and Hong Kong; to horrifically endangered peoples such as the Uyghurs and Yazidis and Rohingya. Ending the occupation, and strengthening endangered democratic institutions in Israel, are goals that rank high on the list of political urgencies for some of us.In the current, often bewildering international context, the venomous attacks on Israel qua Israel offer a seductively easy, morally antiseptic—and, I would add, appallingly self-absorbed—way to intervene in foreign affairs. The hysterical hyperbole, the self-referential projections, the lazy conflations, the warped histories that abound today: All substitute for solidarity. What is needed, I believe, is an entry into the world of political thought, whose foundation is the ability to make distinctions within the context of history rather than to crush them.So no, Palestine isn’t Ferguson, Israel isn’t South Africa, and Zionism isn’t white supremacy. As Arendt wrote, the activity of thinking—the very basis of politics—begins with the knowledge that “A and B are not the same.”
4 h
theatlantic.com
What the Trump Books Teach Us
William Blake once proposed that John Milton was “of the Devil’s party without knowing it” because he evoked Satan in Paradise Lost with such gusto. By contrast, Blake observed, Milton seemed inhibited when he wrote of plodding, sanctimonious old God. Have Donald Trump’s recent chroniclers, most of whom quote the former president liberally and with relish, turned to the devil’s party?Loathsome characters bring out zestful writing, and authors who represent Trump as perilous to democracy—that is, all writers with eyes and ears—could find that the danger the former president poses to America’s future is more cinematic than democracy itself.Peril, the latest big book about the former president, is not the best book by Bob Woodward, or even his best about Trump. That would be Fear, which came out in 2018. But in Peril, Woodward and his co-author, Robert Costa, manage to pull off a singular trick. They don’t let Trump’s devilish ravings, tweets, and tantrums run roughshod over their own, more disciplined voices. Woodward and Costa flex their rhetorical muscles not by writing the hell out of the Trump character, but by smacking down their arch-villain, keeping a choke chain on his every utterance.When writing about the appalling presidential debate of September 30, 2020, they skip Trump’s cruel and confounding yawps about Joe Biden and Biden’s son, Hunter. They also ignore the Proud Boys, whom Trump that night refused to condemn. Given that group’s participation in the attacks of January 6, Trump’s words—“stand back and stand by”—now seem stomach-churning and fateful. But in Peril, the sole line Woodward and Costa quote from that debate is Biden’s demand of Trump: “Will you shut up, man?” With this choice to not quote Trump at all, the book elegantly obliges Biden.For years, Woodward has been accused of styling himself as “impartial” during a crisis that demands partiality. But this underestimates the old master’s ego. Woodward takes a side: his own. His voice in Peril is imperious, swaggering, and territorial. He and Costa lock their subject in a narrative cage, where he remains mostly gagged.[David Frum: Woodward missed everything that matters about the Trump presidency]Other recent Trump books allow their subject more space to strut and fret. This has costs, but it also means they bring more brio to evoking the former president. These books are potboilers: Stephanie Grisham’s I’ll Take Your Questions Now, Michael C. Bender’s “Frankly, We Did Win This Election,” Carol Leonnig and Philip Rucker’s I Alone Can Fix It, and Michael Wolff’s Landslide. These Trump books align in that they keep the former president’s flamboyant psychopathy center stage, where readers can hate-watch it. They all read like airport thrillers.But the books also play back Trump’s falsehoods, sometimes at top volume. Three draw their title from lies told by Trump, and two directly quote the so-called Big Lie. Trump didn’t win the 2020 election—neither “frankly” nor by a “landslide”—and he alone could not fix jack. But it’s not just the titles that replay Trump’s lies. At regular intervals, Grisham, Bender, Leonnig and Rucker, and Wolff quote or cite Trump’s horseshit, often letting it steam there, uncorrected.This can have unnerving effects. About midway through Landslide, Wolff writes of “the president’s determination to sully Joe Biden,” a motivation for defamation and lies if ever there was one. (See: Trump’s first impeachment.) But hot on the heels of this statement, Wolff asserts that Trump has “absolute belief that the Bidens were among the most corrupt political families of all time.”Does he? An absolute belief? Wolff doesn’t mention that this is a ludicrous claim, and with Trump hardly anything is “absolute” or a “belief.” But to note any of this would break Wolff’s narrative flow; his talent is for free indirect discourse, which lets him enter the minds of his principals, and he’s never going to clutter his slick prose with allegedlys or weasel words chosen by lawyers. So rather than punish the character of Trump, as Woodward does, Wolff lets Trump run wild. In all of his books, including a new one out this month about, no joke, “the damned,” Wolff is inexorably drawn to the devil. (Unlike Milton, he always knows it.)Another example of the difficulty of rendering Trump’s freaky deceptions comes in a chapter about his 2020 electoral defeat in I Alone Can Fix It. In describing Trump’s rejection of data, Leonnig and Rucker write, “Georgia was MAGA territory—or so Trump thought.” Georgia in 2020 was very much not MAGA territory. Biden beat Trump statewide to win the state’s 15 electoral votes, and both of its Senate seats flipped to Democrats. But the fact that Trump’s stubborn delusion—“Georgia was MAGA territory”—is allowed to air out like that means we’re in Trump’s head as he churns over the Big Lie. Once again: Does he really think he won Georgia, i.e., that it was MAGA land? Or did he simply want Georgia officials to pretend that he’d won so he could stay in the White House?The title of “Frankly, We Did Win This Election”: The Inside Story of How Trump Lost does keep Trump’s Big Lie securely in quotation marks, and corrects the record with its subtitle. But elsewhere in the book, Bender prolifically recaps the inane banter among Trump and his cronies while also reproducing some of Trump’s most persistent lies about, for example, the size of his rallies. “Nobody has seen anything like it ever,” Bender quotes Trump saying. “There has never never been anything like it.” (Bender, to be fair, points out that Trump hurts himself when he imagines that his distorted apprehension of crowd size is more accurate than the polls that predicted he’d lose the election.)During the 2016 campaign, cable news channels aired Trump’s rambunctious campaign rallies live, and did nothing to correct his lies. In those days, his whoppers seemed so self-refuting that they could pass as reality-TV bacchanalia. Like Alex Jones, whose lawyer has called him a “performance artist,” Trump’s Barnumism was left unchecked for years simply because nothing as appalling had ever been seen in presidential politics. After five years, we’ve become inured to Trump’s lies, and many of us can recite them as if they are an anthem-rock chorus. Fact-checking, by contrast, requires complexity and pedantry; no one chants Daniel Dale’s brilliant fact-checking live-tweets at Jones Beach.[Read: Fact-checking the president in real time]Trump is simply a narrative migraine. To write a monograph about a figure whose speech and actions don’t comport with identifiable beliefs—much less with reality—is to get in deep with a flailing, splintered, and antisocial mind. Grisham, Trump’s former press secretary, quotes several of Trump’s non sequiturs, including some trash talk about the mother of a prime minister. These choice quotes stop her story like a record scratch. And there’s always a reaction shot: Grisham agape at the audience, reflecting on her own WTF. She quotes Trump’s bunk less to correct or satirize him than to render her own chronic bafflement at the former president’s “batshit things.” It hits the spot.Usually, depth psychology—the theory that there are distinct emotions, sensations, and needs somehow “under” one’s personality—is steady ground on which to build a portrait. But with Trump, it falters. Does he even have an interior life? In 1997, in an astute profile of Trump in The New Yorker, Mark Singer concluded that his subject leads “an existence unmolested by the rumbling of a soul.” The British writer Nate White also defines Trump by absences: “He has no class, no charm, no coolness, no credibility, no compassion, no wit, no warmth, no wisdom, no subtlety, no sensitivity, no self-awareness, no humility, no honor, and no grace.”If the afterwords and acknowledgments of all these books are any guide, the authors seem entirely spent by effort. No wonder. The skull of Donald Trump, where delusions and desperation clamor for nourishment like hungry ghosts, is a grim place to spend time. Other readers may have chosen to leave these disturbing books on the shelf; me, I’m grateful that so many observers concluded, as Grisham did, “I have to get this all out so I can process, in my own mind, what the hell happened.”In their various idioms, Bender, Grisham, Leonnig and Rucker, Wolff, and Woodward and Costa have shed collective light on what the hell happened. And they’ve done a supreme public service simply by etching the events of America’s bleak recent history into the record, where they will be more difficult for Trump and his heirs to lie about in the years ahead. When Condoleezza Rice recently urged Americans to “move on” from the January 6 insurrection, all I could think was, No, no, no, don’t move on; read these books. And when Trump runs again in 2024, remember that those who forget history are condemned—ah, but you know the rest.
4 h
theatlantic.com
Reclaiming Jesus From His Church
The election of the elders of an evangelical church is usually an uncontroversial, even unifying event. But this summer, at an influential megachurch in Northern Virginia, something went badly wrong. A trio of elders didn’t receive 75 percent of the vote, the threshold necessary to be installed. “A small group of people, inside and outside this church, coordinated a divisive effort to use disinformation in order to persuade others to vote these men down as part of a broader effort to take control of this church,” David Platt, a 43-year-old minister at McLean Bible Church and a best-selling author, charged in a July 4 sermon.Platt said church members had been misled, having been told, among other things, that the three individuals nominated to be elders would advocate selling the church building to Muslims, who would convert it into a mosque. In a second vote on July 18, all three nominees cleared the threshold. But that hardly resolved the conflict. Members of the church filed a lawsuit, claiming that the conduct of the election violated the church’s constitution.Platt, who is theologically conservative, had been accused in the months before the vote by a small but zealous group within his church of “wokeness” and being “left of center,” of pushing a “social justice” agenda and promoting critical race theory, and of attempting to “purge conservative members.” A Facebook page and a right-wing website have targeted Platt and his leadership. For his part, Platt, speaking to his congregation, described an email that was circulated claiming, “MBC is no longer McLean Bible Church, that it’s now Melanin Bible Church.”What happened at McLean Bible Church is happening all over the evangelical world. Influential figures such as the theologian Russell Moore and the Bible teacher Beth Moore felt compelled to leave the Southern Baptist Convention; both were targeted by right-wing elements within the SBC. The Christian Post, an online evangelical newspaper, published an op-ed by one of its contributors criticizing religious conservatives like Platt, Russell Moore, Beth Moore, and Ed Stetzer, the executive director of the Wheaton College Billy Graham Center, as “progressive Christian figures” who “commonly champion leftist ideology.” In a matter of months, four pastors resigned from Bethlehem Baptist Church, a flagship church in Minneapolis. One of those pastors, Bryan Pickering, cited mistreatment by elders, domineering leadership, bullying, and “spiritual abuse and a toxic culture.” Political conflicts are hardly the whole reason for the turmoil, but according to news accounts, they played a significant role, particularly on matters having to do with race. “Nearly everyone tells me there is at the very least a small group in nearly every evangelical church complaining and agitating against teaching or policies that aren’t sufficiently conservative or anti-woke,” a pastor and prominent figure within the evangelical world told me. (Like others with whom I spoke about this topic, he requested anonymity in order to speak candidly.) “It’s everywhere.”Michael O. Emerson, a sociology professor at the University of Illinois at Chicago, told me that he and his research team have spent the past three years studying race and Christianity. “The divisions and conflicts we found are intense, easily more intense then I have seen in my 25 years of studying the topic,” he told me. What this adds up to, he said, is “an emerging day of reckoning within churches.”The aggressive, disruptive, and unforgiving mindset that characterizes so much of our politics has found a home in many American churches. As a person of the Christian faith who has spent most of my adult life attending evangelical churches, I wanted to understand the splintering of churches, communities, and relationships. I reached out to dozens of pastors, theologians, academics, and historians, as well as a seminary president and people involved in campus ministry. All voiced concern.The coronavirus pandemic, of course, has placed religious communities under extraordinary strain. Everyone in America has felt its effects; for many Christians, it’s been a bar to gathering and worshipping together, sharing communion and performing baptisms, and saying common prayers and participating in rituals and liturgy. Not being in community destabilized what has long been a core sense of Christian identity.But there’s more to the fractures than just COVID-19. After all, many of the forces that are splitting churches were in motion well before the pandemic hit. The pandemic exposed and exacerbated weaknesses and vulnerabilities, habits of mind and heart, that already existed.The root of the discord lies in the fact that many Christians have embraced the worst aspects of our culture and our politics. When the Christian faith is politicized, churches become repositories not of grace but of grievances, places where tribal identities are reinforced, where fears are nurtured, and where aggression and nastiness are sacralized. The result is not only wounding the nation; it’s having a devastating impact on the Christian faith.How is it that evangelical Christianity has become, for too many of its adherents, a political religion? The historian George Marsden told me that political loyalties can sometimes be so strong that they create a religiouslike faith that overrides or even transforms a more traditional religious faith. The United States has largely avoided the most virulent expressions of such political religions. None has succeeded for very long—at least, until now.The first step was the cultivation of the idea within the religious right that certain political positions were deeply Christian, according to Marsden. Still, such claims were not at all unprecedented in American history. Through the 2000s, even though the religious right drew its energy from the culture wars—as it had for decades—it abided by some civil restraints. Then came Donald Trump.“When Trump was able to add open hatred and resentments to the political-religious stance of ‘true believers,’ it crossed a line,” Marsden said. “Tribal instincts seem to have become overwhelming.” The dominance of political religion over professed religion is seen in how, for many, the loyalty to Trump became a blind allegiance. The result is that many Christian followers of Trump “have come to see a gospel of hatreds, resentments, vilifications, put-downs, and insults as expressions of their Christianity, for which they too should be willing to fight.”Tim Schultz, the president of the 1st Amendment Partnership and an advocate for religious freedom, told me that evangelicalism was due a reckoning. “It has been held together by political orientation and sociology more than by common theology,” he said. The twin crises of the summer of 2020—COVID and a heightened awareness of enduring racial injustices—exposed this long-unnoticed truth.Some of the most distinctive features of the evangelical movement may have left it particularly vulnerable to this form of politicization. Among religious believers, evangelicals are some of the most anti-institutional, Timothy J. Keller, the founding pastor of Redeemer Presbyterian Church, in Manhattan, told me. The evangelical movement flourished in this relatively anti-institutional country at a particularly anti-institutional time. Evangelical ministries and churches fit the “spirit of the age,” growing rapidly in the 1970s, and retaining more of their members even as many mainline denominations declined.At the same time, Keller argues, that anti-institutional tendency makes evangelical communities more prone than others to “insider abuse”—corruption committed by leaders who have almost no guardrails—and “outsider-ism,” in which evangelicals simply refuse to let their church form them or their beliefs. As a result, they are unrooted—and therefore susceptible to political idolization, fanatical ideas, and conspiracy theories.“What we’re seeing is massive discipleship failure caused by massive catechesis failure,” James Ernest, the vice president and editor in chief at Eerdmans, a publisher of religious books, told me. Ernest was one of several figures I spoke with who pointed to catechism, the process of instructing and informing people through teaching, as the source of the problem. “The evangelical Church in the U.S. over the last five decades has failed to form its adherents into disciples. So there is a great hollowness. All that was needed to cause the implosion that we have seen was a sufficiently provocative stimulus. And that stimulus came.”“Culture catechizes,” Alan Jacobs, a distinguished professor of humanities in the honors program at Baylor University, told me. Culture teaches us what matters and what views we should take about what matters. Our current political culture, Jacobs argued, has multiple technologies and platforms for catechizing—television, radio, Facebook, Twitter, and podcasts among them. People who want to be connected to their political tribe—the people they think are like them, the people they think are on their side—subject themselves to its catechesis all day long, every single day, hour after hour after hour.On the flip side, many churches aren’t interested in catechesis at all. They focus instead on entertainment, because entertainment is what keeps people in their seats and coins in the offering plate. But as Jacobs points out, even those pastors who really are committed to catechesis get to spend, on average, less than an hour a week teaching their people. Sermons are short. Only some churchgoers attend adult-education classes, and even fewer attend Bible study and small groups. Cable news, however, is always on. “So if people are getting one kind of catechesis for half an hour per week,” Jacobs asked, “and another for dozens of hours per week, which one do you think will win out?”That’s not a problem limited to the faithful on one side of the aisle. “This is true of both the Christian left and the Christian right,” Jacobs said. “People come to believe what they are most thoroughly and intensively catechized to believe, and that catechesis comes not from the churches but from the media they consume, or rather the media that consume them. The churches have barely better than a snowball’s chance in hell of shaping most people’s lives.”But when people’s values are shaped by the media they consume, rather than by their religious leaders and communities, that has consequences. “What all those media want is engagement, and engagement is most reliably driven by anger and hatred,” Jacobs argued. “They make bank when we hate each other. And so that hatred migrates into the Church, which doesn’t have the resources to resist it. The real miracle here is that even so, in the mercy of God, many people do find their way to places of real love of God and neighbor.”The way our sensibilities are shaped determines who we are, including the order of our loves. For many Christians, their politics has become more of an identity marker than their faith. They might insist that they are interpreting their politics through the prism of scripture, with the former subordinate to the latter, but in fact scripture and biblical ethics are often distorted to fit their politics.Scott Dudley, the senior pastor at Bellevue Presbyterian Church in Bellevue, Washington, refers to this as “our idolatry of politics.” He’s heard of many congregants leaving their church because it didn’t match their politics, he told me, but has never once heard of someone changing their politics because it didn’t match their church’s teaching. He often tells his congregation that if the Bible doesn’t challenge your politics at least occasionally, you’re not really paying attention to the Hebrew scriptures or the New Testament. The reality, however, is that a lot of people, especially in this era, will leave a church if their political views are ever challenged, even around the edges. “Many people are much more committed to their politics than to what the Bible actually says,” Dudley said. “We have failed not only to teach people the whole of scripture, but we have also failed to help them think biblically. We have failed to teach them that sometimes scripture is most useful when it doesn’t say what we want it to say, because then it is correcting us.” Teaching people how to think biblically would help, Dudley added, as well as teaching people how to disagree with one another biblically. “There is a lot of disagreement in the New Testament, and it gives us a template for how to listen to each other to understand rather than to argue,” he said.Many Christians, though, are disinclined to heed calls for civility. They feel that everything they value is under assault, and that they need to fight to protect it. “I understand that,” Dudley said. “I feel under assault sometimes too. However, I also know that the early Christians transformed the Roman empire not by demanding but by loving, not by angrily shouting about their rights in the public square but by serving even the people who persecuted them, which is why Christianity grew so quickly and took over the empire. I also know that once Christians gained political power under Constantine, that beautiful loving, sacrificing, giving, transforming Church became the angry, persecuting, killing Church. We have forgotten the cross.”Dudley, my high-school and college classmate, left me with this haunting question: How many people look at churches in America these days and see the face of Jesus? Too often, I fear, when Americans look at the Church, they see not the face of Jesus, but the style of Donald Trump.The former president normalized a form of discourse that made the once-shocking seem routine. Russell Moore laments the “pugilism of the Trump era, in which anything short of cruelty is seen as weakness.” The problem facing the evangelical church, then, is not just that it has failed to inculcate adherents with its values—it’s that when it has succeeded in doing so, those values have not always been biblical.But of course Trump did not appear ex nihilo. Kristin Kobes Du Mez, a history professor at Calvin University and the author of Jesus and John Wayne: How White Evangelicals Corrupted a Faith and Fractured a Nation, argues that Trump represents the fulfillment, rather than the betrayal, of many of white evangelicals’ most deeply held values. Her thesis is that American evangelicals have worked for decades to replace the Jesus of the Gospels with an idol of rugged masculinity and Christian nationalism. (She defines Christian nationalism as “the belief that America is God’s chosen nation and must be defended as such,” which she says is a powerful predictor of attitudes toward non-Christians and on issues such as immigration, race, and guns.Du Mez told me it’s important to recognize that this “rugged warrior Jesus” is not the only Jesus many evangelicals encounter in their faith community. There is also the “Jesus is my friend” popular in many devotionals, for example. These representations might appear to be contradictory, she told me, but in practice they can be mutually reinforcing. Jesus is a friend, protector, savior—but according to one’s own understanding of what needs to be protected and saved, and not necessarily according to core biblical teachings.“Evangelicals are quick to label their values ‘biblical,’” Du Mez told me. “But how they interpret the scriptures, which parts they decide to emphasize and which parts they decide to ignore, all this is informed by their historical and cultural circumstances.” That’s not simply true of this one community, she added, but of all people of faith. “More than most other Christians, however, conservative evangelicals insist that they are rejecting cultural influences,” she said, “when in fact their faith is profoundly shaped by cultural and political values, by their racial identity and their Christian nationalism.”Gender plays a role here as well, according to Du Mez. Over the past half century, evangelicals have tended to depict men and women as opposites. “They believe God ordained men to be protectors and filled them with testosterone for this purpose,” she said. Women, on the other hand, are seen as nurturers. The fruits of the spirit—love, joy, peace, patience, kindness, gentleness, and self-control—are deemed appropriate feminine virtues. “Men, however, are to exhibit boldness, courage, even ruthlessness in order to fulfill their God-appointed role,” Du Mez explained. “In this way, the warrior spirit and a kinder, gentler Christianity go hand in hand.”Du Mez pointed out that even men who embrace a kinder, gentler version of masculinity— servant leadership, for example—may tip into a more rugged, ruthless version when they deem the situation sufficiently dire. And for more than half a century, she said, evangelical leaders have found reason to deem the situation sufficiently dire. They rallied their congregations against the threats of communism, secular humanism, feminism, gay rights, radical Islam, Democrats in the White House, demographic decline, and critical race theory, and in defense of religious liberty.“Evangelical militancy is often depicted as a response to fear,” she told me. “But it’s important to recognize that in many cases evangelical leaders actively stoked fear in the hearts of their followers in order to consolidate their own power and advance their own interests.” Du Mez is somewhat more sympathetic toward ordinary evangelicals than she is toward powerful evangelical leaders. She acknowledges that many evangelicals have genuinely sought to follow God’s will; they were directed to believe what they do by pastors, Bible-study leaders, Christian publishers, and Christian radio and television programming. “Many have sought certainty in turbulent times,” she said, and they know that challenging these narratives may well involve the loss of meaningful communities.Fear has played a central role in the explosion of conflict within American evangelical churches. “Dwelling on fear and outrage is spiritually deforming,” Cherie Harder, president of the Trinity Forum, told me. “Both biblical wisdom and a large body of research holds that fear and grace, or fear and gratitude, are incompatible.” She quoted from one of the New Testament epistles: “Perfect love drives out fear.”There are moments, of course, when fear is an appropriate and necessary response, but there are risks when it becomes a constant presence. “Fear and anger should presumably function as alarm systems—and an alarm is not supposed to stay perpetually on,” Harder said. It is not the onset of fear or anger that is most dangerous, she said, “but stoking it, cultivating it, and dwelling within it that distorts and deforms.”And then there is a regional component to the crisis of evangelical Christianity. Claude Alexander, the senior pastor of the Park Church in Charlotte, North Carolina, told me we must come to terms with the “southernization of the Church.” Some of the distinctive cultural forms present in the American South—masculinity and male dominance, tribal loyalties, obedience and intolerance, and even the ideology of white supremacism—have spread to other parts of the country, he said. These cultural attitudes are hardly shared by every southerner or dominant throughout the South, but they do exist and they need to be named. “Southern culture has had a profound impact upon religion,” Alexander told me, “particularly evangelical religion.”The conservative writer David French, who lives in Tennessee, has written about the South’s shame/honor culture and its focus on group reputation and identity. “What we’re watching right now in much of our nation’s Christian politics,” he wrote, “is an explosion not of godly Christian passion, but rather of ancient southern shame/honor rage.”Pastors now find themselves on the front lines of this conflict, their congregations splitting into warring camps. I spoke with 15 of them, and what I heard was jarring. They told me that nothing else they’ve faced approaches what they’ve experienced in recent years, and that nothing had prepared them for it.Scott Dudley of Bellevue Presbyterian Church said he knows of several pastors who have not just quit their churches but resigned from ministry, and that many others are actively seeking to switch careers. “They have concluded that their church has become a hostile work environment where at any moment they may be blasted, slandered, and demeaned in disrespectful and angry ways,” he said, “or have organized groups of people within the church demand that they be fired.”Several months ago, I spoke with one such pastor, who had not only resigned from his church, a congregation of the Presbyterian Church in America, but had also decided, at least for now, to leave the ministry altogether. He told me that he felt undermined by people in his congregation, including by some whom he had trusted but who, it turned out, were less animated by spiritual matters than by political agendas. This former pastor used the word betrayal in our conversation; he talked about the pain this episode has caused him and his wife. In his words, “The gentleness of Jesus was utterly discarded” by those who felt he wasn’t championing their cultural and political agendas aggressively enough.“They don’t care about the relational collateral damage,” he said.In a similar vein, I recently had a conversation with a senior pastor who is planning to leave his position soon; he’s not yet sure where he’ll land, or even whether he’ll stay in the ministry. He has simply been worn down by the divisions within his church. He has not been the target of outward hostility, but he can feel the ground shifting beneath his feet. He feels that he is growing apart from people in the congregation; there’s no longer the same sense of common purpose. He is watching the collapse of an evangelical movement to which he has devoted much of his life. At one point, as we talked about what is unfolding within American Christianity, his eyes welled with tears.Bob Fryling, a former publisher of InterVarsity Press and the vice president of InterVarsity Christian Fellowship, an evangelical campus ministry, has been part of a weekly gathering of more than 150 individuals representing about 40 churches. He’s heard of conflicts “in almost every church” and reports that pastors are exhausted. Earlier this year, the Christian polling firm Barna Group found that 29 percent of pastors said they had given “real, serious consideration to quitting being in full-time ministry within the last year.” David Kinnaman, president of Barna, described the past year as a “crucible” for pastors as churches fragmented.The key issues in these conflicts are not doctrinal, Fryling told me, but political. They include the passions stirred up by the Trump presidency, the legitimacy of the 2020 election, and the January 6 insurrection; the murder of George Floyd, the Black Lives Matter movement, and critical race theory; and matters related to the pandemic, such as masking, vaccinations, and restrictions on in-person worship. I know of at least one large church in eastern Washington State, where I grew up, that has split over the refusal of some of its members to wear masks.“There have always been mean people who cloak their unkindness in religious devotion,” one minister in a conservative denomination told me. “The New Testament itself is pretty clear about that.” But, he added, the conflicts have grown more widespread and more intense. “Without doubt you’ll see—you already are—a ton of pastors quitting,” he said. ”Most pastors actually hate conflict. So if you’re going to pay me one-quarter of what I could make on the market, why put up with this?”In his own church, some of the elders are devoted to culture-war politics. “These guys can be a special kind of relentless, and I don’t think I’ve had it as bad as many,” he said. “But when we’re stressed out, trying to be public-health experts without the training to do that, trying to keep our own families from blowing up with COVID stress, getting criticized from both sides at once, and then having folks doing whatever they can to ruin us and get us run out of town— we’d love to just be trusted as friends and shepherds. I understand why many folks have just said, ‘I’m done.’ I’m not there yet, but I hardly think I’m above it or guaranteed not to. I just pray to Jesus to not let me throw in the towel.”The historian Mark Noll’s 1994 book, The Scandal of the Evangelical Mind, will be rereleased next year. In the forthcoming preface, which Noll, himself an evangelical, shared with me, he argues that in various spheres—vaccinations, evolutionary science, anthropogenic global warming, and the 2020 elections, to name just a few—“white evangelicals appear as the group most easily captive to conspiratorial nonsense, in greater panic about their political opponents, or as most aggressively anti-intellectual.” He goes on to warn that “the broader evangelical population has increasingly heeded populist leaders who dismiss the results of modern learning from whatever source.” And he laments the “intellectual self-immolation of recent evangelical history.”“Much of what is distinctive about American evangelicalism is not essential to Christianity,” Noll has written. And he is surely correct. I would add only that it isn’t simply the case that much of what is distinctive about American evangelicalism is not essential to Christianity; it is that now, in important respects, much of what is distinctive about American evangelicalism has become antithetical to authentic Christianity. What we’re dealing with—not in all cases, of course, but in far too many— is political identity and cultural anxieties, anti-intellectualism and ethnic nationalism, resentments and grievances, all dressed up as Christianity.Jesus now has to be reclaimed from his Church, from those who pretend to speak most authoritatively in his name.Too many Christians have “domesticated” Jesus by their resistance to his call to radically rethink our attitude toward power, ourselves, and others, Mark Labberton, the president of Fuller Theological Seminary, told me. We live in “an era of acute anxiety and great fear,” he said. As a result, too often Christians end up wrapping Jesus into our angry and fearful distortions. We want Jesus to validate everything we believe, often as if he never walked the face of this Earth. What we’re witnessing can be explained “more by sociology than Christology,” he said.Unlike in the Sermon on the Mount and the parable of the Good Samaritan—unlike Jesus’s barrier-breaking encounters with prostitutes and Roman collaborators, with the lowly and despised, with the unclean and those on the wrong side of the “holiness code,” with the wounded souls whom he healed on the Sabbath—many Christians today see the world divided between us and them, the children of light and the children of darkness. Blessed are the politically powerful, for theirs is the kingdom of God. Blessed are the culture warriors, for they will be called children of God. For many of us who have made Christianity central to our lives, the pain of this moment is watching those who claim to follow Jesus do so much to distort who he really was. Those who deform his image may be doing so unwittingly—this isn’t an intentionally malicious enterprise they’re engaging in; they believe they’re being faithful—but it is nonetheless destructive and unsettling.I believe the portrait I’ve painted in this essay is accurate, but it is also, and necessarily, incomplete. Countless acts of kindness, generosity, and self-giving love are performed every day by people precisely because they are Christians. Their lives have been changed, and in some cases transformed, by their faith. My own life has been immeasurably blessed by people of faith who have walked the journey with me, who have shown me grace and encouraged me in difficult moments. But I can recognize that while also recognizing the wreckage around us.Something has gone amiss; pastors know it as well as anyone and better than most. The Jesus of the Gospels—the Jesus who won their hearts, and who long ago won mine—needs to be reclaimed.
5 h
theatlantic.com
‘Follow the Science’ Doesn't Work When the Science Is This Bad
Ivermectin is an antiparasitic drug, and a very good one. If you are infected with the roundworms that cause river blindness or the parasitic mites that cause scabies, it is wonderfully effective. It is cheap; it is accessible; and its discoverers won the Nobel Prize in 2015. It has also been widely promoted as a coronavirus prophylactic and treatment.This promotion has been broadly criticized as a fever dream conceived in the memetic bowels of the internet and as a convenient buttress for bad arguments against vaccination. This is not entirely fair. Perhaps 70 to 100 studies have been conducted on the use of ivermectin for treating or preventing COVID-19; several dozen of them support the hypothesis that the drug is a plague mitigant. Two meta-analyses, which looked at data aggregated across subsets of these studies, concluded that the drug has value in the fight against the pandemic.So if you’re the sort of person who “follows the science,” it might seem perfectly rational to join the fervent supporters of ivermectin. It might even strike you as reasonable to suggest, as one physician and congressional witness did recently, that “people are dying because they don’t know about this medicine.”The problem is, not all science is worth following.I work on a small team of researchers who do what one might call “forensic peer review.” In the standard process for scientific publishing, peer reviewers take a manuscript mostly at face value: They ensure that the study makes sense as it’s described. We do something else: We check everything, and try to ferret out any potential biases in reported patterns of digits, statistical impossibilities, inconsistencies between what researchers said they’d do and what they actually did, and plagiarized sentences or paragraphs. And we often find fatal flaws hidden behind a veil of two-dollar words and statistical jargon.The ivermectin literature has been no exception. Over the past six months, we’ve examined about 30 studies of the drug’s use for treating or preventing COVID-19, focusing on randomized studies, or nonrandomized ones that have been influential, with at least 100 participants. We’ve reached out directly to the authors of these studies to discuss our findings, sometimes engaging in lengthy back-and-forths; when appropriate, we’ve sent messages to the journals in which studies have been published. In our opinion, a bare minimum of five ivermectin papers are either misconceived, inaccurate, or otherwise based on studies that cannot exist as described. One study has already been withdrawn on the basis of our work; the other four very much should be.[Read: Scientific publishing is a joke]In the withdrawn study, a team in Egypt compared outcomes among COVID-19 patients who did and did not receive ivermectin—but, for the latter group, they included deaths that had occurred before the study began. (According to the journal Nature, the lead author “defended the paper” in an email, and claimed that the withdrawal took place without his knowledge. He did not respond to an inquiry from The Atlantic.) Other papers also have egregious flaws. Researchers in Argentina said they recruited participants from hospitals that had no record of having participated in the research, and then blamed mistakes on a statistician who claimed never to have been consulted. A few studies show clear evidence of severe data irregularities. In one from Lebanon, for example, the same section of patient records repeats over and over again in the data set, as if it had been copied and pasted. (An author on that paper conceded that the data were flawed, and claimed to have requested a retraction.)All of the above may not sound that bad. If five out of 30 trials have serious problems, perhaps that means the other 25 are up to snuff. That’s 83 percent! You might be tempted to think of these papers as being like cheaply made light bulbs: Once we’ve discarded the duds with broken filaments, we can just use the “good” ones.That’s not how any of this works. We can locate obvious errors in a research paper only by reanalyzing the numbers on which the paper is based, so it’s likely that we’ve missed some other, more abstract problems. Also, we have only so much time in the day, and forensic peer review can take weeks or months per paper. We don’t pick papers to examine at random, so it’s possible that the data from the 30 papers we chose are somewhat more reliable, on average, than the rest. A better analogy would be to think of the papers as new cars: If five out of 30 were guaranteed to explode as soon as they entered a freeway on-ramp, you would prefer to take the bus.Most problematic, the studies we are certain are unreliable happen to be the same ones that show ivermectin as most effective. In general, we’ve found that many of the inconclusive trials appear to have been adequately conducted. Those of reasonable size with spectacular results, implying the miraculous effects that have garnered so much public attention and digital notoriety, have not.Given all the care that goes into maintaining scientific literature, how did this house of cards acquire planning permission? The answer is that the pandemic has created a very difficult environment for scientific publishing. In early 2020, a hunger for high-quality information arose immediately. How scared of the coronavirus should we be, and how should we behave? How does the virus spread? How dangerous is it? What decisions should governments make? To answer those questions, scientific studies were produced at record pace, peer-reviewed almost immediately after they were submitted or else put into the public domain via preprint as soon as they had been completed. Publishing science is slow; highly contagious diseases are fast.It’s not that, under such conditions, a few bad studies were bound to slip through the net. Rather, there is no net. Peer review, especially when conducted at pandemic speed, does not exert the rather boring scientific scrutiny needed to identify the problems described above. Forensic work like ours is not organized by scientific journals. We do not get paid. We are not employed by universities, hired by governments, or supported by private money to do this. We do it because we feel it should be done.[Read: A credibility crisis in food science]As volunteers, we have no inherent authority. When we ask a research group for access to its original data, in accordance with a long-held standard for maintaining scientific integrity, our requests are commonly refused or ignored. And when we do find what we think are serious anomalies in a given paper, getting the authors, the institutions they work for, or their publication outlets to return our emails tends to be somewhere between challenging and impossible. When we looked at an ivermectin study published over the summer in the Asian Pacific Journal of Tropical Medicine, for example, we found a highly unusual pattern of numbers that implied a failure of randomization. We reported this issue to the journal more than three months ago and have heard nothing substantive back. One of the journal’s executive editors in chief, Bo Cui, told The Atlantic that the study “represents the best available evidence at the time of publication and has undergone scientific peer review and editor review before its publication”; he also said that the journal has asked the authors to address “the randomization issue” and that any eventual retraction would come only after “due process, free from coercion or pressure.” The study’s lead author, Morteza Niaee, told The Atlantic via email that the randomization procedure was “completely acceptable and well-performed.”This is a consistent theme in our work. We contact the authors of each paper, along with the journal or preprint service where their work is published, long before these issues are discussed in public. Sometimes, the authors or journals even reply, although these communications rarely result in any kind of investigation, let alone a serious consideration of the issues raised.In this environment, no sinister conspiracy is needed to allow for the construction of an irreparably flawed body of literature. In fact, the suspect quality of the ivermectin/COVID-19 literature may be alarmingly commonplace. Remember, our low estimate is that about 17 percent of the major ivermectin trials are unreliable. John Carlisle, famous in metascientific circles for identifying the most prolific research fraud in the history of medical research—the case of Yoshitaka Fujii, an anaesthesiologist who managed to garner an astonishing 183 retractions—reviewed more than 500 trials submitted to the journal Anaesthesia in the three years leading up the pandemic and concluded that 14 percent of them contained false data. A 2012 survey of researchers at five academic medical centers in Belgium reported that 1 percent admitted to having fabricated data in the prior three years, though 24 percent said they had observed a colleague doing so. A meta-analysis on the same topic concluded, similarly, that 2 percent of researchers admit to having engaged in serious misconduct while 14 percent say they have observed it in a colleague.Richard Smith, the former editor of the British Medical Journal, suggested in July that the scientific community is long past due for a reckoning on the prevalence of false data in the literature. “We have now reached a point where those doing systematic reviews must start by assuming that a study is fraudulent until they can have some evidence to the contrary,” he wrote. This is less hostile than it sounds. Smith isn’t saying that everything is fraudulent, but rather that everything should be evaluated starting from a baseline of “I don’t believe you” until we definitely see otherwise. Think of the airport-security workers who assume that you may be carrying contraband or weapons until you prove that you aren’t. The point is not that everyone is armed but that everybody has to go through the machine.Yet it has not yet sunk in to the public consciousness that our system for building biomedical knowledge largely ignores any evidence of widespread misconduct. In other words, the literature on ivermectin may be quite bad—and in being so, it may also be quite unremarkable.If this is the case, how does medical science manage to navigate all the bad research? How have we not returned to the ages of leeches and bloodletting?The secret, again, is simple: Much research is simply ignored by other scientists because it either looks “off” or is published in the wrong place. A huge gray literature exists in parallel to reliable clinical research, including work published in low-quality or outright predatory journals that will publish almost anything for money. Likewise, the authors of fabricated or heavily distorted papers tend to have modest ambitions: The point is to get their work in print and added to their CV, not to make waves. We often say these studies are designed to be “written but not read.”Although some of the papers we examined may claim, for instance, that ivermectin is a perfect COVID-19 prophylactic, they do so based on a smallish study of a few hundred people—and the work is published in journals that during pre-pandemic times would have been deeply obscure. When a group claims to have reviewed, say, 100,000 patient records—and then publishes their dubious results in a high-profile journal—the risks are significant.In a pandemic, when the stakes are highest, the somewhat porous boundary between these publication worlds has all but disappeared. There is no gray literature now: Everything is a magnet for immediate attention and misunderstanding. An unbelievable, inaccurate study no longer has to linger in obscurity; it may bubble over into the public consciousness as soon as it appears online, and get passed around the internet like a lost kitten in a preschool. An instantly forgettable preprint, which would once have been read by only a few pedantic experts, can now be widely shared among hundreds of thousands on social media.And our work will begin all over again.The fact that there is no true institutional vigilance around a research literature that affects the health of nations, that it is necessary for us to do this, is obscene. It is a testament to how badly the scientific commons are managed that their products are fact-checked for the first time by a group of weary volunteers.
1 d
theatlantic.com
Public Health Was Radical Once
There was a time, at the start of the 20th century, when the field of public health was stronger and more ambitious. A mixed group of physicians, scientists, industrialists, and social activists all saw themselves “as part of this giant social-reform effort that was going to transform the health of the nation,” David Rosner, a public-health historian at Columbia University, told me. They were united by a simple yet radical notion: that some people were more susceptible to disease because of social problems. And they worked to address those foundational ills—dilapidated neighborhoods, crowded housing, unsafe working conditions, poor sanitation—with a “moral certainty regarding the need to act,” Rosner and his colleagues wrote in a 2010 paper.A century and a half later, public health has succeeded marvelously by some measures, lengthening life spans and bringing many diseases to heel. But when the coronavirus pandemic reached the United States, it found a public-health system in disrepair. That system, with its overstretched staff, meager budgets, crumbling buildings, and archaic equipment, could barely cope with sickness as usual, let alone with a new, fast-spreading virus.By one telling, public health was a victim of its own success, its value shrouded by the complacency of good health. By a different account, the competing field of medicine actively suppressed public health, which threatened the financial model of treating illness in (insured) individuals. But these underdog narratives don’t capture the full story of how public health’s strength faded. In fact, “public health has actively participated in its own marginalization,” Daniel Goldberg, a historian of medicine at the University of Colorado, told me. As the 20th century progressed, the field moved away from the idea that social reforms were a necessary part of preventing disease and willingly silenced its own political voice. By swimming along with the changing currents of American ideology, it drowned many of the qualities that made it most effective.Public health’s turning point, according to several historical accounts, came after the discovery that infectious illnesses are the work of microbes. Germ theory offered a seductive new vision for defeating disease: Although the old public health “sought the sources of infectious disease in the surroundings of man; the new finds them in man himself,” wrote Hibbert Hill in The New Public Health in 1913. Or, as William Thompson Sedgwick, a bacteriologist and a former president of the American Public Health Association, put it, “Before 1880 we knew nothing; after 1890 we knew it all.”This revolution in thinking gave public health license to be less revolutionary. Many practitioners no longer felt compelled to deal with sticky, sweeping problems such as poverty, inequity, and racial segregation (or to consider their own role in maintaining the status quo). “They didn’t have to think of themselves as activists,” Rosner said. “It was so much easier to identify individual victims of disease and cure them than it was to rebuild a city.” Public-health leaders even mocked their predecessors’ efforts at social reform, which they saw as inefficient and misguided. Some dismissively billed the impressive work of the sanitarian movement, which had essentially plumbed entire cities, as “a matter of pipes.”As public health moved into the laboratory, a narrow set of professionals associated with new academic schools began to dominate the once-broad field. “It was a way of consolidating power: If you don’t have a degree in public health, you’re not public health,” Amy Fairchild, a historian and the dean of the College of Public Health at Ohio State University, told me. Mastering the new science of bacteriology “became an ideological marker,” sharply differentiating an old generation of amateurs from a new one of scientifically minded professionals, wrote the historian Elizabeth Fee.Hospitals, meanwhile, were becoming the centerpieces of American health care, and medicine was quickly amassing money and prestige by reorienting toward biomedical research. Public-health practitioners thought that by cleaving to the same paradigm, “they could solidify and extend their authority and bring public health up to the same level of esteem and power that medicine was beginning to enjoy,” Fairchild told me.Public health began to self-identify as a field of objective, outside observers of society instead of agents of social change. It assumed a narrower set of responsibilities that included data collection, diagnostic services for clinicians, disease tracing, and health education. Assuming that its science could speak for itself, the field pulled away from allies such as labor unions, housing reformers, and social-welfare organizations that had supported city-scale sanitation projects, workplace reforms, and other ambitious public-health projects. That left public health in a precarious position—still in medicine’s shadow, but without the political base “that had been the source of its power,” Fairchild told me.After World War II, biomedicine lived up to its promise, and American ideology turned strongly toward individualism. Anti-communist sentiment made advocating for social reforms hard—even dangerous—while consumerism fostered the belief that everyone had access to the good life. Seeing poor health as a matter of personal irresponsibility rather than of societal rot became natural.Even public health began to treat people as if they lived in a social vacuum. Epidemiologists now searched for “risk factors,” such as inactivity and alcohol consumption, that made individuals more vulnerable to disease and designed health-promotion campaigns that exhorted people to change their behaviors, tying health to willpower in a way that persists today.This approach appealed, too, to powerful industries with an interest in highlighting individual failings rather than the dangers of their products. Tobacco companies donated to public-health schools at Duke University and other institutions. The lead industry funded lead research at Johns Hopkins and Harvard Universities. In this era, Rosner said, “epidemiology isn’t a field of activists saying, ‘God, asbestos is terrible,’ but of scientists calculating the statistical probability of someone’s death being due to this exposure or that one.”In the late 20th century, some public-health leaders began calling for a change. In 1971, Paul Cornely, then the president of the APHA and the first Black American to earn a Ph.D. in public health, said that “if the health organizations of this country have any concern about the quality of life of its citizens, they would come out of their sterile and scientific atmosphere and jump in the polluted waters of the real world where action is the basis for survival.” Some of that change happened: AIDS activists forced the field to regain part of its crusading spirit, while a new wave of “social epidemiologists” once again turned their attention to racism, poverty, and other structural problems.But, as COVID has revealed, the legacy of the past century has yet to release its hold on public health. The biomedical view of health still dominates, as evidenced by the Biden administration’s focus on vaccines at the expense of masks, rapid tests, and other “nonpharmaceutical interventions”. Public health has often been represented by leaders with backgrounds primarily in clinical medicine, who have repeatedly cast the pandemic in individualist terms: “Your health is in your own hands,” said the CDC’s director, Rochelle Walensky, in May, after announcing that the vaccinated could abandon indoor masking. “Human behavior in this pandemic hasn’t served us very well,” she said this month.If anything, the pandemic has proved what public health’s practitioners understood well in the late 19th and early 20th century: how important the social side of health is. People can’t isolate themselves if they work low-income jobs with no paid sick leave, or if they live in crowded housing or prisons. They can’t access vaccines if they have no nearby pharmacies, no public transportation, or no relationships with primary-care providers. They can’t benefit from effective new drugs if they have no insurance. In earlier incarnations, public health might have been in the thick of these problems, but in its current state, it lacks the resources, mandate, and sometimes even willingness to address them.Public health is now trapped in an unenviable bind. “If it conceives of itself too narrowly, it will be accused of lacking vision … If it conceives of itself too expansively, it will be accused of overreaching,” wrote Lawrence Gostin, of Georgetown University, in 2008. “Public health gains credibility from its adherence to science, and if it strays too far into political advocacy, it may lose the appearance of objectivity,” he argued.But others assert that public health’s attempts at being apolitical push it further toward irrelevance. In truth, public health is inescapably political, not least because it “has to make decisions in the face of rapidly evolving and contested evidence,” Fairchild told me. That evidence almost never speaks for itself, which means the decisions that arise from it must be grounded in values. Those values, Fairchild said, should include equity and the prevention of harm to others, “but in our history, we lost the ability to claim these ethical principles.”This tension has come up over and over again in my reporting. Although the medical establishment has remained an eager and influential participant in policy, public health has become easier than ever to silence. It need not continue in that vein. “Sick-leave policies, health-insurance coverage, the importance of housing … these things are outside the ability of public health to implement, but we should raise our voices about them,” said Mary Bassett, of Harvard, who was recently appointed as New York’s health commissioner. “I think we can get explicit.”Public-health professionals sometimes contend that grand societal problems are beyond the remit of their field. Housing is an urban-planning issue. Poverty is a human-rights issue. The argument goes that “it’s not the job of public health to be leading the revolution,” Goldberg said. But he and others disagree. That attitude emerged because public health moved away from advocacy, and because the professionalization of higher education splintered it off from social work, sociology, and other disciplines. These fragmented fields can more easily treat everyone’s problems as someone else’s problem.The future might lie in reviving the past, and reopening the umbrella of public health to encompass people without a formal degree or a job at a health department. Chronically overstretched workers who can barely deal with STDs or opioid addiction can’t be expected to tackle poverty and racism—but they don’t have to. What if, instead, we thought of the Black Lives Matter movement as a public-health movement, the American Rescue Plan as a public-health bill, or decarceration, as the APHA recently stated, as a public-health goal? In this way of thinking, too, employers who institute policies that protect the health of their workers are themselves public-health advocates.“We need to re-create alliances with others and help them to understand that what they are doing is public health,” Fairchild said. The field in the late 19th century was not a narrow scientific endeavor but one that stretched across much of society. Those same broad networks and wide ambitions are necessary now to deal with the problems that truly define the public’s health.
1 d
theatlantic.com
College Admissions Are Still Unfair
This week Amherst College announced that it was ending the use of legacy preferences in its admissions process. Its president, Biddy Martin, acknowledged that providing an advantage to applicants who are the children of alumni “inadvertently limits educational opportunity.” When incredibly wealthy, highly selective colleges such as Amherst (endowment: $3.7 billion; admission rate: 8 percent) make an announcement like this, it’s tempting to pour a bucket of cold water on the self-congratulatory fireworks they’re lighting for themselves. That should not happen this time.Still, the temptation is real. Why congratulate a college for doing something it should have done a long time ago? Many large public universities, including the University of California, the University of Texas, and Texas A&M, dropped legacy preferences years ago, as did Johns Hopkins University two years ago and MIT and CalTech before it. Pomona College, a highly ranked liberal-arts college like Amherst, dropped legacy preference in 2017. What’s more, even at those places that give the offspring of alumni an edge, or what admissions offices call a “tip,” legacies usually account for a small share of the enrolled class, and many of those admits would likely have gotten in without the advantage, or so the admissions deans defending the practice of birthright advantage like to say.Wouldn’t it be better to go after all those jocks on campus? Division III sports have been called “affirmative action for rich white students,” and almost a third of Amherst’s roughly 1,800 students play one of the college’s 27 varsity sports. That is nearly three times as many athletes as legacies. Amherst plays in the New England Small College Athletic Conference, of which 77 percent of athletes are white, while the college’s student body is only 43 percent white, which makes you wonder whether Amherst’s obsession with sports is also “inadvertently limit[ing] educational opportunity,” maybe even more so than legacy preferences were.Within a few hours of Amherst sharing its news, Catharine B. Hill, the managing director of the nonprofit consulting and research firm Ithaka S+R, dumped out her bucket. In a piece entitled “Ending Legacy Admissions Won’t End Inequity,” Hill stated bluntly, “Legacy admissions are bad from a public-relations perspective, but ending them would do almost nothing to improve socioeconomic diversity at these institutions or increase lower-income students’ likelihood of being admitted.” Splash! So much for your fireworks, Amherst.[Read: College sports are affirmative action for rich white students]It’s important to say that Hill is not some internet troll or the kind of economist who thinks the market should decide who gets to go to college or drink clean water. She is the former president of Vassar College and a serious champion of college access. Under her leadership, Vassar reinstated its “need-blind” admissions policy and significantly raised its enrollment rate of students with Pell Grants. Hill’s absolutely accurate point is that increased institutional spending on grant aid—not loans—for students with economic need will do much more to increase the enrollment of working-class and low-income students at wealthy colleges than getting rid of legacy admissions will.The problem with Hill’s argument is that Amherst did precisely what Hill recommended: It increased its financial-aid investment to make an education there more affordable. Her critique poses a false choice. Amherst showed that it’s possible to do two good things at the same time. There’s no need to decide between getting rid of the legacy tip and increasing need-based financial aid. Furthermore, no one who is fighting for legacy admissions thinks it will “end inequity,” as Hill claims. Straw men should be kept away from the fireworks.Perhaps worst of all, arguing that legacy has no impact only gives further cover to the institutions that still think some children should inherit a leg up in the admissions process. This is also not true. Almost 400 University of Notre Dame freshmen, or 19 percent of the class, were legacies this year. At the University of Southern California, more than 500 freshmen legacies were enrolled last fall. Would they have gotten those spots without a legacy tip? Perhaps, but at Johns Hopkins, which is just as selective, the percentage of enrolled legacies declined from 12.5 to 3.5 percent, while Pell enrollment climbed from 9 to 19 percent.[Ronald J. Daniels: Why we ended legacy admissions at Johns Hopkins]There is also an important component of racial justice in dropping legacy preferences. The practice overwhelmingly benefits white applicants and harms first-generation, immigrant, low-income, and nonwhite students. A 2018 lawsuit against Harvard revealed that 77 percent of legacy admits were white, while just 5 percent were Black and 7 percent were Hispanic. At Notre Dame, the class of 2024 had five times as many legacies as Black students. The college-access advocate Akil Bello told me that “eliminating legacy preference at what I like to call highly rejective colleges matters because it ends the perpetuation of the generational head start and advantages that white people in this country have.” Colleges want to hold on to their institutional legacy, but discrimination is part of that legacy. And, looking at the legacy-enrollment rates of several highly ranked colleges versus their Black-enrollment rates in the chart below, the failure to serve all students remains a part of their present. ***Share includes legacies and donors, from data released according to CA legal requirements ****Data from trial documents and include Classes of 2015 to 2019 The fundamental reason to get rid of legacy preferences is that they are unethical; this led a group of young, first-generation students to create the “Leave Your Legacy” campaign, which helps alumni contact their alma mater to say they will not donate any money until the legacy preference is eliminated. This kind of pressure and the attention Amherst is rightly getting for dropping the tip for legacies could help persuade more colleges to give up an ugly and unfair practice.Eliminating legacy preferences is not the end of the fight for fairness in college admissions; it’s the beginning. Elite colleges and universities have a long way to go in making a meaningful commitment to diversity, but Amherst took one step closer to it this week. And for that, I’m happy to light a sparkler.
1 d
theatlantic.com
Where Have All the Workers Gone?
The U.S. economy right now is a little bit like Dune.Not Frank Herbert’s magisterial sci-fi epic novel, or Denis Villeneuve’s new and reportedly sumptuous film adaptation. I mean David Lynch’s infamously bewildering 1984 movie version, which is remembered mostly for being a semi-glorious mess. Like that space oddity, today’s economy is too strange to neatly categorize as “clearly great” or “obviously terrible.” You keep waiting for it to just be normal. But it stays weird—big economic indicators point in conflicting directions—so you have to accept that nothing is going to make sense for a while, and maybe it’ll be okay.Americans are buying more stuff than ever before. That’s good. But because of supply constraints, it can feel like there’s a painful shortage of just about everything. That’s bad. Economic growth is booming, but the president’s approval rating on the economy is falling, which is a historically odd juxtaposition. Businesses everywhere are struggling to fill jobs, which sounds bad, but employer pain is workers’ gain, and wages are rising, which is wonderful. But because prices are rising too, inflation-adjusted hourly-wage growth actually declined in September, which is not wonderful.The strange October economy is a chapter within a broader saga of strangeness. Last year, COVID-19 put our economy in a time warp by forcing tens of millions of Americans to stay home, destroying millions of jobs, and accelerating the digitization of at-home shopping and entertainment. The pandemic thrust many people back into the homestead economy of the 1830s, while also re-creating the Depression-era economy of the 1930s and advancing into the virtual economy of the 2030s. Like the dreams of the Dune boy-hero Paul Atreides, the U.S. economy is experiencing the disorienting superposition of multiple timelines.The great mystery of this moment is the labor shortage. America’s GDP is larger than it was in February 2020. But the total economy is down about 7 million workers. That’s akin to the entire labor force of Pennsylvania sitting on the sidelines. In September, the number of people working or actively looking for work mysteriously declined, which is not what you would expect to see in a rapidly growing economy with simmering inflation. Wages are rising. Job openings are everywhere. But we’re running out of people who seem to want a job right now.[Derek Thompson: America is running out of everything]So what’s going on? Where are all the workers?That might sound like a stupid question, on account of this is a pandemic. More than 10,000 Americans are still dying of COVID-19 every week. Tens of thousands more are sick from recent infections or lingering symptoms. Millions more might be scared of throwing their body in front of the coronavirus by going back to work among rude customers who might refuse vaccines, masks, or any sense of human decency. Finally, more than 700,000 people have died from COVID-19, and although it’s ghoulish to treat these deaths predominantly as a loss for the labor force, that the virus has killed many American workers is nonetheless undeniable.But when you look closely, the direct effects of COVID-19 don’t explain very much. Most pandemic deaths have been among elderly people, not Americans of prime working age. And COVID fears have lessened over the past few months. Even so, the number of Americans under 65 looking for work is still shrinking.“What’s most puzzling to me is that the labor shortage is everywhere,” Jason Furman, the chairman of the Council of Economic Advisers under President Barack Obama, told me. “It’s everywhere, and it’s every industry. Every small-business person I talk to has a story. And this is coinciding with large increases in nominal wages. So what are people doing? How are they getting by?”The most complete explanation is that the massive fiscal-policy response to the pandemic reduced the urgency of looking for work. The United States has spent trillions of dollars to help families get through the economic deep freeze, via stimulus checks, expanded unemployment benefits, and the moratorium on student-loan interest payments. National eviction bans have taken pressure off renters. Then there’s the record-high surge in savings among families who haven’t gone on vacation or splurged on experiences in more than a year. Add to that the fact that job openings have hit record highs—which means people know that if they wait a month or three, there will still be jobs aplenty to apply to. Seeing this whole picture, more Americans clearly feel like they can take a more leisurely approach to going back to work.Surveys bear this out. A monthly questionnaire by the hiring company Indeed found that the most common reasons given for not looking for work right now are “having an employed spouse” and having a “financial cushion,” followed by “care responsibilities” and then “COVID fear.” These might seem like distinct reasons, but we can knit them all together into one meta-explanation: People can afford to prioritize family care and avoid COVID-19, for now, because of savings and working partners.The labor shortage fits into a broader picture of workplace turmoil. Widespread media reports assert that strikes are “sweeping the labor market,” although it’s a bit unclear whether the frequency of strikes or the volume of the media coverage is what’s increasing. What’s more certain is that Americans are quitting their jobs at record numbers, especially in the leisure and hospitality sector. The “Great Resignation” seems to be accelerating, alongside a remote-work revolution in the knowledge economy.This raises a bigger question: Is this a new normal? For now, much of the labor force seems to be participating in a kind of distributed protest against the status quo of work in America. As more people reject the office, spend more time with their family, or avoid returning to work entirely, this may be a pivotal turning point in the relationship between labor and capital.Or maybe not! Perhaps we are suspended in an air bubble in history, and perhaps it will pop in the next year. Eventually, Americans will go back to work, where bosses will still boss them around, employers can still fire them, unions are still rare, and real wage growth is still slow. President Joe Biden is stumping for a social-infrastructure bill that would include paid family leave, expanded child tax credits, and subsidized child care. But the fate of that bill is highly uncertain.[Derek Thompson: The Great Resignation is accelerating]Whether or not today’s worker revolt becomes tomorrow’s worker revolution, what’s abundantly clear is that America needs more workers. America’s prime-age population stopped growing more than a decade ago, and because of declining fertility rates, it’s unlikely to recover through natural growth alone. If the U.S. needs more workers, the arithmetic is straightforward: We need more immigrants.Welcoming immigrants is more complicated than putting up a Help Wanted sign at the border. Democrats are looking for ways to expand legal immigration—a matter of moral and long-term economic urgency—while avoiding a xenophobic backlash from the right. One great way to do this would be to “recapture” surplus permanent-residency visas, or green cards, that went unclaimed in previous years. Since 1992, hundreds of thousands of green cards authorized by Congress have not been issued because of administrative hiccups; last year, unused green cards reached a record high. As a result, the U.S. could extend permanent-residency visas to more than 100,000 immigrants—essentially liberalizing immigration law without technically increasing the total number of visas already authorized by Congress. This would be a clever first step in allowing more legal immigration without spooking Americans who are, for a variety of reasons, resistant to dramatic changes in the number of people the U.S. admits.Eventually, Americans will spend down their savings and millions of people will come back from the sidelines and start working again. When they do, America will still need more workers to build houses, staff restaurants, run hotels, and care for the elderly—fields that are now experiencing serious worker shortages and that, in the past, have provided many immigrants with their first foothold in the U.S. economy. More immigration would fill more vacancies, stimulate more demand, and lead to more new ideas, new companies, and new technologies. What stands in the way of this abundance agenda is little more than an irrational fear of new Americans’ contributions. In economic policy, as in interstellar psychological warfare, fear is the mind killer.
1 d
theatlantic.com
The Atlantic Daily: January 6 Isn’t Going Away
Every weekday evening, our editors guide you through the biggest stories of the day, help you discover new ideas, and surprise you with moments of delight. Subscribe to get this delivered to your inbox. Getty; The Atlantic Certain moments in history leave long shadows. The January 6 attack on the U.S. Capitol is sure to be one of them, even though the fallout is far from settled. Not even a year has passed, and already we are seeing glimpses of its disturbing cultural legacy. This is the New Lost Cause. David A. Graham argues: “This mythology has many of the trappings of its neo-Confederate predecessor, which Trump also employed for political gain: a martyr cult, claims of anti-liberty political persecution, and veneration of artifacts.” In some ways, the events were not a riot, but a war. A new HBO documentary focuses on the clash between Capitol Police and the insurgents: “There is something striking in seeing people on two sides of a very recent conflict discuss the opposing roles they played in it,” our culture writer Sophie Gilbert notes. Brook Mitchell / Getty Explore the week that was. It’s springtime in Sydney, Australia, where the surfers above are catching a morning wave. See more photos from around the world, as curated by our senior editor Alan Taylor.Read. A new book offers a sweeping account of human history. Prefer a classic? There’s always Crime and Punishment—a novel Dostoyevsky had no choice but to write.Watch. Dune is here. The highly anticipated adaptation of the sci-fi novel is epic—but that’s not what makes it great, our critic David Sims argues.If you’re counting down the days to Halloween, try a scary movie, as picked by David.Listen. Calling Succession fans! On the latest episode of The Review, our critics discuss the show’s Season 3 premiere. On this week’s How to Build A Happy Life, we talk about loneliness—and how to recognize whether you’re feeling it. Every weekday evening, our editors guide you through the biggest stories of the day, help you discover new ideas, and surprise you with moments of delight. Subscribe to get this delivered to your inbox.
2 d
theatlantic.com
Stop Shopping
Lately, news stories about the supply chain tend to start in similar ways. The reader is dropped into an American container port, maybe in Long Beach, California, or Savannah, Georgia, full to bursting with trailer-size steel boxes loaded with toilet paper and exercise bikes and future Christmas presents. Some of the containers have gone untouched for weeks or months, waiting for their contents to be trucked to distribution centers. On the horizon, dozens of additional vessels are anchored and idle, waiting for their turn in the port. More ships keep arriving. Everyone involved—sailors, longshoremen, customs clerks, truckers—works as fast and hard as they possibly can. It’s not fast or hard enough.The supply chain, as you know, is having a bad time. That’s been true since the pandemic began. Shortages in consumer goods have persisted far beyond analysts’ initial expectations, then beyond their subsequent revisions. At the moment, for most types of goods, shelves aren’t exactly bare yet. For the relatively well-off Americans accustomed to the astonishing abundance of big-box retail and grocery stores and the near-instant gratification of online shopping, it’s more a matter of having to settle for your third-favorite brand of Greek yogurt or wait six weeks for back-ordered jeans. But what’s already a genuine crisis for people who work in the global supply chain could very well turn into one for all of us; the manufacturing and distribution of necessities such as food and medicine require many of the same resources as the consumer economy’s various conveniences and diversions.What news stories generally don’t show you is where all of this stuff is going. At least anecdotally, much of it seems to be headed directly into the overflowing package room in my apartment building. As Slate’s Jordan Weissmann recently pointed out, it’s not as though the volume of goods getting through this mess and to retailers has slowed to a trickle; imports last month were actually at an all-time high, eclipsing the same period in 2019 by 17 percent. Rather, Americans are buying an extraordinary amount of stuff. Especially in the past six months, the system has been rocked by explosive demand.When you dig down into the numbers of just how much people are purchasing, the way that the supply-chain crisis gets talked about starts to feel a little uncanny. As the holidays approach, many people have begun to worry that shortages will worsen, not just for kids’ toys and other popular gifts, but for holiday decorations, seasonal clothing, and even food—the things that make the end of the year special. Despite those concerns, few seem willing to acknowledge that the record amount of stuff being brought into the country isn’t merely disappearing off store shelves. We know where it’s going, and we know who’s buying it all up. They—and maybe you—could simply knock it off.I’m not proposing that you or anyone else boycott commerce on a conceptual level. That would be impossible, and it would ignore how human life in this country works. It would also be the sort of killjoy self-righteous proposal that doesn’t gain much traction. Shopping is fun—novelty and possibility are fun—and it’s often how people access the tools and materials to do things that bring them genuine comfort or joy, which everyone needs. But even a quick glance at America’s credit-card statements begins to explain the mess we’re in. A lot of people buy things for the sake of it, stuff they don’t need or even particularly want and in many cases won’t use, as a salve for boredom or anxiety or insecurity. On the whole, consumer expenditures, which encompass both necessity spending (rent, gas, groceries) and discretionary spending (whatever you ordered from an Instagram ad after three glasses of happy-hour wine last Friday), account for about 70 percent of the country’s economy, according to the Bureau of Labor Statistics. But that spending is not distributed equally. In a typical year, the most affluent 20 percent of people account for nearly 40 percent of the country’s consumer spending, and this wealthier group’s purchases are disproportionately discretionary. Through the course of the pandemic, the situation has become even more lopsided. The affluent group spent much of 2020 working from home, largely insulated from mass unemployment and socking away the lion’s share of what Bloomberg Economics estimates as $2.3 trillion in extra cash that this group’s members might have otherwise spent on vacations or restaurant meals. Population-wide gains in spending power largely haven’t accrued to people with the most quality of life to gain from buying a few more things—they’ve gone to people for whom shopping is already a way of life. After taking a dive in the first months of a pandemic, spending from this group began to rebound relatively quickly as fears of white-collar layoffs dissipated and people began sprucing up their houses and yards and wardrobes. Since this summer, the group’s shopping has escalated further, even as spending among people with lower incomes has fallen off. The relatively well-off have returned to stores with money burning a hole in their pockets, gobbling up designer handbags, fine champagne, new cars, teeth whitening, and pretty much anything else you can think of. The problem with the explosion of this kind of discretionary shopping is that the same logistical resources that make this spike possible are also needed in other parts of the economy. The goods necessary to make school lunches—a vitally important civic function—might not be available for reasons that have nothing to do with how much food is theoretically available. Experienced workers and truck space and loading docks and time itself are not limitless resources. In a system asked to function beyond its capacity, if the distributor of hundred-dollar throw pillows can pay more for access to trucking capacity than a local food distributor that serves schools, then their pillows go on the truck.Currently, these resources get allocated according to little other than profit. Thinking about how necessary something is in the lives of everyday Americans, or how helpful its replenishment would be to people in genuine need, is the kind of resource triage that generally happens only after a natural disaster, and sometimes not even then. Somewhere along the line, powerful people in both business and government decided that the weaknesses that have caused the near-collapse of the supply chain are things Americans should just live with. For example, even before the pandemic, many truckers looked for work elsewhere instead of hauling goods out of container ports, because port trucking is particularly brutal and poorly compensated work. Instead of directly addressing this type of obvious problem in how goods are moved, America’s government and media so often have simply pleaded with Americans to spend more money—to create jobs, to revitalize the economy, to save the country.It’s no surprise we’ve obliged. Shopping has been marketed as a civic responsibility in America for more than a century. According to Tim Kasser, a psychologist and professor emeritus at Knox College who has spent decades studying materialism, the word citizen has slowly come to be replaced by the word consumer in newspapers and books. “It’s become more and more a sort of a default, to think of people as consumers instead of the myriad other roles that they play,” he told me. That’s also how people are socialized to think of themselves. For Americans, shopping isn’t just an activity about collecting the resources necessary for safe, happy lives. Over time, it’s become an expression of personal identity, a form of entertainment, and a way in which some believe they can effectively participate in politics—people rush to buy from or boycott companies on the basis of their public stances on social issues, and brands have begun to run extensive get-out-the-vote campaigns among their customers.Kasser points out that a person’s propensity toward materialism—which his research defines as “a set of values and goals focused on wealth, possessions, image, and status”—tends to increase when they’re feeling threatened, insecure, or unsure of themselves. Research has shown that society-level threats can reproduce that effect at population scale. The pandemic threw people out of their normal routines; it severed people from the habits, settings, and relationships that undergird their self-conceptions; it made people fear for their lives. Of course those with resources responded by getting back to shopping for things they don’t need as quickly and voraciously as they could. The structure of American consumerism ensures that buying more of whatever sounds good in the moment is the primary way most people are able to cope with uncertainty. “The logic of the system requires people to come to believe that what’s important in life is to make a lot of money and to buy a lot of stuff,” Kasser told me. Once you do, “it’s very difficult to change your beliefs.”Difficult—but not impossible. The shock of the pandemic can create at least one opportunity, Kasser said. It provides a relatively rare opening for people, shaken out of the day-to-day inertia of existence, to reevaluate their lives and their values en masse. Kessler has found that an honest appraisal of those things generally leads people to less materialism and more investment in their families and communities. In recent months, many people have already done these reappraisals in their professional lives, as my colleague Derek Thompson has chronicled, quitting jobs in enormous numbers in pursuit of better wages or improved quality of life.If you’re currently stewing in consumer hell, frustrated at shipping times and fearful of what holiday shopping will look like, it might be time to take a step back. You can stop. Not stop buying things entirely—you have to keep being a person, of course, and no one will begrudge you things that bring you joy, or begrudge your kids their Christmas presents. Some people will need to buy more or order more or get more deliveries than others, because the circumstances of their lives genuinely require it. But if you find yourself idly filling online shopping carts with mediocre sweaters or new golf equipment you won’t use until next spring anyway, you can just close the tab.As America slogs through its protracted supply-chain woes, we can be honest with ourselves about what the need to constantly shop has done to the country and our own lives. Big-box retail (not to mention Amazon) was made possible by deregulating trucking and sending manufacturing overseas, which keeps the cost of consumer goods low but has replaced millions of opportunities for good, stable employment with customer-service jobs so crappy that workers are doing everything in their power to find another way to make a living. And the personal insult added to that societal injury? As a coping mechanism for the existential problems of American life, all that spending almost certainly doesn’t even make you happy. Abundant research has shown that it doesn’t really make anyone happy, especially around the holidays.As it stands, America’s central organizing principle is thoughtless consumption, acquiring things for yourself and letting everyone else pick over what you left behind on the shelves. You can decide you don’t like that. You can decide that people—your family, your friends, the people in your community, the port truckers and Amazon warehouse workers running themselves ragged—are more important to you than another box of miscellaneous stuff. You can take a bit of pressure, however tiny, off a system so overburdened that it threatens to grind everyone in it to dust. American shopping is a runaway train, gliding smooth and frictionless down the tracks toward God knows what over the horizon. Your brakes are small, but you can throw them whenever you want.
2 d
theatlantic.com
Democrats Stare Into the Abyss
Since mid-summer, Democrats have been trapped in a downward spiral of declining approval ratings for President Joe Biden, rising public anxiety about the country’s direction, and widening internal divisions over the party’s legislative agenda. The next few weeks will likely determine whether they have bottomed out and can begin to regain momentum before next year’s midterm elections.Roughly since the rise of the Delta variant sent COVID-19 caseloads soaring again, the White House and congressional Democrats have faced a debilitating slog of dashed hopes and diminished expectations. Weeks of negotiation over the party’s massive economic-development and social-safety-net bill have mostly continued that story, with Democratic groups lamenting the loss of programs that are being lopped off to meet the objections primarily of two centrist Democratic senators, Joe Manchin of West Virginia and Kyrsten Sinema of Arizona—the same duo whose resistance to changing the Senate filibuster rule has so far stymied the party’s hopes of passing legislation establishing a nationwide floor for voting rights. Amid all of these reversals, anxiety is rising among Democrats about whether they can hold the governorship in next month’s election in Virginia—a state Biden carried last year by 10 points.But after months of steady retreat, Biden and congressional Democrats are currently engaged in intense negotiations that will decide whether (and in what form) they can pass their sweeping economic and safety-net bill. And after a Republican filibuster on Wednesday blocked the Democrats’ latest proposal to combat the voting-rights restrictions proliferating in red states, the party now squarely faces the choice that many activists consider an even more existential decision: whether it will reform the filibuster to pass that legislation.[Read: The Democrats’ last best shot to kill the filibuster]On both fronts, these deliberations provide the party a chance to finally begin posting legislative victories on significant priorities. For all that may be eliminated from the economic bill, which the party is seeking to pass under the reconciliation process that preempts a GOP filibuster, it could still encompass the biggest increase in both public investment and the social safety net since the 1960s, pumping money into programs for kids, health care, economic development, and climate change.“The process has certainly been challenging, and we’ll still have far more to do to achieve economic and racial justice,” says Sharon Parrott, the president of the left-leaning Center on Budget and Policy Priorities. “But I think [this package] will be a very important set of significant policy advances that will be game-changing in a lot of ways.”Passing the reconciliation bill would also clear the way for passage of the extensive bipartisan infrastructure package approved earlier this year in the Senate. And once reconciliation and infrastructure are completed, many hope Biden and other party leaders can intensify pressure on Manchin and Sinema to find some way to exempt voting-rights legislation from the filibuster.“The fact that reconciliation has stretched this long has definitely been harmful to the efforts to move Manchin and Sinema on voting rights and the filibuster,” says Eli Zupnick, a spokesperson for the liberal advocacy group Fix Our Senate. “My theory, and I think everyone’s theory throughout … is that once [the White House] got through reconciliation, they felt they could expend political capital with Manchin and Sinema in a way that they could not with reconciliation hanging out there.”Democrats could still fall from this tightrope. Progressives could demand the inclusion of too many programs, even in truncated form, to realistically meet the spending ceiling Manchin and Sinema have set. Sinema’s resistance to higher tax rates, in turn, could make it impossible for the party to fund even a more modest version of its plans. And even if Democrats can solve the Rubik’s Cube of the reconciliation bill, nothing may move Manchin and Sinema from their defense of the filibuster, which on voting rights, as I’ve written, illogically gives Senate Republicans a veto on whether Washington responds to the restrictions that their Republican colleagues in the states are passing.Moreover, the evidence of history is that legislative success in a president’s first year doesn’t guarantee electoral success in the midterm elections of his second year. Voter assessments of current conditions, on the economy and the country’s overall direction, have seemed to matter more. But while legislative success hasn’t been sufficient to ensure successful midterm contests, it may still be necessary to avoid the worst: The collapse of a party’s agenda can disillusion its core voters and send a signal of disarray to swing voters.[David A. Graham: The Democrats’ greatest delusion]A wide range of strategists from across the party’s ideological spectrum have escalated their calls in recent days for the party to arrive at a budget deal—almost any deal. Simon Rosenberg, the president of NDN, a Democratic research and advocacy group, has argued for weeks that Democrats need to conclude the legislative wrangling so they can shift their focus back to the public’s top priority: containing the coronavirus pandemic and undoing the economic damage associated with it.On Wednesday, the centrist group Third Way and the liberal polling organization Data for Progress held an unusual joint press conference to encourage Democrats to reach an agreement. “There are enormous substantive reasons why it’s important for individual components of this package to be included, but politically, what will matter most for Democrats is that the bills are done,” Sean McElwee, a co-founder and the executive director of Data for Progress, said during the event. “The sooner we can get these bills finalized … the sooner we can demonstrate, to both our base and the independents, that we are unified as a party and able to get things done. That’s why there is real urgency around getting this across the finish line.”One reason for that urgency is the Virginia governor’s race on November 2. Democrats have been unnerved by former Governor Terry McAuliffe’s inability to establish a safe advantage over the Republican Glenn Youngkin in the race to succeed Democratic Governor Ralph Northam, who is term-limited.A Youngkin victory would actually fit the state’s long tradition of pushing back against the president’s party: The party out of the White House has won every Virginia governor’s race since 1977 with just one exception—McAuliffe’s 2013 victory, the year after Barack Obama’s reelection. But given the state’s blue tilt since then, a McAuliffe loss would still rattle Democrats, particularly because evidence suggests that Biden’s sagging popularity is exerting an undertow on the governor: A Monmouth University poll released Wednesday found that a 52 percent majority of registered voters in the state now disapprove of Biden’s performance, and just over four-fifths of those disapprovers are backing Youngkin. McAuliffe is winning an even higher percentage of those who approve of Biden, but just 43 percent of voters express such positive views of him, the poll found. (A Fox News poll in Virginia last week showed Biden’s approval at 50 percent and McAuliffe narrowly leading.) McAuliffe has publicly pleaded for congressional Democrats to finish their work, particularly on the infrastructure bill.Reaching agreement on the reconciliation bill (and the infrastructure package whose passage it would trigger) would hardly solve all of the Democrats’ problems. Economic unease, particularly over inflation, is rising, which some Democrats believe is the key reason Biden’s approval rating hasn’t recovered in most surveys (or has even continued falling) as the Delta wave has started to recede. No matter what happens on reconciliation, a long list of party priorities that passed the House appear doomed by the Senate filibuster, including immigration and police reform, LGBTQ equality, and gun control. And the final reconciliation bill, coming in at a price tag far below the original goal of $3.5 trillion, will inevitably be conspicuous for what it leaves out, including free community college and provisions pushing utilities to shift toward clean-energy sources. Depending on how talks with the unpredictable Sinema pan out, Biden could even be forced to retrench (or eliminate) a plan the party has discussed for 20 years to allow Medicare to negotiate lower prescription-drug prices. Sinema’s resistance could also force Biden to accept little or no progress at reversing the reductions in corporate- and income-tax rates approved by Donald Trump and the Republican Congress—tax cuts that every Democrat in both chambers (including Sinema) voted against. Those would all be bitter pills for much of the party to swallow.Voting rights, which is now proceeding on a completely separate path, may offer Democrats their best chance to heal those bruises and unite the party heading into 2022.[Read: The Democrats’ dead end on voting rights]For many party activists and strategists, the fate of the voting-rights bill is even more consequential than what happens to the reconciliation budget package. Amid all the red-state measures restricting access to the ballot and increasing Republican leverage over election administration, if Democrats cannot secure voting rights, “I think it would be a failed Congress,” Zupnick says, in a widely shared view. “It would be seen as the biggest missed opportunity and biggest political mistake in a generation at least. If they don’t take steps now to protect our democracy, the window could shut and there may not be another chance. This cannot be seen as a successful Congress no matter how strong the reconciliation bill is if they do not do something on democracy protection.”For months, activists have complained that Biden and the White House have focused far more on passing the reconciliation bill than on passing the voting-rights legislation—an imbalance apparent in the president’s priorities this week on the former even as the GOP blocked the party’s latest version of the latter. Biden, at a CNN town hall last night, said explicitly that he intended to complete his reconciliation bill before fully focusing on the voting-rights legislation—opening the door, for the first time, to supporting an exemption from the filibuster if necessary to pass it.But the party’s best chance to solve both of these problems may be to link them. It’s possible to imagine a grand bargain in which House and Senate progressives would accept the smaller reconciliation bill that Manchin and Sinema are demanding in return for them creating some exemption from the filibuster for voting rights.Manchin and Sinema, as some Democrats told me this week, may feel they already have enough leverage on both fronts that they don’t need to make any deals. But while Manchin is essentially immune to intra-party pressure in West Virginia, agreeing to advance the voting-rights bill would surely represent Sinema’s best opportunity to undo (or at least soften) the animus she’s generated among Democratic activists in Washington, D.C., and Arizona with her actions on issues such as the minimum wage and the reconciliation bill.“It may be time for Dems to start thinking even more out of the box, given our thin majorities and struggles to get our agenda passed,” Rosenberg told me when I ran the idea of a grand bargain by him. “The endgame on reconciliation is going to be very hard, and perhaps something like this may be just the thing to get us to a good and smart final deal.”In his floor speech after the latest GOP filibuster blocked the Democrats’ voting-rights bill, on Wednesday, Senate Democratic Leader Chuck Schumer noted that the Lincoln-era congressional Republican majorities passed the major Reconstruction civil-rights laws—including the Fourteenth and Fifteenth Amendments—on an entirely party-line basis, without a single vote from House or Senate Democrats (who were defending their allies in the former Confederate states.) “To the patriots after the Civil War, this wasn’t partisan—it was patriotic, and American democracy is better off today because the patriots in this chamber at that time were undeterred by minority obstruction,” Schumer insisted. A grand bargain among Democrats that simultaneously resolves their disputes over the spending bill and voting rights may be their best chance to uphold that tradition today—and reverse their own fading fortunes before 2022.
2 d
theatlantic.com
January 6 Wasn’t a Riot. It Was War.
In the days and weeks after the storming of the U.S. Capitol on January 6, 2021, commentators and media outlets grappled with the question of what to call that event. Language is sticky; it clarifies and obfuscates the truth depending on who’s wielding it. January 6 was described as or likened to a “riot,” a “tourist visit,” an “insurrection,” a “peaceful protest,” and a “coup attempt.” And yet, watching Four Hours at the Capitol, Jamie Roberts’s tight, unsettling new HBO documentary about that day, another word seemed more appropriate to me, one that most of the participants interviewed in the film might agree on. More than anything else, January 6 was war.There have been a number of incisive breakdowns of that day, including “Day of Rage,” The New York Times’ 40-minute film detailing how the attack was strategized and executed, and how President Donald Trump and his allies fomented mass anger and even seemed to encourage the violence. Four Hours at the Capitol isn’t as analytical, or as thorough in its parsing of all the information that’s emerged. But its immersiveness offers something else. With his rigidly chronological framing and his interviews with people who were present at the Capitol that day, Roberts captures the extent to which both sides were engaging in combat. This dynamic emerges over and over again throughout different accounts and video clips. One clash between Capitol Police officers and pro-Trump extremists is referred to by a participant as “the battle for the tunnel.” Different interviewees describe fighting on “the front line,” engaging in “hand-to-hand combat,” and, in the case of one police officer, the strangeness of walking through his own colleagues’ blood. In a scene that seems ripped right out of a Bruce Willis movie, a police commander shouts, “We are not losing the U.S. Capitol today, do you hear me?”Like most people, I watched January 6 unfold from my couch, where the cognitive dissonance of seeing men in full tactical gear and Confederate Army cosplayers traipsing through the Capitol’s hallways was undercut by a genuine horror about what might happen next. TV news showed how easily the small number of Capitol Police officers present that day were overwhelmed. Matter-of-factly, Four Hours at the Capitol documents how fiercely they fought to keep the insurrectionists from overwhelming the building and reaching members of Congress. Roberts sweeps viewers quickly into the day, starting with an assembly of Proud Boys on the National Mall who seem disturbingly primed for violence even at 10:35 in the morning. Around noon, after Trump declares, “If you don’t fight like hell you’re not going to have a country anymore,” his followers start heading to the Capitol, a makeshift army equipped with flags, weapons, even a hangman’s platform.Four Hours lets its subjects speak without interjection or correction, a decision that seems to respect its audience’s ability to reason out the logical gaps. Most of the people interviewed who stormed the Capitol that day seem either savvy enough to avoid self-incrimination or steeped in self-delusion. Roberts occasionally editorializes, following up a scene in which a Georgia car dealer recalls how proud he was that day “to see the American spirit that was on display” with footage of people smashing the windows of the Capitol with body shields stolen from cops. But there is something striking in seeing people on two sides of a very recent conflict discuss the opposing roles they played in it. “They were trying to kill us. There was no doubt in my mind,” says Michael Fanone, an officer who was dragged away from his colleagues by a crowd, beaten, and Tasered, resulting in a mild heart attack and a brain injury. “There was a lot of fighting between patriotic people and Capitol Police” is how the Proud Boy Bobby Pickles puts it, likening January 6 to “1776, because it reminds us of revolting against our government.”The breadth of people Four Hours includes adds emotional texture to its presentation of events. Roberts interviews both Democratic and Republican members of Congress, as well as the aides who hid in dark rooms, afraid they were going to be killed. Representative Ruben Gallego of Arizona describes the violent plan he made if he had to fight to survive, while Connecticut’s Rosa DeLauro recalls phoning her husband to tell him that she loved him, in case she didn’t make it out alive. Representative Buddy Carter of Georgia enthusiastically recalls needing to “fight” the certification of Joe Biden’s electoral victory, but seems frustrated that others took his words too literally. “How could y’all be so stupid? Guys, we were winning,” he says, exasperated. “We were winning the moral wars.”[David A. Graham: The new lost cause]What’s clear, watching the documentary, is how much worse things could have been—what might have happened if the hordes screaming Nancy Pelosi’s name had gotten to her, how bloody the day might have become had more police officers used their weapons, how many more cops and rioters might have died. As it was, one officer died the next day after suffering two strokes, while four died by suicide in the weeks after the battle. One pro-Trump extremist was fatally shot in the Capitol, one died of an amphetamine overdose, and two died of medical events related to heart conditions. The wife of Jeffrey Smith, a D.C. police officer who took his own life with his service weapon nine days after the attack, says that her husband was a “completely different person” when he arrived home that evening. “There was obviously something that happened that changed him.”Capitol Police officers are equipped to deal with violence and threats to their lives. They’re not trained for warfare, which is what must have made January 6 and their task of defending the U.S. Capitol seem so absurd. The last time anti-government forces stormed the building was in 1814, when British forces set fire to the Capitol, the White House, and the United States Treasury. Never before 2021 had the Confederate flag been paraded through the seat of the U.S. government. Even now, as my colleague David A. Graham wrote earlier this week, pro-Trump factions are trying to redefine January 6 as a mythic symbol, a New Lost Cause. But what Four Hours at the Capitol captures is impossible to deny: Pro-Trump forces went to war against the American officers charged with defending democracy.
2 d
theatlantic.com
Guns—Even Props—Are Not Toys
Alec Baldwin was involved in a tragic shooting on the set of his latest movie yesterday.One person was killed and another seriously wounded when a prop gun was discharged by the actor, according to the Santa Fe County Sheriff's Office. Early reports offered conflicting information. A spokesperson for Baldwin told the Associated Press that the gun in question was firing blanks. In an email to members of the International Alliance of Theatrical Stage Employees, the secretary-treasurer of IATSE Local 44 wrote that “a live single round was accidentally fired on set by the principal actor,” IndieWire reported.It's impossible at this point to draw any hard conclusions about precisely what went wrong. But whatever the specifics, there’s a simple lesson to be learned: Guns aren't toys. Even props must be handled with respect for the harm they're capable of inflicting. Training is required to operate any firearm safely, whether on set, at the range, or at home. And following gun-safety rules is always imperative.A variety of different guns are used in film productions. Those include rubber guns that don't function at all, airsoft guns with simulated blowback, blank-firing props, and even real functioning firearms. Blank-firing prop guns are designed to only work with blanks. Many have blocked barrels to prevent a projectile from being fired through them. Real guns used as props are sometimes modified in the same way. But live ammunition improperly loaded into either kind of gun could potentially overcome those precautions.Many productions employ trained safety experts to ensure that live ammunition is not brought on set, and that proper safety procedures are followed.[Read: How should Hollywood respond to mass shootings?]Even if a production does not make an obvious mistake, such as allowing live ammunition onto set, the use of blanks carries its own risks. Blanks are usually cartridges that are manufactured without the inclusion of the bullet. They still feature a primer and powder charge, though, at about half the strength of a live round.That means they still expel a lot of hot gas at a high rate of speed and can still be dangerous. This is especially true if something is lodged in the prop gun's barrel that the charge can propel forward.The military is acutely aware of these risks. If you've ever seen footage of soldiers training with a strange device on the end of their barrels, that was a blank-firing attachment. It's designed to both help the firearm cycle with the lighter power load and block any potential projectiles— short of an actual bullet fired from a live round—from exiting the barrel.There have been other tragic prop-gun-related accidents on set. The CBS Cover Up star Jon-Erik Hexum reportedly pointed a prop gun loaded with a blank at his head as a joke during a break on set back in 1984. The force of the blank going off so close to his head was enough to kill him, even without any bullet. Older-style blanks sometimes used a cotton wad, and if such a wad was propelled out of the barrel, it might have contributed to Hexum's death. Newer blanks that use crimped cartridges instead of cotton wads may provide an added layer of safety. But that doesn't mean they are perfectly safe either.In 2008, a similar scene unfolded before a rendition of Oklahoma at a Utah high school. A 15-year-old boy was killed when a gun firing blanks was pointed at his head, apparently at close range.And in 1993, Brandon Lee, the son of Bruce Lee, was filming a movie titled The Crow. During a scene in which the character he played was supposed to be shot, something went wrong, and Lee was killed. His autopsy revealed that he'd been hit with a .44-caliber bullet.The details in Lee’s case are, to some degree, still disputed. We might never get a perfect explanation for what happened to him, and the same might be true for this latest tragedy.[Read: Guns are a threat to the body politic ]This kind of negligence is exceedingly rare. Prop guns, blanks, and even real guns used as props are involved in entertainment productions every day without a problem. That is only the case when everyone handling the firearms, including the actors, both properly understand the risks involved in using them and follow the rules designed to mitigate those risks.Although the details of this accident remain unclear, it appears likely that somebody did not follow all of the rules necessary to keep everyone involved safe. We don't yet know exactly which safeguards failed. But the result is one person dead and another in the hospital—because guns are not toys.
2 d
theatlantic.com
Let the Booster Mixing Begin
Mixing and matching vaccine brands is officially on the table in the United States. But that option might soon be billed as the B-list choice.Last night, CDC Director Rochelle Walensky gave the green light for Moderna and Johnson & Johnson booster shots, the long-awaited follow-up to a similar recommendation given to the Pfizer formulation last month. As the endorsement stands, all who are eligible for an additional jab—which now includes tens of millions more Americans—should be able to pick whatever booster brand they like. But discussions among a panel of experts who advised Walensky hinted at a catch: The agency has yet to issue its final clinical guidance on who, specifically, might want to boost with what—and an early draft of the recommendations suggests that Americans “should” stick with the same brand they got in their first go-round.Switching to a different shot would be allowed, as was authorized by the FDA on Wednesday; per the draft CDC guidance, people may opt to mix and match based on availability or preference, after assessing their individual risks and benefits. (As a reminder, the FDA’s authorizations tell Americans what vaccines they’re allowed to get. The CDC follows that up with advice on what folks should do with those options.)The CDC’s stance on mixing and matching, then, could end up being a relatively soft one, neither extolment nor excoriation. That might also be the most practical course of action for the agency, given the variables involved and the lack of clear-cut evidence that could untangle them. But the wishy-washiness of Pick whatever is confusing as hell.Consider, first, the sheer number of choices now available to booster-eligible Americans (a limited set of mRNA recipients, and all folks who got J&J). With three approved or authorized vaccines, the simplest mix-and-match matrix has nine possible combos. But that’s an underestimate of the absolutely unmanageable number of variations therein. Moderna’s third shots, for instance, come in full doses for immunocompromised people and half doses for everyone else. The timing of additional shots might matter, too: People who get a second injection of J&J half a year after their first seem to churn out more antibodies than people who wait just two months. Clearly, inoculation isn’t just about which vaccines you’re getting. It’s about which vaccines, when, how much, how often, in what order, on and on and on—an absolute multiverse of choices. Add to that the inevitable differences among individual immune systems, and just start to imagine the terror of the resulting flow chart. Against that chaotic-evil backdrop, the CDC’s interim preference for homogeneity has a certain appeal—even if it sets up a slightly judgy juxtaposition between what’s by the book and, essentially, what the mavericks might do, if they feel like it.[Read: Should you mix and match your booster shot?]Then again, maybe what the CDC says is, at this point, kind of moot. Millions of people have already boosted, some of them ahead of eligibility. Now, with even more choices available, “people who care will vote with their feet,” Céline Gounder, an infectious-disease physician at Bellevue Hospital, in New York, told me. That may in the CDC guidance is easy to grab and run with. For anyone who has made up their mind, in any direction, the agency’s relatively hands-off approach isn’t all that useful (or hard to ignore).Cross-vaccine boosting can certainly come with perks. People won’t have to stress about matching brands across doses; individuals in at-risk groups might have the flexibility to avoid rare, shot-specific side effects. The strategy might even be more protective. None of that, though, makes actually selecting a booster any easier. As things stand, the decision requires a small leap of faith, or at least some immunological inference. Data on mixing and matching is still relatively scant, though the early evidence looks promising. A recent National Institutes of Health study found that switching shots seems to juice out antibodies at least as well as, and in some cases quite a bit better than, staying the course with one brand. That seems especially true for the OG J&J crowd: mRNA boosters sent antibody levels soaring, compared with a second helping of J&J. (A caveat: The study boosted with the full Moderna dose, not the half dose that the FDA authorized for non-immunocompromised people.) If that pattern holds, J&J, already the least popular vaccine in the U.S., might become even more of an underdog.That’s not for sure. Gounder is advising caution: The NIH study was small, tracking an imperfect proxy for protection in fewer than 500 people for a very limited period of time. Boghuma Kabisen Titanji, an infectious-disease physician and researcher at Emory University, in Atlanta, is a bit more optimistic, and told me that she finds the mix-and-match data compelling enough to offer the strategy. The trends in the NIH study, she pointed out, seem well in line with the months of data that have come out of places such as the United Kingdom, which adopted a hybrid approach early on, albeit for original doses and with a different set of brands (Pfizer and AstraZeneca).Ideally, mixing and matching could blur the brand boundaries between vaccinated Americans, effectively collapsing more of us into the same pretty-well-protected pool. (Did you get Pfizer or Moderna? J&J? Who cares?) Or, it could splinter us into infinite subgroups that become ever more difficult to compare.Collecting good data on vaccine responses is getting harder as inoculation becomes more bespoke. With so many Americans now poised to choose their own vaccine adventure—you know, as they may—the differences among regimens might get tougher to pin down. We need that data: What we learn now will—hopefully—help us design better, safer, more efficient vaccine regimens for future generations. But if fewer people embark on similar trajectories, they could get more difficult to group together. Studies might have to be more limited in scope, or work harder to combine data from different parts of the country. That’s not impossible, Saad Omer, a vaccine expert and epidemiologist at Yale, told me. But it does make things “more challenging.”Some of this beta-testing vibe harkens back to last winter, when experts heatedly debated the merits of skipping or delaying second doses of the Pfizer and Moderna vaccines. Certain countries, including the U.K., spaced out the shots; the U.S. and others stuck to the very trim gaps prescribed by trials. The delay was a gamble, since it left people partially protected for longer and sent mixed messages to a frustrated public. But now it looks beneficial. Really, we were all guinea pigs—and this mass round of boosting is slating us for a discombobulation redux.We won’t all be winners; someone always has to be in the group that fares worse. Then again, “worse” is always relative. Anyone who’s playing the booster game is already, technically, fully vaccinated, putting them ahead of the billions around the globe who still are not. Titanji pointed out that more Americans have gotten boosters than people have received first doses in Nigeria, a country with some 200 million residents.​​Even in the U.S., getting more first shots to people remains the bigger priority—that’s how we collectively contain the coronavirus. But the hyper-individualistic American approach to the pandemic is once again nudging each of us to chart our own course. The government has kind of shrugged about mix-and-match boosting, and punted the decision to us: Choose whichever path seems right to you; turn to page 7; hope for the best. Here’s the trick, though—no one’s sure where this chapter ends. Good luck, I guess.
2 d
theatlantic.com
Why Childhood Friendships Feel So Intoxicating
Earlier this month, a new novel by the late French writer Simone de Beauvoir was published. Written nearly 70 years ago by a woman who died 35 years ago, Inseparable follows the devoted, almost romantic friendship between fictionalized versions of de Beauvoir and her real-life childhood best friend, Zaza. De Beauvoir was besotted with Zaza. Her consuming infatuation with the girl seeps through every page—perhaps explaining why the author decided the book was too intimate to publish during her lifetime.De Beauvoir’s intense adoration of Zaza may seem unique, but the experience of having such an intoxicating childhood bond is not unusual. As the journalist Lydia Denworth writes in an excerpt of her book Friendship published in The Atlantic, “The intensity of feelings generated by friendship—or loneliness—in childhood and adolescence is by design.” Literature is filled with tales about this type of youthful passion. Anne of Green Gables calls these bonds “bosom friends”—relationships one holds close to the heart. The author Elena Ferrante’s Neapolitan novels capture a complex and turbulent portrait of this kind of entanglement in the draw between its two protagonists, Elena and Lila. Elena is entirely in Lila’s thrall; she feeds off her vibrance, imitates her, and kicks off her writing career under Lila’s inspiration. At the same time, Elena is deeply jealous—a layer that only fuels their interdependence.The writer Julie Buntin, who herself had an overwhelming and exhilarating childhood friendship, also considers these types of relationships in her debut novel, Marlena. The work both conjures the fervor of young crushes and critiques the tragic endings they tend to come to in literature. De Beauvoir’s friend died young, as does the titular character of Buntin’s book. In this context, Marlena prompts the reader to ask: Can homages to these fierce, entrancing young people exist without romanticizing the sad fates that seem to befall them? ​Every Friday in the Books Briefing, we thread together Atlantic stories on books that share similar ideas. Know other book lovers who might like this guide? Forward them this email. When you buy a book using a link in this newsletter, we receive a commission. Thank you for supporting The Atlantic. What We’re ReadingDenise Bellon / AKG imagesThe philosopher who took happiness seriously“Unrequited love, embarrassing as it can be, is, in de Beauvoir’s thinking, a way of being unconstrained, because loving the other ‘genuinely’ is to love him ‘in that freedom by which he escapes. Love is then renunciation of all possession.’”
2 d
theatlantic.com
What Joe Biden Could Learn From Henry Kissinger
Last month, President Joe Biden went before the United Nations General Assembly in New York and declared the end of America’s forever wars in the Middle East. “As we close this period of relentless war,” he told the assembled representatives, “we’re opening a new era of relentless diplomacy.”But Biden’s speech was accompanied by inauspicious diplomatic steps. First came the shambolic and ignominious withdrawal from Afghanistan, which left America’s allies feeling that the United States had failed to consult adequately with those who had fought beside it before it rushed for the exits. Then Biden announced a new Indo-Pacific defense pact with Australia and the United Kingdom. France, America’s oldest ally, was shafted in the process, its $60 billion contract to build diesel submarines for the Australian navy abruptly canceled, its role and interests in the Indo-Pacific rendered irrelevant to the Asian power equilibrium that Biden was striving to shore up in the face of a growing challenge from China. Relentless diplomacy was beginning to look like ruthless diplomacy. Indeed, if the art of diplomacy is to tell a person to go to hell and make him look forward to the trip, French President Emmanuel Macron’s outrage suggested that Biden had failed to meet that standard.Perhaps Biden could learn something from America’s most accomplished diplomat, Henry Kissinger. At 98, Kissinger remains a controversial figure, his realpolitik brand of balance-of-power diplomacy reviled by some for its application in Laos, Cambodia, Chile, and Bangladesh, but revered by others for achieving the opening with China and détente with the Soviet Union. All of those events, however, took place while Kissinger was serving as Richard Nixon's national security adviser. Only when Kissinger became secretary of state in September 1973 and moved from the West Wing to Foggy Bottom were his diplomatic skills fully put to the test. And that is when his relentless diplomacy in the Middle East sidelined the Soviet Union during the Cold War and produced four Arab-Israeli agreements, which established a new American-led order in that turbulent part of the world and laid the foundations for Arab-Israeli peace.[From the December 2016 issue: The lessons of Henry Kissinger]Like Biden after Afghanistan, Kissinger had to confront the limits to the use of force demonstrated by the United States’ defeat in Vietnam. And like Biden, he faced a period of domestic turmoil, as the Watergate scandal forced Nixon from office and raised questions abroad about the ability of the U.S. to sustain a coherent and reliable foreign policy. Kissinger executed a pivot in U.S. foreign policy away from Southeast Asia toward the Middle East. Ironically, almost five decades later, Biden is now executing a pivot away from the Middle East back toward Southeast Asia.Recognizing the limits of coercive power and facing a growing isolationist trend at home, Kissinger, like Biden, understood that the United States could not afford to withdraw from the world. Instead, Kissinger depended on deft diplomacy to promote American interests at a time of intense geopolitical rivalry, when deploying ground forces was no longer an option. Timothy A. Clary / Getty ; Brownie Harris / Corbis / Getty ; The Atlantic Kissinger’s success was built with several key ingredients. He always began with a clear objective, at least in his own mind, and a strategic concept for how to achieve it. In the Middle East of the 1970s, his objective appeared to be peace between Israel and its Arab neighbors. But that obscured his real purpose, which was to build a new, American-led order in the region. For Kissinger, peacemaking diplomacy was a process designed to ameliorate conflicts between competing powers, not resolve them. He feared that pursuing peace as an idealistic end state would jeopardize the stability that his order was designed to generate. Peace for Kissinger was a problem, not a solution. The desire for peace needed to be manipulated to produce something more reliable, a stable order in a highly volatile part of the world.Kissinger’s diplomatic daring was informed by an innate conservatism. He was wary of the crusading impulses that drove many American leaders to overreach in their desire to remake the world in America’s image. He knew from his study of history that maintaining order was usually too prosaic an objective to inspire presidents, compared with the immortality they might hope to achieve by pursuing peace or democracy in far-off regions that knew little of either. Declarations like Biden’s—that democracy versus autocracy is the defining struggle of our time—were not for Kissinger. Rather, he pursued the more mundane but achievable idea of a balance of power between competing states to deter those who would seek revisions to the order. In his concept, that equilibrium would discourage war and create the conditions over time for peace and democratic change.Once balance had been established, the United States, with its immense power, would play the role of the “indispensable balance wheel,” swinging back and forth between the contending regional powers, ideally positioning itself closer to all of them than they were to one another. That was the challenge for America then: using its power to deter states from disrupting the order and rewarding them for maintaining it. And that is the same challenge Biden faces today.If Kissinger’s theory was clear, its practice was inevitably more complicated, especially in the Middle East. During his time in the White House, Kissinger agitated for a balance of power in which U.S. support for Israel, Saudi Arabia, and the Shah’s Iran would deter the revisionist impulses of Soviet-backed clients in Egypt, Syria, and Iraq. Détente with the Soviet Union buttressed this equilibrium because it committed Moscow to maintaining the regional status quo. The order worked well enough for three years. But it collapsed when Egypt and Syria launched the Yom Kippur War on Israel in October 1973 and the Soviet Union, fearing the loss of its position of influence, swung behind them.Kissinger was as surprised as the Israelis. He had become so confident in the prevailing equilibrium that he had overlooked a principle derived from his study of history: that for the order to be stable, a balance of power was insufficient; there also had to be a “moral consensus” among the powers that the existing arrangements were fair and just. The legitimacy of the Middle Eastern order Kissinger was creating in fact rested on shaky foundations because it failed to provide a sense of fairness or a modicum of justice to the Arab states that had lost territory to Israel in the 1967 Six-Day War.By his own admission, Kissinger had underestimated Egyptian President Anwar Sadat, dismissing him as resembling a character from Verdi’s opera Aida (which is set in ancient Egypt). But once the Yom Kippur War broke out, Kissinger became determined to build a new Middle Eastern order based on working with Sadat to turn Egypt from a revolutionary power into a status quo power, moving it from one side of the balance to the other. In that way, he would remove the largest and most militarily powerful Arab state from the conflict with Israel, making it impossible for the others to contemplate going to war again. He learned this play from his study of the post-Napoleonic 19th-century European order, when Castlereagh and Metternich, the foreign ministers of Great Britain and Austria, respectively, brought France over to the side of the status quo powers.[Read: In defense of Henry Kissinger]Kissinger’s diplomatic mechanism for achieving this feat was an Arab-Israeli peace process in which the United States would persuade Israel to yield Arab territory in return for steps that would reduce the incentives for the Arab states to return to war. But because he viewed peace with a jaundiced eye, his peace process would be cautious, gradual, and incremental. He labeled it “step-by-step diplomacy.”“Territory for peace” became the legitimizing principle for Kissinger’s new Middle Eastern order. But how to convince Israel, which had just experienced the trauma of a war in which its survival seemed to hang in the balance, that yielding territory would make it more, not less, secure? Especially when Kissinger shared Israel’s skepticism about Arab countries’ peaceful intent.Here’s where Kissinger’s manipulative skills became essential to his successful diplomacy. In a series of ferocious arguments with Golda Meir, Israel’s doughty prime minister, he did not try to sell her on peace. Instead, he persuaded her to give up territory for time: time for Israel to get over the trauma of the war, time to reduce its isolation and build its military and economic strength, and time for the Arabs to eventually accept Israel and make peace with the Jewish state.Convincing the Israelis was painful, difficult, and frustrating, yet Kissinger was indefatigable. Hour after hour, in meeting after meeting in Washington and Jerusalem, he deployed all the arguments in his arsenal, at times with a sense of humor that disarmed his stiff-necked audience, at others with threats that only reinforced their resistance. In the end, though, he succeeded in persuading Israel to hand back to Egypt the Suez Canal and then the oil fields and strategic passes in Sinai. Two years later, after Kissinger had left office, President Jimmy Carter brought this process to completion in the Israel-Egypt peace treaty.Israel became committed to trading territory for time. With American assistance, Israel used time to build up its military, economic, and technological capabilities, becoming the strongest power in the Middle East. Meanwhile, over time, the Arab states gradually tired of the conflict, accepting the Jewish state in their midst and recognizing the benefits of cooperating with it—as recently evidenced by the Abraham Accords—just as Kissinger had predicted.What he did not expect, however, was that Israel would also use time to consolidate its grip on the West Bank, as settlers continued to build and expand their communities with government support. At the end of Kissinger’s tenure as secretary of state, just 1,900 settlers lived on the West Bank; by 2020, that number had swelled to more than 466,000 in 131 settlements. That made it all the more difficult politically for Israel to withdraw from the territory even as its strength grew. Now, more than four decades later, the idea that Israel should relinquish the West Bank has become almost unimaginable. Kissinger understood the consequences of settlements for his legitimizing principle. He wrote in his memoirs that Israel had no choice in the end but to cede territory for peace, warning that “the Jewish state would consume its moral substance if it sought to rest its existence on naked force.”Kissinger also applied his relentless diplomacy to Syria’s Hafez al-Assad. Syria did not have the same weight as Egypt in the Middle Eastern power balance, but it did have an important role to play in legitimizing Kissinger’s peace process. Syria prided itself on being the beating heart of pan-Arab nationalism, which utilized antagonism toward Israel as a unifying factor among disparate Arab states. By engaging in peacemaking with Israel, Sadat was breaking the mold. If Kissinger could entangle Assad in his diplomatic web, it would provide Arab cover for Sadat’s camp-shifting and undermine the Soviet Union’s ability to thwart Sadat’s endeavors.Assad was shrewd enough to recognize that Kissinger’s purpose was to break up the united Arab front against Israel and that if he succeeded, Syria would be left weakened and isolated. But he also understood that he could extract advantage from the fact that Kissinger needed him to provide cover for Sadat and reinforce the perception that only the United States could deliver for the Arabs. It was a match of wits and guile unlike any other in Kissinger’s experience as secretary of state. For 30 days, Kissinger shuttled between Jerusalem and Damascus, making 13 trips, with side excursions to Egypt and Saudi Arabia to secure the support of Sadat and King Faisal. It was a dispiriting, frustrating, and exhausting endeavor that put him out on the frontier of American diplomacy without any serious backing from President Richard Nixon—who was by that point completely preoccupied with fending off his imminent impeachment.[From the January/February 2016 issue: The long history of leading from behind]Back and forth the American diplomat went, patiently cajoling both sides closer to an agreement, threatening the Israelis, promising blandishments to the Syrians. The fruit of his labor, negotiated in 1974, was the Golan Heights disengagement agreement. It kept the peace in the Golan between Israel and Syria for more than four decades, with only a handful of minor violent incidents.Even today, with Syria engulfed in civil war and Israel regularly striking Iranian targets, the agreement remains in force and the Golan Heights remains peaceful, notwithstanding efforts by Iranian-backed militias to approach the border and former President Donald Trump’s gratuitous recognition of Israeli sovereignty there. Kissinger’s relentless diplomacy had taken both Syria and Egypt out of the conflict with Israel. Henceforth, no Arab neighbor of Israel would contemplate waging war on the Jewish state, and all of them would seek to resolve their differences with Israel through American-led diplomacy.It was by no means a flawless performance. Kissinger underestimated the ability of lesser powers to disrupt the will of the great powers in the Middle Eastern order that he was tending, and his preference for order and skepticism about peace led him to miss several opportunities to advance the peace process that he had created. Nevertheless, the art in his diplomacy lay in his conception and achievement of an American-led regional order, in which the pursuit of peace was an essential mechanism.What is the takeaway for the Biden administration as it shifts its focus from the Middle East to Asia to counter China’s rising, assertive power?At a time of intense geopolitical competition, Kissinger’s first priority would be to establish an equilibrium in the Asian balance of power. Biden is attempting to achieve that by concerting the policies of the major powers in the region, bringing India into the fold, and building Australia’s force-projection capabilities. But America’s own military deployments in the Asian arena will need to be significantly strengthened, especially to deter a Chinese move against Taiwan. More attention clearly needs to be paid to the role of America’s European allies. They have less to contribute because of their geographic distance, but they can nevertheless add ballast to the enterprise, if only by building their own capacity to balance Russia in Europe, thereby helping relieve the burden on the United States as it shifts resources to Asia. Ignoring their interests, as Biden did recently with France, only advantages China, creates problems in Europe, and undermines the credibility of American diplomacy by revealing a gap between rhetoric and practice.As Kissinger learned the hard way in the Middle East, an equilibrium in the balance of power is insufficient without a legitimizing principle that gives our partners a sense of fairness and justice. In Asia, the threat of a Chinese-dominated alternative order makes that easier to achieve. America’s role as the “offshore balancer” in Asia, which it has been playing since its withdrawal from Vietnam, is broadly accepted there as desirable. Moreover, the threat from China concentrates the minds of leaders who might otherwise pursue grievances with their neighbors.However, Biden’s idea of an alliance of democratic states to counter autocratic ones, far from generating a moral consensus in Asia, could make one more difficult to accomplish. Many of the powers that Biden needs on his side are autocratic or trending that way, including the Philippines, Malaysia, Vietnam, Singapore, and India. Biden would be better off developing a coherent trade policy that would benefit our regional partners. By joining the Trans-Pacific Partnership, for example, the United States could help legitimize fair-trade rules that China would have difficulty ignoring.At the same time, Biden also needs to shore up the Middle Eastern order, lest our retrenchment from there tilt the balance of power in favor of Iran and outside powers such as Russia and China. An event like Iran attempting to cross the nuclear-weapons threshold, for example, could divert American attention away from Asia and require the United States to return to the Middle East with force yet again. The Arab-Israeli peace process that Kissinger initiated to legitimize that order has also stalled. If a sense of justice and fairness is to be restored, Kissinger’s approach of a gradual, incremental process needs to be applied to the Palestinian problem. The small economic steps now being taken by the Israeli government should be tied to a peace process that begins with some territorial steps (such as restricting settlement building and ceding more territory to Palestinian control) and leads to an eventual Palestinian state in the West Bank and Gaza at peace with Israel.Kissinger’s style of relentless diplomacy required a combination of caution, skepticism, agility, creativity, resoluteness, and guile in the service of a strategy that favored the pursuit of order over grandiose objectives and magical thinking. By those standards, Biden’s relentless diplomacy is falling short. But the learning curve is always steep in a new administration, no matter how professional the policy makers. They would do well to absorb the lessons from Kissinger’s experience.
2 d
theatlantic.com
Why Aren’t More Pregnant People Getting the Vaccine?
Across the U.S., vaccination numbers have been slowly climbing, protecting more and more of the population and bringing the country closer to getting the coronavirus under control. But despite this success, some high-risk groups have lagged behind. In particular, rates among pregnant people are discouragingly low.Although more than three-quarters of all eligible adults have gotten at least one COVID-19 shot, only about 25 percent of mothers-to-be have gotten one during their pregnancy. Rates are even lower for Latina and Black expectant mothers, at 22 and 15 percent, respectively, compared with 27 percent of white and 35 percent of Asian expectant moms. The vaccines are safe for use during pregnancy—a CDC study on the Pfizer and Moderna mRNA shots found that they did not increase miscarriages, and the agency has urged pregnant people to get vaccinated. And though infants and small children are not yet able to get the immunizations themselves, nursing babies may be able to receive some protection from antibodies in breast milk.The consequences of remaining unvaccinated can be dire. At least 200 pregnant people have died of COVID-19, including 22 in August alone; nearly 23,000 have been hospitalized. Newborns are suffering too. The American Academy of Pediatrics has reported links between infection during pregnancy and preterm birth, and according to the CDC, babies born to patients with COVID-19 are at increased risk of admission to the neonatal intensive-care unit. [Read: America is getting unvaccinated people all wrong]So why aren’t more expectant mothers getting shots that could be lifesaving for both them and their future children? Many assume that all unvaccinated people are conspiracy-minded anti-vaxxers, but as my colleague Ed Yong has written, the reasons for not getting COVID-19 shots are more complicated than that. Pregnancy adds another layer of complexity. The vaccine-skeptical women I spoke with told me that they believe the pandemic is real and that they are pro-science, but they were also overwhelmingly concerned about their own and their baby’s safety because of what they saw as a dearth of research on long-term outcomes. Given the high stakes of protecting their unborn child, and amid an often confusing information landscape, many opted for what felt safe, rather than what was safe.The doctors I interviewed also found that a perceived shortage of data is what concerned most of their unvaccinated pregnant patients. Some had been spooked by anecdotes they’d heard about family members and friends with kids on the way reporting negative vaccine side effects. Others were simply worried about putting something unfamiliar in their body. All were trying to make the best choice for themselves and their child. Jennifer Thompson, a maternal-fetal-medicine specialist at Vanderbilt ​​University Medical Center, said that some of those who declined to get a COVID-19 shot asked her about other ways they could protect themselves from the virus. Regan Theiler, an obstetrician at the Mayo Clinic, in Minnesota, told me that her patients “routinely get their flu shots. These are women who are health-care workers vaccinated against hepatitis B. They get their TDAP boosters in pregnancy to protect their baby.” Indeed, pregnant people have been more likely to take more familiar vaccines: 61 percent got a flu shot during the 2019–20 flu season, for example.[Read: The difference between feeling safe and being safe]Convincing a historically marginalized population about the value of a new treatment is challenging in the best of times. Medical researchers have generally understudied how a lot of drugs affect pregnancy, and doctors too often dismiss pregnant patients’ worries—especially Black people’s. In the case of COVID-19 vaccines, early mixed messaging about the shots’ safety during pregnancy created a lasting anxiety that some health-care providers fueled. (In Mississippi, some pregnant people were reportedly wrongly turned away from clinics.) In the absence of clear guidance, misinformation masquerading under the guise of “wellness” and coordinated anti-vax campaigns targeting expectant mothers took root. Even for those who didn’t subscribe to any conspiracy theories, the confusion may have felt overwhelming.Kirsy Vasquez, a pregnant woman from outside Boston, is vaccinated only because her workplace mandated it. She told me she would have willingly gotten the shot after giving birth, but doing so while she was expecting terrified her. She hasn’t experienced any major side effects—just fatigue and a sore arm—but that hasn’t quelled her fear. “I might be okay right now, but nobody knows,” she said, sharing that she’s most worried about what this decision will mean for her baby. Indeed, vaccination is typically thought of as a medical decision, but during pregnancy it’s also a parenting one. For many, it’s the first big decision they’re making on behalf of a new child, Lynn O'Brien Hallstein, a professor studying motherhood at Boston University, told me. Vaccination then becomes a test of whether you’re a good parent; both those in favor and those against have loud, strong opinions, and the stakes of failure feel monumental.Of course, a COVID-19 vaccine is recommended for anyone eligible, and the clear medical consensus is that getting vaccinated is in the best interest of pregnant women and their babies. The reasons that refraining still feels safer to so many are likely informed by historical theories about pregnancy that emphasized the danger a mother’s actions—and even thoughts—posed to an unborn baby. Quill Kukla, a philosophy professor at Georgetown and the author of Mass Hysteria: Medicine, Culture, and Mothers’ Bodies, points to the early-modern theory of the maternal imagination, which posited that “if pregnant women so much as had a feeling or saw something that was disturbing, that would translate itself directly onto the body of the fetus.” The theory was often used for blatantly racist purposes—such as suggesting that a white woman lusting after a Black man might change the race of her baby—but its effects are still felt today.Culturally, pregnant bodies are seen as fragile entities that must be kept pure from pollution. And while, practically, women do have to take into account that what they do affects the health of their fetus, assessing and weighing the risks can be confusing. They often have to sort through long lists of what to avoid in pregnancy, such as sushi and certain sleeping positions; while many of these things carry some risk, some advice books might lead new parents to believe the danger is greater than it is. “You can’t even take an ibuprofen when you’re pregnant, so it’s definitely scary to think about taking a brand-new vaccine,” Kristina, who works in finance in Dallas, told me. (The FDA recommends avoiding the drug if you’re pregnant, especially after 20 weeks.) It’s no wonder that expectant mothers have favored inaction on the vaccine, Kukla told me. “The whole history of pregnancy advice has been organized around pregnant women somehow keeping out outside influences.”While some anti-vaxxers are actively spreading fear and misinformation, a lot of unvaccinated pregnant people are safety-minded, but have been understandably influenced by this overarching cultural attitude around pregnancy. “I do hear from lots of patients that ‘I just don’t like putting things in my body when I’m pregnant. I want this to be as natural as possible, and maybe I’ll consider it after delivery,’” Thompson, the maternal-fetal-medicine doctor, told me.Kristina, who asked to be referred to by first name only, given that she was discussing private medical information, decided to wait out her pregnancy. She said her doctor waffled when giving her advice, emphasizing that there were no safety guarantees and ultimately telling her the choice was hers. “Honestly, it’s probably not what a pregnant woman needs to hear, because obviously, they’re not gonna go forward if you tell them that,” she told me. Instead of getting a shot, she took strict precautions, because she knew the risks of COVID-19. She was working from home, getting groceries delivered, and rarely leaving her house. To her, this scenario was “the best of both worlds”—she could avoid both infection and the stress of getting a vaccine that felt scary to her. Still, she emphasized that she’s not an anti-vaxxer. She encouraged her relatives to get vaccinated, and eagerly got her shots too—after she gave birth over the summer.[Read: America is now in the hands of the vaccine-hesitant]Although Kristina made it through her pregnancy without getting COVID-19, this bias toward inaction can have wide-ranging harms, pandemic or no pandemic. When nonintervention is a default, patients with conditions that actually require treatment can be endangered. There’s also a lack of research on the effects of many medicines during pregnancy. Though pregnant people were excluded from the initial COVID-19 vaccine trials, researchers did study the shots’ effects on them later on. However, according to one report, almost 75 percent of drugs approved between 2000 and 2010 don’t have any data on how they influence pregnancy. Without adequate evidence, many feel stuck when making decisions about their health. “It’s not just my body now. I’m thinking about my child’s body as well,” Jasmine Fortescue, an insurance representative from Florida who hasn’t gotten vaccinated, told me. “The information helps. But information can also be so overwhelming that sometimes you just have to stop and think about what you want to do and what you really believe.” As O’Brien Hallstein, the Boston University professor, said, while the power of choice can be empowering, it can also be a terrible burden, because it comes with being the target of blame if anything goes wrong.An obvious exception to the preference for nonintervention is the racist history of researchers unethically testing treatments on pregnant women of color. The entire modern field of gynecology, for example, is indebted to experiments that a man named James Marion Sims performed on enslaved Black women without their consent and without anesthesia. In the 20th century, the influential obstetrician Fred Adair standardized a blueprint for prenatal care that was rooted in eugenics. As Dána-Ain Davis, a professor at the City University of New York studying Black maternal health and medical racism, told me, these “afterlives of slavery” permeate obstetric care to this day. “From that kind of relationship, you want to have trust that I’m supposed to take this vaccine?” Davis asked. “​​That is irrational. It is irrational on the part of the medical community and the public-health community, knowing what they know about how Black people’s bodies had been treated during reproduction and pregnancy. It is irrational for them to think that people are going to embrace all of a sudden the recommendations of somebody who they feel has not paid attention to them.”To improve vaccination rates, the experts I spoke with all pointed to the importance of community-based care that both eases problems of access and builds trust. “When I have these conversations, I ... acknowledge that you’re not crazy for feeling like this,” ​​Ndidiamaka Amutah-Onukagha, a public-health professor at Tufts University School of Medicine and the director of the MOTHER Lab, which studies maternal-health disparities, told me. “Then you back it up with science.”Erika R. Cheng, a researcher at the Indiana University School of Medicine who has studied communication between patients and obstetricians, says that in the exam room, the gold standard for provider-patient communications is a process called shared decision making. In it, the two work together to reach an informed medical decision based on both the available evidence and the patients’ own values. But that process works only when the provider takes the patient’s concerns seriously, devotes adequate time to the conversation, and encourages the patient’s autonomy—conditions that many medical conversations don’t meet. Her research has shown that many pregnant people don’t always feel comfortable raising questions with their doctors at all.If patients are uncomfortable with their doctors during normal times, it’s no wonder things go awry when information changes as quickly as it has during the pandemic. Confusion among patients was a natural response to uncertainty in earlier months among providers, who were still waiting on research. Many I spoke with were hopeful that more available data on the vaccines might convince the skeptical and get more shots in pregnant patients’ arms. Jennifer Thompson has started to notice this play out already. She says some women who were initially mistrustful have shared with her that they eventually chose to get vaccinated not in spite of their pregnancy, but because of it—making a choice that both felt and was safe.
2 d
theatlantic.com