Tools
Change country:
The Atlantic
The Atlantic
America’s Hands Are Full of Blood
Thoughts and prayers. It began as a cliché. It became a joke. It has putrefied into a national shame.If tonight, Americans do turn heavenward in pain and grief for the lost children of Uvalde, Texas, they may hear the answer delivered in the Bible through the words of Isaiah:“And when ye spread forth your hands, I will hide mine eyes from you: yea, when ye make many prayers, I will not hear: your hands are full of blood.”We will learn more about the 18-year-old killer of elementary-school children: his personality, his ideology, whatever confection of hate and cruelty drove him to his horrible crime. But we already know the answer to one question: Who put the weapon of mass murder into his hand? The answer to that question is that the public policy of this country armed him.Every other democracy makes some considerable effort to keep guns away from dangerous people, and dangerous people away from guns. For many years—and especially since the massacre at Connecticut’s Sandy Hook elementary school almost exactly a decade ago—the United States has put more and more guns into more and more hands: 120 guns per 100 people in this country. The years of the pandemic were the years of the greatest gun sales in U.S. history: almost 20 million guns sold in 2020; another 18.5 million sold in 2021. No surprise, those two years also witnessed a surge in gun violence: the spectacular human butchery of our recurring mass slaughters; the surge of one-on-one lethal criminality; the unceasing tragic toll of carelessness as American gun owners hurt and kill their loved ones and themselves.Most of us are appalled. But not enough of us are sufficiently appalled enough to cast our votes to halt it. to it. And those to whom Americans entrust political power, at the state and federal level, seem determined to make things worse and bloodier. In the next few weeks, the U.S. Supreme Court will deliver its opinion in the case of New York State Rifle & Pistol Association Inc. v. Bruen, a decision that could strike down concealed-carry bans even in the few states that still have them. s. More guns, more places, fewer checks, fewer protections: Since Sandy Hook, this country has plunged backward and downward toward barbarism.In his memoir of his career in the gun trade, the former gun executive Ryan Brusse writes of the effect of mass shootings on gun sales. They are, to put it bluntly, good for business. People think that perhaps the authorities might do something, and race to the gun stores to buy weapons before the “something” happens. The gun in the gunman’s hand multiplies to more guns in more hands. Most of those hands do not mean to inflict harm. But the harm follows, even so.In this magazine five years ago, I wrote a parable: A village has been built in the deepest gully of a floodplain. At regular intervals, flash floods wipe away houses, killing all inside. Less dramatic—but more lethal—is the steady toll as individual villagers slip and drown in the marshes around them. After especially deadly events, the villagers solemnly discuss what they might do to protect themselves. Perhaps they might raise their homes on stilts? But a powerful faction among the villagers is always at hand to explain why these ideas won’t work. “No law can keep our village safe! The answer is that our people must learn to be better swimmers—and oh by the way, you said ‘stilts’ when the proper term is ‘piles,’ so why should anybody listen to you?” So the argument rages, without result, year after year, decade after decade, fatalities mounting all the while. Nearby villages, built in the hills, marvel that the gully-dwellers persist in their seemingly reckless way of life. But the gully-dwellers counter that they are following the wishes of their Founders, whose decisions two centuries ago must always be upheld by their descendants. Since then, of course, things have only gotten worse. Can it be different this time? Whether any particular killer proves to be a racist, a jihadist, a sexually frustrated incel, or a randomly malignant carrier of sorrow and grief, can Americans ever break the pattern of empty thoughts, meaningless prayers, and more and worse bloodshed to follow?The lobby groups and politicians who enable these killers will dominate the federal courts and state governments, as they do today, until the mighty forces of decency and kindness in American life say to the enablers:“That’s enough! This must stop—and we will stop you.”
1 d
theatlantic.com
The Unique Challenge of Raising Teenagers Right Now
Sign up for Molly’s newsletter, Wait, What?, here.The teenagers are not alright, but then again, neither are the adults. Pandemic life has been profoundly jarring, and every generation has felt it. I hear about people fighting on airplanes and an increase in violent crimes, then I attend my Alcoholics Anonymous meetings on Zoom and try to figure out why going back to “normal” is so hard. My 80-year-old mother never got COVID-19, but more than two years of sitting at home seems to have hastened her descent into dementia. Meanwhile, many young children are struggling to keep up with their education or even learn how to socialize.Now imagine what this moment must be like for teenagers. In December, U.S. Surgeon General Vivek Murthy issued a warning: Pandemic-related death, fear, loneliness, and economic uncertainty have worsened “the unprecedented stresses young people already faced.” It makes sense. Between unfamiliar hormones and trying to figure out who you are in the world, being a teen has always been incredibly hard. Pandemic teenagehood is even worse. Recently, I was reading a story in The New Yorker about child suicide when I learned that a friend of a friend’s teenager had died by suicide. I felt sick.[Derek Thompson: Why American teens are so sad]Living through a pandemic that has claimed more than a million American lives is not the only thing that’s making young people miserable. They’re also very much not in denial about the likely coming climate disaster. In a survey of 10,000 people ages 16 to 25, more than 45 percent said that their feelings about climate change had “negatively affected their daily life and functioning.” Caroline Hickman, a lecturer at the University of Bath, in the United Kingdom, and the lead author of the study, told the BBC that “the young feel abandoned and betrayed by governments” for their inaction on climate change. It must be hard for young adults to feel that grown-ups care about them when their lawmakers refuse to meaningfully address arguably the largest challenge facing the next generation.How are you supposed to guide, to reassure, to parent teenagers in this situation? I have three children ranging in age from 14 to 18. Soon after coming home from covering the Conservative Political Action Conference in late February 2020, I got an email from the organizers saying that I had been exposed to a new coronavirus. I wrote an email to the nurse at my older son’s school. Almost immediately, the phone rang; it was her. “We obviously can’t tell you what to do, but it would be a huge help if you’d keep your son home for a few days just until …”It quickly became clear that neither of us had any idea how that sentence should end. I went into my eldest son’s bedroom soon after to explain to him why he had to stay home from school. Then I said what I always say when telling my kids something kind of scary: “I’m sure this is not a big deal. I’m sure everything is going to be fine.” I suspected that I was lying but I thought I was practicing a sort of normal parenting deception—the kind you do when you just need a kid to go to sleep or do their homework. Sure, you’ll use algebra in real life. Yes, skipping gym class is bad.Then the unimaginable happened. Today, I can’t tell my teenagers that everything will be all right with a straight face. I don’t have answers for my kids, or for yours. As parents, we are tempted to pretend—to be brave for our kids—but I’m not sure that serves anyone anymore. I’m just trying to parent harder right now, whatever that means. Mostly, I’m attempting to be the annoying parent who’s always around. I come home from dinner early. I try not to take long trips. In lieu of actually knowing what the future holds, I just aim to be ready to react when my kids decide they need me.[Read: We’re all second-guessing ourselves now]My grandfather, the vaudeville drummer turned importer Seymour Mann, never got over surviving the 1918 flu pandemic. He couldn’t figure out why so many of his peers had died and he hadn’t. Seymour walked to work every single day of his life until he was in his 90s. He thought that not taking public transportation had saved his life. It probably hadn’t. Yet, barring a better answer, that was the conclusion he could live with, and it protected him from the truth he couldn’t face—that perhaps he survived out of sheer dumb luck. I don’t want to lie to myself or to my family ​about the messiness of our current reality. But part of me wishes that I had just a fraction of Seymour’s certainty so that I could give it to my children.
1 d
theatlantic.com
How Joe Biden’s Asia Trip Shows Chinese Failure
If President Joe Biden’s trip to Asia—marked as it was by his comments on the defense of Taiwan, announcements on a proposed new regional trade pact, and meetings with leaders who exhibit similar levels of concern about a rising China—has shown the persistence of American global power, it has also revealed something of equal importance: Beijing’s failure to translate economic might into political dominance, even in its own backyard.Biden today concluded a summit of the leaders of the Quad—a security partnership including Australia, India, Japan, and the United States—who issued a joint statement chock-a-block with references to promoting democracy, a rules-based global order, and peaceful resolution of disputes. That came a day after Biden announced the formation of the Indo-Pacific Economic Framework, a partnership with 13 countries as diverse as South Korea, Vietnam, and New Zealand. Notably absent from all this was China. Biden’s trip exhibited Washington’s continued ability to rally other nations behind its standard, and in initiatives overtly targeted against the region’s supposedly rising superpower.The script wasn’t meant to read this way. As China grew in economic importance, its smaller neighbors would, the thinking went, inevitably and inexorably be drawn into its orbit, while U.S. power would correspondingly fade, ushered along by its own political divisions and percolating isolationism. Events of the past decade seemed to prove the assumption: As China acted more assertively in the region, Washington’s efforts to cling to primacy appeared to falter. President Barack Obama’s “pivot” to Asia concluded with a thud when Donald Trump pulled Washington out of the Trans-Pacific Partnership economic pact. (The other 11 members inked the deal anyway.) And Trump, beyond his bellicose and inchoate trade war against China, largely ignored the region, save for a couple fancy dinners with Kim Jong Un.Meanwhile, Beijing appeared to fill the void. China is at the center of another Asia-wide trade pact, called the Regional Comprehensive Economic Partnership, which came into effect in January, while its infrastructure-building Belt and Road Initiative has funded railways, ports, and power plants from Pakistan to Laos. Beijing has also muscled aside its rivals in the South China Sea, steadily turning its contested claim to nearly the entire waterway into a fait accompli, and consolidated its hold over disputed territory also claimed by India. As the U.S. withdrew from Afghanistan last year, Beijing, having fostered sound relations with the Taliban, seemed poised to become the country’s new patron. Amid the COVID-19 pandemic, China also tried to win early plaudits through “vaccine diplomacy,” eagerly shipping its homemade jobs to neighbors. China was winning.But, as China seeks to expand its power, it seems to become more isolated. Biden’s new economic framework has attracted countries across ideological lines (from Communist Vietnam to democratic Australia) and some nations that try to carefully balance the two powers, such as Singapore. Beijing hasn’t weakened American bonds to its chief allies in the region—Japan, South Korea, and Australia. If anything, Washington appears to be drawing more countries to its side of the table, such as India.All of this exposes the abject failure of Chinese foreign policy. Despite their constant pledges of “peaceful development,” China’s leaders have scared many of its neighbors. New Delhi, historically no fan of Washington, has felt threatened by Chinese hostility over disputed borders. Beijing’s intensifying intimidation of Taiwan—with Chinese jets buzzing dangerously close to the island—has alarmed the region. Politicians in Canberra and Seoul have certainly not forgotten the economic coercion Beijing employed against them to compel changes in their policy. China’s bullying in the South China Sea has irked those with competing maritime claims. The Philippines, a long-time U.S. friend, has been trying in recent years to strengthen ties to China but, frustrated by Chinese shipping crowding into waters claimed by Manila as an exclusive economic zone, a Philippines foreign minister last year tweeted a very undiplomatic “GET THE FUCK OUT!”Of course, to maintain its influence, Washington will have to follow through on its new initiatives. In that, Biden is already constrained by politics back home. The new economic framework is not a trade pact aimed at reducing tariffs, a sop to grumbling in the U.S. that trade with Asia costs American jobs. The deal’s focus on environmental and labor standards alone, critics contend, will water down its value and appeal. The U.S. Chamber of Commerce found it “disappointing.”But that misses the geopolitical point. The goal of the framework is to ensure Washington is “writing the rules” on crucial economic issues such as digital business and climate change, a way to solidify American influence against Beijing’s efforts to refashion the norms of the global system in its own favor. The agreement will also focus on securing supply chains—bad news for China, which has been alienating foreign business with erratic “zero-COVID” shutdowns, support for Moscow in its invasion of Ukraine, and human-rights abuses. The framework just so happens to bring together most of China’s chief Asian competitors when it comes to being a base for production(India, Indonesia, and Vietnam) with the countries that invest in and operate those bases (Japan, South Korea, and the U.S.). That can potentially further incentivize international companies to relocate their supply chains out of China. And then there is the sheer symbolic value of Biden rocking into town and attracting leaders from across what is supposed to be China’s home turf into a new economic initiative.For China, the message is clear: Get a new foreign policy. Beijing seems to believe that its economic weight will eventually compel the rest of the region to flock to its flag. But there is little sign of that happening. South Korea exports TK more to China than the U.S., but that didn’t stop its new President Yoon Suk Yeol, from hosting Biden on his Asia tour, before any summit with Xi Jinping. Nor is China even offering all that much on the economic front these days. Beijing’s long-time policy direction, “reform and opening up,” offered the hope of greater cooperation, and thus profits for foreign investors and other countries. Xi has replaced that with the more insular and nationalistic “self-sufficiency,” a campaign to replace imports with Chinese alternatives.Beijing will have to woo the world with more than money. Chinese leaders are attempting to promote their own values and norms—of the authoritarian persuasion—on the global stage. That’s won China some support in forums such as the United Nations. But its immediate neighbors seem far more concerned about the threat created by Beijing’s expanding power and aggressive use of it than they are about American finickiness over human rights.There is little indication, however, that Xi and his foreign-policy team have any intention of softening their stance on key regional issues. Biden’s success may, if anything, prod them to lash out further. A commentary in Xinhua, the official news agency, was quick to deride Biden’s economic framework as a “big scam” based on “sinister intentions” meant to “undermine regional stability,” before complaining about China being left out.This shift—of China’s neighbors opting for tighter ties with America—may progress more and more if Beijing doesn’t change course. Its neighbors would much rather be on good terms with Beijing than bad, and most governments in the region will attempt to balance their relations with both great powers. At the same time, the message to Xi should be loud and clear: As in Europe, where Vladimir Putin’s aggression is uniting the rest of the region against him, so too in Asia is an aggressive China entrenching, not weakening American power.
2 d
theatlantic.com
Requiem for the Viral Internet Challenge
Ten years ago this month, the Harvard men’s baseball team put a video on YouTube in which they danced and lip-synched to Carly Rae Jepsen’s No. 1 hit, “Call Me Maybe.” It was funny because, well, you know: They were muscle-y boys with serious jawlines, and they were doing choreography that involved punching the ceiling of a van; this was back when a lot of people thought that pop songs were really stupid and for girls. So the video got really popular. Then other groups of people started to film themselves doing their own versions of the song: college students in Idaho; the Miami Dolphins cheerleaders; the U.S. Olympic swim team. Maybe you, too, were inclined to dance and lip-synch to Carly Rae Jepsen’s No. 1 hit, “Call Me Maybe,” with your friends and post it to the internet. This is how one of the first super-viral “challenges” on social media was born.Planking, where people filmed or photographed themselves lying flat—like a plank—in unexpected places, had already peaked, as a challenge, in the previous year. But the “Call Me Maybe” challenge turned out to be a lot less dangerous, and—as a group activity—a lot more fun. The Pittsburgh Steelers made a “Call Me Maybe” video in 2012. A class of kindergartners made one. Selena Gomez and Justin Bieber made one—this is when they were in love. And I’m sure you already know who else made one … I did, at the end of a closing shift at a coffee shop in the mall food court. (This was an amazing, boring, mostly unsupervised job. We also did the eat-a-spoonful-of-ground-cinnamon challenge, which was popular at about the same time.) I recently dug up our “Call Me Maybe” video from the depths of Facebook and watched it and was shocked.Although it is always uncomfortable to see a video of yourself from your teen-goth era, what really set me back on my heels was how alien the clip seemed. I texted the link to my former coffee-serving colleagues and co-stars in the video. “Was there choreography involved or is this freestyle?” I asked them. “I couldn’t even watch it, I need to be in the safety of my own home first,” one of them replied. This video from 10 summers ago was not just embarrassing—it was from another world. Viral challenges like this one used to have the power to unite the internet, bringing together mall-food-court kids and professional athletes and politicians and 4-year-olds. Then suddenly, they disappeared.The challenge once embodied all that social media was meant to be: a forum for exchange; a source of fellowship; a way “to make the world more open and connected.” Our favorite truism about the internet today—that it divides us into warring tribes and makes everything terrible—simply wasn’t true back then, or at least it didn’t seem to be. In the early 2010s—the golden age of challenges—anyone could get involved in an online trend, and that would only make the whole thing better. I can’t even think of a person, circa 2012, whose decision to make a “Call Me Maybe” video would have killed the fun. Phil Spector? Sandra Bullock’s ex-husband who cheated on her? We even loved it when U.S. troops stationed in Afghanistan lip-synched next to their mortar shells and machine guns. (“Whatever your position on U.S. foreign policy, these are worth watching—they’re amazing,” The Atlantic argued at the time.) We loved it when Donald Trump made a video too.Today, you can imagine how this would all play out. A right-wing pundit would spin the challenge in some awful way to “own the libs,” and then the libs would do the challenge, too, so as to make it both heavy-handed and smug. Then some dreadful bureaucrat would post a video, setting off a flame war, and someone else with a porch surveillance camera would harass their Amazon delivery person into joining in. If the viral challenge served to bring us all together—if it stood for online comity and fun—then we should acknowledge that it’s never, ever coming back. The past five years have dumped a bucket of ice-cold water on the premise. One needn’t blame politics alone for the death of this cultural phenomenon. The challenge could also be a victim of our new self-consciousness online, and our more developed fears of looking stupid. The male lead in the original “Call Me Maybe” music video was a shirtless hunk with the words The sky is the limit tattooed in script across his entire chest—solid evidence that embarrassment was not a powerful force in 2012, and that “cringe culture” on the internet was still brand-new. But if cringe killed viral challenges, then what went wrong in 2020? During the early months of the pandemic, we were all invited to post whatever we wanted to, cringe or not. Instead of producing a great new challenge, though, this gave us only short-lived TikTok trends (mostly dances that looked cool but were too hard to do yourself) and a bunch of celebrities using hashtags sponsored by the CDC or the National Health Service. During the shutdowns of that spring, The New York Times tried to convince me that “social media challenges” were “helping keep boredom at bay,” yet the examples it provided were the most boring things I’d ever heard of: turning pillowcases into dresses, bouncing Ping-Pong balls off of pots, juggling toilet paper, doing push-ups. (Doing push-ups???)[Read: Why the past 10 years of American life have been uniquely stupid]I understand that people still film themselves dancing and put it on the internet. (They even film themselves dancing to “Call Me Maybe,” but in an upsetting way.) I realize that “videos of people lip-synching” continue to be a viable entertainment product. But it’s not the same—it’s a hot or talented or famous person’s game now. New “challenges” do emerge on the internet every week, but they’re not the kind that bring people together. A challenge is not really a challenge, I would say, until aunts and uncles have tried it and babies are aware of it and it is not ridiculous to suggest that your “team” at work give it a go. A real challenge has to be fun, it has to be easy, and it has to become unavoidable … and then people have to get sick of it, because such is life. What happened to that?Those sorts of challenges used to pop up all the time. In early 2013, just a few months after “Call Me Maybe,” we had the Harlem Shake. Each video began with one person dancing somberly, alone, usually wearing a mask. Then the beat dropped and they were joined by a bunch more people who danced sort of frantically and strangely. This wasn’t a TikTok star’s sterile presentation of one viral dance move after another on The Tonight Show; it was odd teenagers thrashing around in the drab-looking spaces that are usually available to odd teenagers. In 2014, you could hardly avoid the Ice Bucket Challenge, which wasn’t interesting in the slightest but went exceedingly viral anyway because the videos raised money for a good cause and each one ended with a shivering person shouting out the names of friends or family members who were therefore “nominated” to take a turn dumping ice on their own head. Refusing to participate would indicate that you were heartless, or—worse—not game. And everyone would know it, because you were tagged on your “Facebook Wall.”All of these fads spread on Facebook, which was more or less the official platform of the viral challenge. (In many instances the videos were posted first on YouTube, but they had to be shared to Facebook or no one would see them.) That made sense: Facebook was, at the time, a cross-generational platform—a place where I could share content with my mother and my grandmother too. “Check out the Harlem Shake video I filmed in A.D. White Library today,” some kid I barely knew from the college paper posted in February 2013. “Kaitlyn Tiffany … you have 24 hours!!!” my cousin wrote above a video of a bucket of ice water being flung at her face in August 2014. I don’t think I ended up doing either one? (I’m heartless and not game.) But my college roommates did, and so did the girls from my high-school soccer team, and so did the One Direction member Niall Horan, as well as everyone in between.The final challenge of this golden age arrived a few years later, and its timing was no accident. In early November 2016, as the presidential campaign moved into its final days, the nation came together for one last run at community rapport. When the Mannequin Challenge spread around the internet, entire high schools, including teachers, froze in place, mid-action, to the background music of the rap duo Rae Sremmurd’s “Black Beatles.” It made no sense, which was perfect. One school in Canada filmed a very long tableau vivant with roughly 1,500 people—the camera panned over teens and staff paused as they pretended to sword fight, to lick a statue’s abs, to prepare the day’s lunch in a surprisingly clean and professional-looking cafeteria kitchen. The women’s gymnastics team at Brigham Young University participated, as did students at West Point, and factory workers, and librarians. People did the Mannequin Challenge on airplanes, and on the International Space Station, and on Sesame Street. I hate to bring this up … Hillary Clinton’s campaign did the Mannequin Challenge. They posted it on Election Day. (“Don’t stand still. Vote today.”) Don't stand still. Vote today: https://t.co/jfd3CXLD1s #ElectionDay #MannequinChallenge pic.twitter.com/4KAv2zu0rd — Hillary Clinton (@HillaryClinton) November 8, 2016“Wack as Hell Mannequin Challenge Could Cost Hillary Clinton the Election,” GQ suggested in a headline, but the rest of the post was sanguine: “It is unquestionably annoying. But you know what? I don’t care. It doesn’t matter. By this time tomorrow, if we’re lucky, Hillary Clinton will officially be the next President of the United States.” Whoops! When I watch that video today, of Hillary and Bill and Huma Abedin and (for some reason) Jon Bon Jovi pretending to be frozen in an airplane cabin, I feel queasy. First of all, Bill Clinton is too good at freezing; he looks dead. Second, it’s a little too spot-on: On November 8, 2016, it really did feel as though the physical laws of the universe had changed. Time didn’t stop that night, but it did stretch out, and in the morning everything was different; we saw divides we hadn’t seen before, and no obvious way to bridge them. A lot of people didn’t even want to bridge them. Yet, for a little while longer, somehow, the Mannequin Challenge survived.[Read: I made the world’s blandest Facebook profile, just to see what happens]Kathryn Winn, the author of Memeforum on Substack, wrote about the Mannequin Challenge last year: “It required no special equipment, or learning anything, or editing. Tell grandma to stay still and record her. The whole family can enjoy it and it’s more fun than trying to do a family photo.” It was “a Thanksgiving meme,” she said. I agree that seemingly absolutely everyone asked their families to do the Mannequin Challenge that Thanksgiving. Or maybe I feel that way because my family did it. This is confusing, because after Trump was elected, a lot of people seemed afraid of talking with their own families—if your relatives loved Trump, what could you really talk with them about? I had wanted to skip Thanksgiving altogether that year, for just that reason. Yet we all did the Mannequin Challenge?Winn described a “moment of silence” on the internet at the end of 2016, during which nobody was allowed to joke. The Mannequin Challenge was the lone exception: “Everyone was still allowed to post the mannequin challenge. It was a reminder that life goes on.” Those videos would be the last exhalation of challenge culture: From then on, social media would not be understood as a place to come together but as a place to come apart. Also as a place to be serious, even while joking, to the point that everything became a bore. In those same few weeks of November 2016, media outlets covered a no-fun and not-real trend called the Trump’s Coming Challenge, in which someone yelled “Hey, Trump is coming!” and then recorded a bunch of people screaming and running away. (“The Trump’s Coming Challenge Is Why the Future’s Gonna Be Alright,” a writer for GQ … begged?).In the early to mid-2010s, when viral challenges had their run, most people were still using a social-media platform that was explicitly designed to connect them to people they knew in real life—from work, from school, from hanging around town. I’m not trying to express some great nostalgia for the Facebook of this time—there was concern about political rancor on the platform then, too, and it was well on its way to becoming a fundamentally miserable website—but people did use it like a town square or a family-meeting place. In 2017, Facebook started bleeding younger users in a major way to Instagram. The year after: the wrecking ball of TikTok. The site is a wasteland now, known for corrupting the minds of Boomers.Older people are stuck on Facebook, a website with more garbage content than ever, and lacking any grandkids’ prom-photo albums to click through. Meanwhile the Millennials and middle-aged are straddling the line between Instagram and Twitter. Viral challenges used to bubble up from college kids and teenagers before they crossed the generation gap; now the kids are all on TikTok, and the “challenges” they create (whether there or elsewhere) are either too insider-y and confusing to spread more widely, or else they’re kept behind the glass of moral panic. The Tide Pod Challenge of 2018, for which young people were said to be consuming laundry detergent, didn’t turn out to be real; neither was the Momo Challenge from 2019, which allegedly invited self-harm. Parents’ eternal fear of youth culture has been exacerbated in the TikTok age—sometimes intentionally, as when Facebook paid a Republican consulting firm to plant “challenge” panic in local newspapers. Other challenges that make the news today are creepy and not cool, and seem dangerous to grown-ups. Clearly, Grandma is not going to participate in a trend she finds terrifying.Looking back on the era of transcendent challenges, I’m talking about a time when I myself was young, which is what filming yourself dancing in socks in a mall is all about. But those challenges were also about being old, or being interesting, or being regular. They were about being anybody! With the Mannequin Challenge, we all froze, but time didn’t stop. Now we’re on the other side: Anybody can hold a pose, or pour water on their head, or do a silly dance with friends, but everybody will never do those things again.
2 d
theatlantic.com
What Philip Guston’s Cartoonish Paintings of Klansmen Urge You to See
Anybody who has seen one of Philip Guston’s representational paintings knows the rest of them. I mean that in a strictly literal sense: The visual universe that Guston began creating in the late 1960s, when he rejected the abstraction that was then dominating the New York art world, is impossible not to recognize. Guston painted in thick, fleshy pinks, commonly outlining his figures in red or black instead of filling them in. His commitment to this palette was such that, according to his daughter, Musa Mayer, in her memoir Night Studio, when Guston died in 1980, she and her mother inherited “hundreds of tubes of cadmium red medium, mars black, titanium white.”Many of his pink canvases are self-portraits in which he appears as a giant, worried head, all forehead wrinkles and wide eyes; many show household objects—cherries, cigarettes, bottles, light bulbs—swollen to menacing proportions; and many are full of puffy, cartoonish Ku Klux Klansmen, typically doing activities that, per Mayer, came from the routines of their creator’s life: staring down a bottle of alcohol, smoking a cigarette, or, as in the 1969 work The Studio, painting a self-portrait while wearing a hooded white robe. Guston’s 1969 painting The Studio (Artepics / Alamy) Guston’s images of Klansmen, which he called “hoods,” are striking in their ability to allow both painter and audience to consider the proximity of evil. But in September 2020, months after the Black Lives Matter protests that followed the killing of George Floyd, four major museums postponed a retrospective of Guston’s work, citing a wish to “reframe our programming.” (The exhibit is now open at the Museum of Fine Arts, Boston, and will travel to three other venues over the next two years.) Plainly, their concern lay with Guston’s Klansmen being misinterpreted or not seen in sufficient context. Many in the art world rebuked the museums for shying away from Guston’s willingness to look racist violence—and his own complicity with it—in the face. Guston put himself in the shoes of the Klansmen he painted to better understand the humans behind the hoods. (He once noted, “What do [Klansmen] do afterwards? Or before? Smoke, drink, sit around their rooms … patrol empty streets; dumb, melancholy, guilty, fearful, remorseful, reassuring one another?”) His paintings, which put bumbling hoods in anodyne settings, render their subjects disturbingly familiar.In the newly rereleased book Guston in Time, the novelist and critic Ross Feld praised Guston’s capacity and willingness to imbue even “the most upsetting or disquieting imagery” with “a shaggy, even goofy friendliness.” He wasn’t wrong about the friendliness: The hoods look like Hershey’s kisses crossed with Moomins. Yet painting the Klansmen approachably doesn’t defang them. By depicting them so crudely that it can take a moment to identify them, Guston arguably tricked his viewers into lingering—and then urged them not to look away.Neither Feld nor Mayer engages with this idea at length in their remembrances, but to me, it seems crucial that Guston tackled his hood paintings—which he debuted in a now-famous 1970 show at the Marlborough Gallery—from the vantage point of the white American Jew. Guston’s parents, Leib (later Louis) Goldstein and Rachel Ehrenleib, were Jews who fled anti-Semitism in Odesa, Ukraine. When Guston was a child, the family moved from Montreal to Los Angeles, where the Klan was both visible and powerful. As a young factory worker, Guston participated in a strike that Klansmen helped break. Some of his earliest works were straightforwardly brutal illustrations of the KKK, he recalled in a 1978 lecture; when he exhibited them in the early 1930s, “some members of the Klan walked in, took the paintings off the wall and slashed them.” Guston knew firsthand the effect his art could have; he also knew fear of anti-Semitism, and of the Klan, firsthand.[Read: The forgotten history of the Western Klan]Yet by the time he resumed painting Klansmen in the years leading up to his Marlborough show, Guston’s racial position in the United States had shifted significantly. In the 1930s, when he set out to “illustrate, or do pictures of the KKK,” as he said in that lecture, Jews of European descent were rarely considered white. By the 1960s, they more widely were. After the Holocaust, blatant anti-Semitism seemed déclassé. Informal Jewish quotas seemed to vanish from college admissions, and intermarriage became more common.Interestingly, in a moment that seemed to lend itself to assimilation, Guston turned in the opposite direction. He appeared, at that time, unconcerned about fitting in with any mainstream: He said that he began painting hoods in part as a horrified reaction to police violence at the 1968 Democratic National Convention, which he described as a “trigger” that “pushed me over.” And he began admitting later in life to his shame at having Anglicized his last name from Goldstein in his early 20s, especially given that, per Feld, he still peppered his speech with Yiddish and presented himself as a “doubt-ridden cerebral Jew painter.” He compared his choice to paint Klansmen to the great Jewish writer Isaac Babel’s decision to depict Russia’s Cossacks; Mayer writes in Night Studio that he left abstraction because he felt, as he put it, an urge to “create ‘golems,’” a reference to the animate clay giants of Jewish legend. These reference points reflect the fact that, like his good friend Philip Roth, he was able to succeed as a Jew, not in spite of his Jewishness. Guston could have de-Judaized himself and vanished into whiteness. He made whiteness visible instead. Philip Guston’s 1969 painting City Limits. (Atrepics / Alamy) Although some critics, including the influential Harold Rosenberg, responded positively to Guston’s hood paintings, with their challenging subject matter and clumsy outlines, much of the art world despised and was perplexed by them. (Feld was one of few viewers who reacted positively to the Marlborough show, and he remained an ardent fan.) In the years Guston spent working uneasily in the tradition of abstract expressionism (which Feld calls “one of the most deeply Protestant art-histories ever seen”), he painted in a variety of colors, often focusing on deep explorations of a single hue. When he began painting hoods, he picked up his signature pinks—which are, it seems to me, not just any pink. Guston worked in the streaky, pale-bellied tones of an unhealthy white man’s skin, which is to say of his own. He was a voracious eater, smoker, and drinker who frequently ignored doctors’ advice. In some photographs, his face is the color of his art. Spotting a Guston from across a room is, in a way, spotting the painter’s own variety of whiteness.Pink evokes girliness, too. Guston’s late-career art overflows with cartoonish femininity, not only in his canvases’ color but in their domestic clutter and swollen curves. Occasionally, this tendency turns sexual—say, in the joyous “gluttony” of the fat red fruits in Cherries (1976), as Feld describes it—but more typically, Guston chooses drab household scenes. Many of his paintings bring viewers into what could be closets or storage rooms. In Flatlands (1970), two hoods stare across a dirty pinkish landscape littered with clocks, old shoes, a book, a basketball. They could be standing in an attic, getting ready to sort through what Feld calls the “crap of life.”[Read: The art movement that embraced the monstrous]Feld celebrates Guston’s willingness to paint everyday objects, and to do so in what he implies is—and what I certainly see as—a stereotypically feminine way. Drawing from the theater scholar Jonas Barish, Feld argues that abstract expressionism asked artists to suppress their emotions in favor of creating “Works that testified to inner Faith.” Guston, though, embraced his “histrionic impulse,” imbuing household detritus with dignity and menace, and transforming it into the “ever-present still life that surrounds the embarrassingly, even tragically human.” In Flatlands, Guston’s hoods seem, if not embarrassed, then overwhelmed. They look lost and powerless in their sea of domestic junk—and yet their white robes, even smeared with pale pinks and ashy grays, warn viewers that it would be foolish not to fear them nonetheless.While visiting the Baltimore Museum of Art’s “Guarding the Art” exhibition in April, I spotted Guston’s 1974 painting The Oracle from a full room away. It’s pink, of course. Half the canvas is littered with shoes heaped in a way that evokes the piles of personal effects found at concentration camps. The big head that tends to stand in for Guston stares at them; a light bulb dangles, closet- or dungeon-style, over his head. Behind him are a pair of hoods, one raising a whip to strike him. My instant interpretation of the painting was that it portrays a habit I recognize from my Jewish family and community: obsessing over the dangers of the past, not turning to look at the present, which is full of threats to Jews and non-Jews alike.But then I remembered that Guston said he saw himself behind the hoods. (Asked at a talk if the hooded figure could be himself, he said it had occurred to him, adding, “Well, it could be all of us.”) Once I read both the big head and the whip-wielding Klansman as Guston avatars, the painting transformed for me. It remained a portrait of Jewish fear; it also evoked the guilt of an American Jew unable to prevent the horrors of the Holocaust; and it became an acknowledgment of the painter’s own proximity to power and whiteness. Guston’s hood paintings conjure the discomfort and dissonance of this reality. They put viewers in a cramped car or attic not only with evil, but also with unresolved complicity, confusion, and shame.The novelist Dara Horn has argued that Jewish literature tends to reject neat endings. She considers this inclination a sign of a “realism that comes from humility, from the knowledge that one cannot be true to the human experience while pretending to make sense of the world.” Guston’s hood paintings, perhaps, belong to this baffled tradition. Guston didn’t know what to say about the Klan or about racial violence, except that he knew to fear it as a Jew, and to both oppose and feel implicated in it as a white American. Nearly 50 years after he painted The Oracle, I can’t honestly say I know more.
2 d
theatlantic.com
How Party-Switching Can Reduce Polarization
Pundits and voters of all stripes lament just how extreme, polarized, and ideological American politics has become. But such grievances rarely come with advice for how ordinary people can address this problem, other than by voting for their preferred political party’s candidates in general elections. Even that advice isn’t very helpful: Voters in many parts of the country do not have the chance to participate in close electoral contests.Yet Democrats in Alabama and Republicans in New York, say, still have the power to secure better representation in Congress and strike a blow against political polarization. In places where electoral competition is lacking, primary elections by and large decide political outcomes. Voters in those places are accustomed to participating in their own party’s primaries. But often the opposite party’s primary is more competitive and more consequential. So why not strategically vote in the other party’s primary?[Nick Troiano: Party primaries must go]To be clear, we’re not talking about trying to throw the opposite party’s primary to the least electable candidate, which can easily backfire. Our proposal applies to races that either Democrats or Republicans know they will likely lose. In these circumstances, voters should try to influence how they will lose. Their ultimate goal should be to pull the other side toward their preferred ideological position.Consider the recent Senate primary in Ohio. Democrats had an entirely uncompetitive race, with Representative Tim Ryan winning nearly 70 percent of the vote, handily defeating Morgan Harper, whose previous claim to fame was losing a Democratic congressional primary in the Columbus area in 2020. On the Republican side, however, polls showed a very close race with a crowded field, and early indications suggested that turnout would be relatively low. Ryan faces a tough general-election race; no Democrat not named Sherrod Brown has won a federal statewide election in Ohio in nearly a decade.The most moderate candidate in the Republican field, state Senator and Cleveland Guardians part-owner Matt Dolan, who surged in the polls late in the race, came in third, trailing the Hillbilly Elegy author J. D. Vance, whom former President Donald Trump endorsed, by nearly 95,000 votes. At first, this seems like quite a large number of votes, and indeed it would have represented almost 40 percent of Dolan’s eventual vote total. But if just 20 percent of the more than 500,000 Democrats who voted in their primary had cast ballots for Dolan, a more moderate voice likely would have prevailed on the right, and Democrats would have increased their odds of being represented by a senator who shared at least some of their values. Instead, 500,000 Ohio Democrats cast a vote—more than 350,000 of them for Tim Ryan—in an election that wasn’t competitive at all.Strategic, limited party-switching has some precedent. About 8,000 nonpartisan and Democratic voters reregistered as Republicans in Nebraska’s recent primary, likely in an attempt to prevent the Trump-endorsed candidate from winning the GOP gubernatorial contest. Or consider Mississippi in 2014. Incumbent GOP Senator Thad Cochran found himself forced into a runoff against the much more conservative Chris McDaniel. Cochran survived in part because he persuaded enough Black Democratic voters to turn out and support him. Conservatives cried foul, but what was the sin? Cochran had a reputation (at least for a Republican) for reaching out to Black voters, and whoever emerged from that primary would almost surely win the general election. By participating in the election that was very likely to determine the next senator from Mississippi, these voters helped ensure that their senator was the “least bad” option. Our reaction? More, please.[Read: What the primaries reveal about the future of Trumpism]This calculus could just as easily apply to Republican voters in blue states such as Maryland, Massachusetts, and New York, or in liberal cities or congressional districts. The Republican presidential primary was effectively uncontested in 2020. It would have made more sense for Republican voters to try to prevent a progressive candidate like Bernie Sanders or Elizabeth Warren from becoming the Democratic standard-bearer than to validate Trump’s inevitable primary triumph. In Washington, D.C., Republican voters regularly participate in the city’s Democratic primary because they know that a Republican candidate is extremely unlikely to win a general election; the primary is where the election is decided.Of course, this strategy has drawbacks. First, it doesn’t work in every state, because not all states have open primaries or Election Day registration (which allows voters to reregister with the opposing party for the general election with ease). Regardless, the hassles of reregistering as a Republican or Democrat might be too high to expect too many voters to do it.Second, there is a risk that with moderate Democrats opting to vote in a Republican primary, the results of the Democratic primary might skew toward a more extreme nominee (and vice versa on the Republican side). But our advice is for voters to selectively intervene only when their own party’s primary is all but a foregone conclusion and when a member of the opposite party is favored to win in the general election. Voters might not know to do this on their own, especially when they like their own party as much as or more than they dislike the opposing party. So it is incumbent on political elites, civic leaders, and local and state-based journalists to offer guidance when the conditions are right.The final drawback is perhaps the most serious, so let us reiterate: We are not encouraging voters to participate in the other party’s primary elections in an attempt to nominate someone unelectable. That practice, known in elections circles as “ratfucking,” has resulted in some notable successes. The most prominent was in the 2012 Senate election in Missouri, where then–Democratic Senator Claire McCaskill spent $1.7 million in the Republican primary in the hopes that the GOP nomination would go to Representative Todd Akin. Akin won the primary, and his campaign imploded shortly afterward, thanks to his infamous remark about “legitimate rape.”But ratfucking is a risky endeavor. What if Akin’s campaign had not imploded? Missouri is Republican enough that a win in the general election would not have been out of the question, leaving the state with a far more radical senator than the other two potential Republican nominees. In the Republican gubernatorial primary in Pennsylvania just last week, Democrats aired TV advertisements as part of an intentional strategy to boost the chance that Doug Mastriano, a state senator who had sought to overturn the 2020 election results, would end up the Republican nominee. He did—and he might well win the general election. At the presidential level, many progressive commentators suggested in the early days of the 2016 Republican primary that Trump was the best GOP candidate—in the sense of the “easiest for Hillary Clinton to beat.” Oops. Going back further, Herbert Hoover worked behind the scenes at the 1932 Democratic convention to steer the nomination to Franklin D. Roosevelt. Oops again.Of course, some voters don’t want less polarization; recent research suggests that the penalty for extremism in elections has declined over time. But most Republicans still prefer centrist Democrats to hard-core progressives, and most Democrats still prefer centrist Republicans to hard-core conservatives. If more voters saw that primary party-switching could work in their interest—under certain conditions—the result in the aggregate would be to help restore a more centrist leadership in Washington, and perhaps even make government functional again.
2 d
theatlantic.com
Hacks Rewrites the Road-Trip Comedy
Deborah Vance, the caustic stand-up played by Jean Smart in HBO Max’s Hacks, likes to live large. Take her new ride, for instance: It’s a gussied-up tour bus with her initials emblazoned in hot pink on the side, and it’s equipped with a soda machine and a regenerative light-therapy bed. It also has a “much better master bedroom than the last time I was on one of these things,” Deborah coos as she ambles inside. It’s a vehicle fit for a celebrity of her stature.The bus is also, figuratively speaking, the story engine for the Emmy-winning comedy’s second season. The new episodes follow Deborah and Ava (played by Hannah Einbinder), the Millennial she’d hired as her writer, on a trek across America to perform shows and sharpen the confessional material in Deborah’s set. Road-trip comedies have been effective since Clark Gable and Claudette Colbert met-cute on a Greyhound in It Happened One Night. Lucy and Desi piled into The Long, Long Trailer for their first film together. Robin Williams drove an RV to the top of the box office. Buses and customized vans make for excellent pressure cookers, and barreling through towns of varying sizes can lead to colorful encounters, unexpected detours, and maybe even profound observations. Mayhem and enlightenment, in road-trip comedies, go hand in hand.[Read: What Hacks proves about Jean Smart]Season 2 of Hacks boasts its fair share of travel-related hijinks, whether Deborah’s making a pit stop to purchase antiques or Ava’s holding up the schedule because she needs to buy a bigger water bottle. Still, the series, which streams new episodes every Thursday, uses the road not only to create dramatic tension but also to reveal how performers think. There’s little contrast between the Deborah of Las Vegas and the Deborah on the road—and making her take her act across the country only emphasizes how her home has always been the stage, no matter where that stage happens to be.Deborah, unlike other travelers, isn’t on a journey toward self-reflection. Her Las Vegas residency ended with her bombing at her final show, so she’s seeking different opportunities to perform for people who can tell her what they think of her and, more important, what they want from her now. These gigs are challenging, even for the comic who’s considered the funniest woman in the business, and as the season progresses, Deborah becomes chameleonic. On a lesbian cruise, she does what she thinks the crowd will like, dancing—or, as Ava puts it, “doing Ellen”—and switching up her jokes. Upstaged at a state fair by the birth of twin calves in one of this week’s episodes, Deborah tries to make cow-related jokes, but loses her composure. This isn’t a road trip about understanding oneself; Deborah has always known who she is. What she doesn’t know is whether she still has an audience. She's not eating, praying, and loving. She's taking notes, calculating moves, and working until her set is complete.[Read: The marriage plot for the age of workism]And that work, Hacks makes clear, is her pleasure. An odyssey of this type might provide escape for most people, but not for Deborah. Instead, it underlines how no real boundary exists between her personal and professional spheres, because her art relies on sharing opinions and stories from her own life. Even for those who don’t make money by riffing on their own relationships before the public, Deborah’s attitude may resonate, especially given how popular “workcations” have become. She can never fully turn off the part of her that cares about her career, so much so that she’ll commit to making a pit stop at a farm in the sweltering heat, to tape a segment for her home-shopping show. Afterward, the camera follows Deborah as she walks away, looking at ease, while in the background her personal assistant struggles to break down the set. Deborah sees her utter devotion to her job as normal, not strange.Ava, meanwhile, is on a self-reflective journey, but her story line captures how being on the road doesn’t always lead to self-betterment; for her, that idea is nothing more than a myth. After confessing to Deborah about the nasty email she wrote while drunk and high—an account of how horrible Deborah is to work for, sent to the writers of a new TV show about a bitchy prime minister called Bitch PM in the first-season finale—Ava feels so guilty, she decides to correct her worst habits. She declares she’s no longer drinking, and that she’ll get herself a flip phone to prevent further digital mishaps.Neither attempt manages to turn Ava into a flawless person, of course: Aboard the lesbian cruise, she’s easily swayed into grabbing a cocktail, and her decision not to use a smartphone results in her nearly losing her dad’s ashes. Besides, even if Ava thinks she’s being a more well-rounded person, her efforts go unrewarded. Deborah remains her overbearing boss, and she’s even suing Ava for breaking her NDA with the email. The two of them might as well be back in Deborah’s mansion for all the headway Ava has made in gaining Deborah’s trust.Yet the lack of epiphanies, life lessons, and intergenerational bonding is what makes the second season work so well. The show isn’t following the trajectory of other comedies that have forced their characters to hit the road. Hacks takes full advantage of getting to provide a new playground for its ensemble, but the structure allows, more than anything, the show to highlight the ways in which performers like Deborah and Ava don’t ever get to clock out. They grow closer only when they’re writing jokes together—and even then, that closeness is by necessity, built in service of Deborah’s new routine, which both of them need to succeed. Every stop offers just another spotlight, every detour more material to turn into a bit. Perhaps that’s a bleak conclusion for a comedy. But then again, neither woman would have it any other way.
2 d
theatlantic.com
The People Who Hate People
Some propositions are so obvious that no one takes the time to defend them. A few such propositions are that human life is good, that people can and often do provide more benefits to the world than they take away, and that we should design society to support people in leading lives that are good for themselves and others.These ideas came under attack, sometimes subtly and sometimes overtly, by environmentalists in the 20th century who were worried about overpopulation. Although major organizations have abandoned population management as an explicit policy goal, the underlying fear that too many people are running up on the limits of too few resources and Well shouldn’t someone do something about that? has never fully been rooted out of American political thought. It is alive and well among NIMBYs. Of all the objections people raise to new housing and infrastructure, perhaps the most risible is that their community is already too crowded. Some even suggest that municipalities should limit housing supply explicitly to combat population growth.At a recent Palo Alto city-council meeting, one resident argued against pro-housing policies, saying, “Does it make sense to be planning for more people? … More people on the planet spells more consequential implications for climate change, loss of biodiversity, stress, war, famine, etc. At a time when humans are in major ecological overshoot, doesn’t it make more sense to plan for a reduced population, plan for reducing population, not increasing it?”[Jerusalem Demsas: Community input is bad, actually]Invariably, the problem is always other people. The man behind the organization that sued UC Berkeley to reduce its enrollment, Phil Bokovoy, told The New Yorker that he opposed building more housing in Berkeley in part because “I don’t think we’ll be able to tackle climate change unless we tackle population growth and rising living standards over a huge part of the world.”Lest you worry that this is a California-specific brain disease, let me reassure you that this antihuman thinking has permeated discourses all over the nation—and the world.But population growth is not the problem that so many people seem to think it is, not least because of the global decline in fertility; arguably, declining population growth is the real population-related concern of the century. And even if it were a concern, the policies that NIMBYs support not only fail to create a climate-conscious built environment but actually make fighting climate change more difficult. Paul Ehrlich’s 1968 book, The Population Bomb, catalyzed overpopulation concerns among the American public. Ehrlich, a Stanford biologist, also served as the first president of the group Zero Population Growth (ZPG). As the historian Keith Woodhouse recounts in his book, The Ecocentrists, “The group’s goal was an end to population growth; the means, troublingly, were not yet specified. Within three years [of its founding in 1970], ZPG had thirty-two thousand members.”The Population Bomb opens not with a depiction of overconsumption by high-income Westerners but with the author’s memory of a taxi ride with his wife and daughter “one stinking hot night in Delhi.” Ehrlich describes the “crowded slum area” and proceeds to detail, in prose dripping with disgust, the view from his cab window of people just living their lives: “People eating, people washing, people sleeping. People visiting, arguing, and screaming. People thrusting their hands through the taxi window, begging.” This goes on and culminates in the almost-too-on-the-nose admission: “All three of us were, frankly, frightened.”As Ehrlich’s family gawked at Delhiites, the U.S. was emitting 18.66 tons of carbon per capita to India’s .33, meaning that the average American was emitting 56 times more than the average Indian. If Ehrlich was genuinely concerned about overconsumption, why is the opening image of his book that of poor, brown people and not the suburban car-centric sprawl that characterizes his home country?The book’s main argument is that an increasing population will run out of resources and if steps aren’t taken to reduce the population, then scarcity will make the world poor. In particular, Ehrlich was concerned about the world running out of food, and he foretold that mass starvation events would mark the waning decades of the 20th century.Ehrlich’s dire predictions have been wrong time and again. Hunger and undernourishment have declined since The Population Bomb was released. To emphasize how alarmist his predictions were, a brief aside: In 1969, the author predicted that “by the year 2000 the United Kingdom will simply be a small group of impoverished islands, inhabited by some 70 million hungry people, of little or no concern to the other 5-7 billion inhabitants of a sick world,” later adding, “I would take even money that England will not exist in the year 2000.”As careful students of history know well, England still exists.To say that overpopulation skeptics were simply wrong is too generous. Twentieth-century critics on the left saw the movement for what it was: In the 1970s, New Left activists and Students for a Democratic Society both criticized ZPG and overpopulation mania. The latter group, Woodhouse writes, “accused ZPG of reckless simplification, reasoning that by treating all people as a single flat category, population activists ignored not only human difference but also human value.” New Left activists specifically called out the underlying racism of ZPG’s project: “ZPG says that there are too many people, especially non-white people, in the world … that these people are terrifying and violent, and that their population growth must be stopped—by ‘coercion’ if necessary.”[Read: The next century’s big demographic mystery]One legacy of this intellectual tradition is the modern xenophobic anti-immigration movement. The Federation for American Immigration Reform (FAIR), founded by John Tanton, was instrumental in advocating for strict immigration controls. (To get a flavor for the callousness of this group, check their website, which tells visitors “How to Report Illegal Aliens.”)Tanton was president of ZPG from 1975 to 1977 and, as the historian Sebastian Normandin and the philosopher Sean A. Valles wrote in a 2015 paper, FAIR “began as a 1979 offshoot of ZPGs Immigration Committee, following ZPG approval of the proposal in 1978.” While the blend of “1960s ecology and neo-eugenics” seems “idiosyncratic or even fringe today … their influence remains.” The authors conclude that “today’s immigration restrictionist network was built and led by—and in some cases is still led by—a network of conservationists and population control activists.”Tanton’s anti-immigration views were not just compatible with his concerns about overpopulation, they were born out of them: As Jason Riley wrote in 2019 for The Wall Street Journal, “Opposition to immigration, legal or illegal, was simply a means to that end.”What should be obvious, but apparently is not, is that opposing immigration actually reduces the U.S.’s ability to cultivate and take advantage of brilliant people who could develop the technological advances to save the planet. (Thirty-seven percent of American Nobel Prize winners in chemistry, medicine, and physics from 2000 to 2020 have been immigrants.)A growing population means more people generating more ideas, but also more interactions among different people coming from different perspectives. These two effects can sound trivial but actually do lead to more and better ideas. The economist Hisakazu Kato argued in 2016 that “a large population will generate many ideas that could bring about rapid technological progress.”Overpopulation concern-mongers not only underestimate the ability of people to help solve the problems of climate change; they also fail to accept that neither resources nor human needs are fixed. The idea that resources will “run out” implies that human ingenuity will remain stagnant. But it doesn’t. Norman E. Borlaug won the Nobel Peace Prize in 1970 (just two years after The Population Bomb was published) for helping Mexico become “self-sufficient in grain” by developing “a robust strain of wheat—dwarf wheat—that was adapted to Mexican conditions.” He then worked in India and Pakistan to introduce dwarf wheat to the countries’ agricultural landscape and became known as the father of the “Green Revolution.”As Gregg Easterbrook noted some years ago in The Atlantic, Ehrlich had written in 1968 that it was a “fantasy” that India would “ever” feed itself. But “by 1974 India was self-sufficient in the production of all cereals.” Borlaug himself was concerned about population growth, but instead of pursuing an anti-humanist agenda, he turned to technological innovation to save countless lives.The economist Julian Simon, a longtime critic of the overpopulation activists, bet Ehrlich that the price of five metals would fall from 1980 to 1990. As The New York Times noted in Simon’s obituary, Ehrlich believed that “rising demand for raw materials by an exploding global populace would pare supplies of nonrenewable resources, driving up prices.” Simon won the bet.If the overpopulation alarmists of the 1970s had really wanted, above all, to protect the environment, they should have promoted the development of dense, energy-efficient communities.The evidence is clear, and has been for some time, that density is good for the environment. As UC Berkeley researchers argued in a 2014 paper, “population-dense cities contribute less greenhouse-gas emissions per person than other areas of the country,” and “the average carbon footprint of households living in the center of large, population-dense urban cities is about 50 percent below average, while households in distant suburbs are up to twice the average.”But the people worried about other people don’t have a pro-density history; quite the opposite.As the urban planner Greg Morrow detailed in his 2013 dissertation, overpopulation activists fought for the very legal frameworks that would keep cities low-density and worsen suburban sprawl. Morrow noted that in the early 1970s, the UCLA professor Fred Abraham, then-president of ZPG of Los Angeles, argued that “we need fewer people here—a quality of life, not a quantity of life. We must request a moratorium on growth and recognize that growth should be stopped.” Morrow added that “the Sierra Club (L. Douglas DeNike) agreed and suggested ‘limiting residential housing is one approach to lower birth rates’ and recommended ‘a freeze on zoning to limit new residential construction.’”Half a century later, NIMBYs who cite overpopulation concerns when opposing new housing say they are afraid of overcrowding on their streets and in their parking lots. Some continue to invoke the global South as a dark warning for the future. In an interview with Slate’s Henry Grabar, Bokovoy, the Berkeley anti-growth activist, warned that his city could end up like “Bangkok, Jakarta, Kuala Lumpur” if more students are allowed to attend the local UC campus. In Trussville, Alabama, the president of a local homeowners’ association played all the hits as he stated his opposition to the inclusion of multifamily housing in a new development in his area: “You’re bringing that many more people with that many more cars,” he complained. “We envision this side of town as spacious properties, higher home values … not having crowded streets.”We have, of course, discovered an elusive technology to allow more people to live on less land: It’s called an apartment building. And if people would like fewer neighbors competing for parking spaces, then they should rest assured that buses, trains, protected bike lanes, and maintained sidewalks are effective, cutting-edge inventions available to all.Perhaps you’re pessimistic about technology’s ability to help solve the big environmental problems we’re facing. You may not trust that technology could ever be sufficient to reduce carbon emissions, or that our political systems could make that technology widespread.If that is the case, think for a second about what it would take to slash the population down by several billion. (Ehrlich himself recently put the optimal number at 1.5 to 2 billion.) The only way to do this is to kill people, limit the aid you give to sick people, and/or stop new people from being born.Some believe that the third approach could be adequate, and achieved simply by providing people with contraceptives. But survey data show that women are actually having fewer children than they would like, so providing more family-planning services is not going to cut the population by four-fifths.[Derek Thompson: Why U.S. population growth is collapsing]NIMBYs and overpopulation skeptics share a sense that the world is too full, that their communities are for the people who already live there, and that new people—immigrants from abroad or the next state over—are simply burdens. And in doing so, they create the world they imagine: unacceptable rates of homelessness, a country lagging far behind its peers in building mass transit, and declining trust.“I think once you get past this idea that NIMBYs are simply curmudgeons or busybodies then you start to realize why this attitude toward growth today – which in the context of the present housing crises and cost of living crises seems so ridiculous – actually has very firmly embedded ideological roots within…liberalism and thus why it’s so difficult to root out among people who consider themselves liberals,” the historian Jacob Anbinder explained to me.Between a politics of scarcity that demands inhumane policy interventions and a politics of abundance, it’s not much of a choice, but it’s one that population skeptics have to make. Enough with the innuendo: If overpopulation is the hill you want to die on, then you’ve got to defend the implications.
2 d
theatlantic.com