Ageism never gets old ( a 10 min read )

The prejudice is an ancient habit, but new forces—in Silicon Valley, Hollywood, and beyond—have restored its youthful vitality.

By Tad Friend

We tend to caricature the elderly as either raddled wretches or cuddly Yodas.


Illustration by Golden Cosmos

Early in his career, Paul Newman personified a young man in a hurry forced to wait his turn. His go-getter characters infiltrated the old-boy network, wore the grey flannel suit, and toiled away before finally, in midlife, grabbing the brass ring and coasting for home. In “The Young Philadelphians” (1959), for instance, Newman played Tony Lawrence, whose mother, over his cradle, gloats, “Someday, he’ll take the place in this city that belongs to him.” Young Philadelphians, it’s clear, are merely old Philadelphians in the making. While Tony is at Princeton, a silver-haired Philadelphia lawyer so venerable he has a British accent tells him, “I’m confident that in due time you’ll become a partner in Dickinson & Dawes.” As Tony shinnies up the greasy pole at an even more eminent firm, he grumbles when old man Clayton has his work on Christmas and grouses that big clients are “reserved for the seniors” who wear homburgs and smoke pipes. Eventually, though, he makes partner and smokes a pipe of his own. Yay.

Times have changed. In “Disrupted: My Misadventure in the Startup Bubble” (Hachette), Dan Lyons, a fifty-one-year-old Newsweek reporter, gets his first shock when he’s laid off. “They can take your salary and hire five kids right out of college,” he’s told. His second shock occurs when he takes a lower-paying job at a startup called HubSpot, where his boss is a twentysomething named Zack who’s been there a month. Lyons arrives for work in the traditional uniform of a midlife achiever—“grey hair, unstylishly cut; horn-rimmed glasses, button down shirt”—to find himself surrounded by brogrammers in flip-flops who nickname him Grandpa Buzz. His third shock is the realization that the tech sector usually tosses people aside at fifty. A few chapters later, he advances the expiration date to forty. A few chapters after that, he’s gone.

This sharp shift in the age of authority derives from increasingly rapid technological change. In the nineteen-twenties, an engineer’s “half life of knowledge”—the time it took for half of his expertise to become obsolete—was thirty-five years. In the nineteen-sixties, it was a decade. Now it’s five years at most, and, for a software engineer, less than three. Traditionally, you needed decades in coding or engineering to launch a successful startup: William Shockley was forty-five when he established Fairchild Semiconductor, in 1955. But change begets faster change: Larry Page and Sergey Brin were twenty-five when they started Google, in 1998; Mark Zuckerberg was nineteen when he created Facebook, in 2004.

With the advent of the cloud and off-the-shelf A.P.I.s—the building blocks of sites and apps—all you really need to launch a startup is a bold idea. Silicon Valley believes that bold ideas are the province of the young. Zuckerberg once observed, “Young people are just smarter,” and the venture capitalist Vinod Khosla has said that “people over forty-five basically die in terms of new ideas.” Paul Graham, the co-founder of the Valley’s leading startup accelerator, Y Combinator, declared that the sweet spot is your mid-twenties: “The guys with kids and mortgages are at a real disadvantage.” The median age at tech titans such as Facebook and Google is under thirty; the standard job requirements in the Valley—which discourage a “stale degree” and demand a “digital native” who’s a “culture fit”—sift for youth.

That culture is becoming the culture. At Goldman Sachs—a century-and-a-half-old investment bank that is swiftly turning into a tech company—partners are encouraged to move on after five years or so, or risk being “de-partnered.” As one senior banker says, “There’s always somebody on your six”—military terminology for the guy right behind you. A recent A.A.R.P. study revealed that sixty-four per cent of Americans between forty-five and sixty had seen or experienced age discrimination at work. Accrued eminence still matters at law firms and universities (though tenured positions have fallen fifty per cent in the past forty years), but the rest of the culture has gone topsy-turvy. Even as Lycra and yoga make fifty the new thirty, tech is making thirty the new fifty. Middle age, formerly the highest-status phase of life around the world, has become a precarious crossing. The relatively new tech sector is generating enormous amounts of a very old product: ageism.

“Ageism” was coined in 1969, two years after the Federal Discrimination in Employment Act set forty as the lower bound at which workers could complain of it. The upper bound continues to rise: the average life span grew more in the twentieth century than in all previous millennia. By 2020, for the first time, there will be more people on Earth over the age of sixty-five than under the age of five.

Like the racist and the sexist, the ageist rejects an Other based on a perceived difference. But ageism is singular, because it’s directed at a group that at one point wasn’t the Other—and at a group that the ageist will one day, if all goes well, join. The ageist thus insults his own future self. Karma’s a bitch: the Baltimore Longitudinal Study of Aging reports, “Those holding more negative age stereotypes earlier in life had significantly steeper hippocampal volume loss and significantly greater accumulation of neurofibrillary tangles and amyloid plaques.” Ageists become the senescent figures they once abhorred.

The baldest forms of ageism include addressing older people in “elderspeak”—high, loud tones and a simplified vocabulary—and tarring them with nouns like “coot” and “geezer” or adjectives like “decrepit.” The young can’t grasp that most older people don’t feel so different from their youthful selves. When Florida Scott-Maxwell was living in a nursing home, in 1968, she wrote in her journal (later published as “The Measure of Our Days”), “Another secret we carry is that though drab outside—wreckage to the eye, mirrors a mortification—inside we flame with a wild life that is almost incommunicable.” She felt like the person she’d always been. Last year, Americans spent sixteen billion dollars on plastic surgery, most of it on fountain-of-youth treatments for wrinkles, trying to close the gap between interior vitality and exterior decay.

Eye tucks get an eye roll in two books that view the problem not as the elderly but as a culture that has forgotten how to value them. Ashton Applewhite’s “This Chair Rocks: A Manifesto Against Ageism” (Networked Books) and Margaret Morganroth Gullette’s “Ending Ageism, or How Not to Shoot Old People” (Rutgers) both grapple thoughtfully with how we got here. Yet each writer tends to see ageism lurking everywhere. Gullette, a resident scholar at the Brandeis Women’s Studies Research Center, is given to such pronouncements as “Typically, anonymous old people portrayed in art exhibits, websites, and journalism convey decline ideology.” That’s a lot of terrain to cover with a “typically.” Applewhite, an activist whose blog, “Yo, Is This Ageist?,” fields inquiries on the topic (the usual answer is yes), is the more grounded guide. She begins by suggesting that we call the elderly “olders.” Ordinarily, this sort of cream concealer—“aging” replaced by “saging” or “eldering”; Walmart greeters hailed for their “encore career”—deepens the frown lines it’s meant to erase. But Applewhite’s point is that older people may not be qualitatively different from “youngers.” She notes that only ten per cent of Americans who are at least eighty-five live in nursing homes, and that half of those in that cohort don’t have caregivers; for the most part, she maintains, they are cognitively robust, sexually active, and “enjoy better mental health than the young or middle-aged.” Her conclusion: “Clearly, hitting ninety was going to be different—and way better—than the inexorable slide toward depression, diapers, and puffy white shoes I’d once envisioned.”

Well, wait and see. Applewhite attacks those who carelessly attribute “decline to age rather than illness,” but the distinction lacks a real difference; age is the leading precondition for most of the decline-hastening diseases, such as cancer, heart disease, and Alzheimer’s. Ageism can be hard to disentangle from the stark facts of aging. Ursula K. Le Guin, who’s eighty-eight, remarks in her recent book of essays, “No Time to Spare” (Houghton Mifflin Harcourt), “If I’m ninety and I believe I’m forty-five, I’m headed for a very bad time trying to get out of the bathtub.” And that’s just the physical difficulties. A third of those over eighty-five have Alzheimer’s. Unsurprisingly, perhaps, the most virulent forms of ageism sprout in retirement communities: in some, if those in assisted living visit the independent-living dining room, they’re forbidden to bring in their walkers or wheelchairs. This often means that a couple married for fifty years can’t eat together.

Gullette argues that ageism stems from the perception that old people are irrelevant. She links the rise of ageism over the centuries to broad trends: the printing press and widespread literacy made the lore that elders carried in their heads available to all (a process hastened, and even finished off, by Google); the industrial revolution increasingly demanded younger, more mobile workers; and medical advances made so many people live so much longer.

Ageism is further fuelled, Gullette believes, by what she calls the “ideology of scarcity”—the trope that the elderly are locusts who swarm the earth consuming all our resources. The relevant economic terminology is indeed grimly suggestive: those over sixty-four are part of the “dependent” rather than the “productive” population; they are “the burden” that the young must carry. A Moody’s report suggests that the aging population—often apocalyptically referred to as “the gray horde” or “the silver tsunami”—will dampen global economic growth for two decades. The two biggest federal outlays, by far, are Social Security and Medicare, and the Bureau of Labor Statistics predicts that between 2016 and 2024 the five fastest-growing jobs (aside from wind-turbine service technicians) will be in health care and elder care.

Yet older people, increasingly, aren’t simply creeping off into a twilit world of shuffleboard and sudoku. In 2000, 12.8 per cent of those over sixty-five were working; in 2016, it was 18.8 per cent. Furthermore, old people have most of the money. Thirty years ago, households headed by those over sixty-five were ten times as wealthy as those under thirty-five; now they’re fifty times as wealthy. So the elderly are a huge market. Think how often you’ve seen ads selling the twin bathtubs of Cialis and the guy tossing the football through the tire of Levitra.

Gullette argues that pharma and cosmetic companies aren’t catering to the old so much as catering to the ageist idea that getting old is unbearable. Using similar reasoning, Allure decided this summer to drop the phrase “anti-aging” from all its copy. The magazine will now tell you only that Retinol can smooth wrinkles and fade spots, which may make you look, um, different. The A.A.R.P.has proclaimed that “anti-aging” and its synonyms “serve no other purpose than to, well . . . make people feel bad about aging.” Dior, choosing its own way to show how vibrant a woman of a certain age can be, just made Cara Delevingne the face of its Capture line of wrinkle creams. (Delevingne is twenty-five.)

Gullette and Applewhite want you to feel great about aging. The path to that bliss is obscure, though, because they think everyone is doing aging wrong. Gullette warns against not only stereotypes of decline but also “the opposite homogenization: positive aging.” If you go skydiving, as George H. W. Bush did on his ninetieth birthday, you’re guilty of “competitive ableism.” Even if you simply murmur into your diary that you don’t feel eighty-one, Applewhite finds you guilty of “internalized ageism.” Comparing your state of mind to the number on your driver’s license, she says, “gives the number more power than it deserves, contributes to ageist assumptions about what age signifies and ageist stereotypes about what age looks like, and distances us from our cohorts.” Her way out of the aging pickle is “more examples in the media, many more, of olders living ordinary lives, neither drooling nor dazzling.” Here’s to the Meh Generation.

Applewhite contends that fear of aging is more Western than Eastern, and that it doesn’t exist in places that have escaped the reach of global capitalism. “In most prehistoric and agrarian societies,” she writes, “the few people who lived to old age were esteemed as teachers and custodians of culture.” This is a comforting idea: if ageism is a by-product of modernity, it should be relatively easy to reverse.

In truth, many nonindustrial societies—half of those which have been surveyed—forsake their elderly. The Marind Anim of New Guinea bury senescent elders alive. The Chukchee of Siberia stab them through the heart. And the Niue of Polynesia view impaired old people as “the nearly dead,” who threaten the barrier between worlds. For Niueans, the medical anthropologist Judith C. Barker writes, “To laugh at decrepit elders, to deride their feeble endeavors at being competent humans, to ridicule them, to neglect them, to be wary of and distant during interactions with them is not to disrespect an elder but to guard against foreign intrusion. These behaviors do not involve elders, but an entirely different category of being.” Namely, the Other.

A meta-analysis by the academics Michael S. North and Susan T. Fiske reveals that Eastern societies actually have more negative attitudes toward the elderly than Western ones do, and that the global ageism boom stems not from modernization or capitalism but from the increase in old people. North and Fiske also note that “efforts to intervene against age prejudice have yielded mixed results at best.” Having students simulate the experience of being old by donning weighted suits and vision-inhibiting goggles, or exposing them to “intergenerational contact”—actual old people—doesn’t lead to kumbaya moments. “Such approaches do not appear to incite a long-term desire among the young for interaction with elders,” they regretfully conclude, “and contact can backfire if older adults are particularly impaired.” Ageism, the slipperiest ism, is also the stickiest. What makes it so tenacious?

We don’t just caricature the elderly as raddled wretches. We also caricature them as cuddly Yodas. The anthropologist Jay Sokolovsky observed that “the ethnographic literature now abounds with this type of dramatic alternation between ‘Dear Old Thing’ and ‘Scheming Hag’ metaphors.” In 1862, Ralph Waldo Emerson situated the toggle point for these obverse perspectives on the outskirts of town:

Age is becoming in the country. But in the rush and uproar of Broadway, if you look into the faces of the passengers, there is dejection or indignation in the seniors, a certain concealed sense of injury.

Nowadays, this toggle point is situated in film and television, where elderly Native Americans and black men are portrayed as sages (Morgan Freeman has played the leader of each of the three branches of government, as well as God) but other elderly people are nearly invisible. “One of the worst things you can be in Hollywood is old,” Kathy Bates, who’s sixty-nine, remarked recently. A U.S.C. study of the films nominated for Best Picture between 2014 and 2016 showed that only 11.8 per cent of the actors were sixty or older, although that age group constitutes 18.5 per cent of the U.S. population. The same vanishing act occurs even earlier offscreen: one TV-writer friend of mine was warned when he got to Hollywood, “Don’t tell anyone you’re thirsty, because you still look a little younger.” Older writers are sometimes called “grays,” as in, “We already have a gray.”

 

In the U.S.C. study, seventy-eight per cent of the films had no older female actors in leading or supporting roles. Actresses have always had a shorter runway; Jimmy Stewart was twice Kim Novak’s age in “Vertigo.” Economists call this phenomenon, in which older women’s looks are judged more harshly than older men’s, the “attractiveness penalty.” A Web site named GraphJoy analyzed the gender gap in studio films and found that Tom Cruise, for instance, was three years younger than his “Risky Business” co-star Rebecca De Mornay, in 1983, but that lately he’s been as much as twenty years older than his female co-stars. Two years ago, Maggie Gyllenhaal, at thirty-seven, was told she was “too old” to play the love interest of a fifty-five-year-old man. As Goldie Hawn’s aging-actress character observed in “The First Wives Club,” “There are only three ages for women in Hollywood: babe, district attorney, and ‘Driving Miss Daisy.’ ” Last year, California passed a law requiring sites such as IMDb, the movie and TV-show database, to remove people’s birth dates upon request.

Lately, a talent manager I know says that ageism in Hollywood has grown even more rampant because so much content is being viewed on younger-skewing platforms like Netflix and Amazon. Even as the number of broadcast-television viewers has dropped and the average age has risen, to fifty-four—well above the eighteen-to-forty-nine tranche coveted by advertisers—the four programs most watched by eighteen-to-twenty-four-year-olds last fall were on Netflix. This downward migration will only increase now that Apple and Facebook are rolling out programming of their own.

As with any form of social struggle, age warfare plays out metaphorically onscreen. The first zombie film, George Romero’s “Night of the Living Dead” (1968), begins with an aged mother sending her two children into a remote area to visit their father’s grave—where a graying zombie attacks and kills the son. A newscaster sombrely explains, “People who have recently died have been returning to life and committing acts of murder.” The taboo thrill that lifts “The Walking Dead” and its zombie ilk is watching mayhem unleashed on this lurching, teeming enemy—the “nearly dead” the Niueans loathe.

This generational combat also surfaced on “The Simpsons,” when Montgomery Burns told his assistant, “Look at those delightful children, Smithers—all those healthy organs ripe for the harvesting!” America’s most beloved show depicts the elderly in a remarkably raw light. (Yes, it’s an animated comedy, but, still.) Homer’s father, Abraham (Grampa) Simpson, is a senile galoot, consigned by Homer to a retirement home, prone to telling rambling stories, the butt of every joke. Montgomery Burns is a powerful tycoon given to underhanded schemes. But he, too, is both physically feeble and senile: not so much forgetful as lost in the past. At the post office, he declares, “I’d like to send this letter to the Prussian Consulate in Siam by aeromail. Am I too late for the four-thirty autogyro?”

Mashup Grampa Simpson and Montgomery Burns and you get Donald Trump. If, as Michael Kinsley once suggested, Al Gore is an old person’s idea of a young person, then Donald Trump is a young person’s idea of an old person. He and his ageing, billionaire-laden Cabinet—the oldest since Reagan’s; the richest ever—embody the revenge of the old Philadelphians. The senile, reactionary elder who’s the target of Silicon Valley’s youth bias is a straw man. But that straw man will be hard to dispatch so long as he is running the country.

In the eighties, body-switch movies such as “Like Father, Like Son” and “Vice Versa” were told largely from the kid’s point of view: What would it be like to suddenly have all the perks and responsibilities of a grownup? With the cultural power now reversed, the frame is, too: What would it be like to suddenly have all the perks and responsibilities of a millennial?

In the pilot episode of the comedy “Younger,” which recently finished its fourth season on TV Land, forty-year-old Liza (Sutton Foster) tries to return to publishing after taking fifteen years off to raise a family. She tells the two snippy young women interviewing her at one publishing house, “Look, I know I’ve been out of play for a while, but I am a much smarter, more capable person than I was fifteen years ago!” They barrage her with all that she’s missed:

FIRST WOMAN: Facebook, Twitter, iPhones—

SECOND WOMAN: iPads, ebooks, YouTube—

FIRST WOMAN: Instagram, Snapchat, Skype—

SECOND WOMAN: Pinterest—

FIRST WOMAN: Bang with Friends.

Liza finally lands a job as an assistant at a house called Empirical—but only by pretending to be twenty-six. She dyes her hair, buys a flannel wardrobe, and bones up on such cultural touchstones as Katniss Everdeen and One Direction and on lingo like IRL, sorry/not sorry, truffle butter, and spit-roasting (which prove not to be the culinary terms they seem). Yet what makes Liza invaluable at Empirical—what propels the show past its central absurdity—is less her newfound facility with Krav Maga than her conscientiousness and wisdom. Because she solves everyone’s problems, her new friends tacitly agree to ignore the fact that she looks and acts forty.

Liza’s anxiety is not about keeping up; it’s about acting her supposed age—for instance, she can never quite get the hang of a meme. “Younger” deftly shows how the new ageism expresses itself as a question less of competence than of cultural fit. At one point, a twerpy Silicon Valley billionaire named Bryce becomes an investor in Empirical. He introduces “hot desking,” flies in cocktails by drone, and tells Liza, “I’m recommending we cut staff by forty-five per cent next quarter. Not you—just the old people.”

The Valley’s denizens, despite their sloganeering about worldwide empowerment, secretly believe that tech creates a series of moats in which digital immigrants eventually drown. Cord-nevers look down on cord-cutters, who look down on landliners, who look down on TV-setters, who look down on AOL-addressees like me. (I’m hoping it will eventually seem retro in a cool way, like blacksmithing.) Shortly after Google began, it marked its cultural boundary when Larry Page and Sergey Brin took a meeting with Barry Diller, the old-media tycoon. Brin arrived on Rollerblades, and Page kept staring at his P.D.A. Nettled, Diller asked if he was bored. “I’ll always do this,” Page said, continuing to stare at his P.D.A. Devices are divisive: they divide us from them.

Can the olds thrive among tech’s youngs? Earlier this year, Chip Conley recounted in the Harvard Business Review how he became a patriarch at Airbnb at fifty-two. “Many young people can read the face of their iPhone better than the face of the person sitting next to them,” he explained. Offering emotional intelligence in return for their digital intelligence, he styled himself as a “modern Elder,” “who serves and learns, as both mentor and intern, and relishes being both student and sage.”

If that sounds goopy but screenplay-ready, it’s because it’s essentially the plot of “The Intern” (2015), which was written and directed by Nancy Meyers, Hollywood’s leading impresario of later-life fantasies. Robert De Niro plays Ben, a widowed, menschy seventy-year-old who becomes a “senior intern” at a fashion startup run by Anne Hathaway’s character, Jules. Discriminated against in his job interview (“What was your major? Do you remember?”), and initially ill-adapted to this new world—he wears a suit, carries a briefcase, and uses a flip phone and a Casio calculator—Ben soon learns from Jules how to set up his Facebook profile. In return, he saves Jules’s company and her marriage, teaches the twentysomething interns how to man up, and even scores with the company masseuse. There may be snow on the roof, but there’s still a fire in the kitchen.

De Niro, at seventy-four, is too old to play a traditional leading man. But Hollywood has finally found a solution to the technology-hastened problem of stars aging out of the demo: better technology. In “The Irishman,” a forthcoming Martin Scorsese Mob film, the director will pair once more with De Niro, his favorite actor. The twist is that motion-capture technology and C.G.I. will enable the actor to look fifty in the film’s present day—and thirty in its flashback scenes. The time-honored Hollywood cry “Get me a thirty-year-old Robert De Niro!” is being answered by De Niro himself. Can the late Paul Newman be far behind?

The germ of ageism is age—what it brings and what it bodes. “Aging: An Apprenticeship” (Red Notebook Press), a collection of essays edited by Nan Narboe and written by a parliament of mature observers, is rife with gimcrack Zen. “The sound of the ocean is the sound of time passing, the sound of one moment giving way to the next,” one writer intones, and another imparts the axiom “Old age grounds us and from that grounded point of view, we can begin to attend to our inner and outer world in a way that we could not when we were speeding over the surface of things.” Children astonish; priorities change; wisdom accrues; readers nap.

The book’s flintier writers, all on the older end of the spectrum, scorn such piffle. Edward Hoagland observes, “By not expecting much, most of us age with considerable contentment—I’ve been noticing lately at senior-center lunches and church suppers—and even die with a bit of a smile, as I remember was often the case during a year I worked in a morgue in my twenties.” The poet Donald Hall casts a wintry eye at our circumlocutions for death—pass away, go home, cross over, etc.—and notes that “all euphemisms conceal how we gasp and choke turning blue.”

Timeless writers are ageists nonpareil. Shakespeare referred to life’s final scenes as “second childishness and mere oblivion, / sans teeth, sans eyes, sans taste, sans everything.” Philip Larkin, in “The Old Fools,” wrote, “Their looks show that they’re for it: / Ash hair, toad hands, prune face dried into lines.” And Philip Roth, in one of his later novels, wrote that “old age isn’t a battle; old age is a massacre.”

How can we avoid this savage truth? Obviously, by shunning old people. In “Ageism: Stereotyping and Prejudice Against Older Persons,” a collection edited by Todd Nelson (M.I.T. Press), a chapter by the psychologists Jeff Greenberg, Peter Helm, Molly Maxfield, and Jeff Schimel points out that many people preserve themselves from “death thought accessibility” by shunning “senior citizen centers, bingo parlors, nursing homes, golf courses, Florida, and Rolling Stones concerts.” The authors dryly conclude, “Another way to avoid older adults is to keep them out of the workplace.”

Ageism is so hard to root out because it allows us to ward off a paralyzing fact with a pleasing fiction. It lets us fool ourselves, for a time, into believing that we’ll never die. It’s not a paradox that ageists are dissing their future selves—it’s the whole point of the exercise. The cultural anthropologist Ernest Becker codified this insight as “terror management theory.” Becker wrote, “The irony of man’s condition is that the deepest need is to be free of the anxiety of death and annihilation; but it is life itself which awakens it, and so we must shrink from being fully alive.”

If ageism is hardwired, how can we reprogram ourselves? Greenberg and Co. suggest three ways: having the elderly live among us and fostering respect for them; bolstering self-esteem throughout the culture to diminish the terror of aging; and calmly accepting our inevitable deaths. They note, however, that “all these directions for improvement are pie in the sky, particularly when we think of them at a society-wide or global level of change.” So ageism is probably inevitable “in this potentially lonely and horrifying universe.”

That took kind of a dark turn, didn’t it? The only way to eliminate the terror that animates ageism is to eliminate death. The good news, sort of, is that the eager beavers in Silicon Valley are working on that, too. ♦

Recommended Stories

This entry was posted in .HEALTH, AGEISM, HEALTH. Bookmark the permalink.