Rebecca Schuman | Longreads | December 2018 | 11 minutes (2,795 words)
The ’90s Are Old is a Longreads series by Rebecca Schuman, wherein she unpacks the cultural legacy of a decade that refuses to age gracefully.
* * *
The 1990s did not end on January 1, 2000. The monumental anti-climax of Y2K — a computer “bug” that was supposed to screech our Earth to a Scooby-Doo foot-cloud halt, but instead did bupkis — was a truly apt expression of the preceding decade. But even discounting Y2K, I’ve got some serious issues with the alleged “turn of the actual millennium” as the endpoint of the most intentionally underwhelming decade of the 20th century. And not just because 2000 (zero-zero) is so obvious and overplayed — though there is, of course, that.
The actual termination point of the ‘90s required an attitudinal shift that would decentralize the role of Generation X as the admittedly-petulant target of all culture and advertising — the thawing of the winter of the bong-ripping couch-slacker’s discontent; the disappearance of gin and juice from house-party bars; the centering of the hot tub on The Real World; the sobering realization that both men and women were from Earth and just sucked; the demise, for that matter, of Suck itself.
In point of incontrovertible fact, the 1990s would not end in the United States until the aughts’ resurgence of aggressive consumerism and even more aggressive vacuity came to dominate all aspects of sociopolitical and popular culture. So the only question is: When was that? There are more potential answers than squiggles on a Fido Dido sweatshirt.
Was it in 2001, when the original Fast and the Furious premiered? 1996, when Blur released that WOO-hoo song? Was it 2010 — you heard me, two thousand and ten — when enough grandparents had shuffled off the mortal coil to make the primary avenue of written news consumption digital rather than paper?
I have spent an unnecessarily and perhaps questionably extensive amount of time researching in this subfield, and I present my findings to you now in a perplexing new format (I believe it is called a “list-cicle”?) that is apparently the only thing young people are able to read.
Sixth Place: September 11, 2001
Let’s get right to this colossal bummer so that nobody thinks I’m insensitive. I lived about 60 blocks north of the World Trade Center at the time and had just quit my job in the Financial District to focus on my very important career as a part-time MFA student and generalist dumbass. In the direct aftermath of the terror attack, I spent my newly unemployed days “producing” a play at the Tribeca Playhouse, located in the literal rubble of the Twin Towers, as well as the metaphorical rubble of my oldest, greatest friend: detached irony. Suddenly, after the murder of 2,600 of my fellow New Yorkers, not even I felt the need to filter every assertion through several layers of all-knowing spite.
The actual termination point of the ‘90s required an attitudinal shift that would decentralize the role of Generation X as the admittedly petulant target of all culture and advertising.
It was bad enough — in a city that had perfected the art of minding one’s own business while simultaneously hearing your next-door neighbor’s every belch — that everyone had to share this horrible thing together, communally breathing in that inescapable stench. It was bad enough to have to feel a sincere wrenching emotion in front of other people. But what was worse was that it took about 14 hours before New York’s shock and grief was itself hijacked (pun intended) by the opportunistic fuckfaces the Supreme Court had just inexplicably ruled into office, who promptly declared war on an uninvolved country and simultaneously cultivated the aughts’ true Axis of Evil: megalomania, megachurches, and McMansions.
So, big congratulations, 9/11. You have ruined everything, even this fun ‘90s series, which now must end — though not before I talk for at least a moderate amount of time about, among other important things, Britney Spears.
Fifth Place: January 8, 2001
As the new millennium celebrated its first birthday, Britney was 19 years old and already a veteran global superstar. The only thing that sold more records than her vocal fry was her handlers’ genius take on aughts purity culture. Spears’s early brand was abstinence, sold with as much glistening, barely-legal midriff as possible.
The apex of this orgasmic sexual avoidance came (or didn’t) in Britney’s highly engineered relationship with a twerpy little kid from ‘NSYNC named Justin Timberlake. Their finest moment as a couple transpired when they strode the red carpet of the 2001 American Music Awards in matching head-to-toe light-wash denim. Jeans (or, in Britney’s case, jean-bustier-gowns), just like cinema, music, and everything else, now belonged exclusively to the teens and the desperate grown-ups who would spend the next decade catering to their every whim and pretending to be them.
These days, Britney — a comeback veteran at 36, a Vegas mainstay, the mother of two middle schoolers — is a national treasure of the same caliber as your Betty Whites and Dwayne “The Rock” Johnsons. Nobody even cares that she lip-syncs. (I don’t.) Timberlake, meanwhile, successfully reinvented himself as a Serious Artiste, and who doesn’t kind of dig JT? (I do.)
Given how much we are wedded to their current personae, it feels a little uncomfortable to remember just how much Spears and Timberlake used to abjectly blow, and just how aggressively their particular brand of blowage blew up the remains of the ‘90s. But they did. They blew a lot. (I mean, figuratively, probably. I’m not sure what is or isn’t covered in virginity pledges.) Against the tattered backdrop of the previous years’ lo-fi earnestness and anger, their meaningless parroting of lyrics mass-produced in a Swedish hit factory — as they unironically sold sex to promise-ring wearing virgins — was cold-turkey dystopia to those of us still keening along to OK Computer.
And yet, the real Karma Police made sure I knew how unimportant it was that I considered Spears and Timberlake a humanitarian crisis — and this forced me to bid bye bye bye to any remaining trace of cultural relevance. For try as I might to stuff my ham-hock thighs into the toothpick-sized upper quadrants of ultra-low, whisker-washed Seven superflares, I would never be cool again.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
Fourth Place: June 15, 2002 (or thereabouts)
In approximately the middle of the fine month of June in the Year of our Lord Aught-Two, the world order changed and nobody even noticed. This was the ignominious moment when the graduating class of ‘02 entered the college-educated workforce. Hordes of junior analysts, associate investment bankers, paralegals, lab technicians, accountants, software engineers, and Second Assistants to the Senior Vice President had the nerve to put on grown-up outfits and strut around offices like they had any right to be there, despite the disqualifying fact that most of them were born in or later than 1980.
Couldn’t anyone see that these people were tiny baby toddlers? They were in the seventh grade when I was a senior in high school. If I had taken any of them to prom, I’d be in jail for pulling a Mary Kay LeTourneau, a phat reference that only people born earlier than 1980 will even get.
There was just something off about the early ‘80s babies — that uncomfortable Crossroads of a generation that was no longer Gen-X, but not yet millennial. For unlike us, they were born at a time when it had officially become unacceptable for parents to spend the day chain-smoking and gabbing on the phone while their offspring were locked in lead-covered playpens with a shorted-out Lite Brite and a rusty can of Tang.
Their parents were the real-life American Psychos, so it’s only fitting these alleged young adults would grow up to dismember the slacker ethos of an entire decade.
The coked-out overachievers of the Reagan era doted on their children with help from the What to Expect parenting-book craze — and when those children became alleged adults they were different than I was. They hadn’t been raised on standardized tests like their successors, so they still had a Gen-Xer’s ability to problem-solve without getting their hands held — but unlike us, their proudly unambitious and comfortably numb predecessors, they also had something totally foreign to me: goals.
Despite being technically too young to drink, vote, drive, or attend a PG-13 film, these children toddled into our places of work and did unacceptable things like reorganize their boss’s Rolodex without being asked to, leading our bosses to wonder why we weren’t reorganizing their Rolodexi when we apparently had time to email our appropriately-aged colleagues with such pressing professional matters as RE:re:re:re: HR is full of skanks. Their parents were the real-life American Psychos, so it’s only fitting these alleged young adults would grow up to dismember the slacker ethos of an entire decade.
Third Place: July 1, 1997
When Sean Combs — who at the time went by “Puffy” or “Puff Daddy” — released No Way Out in the United States, it debuted at No. 1 on the Billboard chart, and then stayed on that chart for 28 weeks, scorching the earth of the East Coast/West Coast hip hop wars and, according to Vibe, creating a “seismic” change in the rap world forever.
The album’s first hit single, “I’ll Be Missing You,” an homage to slain artist Christopher Wallace a.k.a. The Notorious B.I.G., wasn’t just the first rap song ever to debut at number one. It was also — in stark contrast to the cocky and uncomfortable-to-white-people subversiveness of, say, “It Was a Good Day” — a treacly, middling karaoke version of a criminally overrated ode to stalking. The first time I heard it, I assumed it was a Weird Al parody. Entertainment Weekly gave the song a “D” and described it as proof positive that rappers were not above “self-serving sentimentality.”
Hip hop has always used sampling, of course, but in 1997, Puffy elevated the practice all the way past coolness and straight through hackdom without even stopping, and finally landed on the Aughties’ most iconic overarching synecdoche: the empty Personal Brand. No Way Out was the best way into the real source of Benjamins: merchandising.
Hip hop, like all popular music, has also always been largely about fame and wealth — the gold chains Run-DMC wore were the real thing; they penned a whole amazing song to their shoes — but in the ‘80s and Doggystyle-era ‘90s, much of that money was happened upon by way of making legitimately great music. To Puffy’s immense credit as a business genius, in 1997 he made the important discovery that the music was largely incidental.
Today, one can purchase a Sean John necktie at Macy’s for approximately $70, and Combs holds a net worth of $820 million.
Runner-Up: December 12, 2001
In the criminal justice system of late ‘01, the people were represented by two separate but equally important groups: the police who apprehended Winona Ryder as she grabbed $5,500-worth of overpriced crap and waltzed out of the Beverly Hills Saks, and the district attorneys who looked up from their towering stack of paperwork and went, “Wait, isn’t that the girl from Beetlejuice?” Apparently, playing a series of sharp-tongued, disaffected, capitalism-averse waifs did not directly correspond to real-life sharpness or aversion to luxury goods. And so, at the cusp of reality TV’s domination of the small screen, the grainy closed-circuit camerawork of Ryder’s spaced-out shoplifting became everyone’s favorite show for a few weeks.
Had Veronica from Heathers finally snapped? Was whatsername who wasn’t Cher or Christina Ricci from Mermaids on drugs? Was Lelaina Pierce having some sort of breakdown? Was Girl, Interrupted real????? (I mean, I know Girl, Interrupted is a memoir based on real things, but was Winona really losing it?) Ryder was sentenced to a few years’ probation and had to pay Saks restitution. And because it was the aughts and not yet the twenty-teens, where a criminal enterprise of this sort might garner one a Netflix special, the escapade ensured a quick-onset atrophy of Ryder’s acting career for the better part of two decades.
Given that advertisers had already long-abandoned the ‘90s woman — she was, to be perfectly honest, getting sort of old — it was inevitable that she also disappear from big-screen representation.
When Stranger Things premiered in 2016 with Ryder in a widely-lauded comeback role as (what else?) a beleaguered middle-aged mom, she finally addressed the poorly-crumpled, garment-bag-sized elephant in the room, sort of. She never divulged why she did it, but she did explain that the incident and arrest had saved her in a way, by forcing the Hollywood hiatus she needed — not merely to regain her physical and mental health, but to break out of all of those late-’80s and ‘90s roles that typecast her to the very depths of her possibly-not-permanent grave (I don’t remember the plot of Beetlejuice that well).
In the aftermath of Ryder’s arrest, the immediate box-office void of films about flawed, giant-eyed young white chicks — the Morose Prozac Downer Girls — ushered in the era of such breakthrough female-driven cinema as Josie and the Pussycats. After all, given that advertisers had already long-abandoned the ‘90s woman — she was, to be perfectly honest, getting sort of old — it was inevitable that she also disappear from big-screen representation.
Ryder effectively killed the ‘90s chick-flick in the least-’90s way possible: with a materialistic crime that centered the exact kind of overpriced, ostentatious, label-forward crapola that ‘90s women mocked. This crapola would max out the credit cards of the entire ensuing decade, as young women fell over themselves to buy hideous $1,000 bags and excruciating $600 shoes in emulation of the true murderer of the ‘90s: Carrie Bradshaw.
The “Winner”: June 6, 1998
There were two types of people in the ‘90s: losers, and people who didn’t know they were losers yet. As such, it was a challenge to choose a true decade-killer among these weak specimens. However, the dishonor 100 percent goes to Sex and the City, a television program and cultural phenomenon that premiered on HBO on a date that will live in infamy as the precise moment the ‘90s were summarily impaled on a fur-covered Manolo Blahnik stiletto.
SATC had the subtext-bereft sincerity of both a Puff Daddy record and a post-9/11 second-person disaster narrative. It had the career ambitions of an ‘80s child and the unattainably expensive, cynically sexualized yet hideously ugly fashion of the Justin-and-Britney-era denim abomination. It had the incomprehensible acquisitiveness and brand-name obsession of Winona shoplifting in Saks, and the accompanying murder of the nuanced heroine who cared about anything besides pleasing men and cutting carbs.
Sure, early-era Carrie was vaguely, disingenuously ‘90s: She smoked, and wore the same ratty fur coat with every outfit in a kind of upscale-Courtney Love homage. She didn’t have a real job (even at an extraordinary $1 a word, a weekly column at the New York Star would pay $3000 per month, or $36,000 per annum before taxes). And she always accessorized with that dumbass nameplate necklace, itself an ironic and ill-advised co-option of black urban culture, rendered in what Carrie later described as, and I quote, “ghetto gold.” (A lovely example of the unapologetic and often unbearable whiteness of SATC’s being.)
Sex and the City may have started before the turn of the millennium, but there is nary another lone cultural entity that both caused and represented the aughts’ vile self in such dominant and equal measure.
But despite these half-assed vestiges of subversiveness and irony, and despite Carrie’s age at the show’s onset placing her directly in the middle of Generation X, the SATC universe — and the influence that universe would soon wield on an entire generation of suburban middle schoolers arguing over who in their group was The Samantha — espoused a shockingly superficial and materialistic ethos from its first infernally addictive episode.
The worst part of Sex and the City was that it existed in the liminal space between the ‘90s’ reverence of anti-materialistic coolness and the total disappearance of cool as concept in the 21st century. As a result, Carrie and her rich, vapid friends were cool, and the show’s nearly instantaneous status as appointment television meant that everyone else wanted that kind of cool, too. Unlike the cool of the ‘90s, which depended upon the rejection of anything commercial and popular, you could purchase SATC’s cool at Barney’s and the Patricia Field store on West Broadway. (As a sweet bonus, the show also ushered in the destruction of the island of Manhattan as a remotely interesting place to be.)
Now, suddenly, you had to be a size two and date investment bankers. Cool metamorphosed from an attitude into a commodity. By the time the openly craven marathon of product placement masquerading as the series’ feature film hit cinemas in 2008, the very concept had bought, sold, and processed itself out of existence.
Sex and the City may have started before the turn of the millennium, but there is nary another lone cultural entity that both caused and represented the aughts’ vile self in such dominant and equal measure. Carrie, Samantha, Charlotte, and Miranda may have come of age in the ‘90s, but I Can’t Help But Wonder: Are they also the four horsewomen of its apocalypse?
You’ve Reached the Winter of Our Discontent
It’s Like This and Like That and Like What?
When the Real World Gave Up on Reality
Bridget Jones’s Staggeringly Outdated Diary
The Gilded Age of (Unpaid) Internet Writing
* * *
Editor: Ben Huberman