BROOKLYN, NY - JUNE 18: PUMA sneakers on display at the PUMA Hoops HQ kickoff where Walt "Clyde" Frazier signs the first ever life long contract with PUMA on June 18, 2018 in Brooklyn. (Photo by Jamie McCarthy/Getty Images for PUMA)
There were few players as dominant in college basketball this past season as Deandre Ayton, a 7-foot-1 center who played his freshman year at the University of Arizona before declaring for the NBA draft. The native of the Bahamas was an imposing force and, as such, will likely be selected as the top pick in the 2018 NBA draft, which will be held at the Barclays Center this Thursday.
It’ll be a historic moment: If he is chosen by the Phoenix Suns with the first pick, Ayton will become the fourth international player in the past six years chosen as the number one overall pick. But even if he’s chosen as the second pick, Ayton will still make history — in a shocking turn, the center spurned Nike, Adidas, and Under Armour to sign a four-year multi-million sneaker endorsement deal with Puma, a company that hasn’t been relevant in the sneaker game for decades.
When asked by Bleacher Report about the ramifications of signing with a company whose last NBA sneaker endorsement ended in arbitration (Vince Carter signed with Puma in 1998, only to back out of his contract a year later, claiming Puma failed to deliver a signature sneaker as well as a sneaker that fit properly; he had to pay $13.5 million after the arbitrator ruled Carter had indeed breached his contract), Ayton said, “That’s a problem. That’s going to catch everybody’s eyes. That’s a huge step for Puma, too.” Read more…
NEW YORK, NY - MARCH 14: Bruce Springsteen performs onstage during a special performance of "Springsteen on Broadway" in front of an audience of SiriusXM subscribers at Walter Kerr Theatre on March 14, 2018 in New York City. (Photo by Kevin Mazur/Getty Images for SiriusXM)
There is a moment at the end of Bruce Springsteen’s “Thunder Road,” his seminal hit from the 1975 album, Born to Run, in which New Jersey’s most famous son intones, “It’s a town for losers, and I’m pulling out of here to win.”
The lyric is classic Springsteen, a nod to the most consistent theme of his biggest hits throughout his early catalog, which spans seven records over a decade from the mid ’70s to the mid ’80s. From “Born to Run” to “Atlantic City,” Born in the USA to The River, Springsteen is constantly searching for the open road and thus fulfilling some inherent promise and potential. Springsteen was 26 when he recorded “Thunder Road,” and it’s not surprising that the musician’s promise that “these two lanes will take us anywhere” would appeal to fellow baby boomers, those trapped in contemplation between seeking out quarter-life ennui or something more.
But Springsteen’s evolution as an artist hasn’t been static. As fans age with the Boss, those same themes of entrapment and freedom have taken on new meaning while, at the same time, attracting new audiences, such as millennials and those who came of age during the recession. Born in New Jersey, Toniann Fernandez of The Paris Reviewgrew up haunted by Springsteen’s specter:
The sound of “Born in the U.S.A.” used to conjure images of the muscular white boys of my high school years, drunk with testosterone and Natural Ice, clad in denim and American flags. They screamed along with E Street imitators in bars we were all too young to patronize. I had always found the Springsteen omnipresence in coastal New Jersey offensive.
That sentiment, though, changed recently, and Fernandez describes her quest to not only embrace the musical menace of her teenage years but to actually meet Springsteen during the Broadway run of Springsteen on Broadway.
I had exactly five hundred dollars in my savings account at the time, the last crumbs of my earnings from my days as a nine-to-fiver. He encouraged me to buy the ticket. I told him that he didn’t get it. The point was not just to see the show, the point was for the Boss to request my presence at the show, perhaps in the front row. I suppose I hadn’t been so clear to myself or to anyone else how much this was about me, not Bruce. When I went back to the ticket window, the clerk told me the ticket was in someone else’s cart on Ticketmaster and that I would have to wait three minutes to see if they released it. Of course, having the ticket withheld was all I needed to draw my debit card from my wallet. Three minutes of purgatory ended, and I paid for my ticket through tears.
Fernandez writes of finally understanding the Boss’ appeal once she left New Jersey, of realizing and appreciating what the open road feels like upon riding in the getaway car, and what’s fascinating is how this thread of escapism that Springsteen represents — his hook for all these years — is an oft-repeated thread through various forms of music. Take EDM — as Emily Yoshia explains in her recent essay for Vulture about Avicii’s reported suicide, the musician’s massive hit, “Levels,” spoke of attaining a level of both personal and professional success that seemed (and still seems) unattainable to anyone who celebrated their 21st birthday in the mid-2000s.
Like every apocalyptic radio pop song of that era, asking us to live like tomorrow will never come, there was an overwhelming need for the music of the era to freeze time, both to stave off adulthood, but also to deny every feeling of doubt and sadness and confusion that had come before, to will it away in order to start our lifestyle brands or build our Twitter following. I had managed to convince myself in 2011 that I could still get what I wanted, but in reality I had a very small reservoir left, constantly one disaster away from moving back home again.
There is a connection between Springsteen and Avicii, of escaping and living like tomorrow will never come, and it’s why Springsteen’s catalog still sounds fresh after all these years. Yes, many of his tracks are bangers, but that’s beside the point: the Boss’s lyrics connect us to a future that we may never know.
Kanye West’s emergence from his self-imposed cocoon of social media silence last week has not been seamless. After proclaiming his support for Donald Trump and the president’s Make America Great Again plank, the musician and fashion designer took to TMZ Live on Tuesday for arguably the most bizarre of what has already been a bizarre fortnight of proclamations:
“When you hear about slavery for 400 years. For 400 years?! That sounds like a choice. You was there for 400 years and it’s all of y’all. It’s like we’re mentally in prison. I like the word ‘prison’ because ‘slavery’ goes too direct to the idea of blacks. Slavery is to blacks as the Holocaust is to Jews. Prison is something that unites as one race, blacks and whites, that we’re the human race.”
Kanye is well aware of the weight his words carry. As someone who has referred to himself as the “most impactful artist of our generation,” Kanye long ago realized that anything he says, no matter how inane and obviously ridiculous, will be incessantly discussed. For Kanye to then make such an ignorant proclamation is willfully disingenuous. And his follow-up tweets (now deleted) didn’t help to clarify his position:
The latter tweet references William Lynch, a purported 18th-century slave owner from the British West Indies who traveled to Virginia in 1712 to teach slave owners how to better control their property. His speech on the banks of the James River was first “discovered” in 1970, and began its life online starting in 1993 when a reference librarian at the University of Missouri-Kansas City uploaded the “Willie Lynch letter,” which detailed how Lynch psychologically and physically tortured slaves. The letter is also patently false.
Willie Lynch never existed, nor did anyone from the British West Indies organize such a summit to advise slave owners in the early 1700s. As the librarian mentions in an email to her superiors, “Prof. [William] Piersen of Fisk contacted us a few months back about its origins and provided me with a critique which points to the narrative being a much-latter-day document…assuming Prof. Pierson’s [sic] critique is on target, I think it likely that it’s a ’60s or ’70s document.”
I accessed this email via the Wayback Machine, which means it has existed to dispel the Lynch rumor for years. Yet the letter continues to be legitimized within the framework of pop culture. Kanye isn’t the only artist to name-drop Lynch: So has Talib Kweli, Lupe Fiasco, Kendrick Lamar, and Nas among others.
And it’s not just rappers dotting bars with reference to Lynch’s “letter,” Denzel Washington quotes the letter at length in the 2007 film The Great Debaters.
The letter, and its supposed relevance explaining not only the slave experience but also the origins of “lynching,” has been disseminated enough times that in 2004 Jelani Cobb wrote an extended answer to the question, “Is Willie Lynch’s letter real?”
There are many problems with this document — not the least of which is the fact that it is absolutely fake…it has been cited by countless college students and a black member of the House of Representatives, along the way becoming the essential verbal footnote in barbershop analysis of what’s wrong with black people.
When Mark Adams of the Baltimore Suncontacted the publisher of the St. Louis Black Pages in1998 — the newspaper that first printed Lynch’s speech in the early 1990s — to inquire about the provenance or authenticity of the letter, Adams was rebuffed. “I’ve never run a piece that got the response this one got. There’s something truly magical about it. Don’t ask me to explain it,” said publisher Howard Denson. “How else can you explain how whites kept control when they were outnumbered five, 10 or 20 to one?” he asked. “Blacks still carry the negative mental legacy of slavery. I think we really need to address the things that hold us back. Blacks spend $400 million annually, but they believe they’re poor and powerless because they’ve been conditioned to think that way.”
Even though the letter is fake and Willie Lynch is a conjuring from the civil right era, that is not to say the character doesn’t have power. As Lupe Fiasco noted via Twitter:
It’s shouldn’t be suggested that because it lacks a verifiable provenance or even that if it is a work of modern literary fiction that it’s source material & subject matter should be downplayed. It has a power & is actually rather tame compared 2 REAL historical facts @Longreads
Lynch’s letter shouldn’t be downplayed, but it also shouldn’t be given the sort of weight that a work of historical significance carries, and perhaps a search down a Willie Lynch rabbit hole with lead to other examples that provide a far greater historical context, whether that be Solomon Northup, When I Was A Slave, or Cudjo Lewis.
Willie Lynch is an urban myth, and while the internet is full of stories that we know to be false, we’ve known for more than two decades that there was no Willie Lynch, so why keep spreading the lie to only fit a convenient narrative? For Kanye to willfully ignore what has been proven untrue is perhaps more dangerous than his support of MAGA and his “brother,” Donald Trump.
From left, writers Alice Crites, Stephanie McCrummen, Amy Gardner, and Beth Reinhard embrace in the newsroom after The Washington Post wins two Pulitzer Prizes. The Post shared a Pulitzer with the New York Times for their coverage of Russian meddling in the 2016 U.S. presidential election and contacts between President Donald Trump's campaign and Russian officials and won a second Pulitzer for uncovering the decades-old allegations of sexual misconduct against Senate candidate Roy Moore of Alabama. (AP Photo/Andrew Harnik)
As expected, the New York Times and The New Yorker dominated much of the 2018 Pulitzer Prize fanfare, and while it is necessary to honor the award-winning reporting undertaken by Jodie Kantor, Meghan Twohey, and Ronan Farrow, some of the most-talked about features from this past year were also celebrated. Including, Rachel Kaadzi Ghansah, whose in-depth reporting on Dylann Roof for GQ won for feature writing (Ghansah also won a National Magazine Award for this story). And the staff of the Cincinnati Enquirer, which provided a brutal examination of the effects of heroin during a week-long period.
Evan Butcher of the Chippewa Tribe plays basketball near Cannon Ball, North Dakota. 2016. Robyn Beck /AFP/Getty Images)
Last week, the New York TimesMagazinefeatured the high school basketball team the Arlee Warriors on its cover. Hailing from the city of Arlee, home to about 600 people on Montana’s Flathead Indian Reservation, the Warriors are among the greatest Native American high school squads ever assembled, a group that blends high-octane offense predicated on three-point field goals with a frantic and suffocating pressurized defense.
The feature, written by Abe Streep, doesn’t just showcase the Warriors and its players — including Phillip Malatare, a six-foot guard who’ll be a preferred walk-on at the University of Montana next fall — it also profiles the town, the reservation (a sovereign nation comprising the Confederated Salish and Kootenai Tribes), and a wave of recent suicides in the community. It was these suicides that prompted the Warriors’ transformation: The team wasn’t just a winner of back-to-back state titles, but rather a beacon to those that viewed suicide as a solitary option. Read more…
David Chang with South Philly Barbacoa's Cristina Martinez in 'Ugly Delicious.'
There’s no denying that David Chang’s new Netflix docuseries, “Ugly Delicious”, is aesthetically gorgeous. The show’s underlying concept—”ugly” food like tacos, barbecue, and fried rice all have intrinsic values that surpass its creation born out of necessity and a lowly legacy—is a sui generis angle for a well-worn genre that has long shifted to food porn rather that pursuing and examining the cultural and geopolitical value that food possesses.
In a recent interview with Grub Street, “Top Chef” judge and chef Tom Colicchio mentioned the rise of “unfussy” food on the program’s 15th season: “The chefs were doing more, I wouldn’t say rustic, but a much more conventional style of food.” Translation: This shift isn’t occurring in a vacuum.
As the New Yorker‘s Helen Rosner explains in her review of the eight-part series, “What makes “Ugly Delicious” compelling, ultimately, is Chang’s commitment to rejecting purity and piety within food culture…In food culture, particularly American food culture, the concept of authenticity is wielded like a hammer…[and] the problem with such rigid categorizations, according to “Ugly Delicious,” is, for one thing, creative stagnation.”
This certainly makes for a thoroughly interesting viewing experience; before I realized it, I had binge-watched four episodes. This sort of programming is also refreshing—Chang has subverted a genre. For a generation that has been bred on the gluttony of glossy networks and competitive cooking, “Ugly Delicious” throws up a middle finger, and instead asked questions that are relevant to how we should be thinking about food (and not just consuming for its sheer shock value). Read more…
It started with a Lean Cuisine. After a night out catching up with a friend, I just made one of the last NJ Transit trains leaving Penn Station that April night, traveling the 20 or so minutes to the suburbs of New Jersey where my wife and I had moved two weeks prior in anticipation of welcoming our first child to what truly is NYC’s sixth borough (daycare and two-bedrooms aren’t cheap in the original five boroughs).
Dinner consumed and sleep completed, I awoke with one of the worst bouts of food poisoning I’ve ever encountered, a wasting that lasted through midway into the next week. Eventually, the illness subsided, but I wasn’t concerned when I woke the following Saturday with slight soreness in my right ankle and the inability to fully extend my left knee. Seemed odd, sure, but perhaps that was just a lingering side effect of the tainted Swedish meatballs. And there was a crib to be built along with final trips to Target and Buy Buy Baby.
On Sunday, though, I could barely put any pressure on my right leg, hugging walls as I walked in an effort to support myself. A nighttime sprained ankle, I thought, but I still went to the ER on Monday, while accompanying my wife for her final OBGYN appointment. The medical establishment’s consensus: must be a sprain. I spent four more days in sheer agony, unable to put any pressure on my right leg and unable to sleep because the pain was too intense. After our son was born, I was the one wheeled out of the hospital while my wife carried our nine-pound baby. By this point, my other ankle had started to tingle (both eventually swelled to the size of a grapefruit, and my left knee was significantly inflamed, looking as though a softball had lodged itself behind the kneecap).
Two days after our first child was born.
The first two weeks of my son’s life were a blur. Of course, there was no way I could help during feedings in the middle of the night — it took me 10 minutes just to navigate the short hallway between the bed and bathroom. And unless I was sitting on the couch RICEing my legs (for what I thought must be the two worst sprains in the history of orthopedics), I couldn’t hold my son. I was an invalid, completely useless to all around me at a time when those same people needed me the most. My brother had gotten married during this time period, and I recently looked at the photos from the ceremony, which was held at City Hall: shuffling along in crutches, my ankles are encased not only in compression wraps but also air casts.
The realization that my “sprains” had — HAD — to be something more significant arrived during my son’s third week on earth. My general practitioner was located in Manhattan, so I found a local doctor who, for the first time since this health scare began, had more than an inkling of what was plaguing my body: I had joined the roughly 20,000 Americans who suffer annually from reactive arthritis. This exceedingly rare form of arthritis, which shares symptoms with rheumatoid arthritis, was attracted to the genetic antigen I carried, passed down maternally, and had proceeded to attack all of my white blood cells. When I finally was able to see a rheumatologist, I was told that my CBC — complete blood count — was one of the lowest he’d ever seen.
As I read Molly Osberg’s harrowing essay for Splinter detailing her health catastrophe, the result of contracting a rare form of strep that began to wreak havoc on her internal organs, I found myself mentally transported to my own uncertainties this past summer. Her concerns about medical expenses, proxies, if her full-time job still exists, and whether or not any of the poking or prodding will work — all of those emotions became all the more real again. We even both made it into medical journals!
I encountered them all when I was admitted to a Brooklyn hospital, though I only remember a fraction of the people who tried to puzzle out what was sending my body into septic shock. According to my medical records I saw six specialists in 40 hours. There was the anesthesiologist who assuaged my terror when I was put under for a bronchoscopy (I was afraid I’d wake up with a camera down my larynx), the infectious disease doctors who asked me about my sex life, how often I got high, my last period, whether I’d been anywhere near livestock.
I also shared her feeling of invincibility. Before I contracted reactive arthritis, I rarely needed anything more than a hot shower and a few doses of Dayquil to right whatever was ailing me. I didn’t get sick. As Osberg writes,
As someone with a pack-a-day habit, I got a little sick every year, and my response was to sleep (or work, or drink) through it until the issue somehow resolved. Before 2017 I don’t think I’d been to a doctor in about five years—though as was later reiterated to me by one chagrined specialist after another, my abysmal life choices up to that point didn’t end up making much of a difference.
Like Osberg, I was fortunate to have insurance, but if I had been a freelancer, my bills would have been astronomical. One shot of Humira, the immunosuppressive drug I inject twice a month, costs upwards of $500 without coverage, and that doesn’t include the barrage of pills I take daily. As I was reading Osberg’s piece, I stumbled across the story of Donald Savastano, a 51-year-old from central New York who recently won the lottery—a $1 million prize. A construction contractor, he first decided to visit a doctor, something he hadn’t done in several years because of his lack of insurance, before lining up a vacation. Following a diagnosis of stage 4 lung cancer, and just twenty-three days after winning the award, Savastano died. Osberg conveys the sheer terror that accompanies such a health scare; the worry isn’t so much about how to survive — it’s about how to recover.
Never mind recovering physically or financially in any of these scenarios: I can’t imagine surviving emotionally, fielding calls from collections agents, facing eviction, waiting for the pain meds to hit so I can keep at a futile job search with an IV still dangling from my side. I am 29 years old, with no pre-existing conditions before this moment, and I am unemployed and exhausted and in pain all the time.
It took one month before I could walk a distance greater than a hallway. It took another month before I could walk back and forth to the elevator bank in our apartment building. And it took three months before I regained my walking gait and balance, and didn’t feel any debilitating pain for the first 90 minutes of my morning. It’s now been eight months since the diagnosis, and my health has somewhat returned to normal — although my right Achilles still aches every day, and I’ve been told to limit my exercise to stretching and spinning (so as to not potentially tear the tendon). But I can walk, and just the normalcy of that activity is something I never thought that I would have taken for granted. It’s those actions that become the most significant.
Around mid-August they took the PICC line out. My white blood cell count went back to normal; I was cleared to drink by far the best beer of my life; the scabs left over from the drainage tube punctures in my torso fell off. I was pleased the surgical incision had missed my favorite tattoo, and less pleased when streaks of my hair turned gray and started coming out in clumps, or when my nails fell off—a months-delayed reminder of that time my body was preparing to die. I went back to work towards the end of the summer.
I wish I could say that my love of the Spice Girls as an 11-year-old was based on some innate wokeness, but really, when I first heard the group’s debut album Spice 22 years ago, all I cared about were the insatiable melodies, catchy hooks, and Mel B’s rapping (“Here’s the story from A to Z”) that carried ‘Wannabe’ into its final chorus.
From the jump, I was a Spice Girls stan. The group, an all-female pop band cobbled together following a blind audition, made one of the first tapes I played non-stop, continuously transferring the record back and forth between my boombox and my walkman. It got to the point where my younger brother also became an uber-fan, eventually receiving a copy of Spice World, the group’s 1997 biopic-slash-ode to the Beatles’ A Hard Days Night.
For Broadly, Sirin Kale perfectly illustrates the film’s appeal to a generation of adolescents who were struck by the Spice Girls’ inherent coolness and fun vibes:
In 1997, the Spice Girls were cresting the Girl Power wave. I, an eight-year-old weirdo in platform trainers with an imaginary boyfriend, revered the five-piece with a doglike devotion (except Geri—more on that later). The Spice Girls were my childhood soundtrack and the object of all my worldly ambitions. To quote Mel C’s well-received 1999 solo offering, they were my northern star.
Kale delves into the backstory of the film’s production, including an ever-changing script, persistent paparazzi (e.g. posing as a cow to snap a pic of the super-group), and an utterly absurd plot that involved bombs planted in the Spice Girls’ bus and an alien invasion.
Despite this, Tickner does have some fond memories of the shoot. Without him, the iconic alien invasion scene might have crash-landed. “For some reason, no one was addressing the problem of what the spaceship was going to be on set,” he explains. “Here was a very obvious prop in the script. An alien was going to come down in the spaceship. But the art department hadn’t been asked to make one.”
There’s a lot to like about The Post, a film that has drawn rave reviews even before its pre-holidays debut. The combination of Meryl Streep as Washington Post publisher Katharine Graham, and Tom Hanks as the paper’s editor-in-chief Ben Bradlee is the rare pairing of GOAT actors operating at their all-time peak.
The film covers the publication of the Pentagon Papers in the New York Times, the WashingtonPost’s attempt to obtain its own copy, and the ensuing battle against the Nixon administration which led to the Supreme Court case about the Daniel Ellsberg-leaked documents. As Manohla Dargis of the New York Times described in her review of the film, “The pleasure of The Post is how it sweeps you up in how it all went down…Like many movies that turn the past into entertainment, The Post gently traces the arc of history, while also bending it for dramatic punch and narrative expediency.”
The Post is the ultimate click-bait film for our current moment: An all-star cast telling the story of righteous journalism while press freedoms are being threatened on a daily basis. There is a time-honored tradition of films that have functioned in a similar way, including Network, All the President’s Men, and most recently, Spotlight. Last month The Post published a compendium of the greatest journalism movies ever made, selected by the likes of Katy Tur, Jill Abramson, and Marty Baron (who, of course, chose Spotlight, where he’s played by Liev Schreiber). And on the heels of ThePost’s rundown was a feature by Haley Mlotek on the 30th anniversary of Broadcast News, the 1987 drama that “predicted journalism as we know it.”
What’s most interesting isn’t the selection of films that have largely defined what our conceived notions of how journalism functions, including what reporters look like — bodies clad in beige clothing drinking copious amounts of coffee. What I find fascinating is that most of these films deal with large-scale or long-form investigative reporting, the type of work that takes months and involves countless interview montages. What about a film that covers a day in the life of an average newspaper?
I’m talking about The Paper, in my opinion, the best journalism film ever made and one that almost never gets any credit. Starring Michael Keaton as the metro editor of the fictional Sun — a loose portrayal of TheNew York Post — the movie details the killing of two out-of-state businessmen in a pre-gentrified Williamsburg and the arrest of two black teenagers for the crime. The problem is the charges are bogus, a mob hit made to look like murders with racial undertones at a time when New York, on the screen and in real life, had reached a tipping point. The Sun and its staff, including Glenn Close as the managing editor, Robert Duvall as the EIC, and Randy Quaid as a quasi-Mike McAlary-Pete Hamill-type columnist, have a day to both confirm and break the exclusive. Asked at one point why the story can’t wait until the next day, as Close tells Keaton during a staff meeting, “We taint them today, we make them look good on Saturday, everybody’s happy.” Keaton exclaims, “Not tomorrow, right fucking now, today!”
Co-written by Stephen Koepp, former executive editor of Time magazine, The Paper beautifully illustrates the lunacy and creativity of working under a deadline. The feeling one gets upon getting the perfect quote — “Don’t take the bat out of my hands, it’s the ninth inning, I got to get the quote, the guy’s not going to be there all night,” says Keaton — or confirming a previously deep background detail on the record. It’s a rush native to only journalists, the endorphins multiplying as you have only minutes to finish the article. Every reporter has experienced at least one editor snapping at them as Duvall does to Keaton, “You want to run the story? You have five hours until 8 o’clock — go get the story. Do your job!” And then it’s over, and you have to do it again the next day. That’s the inherent genius of The Paper. No other film conveys the madness of deadline journalism — or the fun.
Midway through the film, Quaid, who shines as the paper’s embattled columnist who believes people are plotting against him, fires a gun through a stack of newspapers to end an argument, which allows Keaton to finish a conversation with his wife (played by the brilliant Marisa Tomei).
At which point, Tomei, whose character works at the Sun and is at the beginning of her maternity leave, gushes, “God, I miss this place!”
The journalism practiced in All the President’s Men, The Post, and Spotlight is never going to cease — it’s the journalism that will always endure. The deep-rooted injustices that are so outrageous, it is as if the abuses themselves are practically begging for someone to shine a light on. Liev Schreiber, as The Boston Globe‘s editor-in-chief, makes this point in Spotlight: “Sometimes it is easy to forget that we spend most of our time stumbling around in the dark. Suddenly, a light gets turned on.” But what is being threatened is the journalism of The Paper: the daily local grind.
Following the dissolution of uber-local sites DNAInfo and Gothamist, Danielle Tcholakian wrote about what happens when newspapers stop covering what immediately impacts its citizens:
That was a big part of what we were there to do: show people exactly how every action, big or small, impacted their daily lives in the neighborhoods they lived in and loved.
And that is what makes The Paper so special, and why Tomei’s quote is such a genius line. She underscores the heart of the film: forget the money, the fame, and the accolades, all that matters is getting the story right — for a moment, because as the 1010 Wins tagline blares throughout the film at various points, “Your whole world can change in 24 hours.”
The profile attempted let ordinary details speak for themselves, and it opens with a description of a wedding registry: “On their list was a muffin pan, a four-drawer dresser and a pineapple slicer…Weddings are hard enough to plan for when your fiancé is not an avowed white nationalist.” But these ordinary details don’t contain meaning, they merely surround it. As Josephine Livingstone of The New Republic explains,
writers who simply represent (rather than report on) extremists leave rhetorical spaces open for Nazi ideology to flood in. You cannot let a Nazi hang himself, because he is the one left holding the rhetorical rope.
Fausset’s article wasn’t the Times‘ first attempt to transform racism into a personality quirk. From 1933, when Adolf Hitler was appointed chancellor of Germany, to his 1939 invasion of Poland, there was a significant movement both in the United States and worldwide to portray Hitler as a misunderstood genius whose everyday likability could better connect with the working class German people and lift the country from its post-war depression.
Magazines and newspapers like the Times of London, The New York Times, The Saturday Review (“Hitler at Home”) and even the American Kennel Gazette (“Hitler Says His Dogs are Real Friends“) were more interested in Hitler’s interior design sensibility, his gustatory preferences, and his love of German Shepherds. In 1936, Vogue toured Hitler’s chalet as part of a package showcasing the interior design of the homes of foreign rulers. (Federico Mussolini’s villa was also included). Their coverage of Hitler successfully peddled these themes of austerity, industriousness, and single-minded drive to the masses eager to believe in Germany’s rebirth.
In her 2015 book Hitler at Home, Despina Stratigakos, a professor of architecture and history at the University of Buffalo, catalogued numerous attempts to normalize the dictator, which started with the publication of The Hitler that Nobody Knows, a 1932 photo album that doubled as a behind the scenes peek into Hitler’s private life. With more than a hundred photographs taken by Hitler’s personal photographer, the book — which sold 400,000-plus copies by 1942 — meant to serve as a beacon proclaiming Hitler as the leader of the new Germany. But Stratigakos stresses the effect was a more insidious.
Until the turnabout in 1932, National Socialist publicists had diverted attention away from or suppressed stories about their leader’s private life. Yet even as they continued to fight reports that could harm Hitler’s reputation, the Nazis began to construct for public consumption their own version of the private individual. The image of “Hitler as private man” would now be reconfigured from a liability into an asset…Bildung and self-improvement, together with self-discipline, a strong work ethic, and modesty, formed the core moral values of the German middle classes. The components of the “good” Hitler were thus artfully assembled with an eye to courting this constituency of voters and persuading them to abandon their allegiance to [war hero and political opponent Paul von] Hindenburg.
Even the New York Times wasn’t exempt from indulging in Hitler’s spin. Laurel Leff, a professor of history at Northwestern University, published Buried by the Times in 2005, examining the ways the Times either ignored or inadequately covered the Holocaust, partially due to a distaste among the editors for Zionism. In October 1935, the Times magazine included a fawning profile of Hitler as an architect, featuring his remodel of a small Bavarian cottage and it’s transformation into the fortress of Berghof, which was shown completed on the cover of a May 1937 issue.
But perhaps the strangest Times article was, “Herr Hitler at Home in the Clouds.” Written by Hedwig Mauer Simpson, the wife of Stanley Simpson, a British journalist and Munich-based correspondent for the New York Times and Times of London (she was a frequent contributor to the The Associated Press and The Daily Mail)—he would be the first to report on the Dachau concentration camp, a piece that was ultimately turned down by the Times ofLondon.A journalistic power couple within Munich, the Simpsons were among the first reporters to have early access to Hitler, and she was known for her ability to file several stories at once and under intense pressure.
In the article, Simpson rehashes worn troupes about Hitler’s vegetarianism, the long walks he enjoyed with his Alsatian dogs, and his love of the German people. The tick-tock of his daily routine is described down to the minute. Breakfast is at 9 am, lunch is served by “white uniformed butlers,” and dinner is promptly at 8 p.m., with the ladies of the Berghof in evening dress and Hitler in English tweeds. In a rare step back from the festivities, Simpson writes that the setting contains “all the elements of exacting bureaucracy and secret-police efficiency.”
The Times article was published on August 20, 1939, 11 days before Hitler’s invasion of Poland. Simpson would take one of the last peacetime trains out of Munich to London, and it appears she gave up writing following her departure from Germany. There is nothing in the article that suggests the chancellor, who “no makes no secret of being fond of chocolate,” has anything on his mind except the promise of an afternoon nap. Simpson clearly feels pampered and privileged to be in his presence. Whatever she felt on that last train out of Germany isn’t recorded here.
Longreads’ Catherine Cusick recently discussed why articles like Fausset’s and Simpson’s are dangerous: “Reporters and editors committed to covering this movement may not be able to feel their own hearts beating faster out of fear.”
Ordinary details can furnish a room, they can set a table, they can fill the time between hushed meetings of planned genocide or the quiet tapping at a computer, spreading hateful slurs to thousands of followers. If a writer can’t feel that fear, can’t show those feelings on a page, then all the reader is left with is Hitler at home.