Search Results for: Vice Magazine

We Could Have Had Electric Cars from the Very Beginning

An advertisement depicts a Baker Electric automobile, the Baker Queen Victoria, driven by a young woman, 1909. (Stock Montage/Getty Images)

Dan Albert | An excerpt adapted from Are We There Yet? : The American Automobile Past, Present, and Driverless | W. W. Norton & Co. | June 2019 | 25 minutes (6,750 words)

Most people reasonably expect the story of the evolution of the automobile to begin with the invention of the automobile itself. I’ve disappointed enough people in my life already, so I give you the Jesuit Rat Car of 1672. In that year, missionary Ferdinand Verbiest created a steam wagon to bring the Emperor of China to Jesus, but the car was only big enough to carry a rat.

If you don’t like the Jesuit Rat Car as an automotive first, you might consider Nicolas-Joseph Cugnot’s cannon hauler of 1769. A product of the French army’s skunk works, it was canceled in beta testing. In 1790, Nathan Read got the first American patent for a steam-powered wagon, a remarkable feat because the US Patent Office itself had yet to be invented. Perhaps that counts. In London, Richard Trevithick set a Georgian coach body atop a steam boiler and eight-foot wheels, creating the first giraffe-less carriage. In 1805, American Oliver Evans drove his harbor dredge, the Orukter Amphibolos, down the streets of Philadelphia in hopes of enticing investors for a car business. Philadelphia cobblestone street paving gave horses purchase but shook the Orukter so violently that the wheels broke. Let’s call his the first amphibious car. Read more…

Father’s Little Helper

Illustration by Eric Peterson

Scott Korb | Longreads | June 2019 | 14 minutes (3,467 words)

I.

Some of what you’re reading I was writing a few hours after taking half a Valium, prescribed by my doctor, partly for anxiety and partly for general neck and shoulder pain, and also a tingle and numbness that I was then feeling down my left arm into my fingers. It began with a yoga pose. It’s hard to know now what exactly I wrote while under the drug’s influence, such as it was. When I took the Valium I was 39; now I’m 41.

These 40-odd years, if Schopenhauer is right, have given me the text of my life. “The next 30,” he says, will “supply the commentary,” of which this, I hope, is an early part.

The pharmacist, who was younger than me, with slick hair, and whom I’d gotten to know a little over the years since my wife was treated for breast cancer, used the word spasm when referring to the orders faxed over from my doctor’s office. I nodded, yes, muscle spasms, even though that didn’t seem right; maybe I don’t know what spasm means. I said nothing about the low-grade anxiety I’ve felt for much of my life, which has gotten worse since my wife’s treatments finished up. “Low and slow,” he recommended. So I took half a pill. I’d never taken one before, and I’m cautious.

While discussing the pain in my neck and shoulder, the facial tics I’ve had my whole life, I also told the doctor I’m reluctant to take drugs, even Ibuprofen, though my wife has told me Valium can be fun. She recalls a day just before Father’s Day, 2014, wandering through New York City’s West Village, buying me expensive t-shirts in the late-spring heat, a week after major surgery, without a worry in the world.

I decided to take the Valium in advance of an MRI my doctor had prescribed to capture images of my cervical spine, hunting for disease. The pill would help get me through the test.
Read more…

Vacation Memories Marred by the Indelible Stain of Racism

Illustration by Olivia Waller

Shanna B. Tiayon | Longreads | June 2019 | 9 minutes (2,384 words)

As I looked out the bus window I was awestruck by the magnificence and vastness of the canyon that stretched farther than my eyes could see. I stared at the brown hues with hints of red, orange and blue, and the rock textures that were still visible even from a distance. The Grand Canyon was breathtaking and I was taking it all in for the last time as the bus drove by.

A loud voice disturbed the peace of my window gazing.

“There’s no eating on the bus,” it said. “The kid — she dropped the paper and there’s no eating on the bus.”

My eyes never left the window. When the disruption passed, I turned my thoughts to our trip. It was March, 2018. Hailing from the DC Metro area, this was the first trip we took as a family after I completed my PhD program in May, 2017.

My husband and I, with our four kids ranging in age from 2 to 20, had just finished the arduous but magnificent hike of the canyon’s Cedar Ridge Trail. The hike was challenging, but we made it. As I sat on the bus returning to the Visitors’ Center, I could already feel my muscles starting to tense up from navigating the trail’s 6,120-foot elevation gain.

“The paper,” the voice interrupted again, “somebody needs to pick that paper up.”

This time I turned my head towards the front of the bus, realizing that the person spoiling my daydreaming was the bus driver. The National Park Service bus driver glared at us through the rearview mirror, gesturing towards a Kind bar wrapper my 2-year old had accidentally dropped on the floor. We weren’t the only ones eating on the bus, but we were the only ones being admonished for it. Also, we were the only Black family on the bus. In fact our family represented 6 out of the total 8 Black people on the trail at all that day, among dozens of White visitors.

I bent down to pick up the paper just as we arrived at the second stop. The bus driver pressed the brake. Still partially out of my seat, my body lunged forward with the momentum of the bus. When it came to a complete stop, my back jolted to the back of the seat. I looked up and the bus driver was now out of her seat, coming towards us with her hands flailing. She was a thin-framed, older White woman, I guess in her early 60’s, with long, straight, bleach blond hair hanging down her back. Wide-framed, tinted glasses sat on her face. She had on dark jeans and a red puff vest, and she reeked of cigarette smoke.

She stopped within a foot of my family. “I need you all to get up and move to the back,” she said. “I need those seats so the passengers can board.”
Read more…

William S. Burroughs and the Cult of Rock ‘n’ Roll

Paul Natkin/WireImage

Casey Rae | William S. Burroughs and the Cult of Rock ‘n’ Roll | University of Texas Press | June 2019 | 28 minutes (4,637 words)

 

Naked Lunch is inseparable from its author William S. Burroughs, which tends to happen with certain major works. The book may be the only Burroughs title many literature buffs can name. In terms of name recognition, Naked Lunch is a bit like Miles Davis’ Kind of Blue, which also arrived in 1959. Radical for its time, Kind of Blue now sounds quaint, though it is undeniably a masterwork.

Burroughs wrote the bulk of his famous novel Naked Lunch in Tan­gier, Morocco between 1954 and 1957. During those years, Burroughs was strung out and unhappy, living off of his parents’ allowance and getting deeper and deeper into addiction. He had friends but rarely saw them, preferring to spend days at a time staring at his shoes while ensorcelled in a narcotic haze.

Read more…

Caught Between Borders

Illustration by Eric Chow

Malia Politzer | Annie Hylton | Longreads | June 2019 | 25 minutes (6,991 words)

 
The first time his father tried to kill him, Ismail* was 15 years old. By the time he turned 19, he had escaped four attempts on his life: Once, he was outside an asylum center in South Africa, where he’d hoped to find safety; other times he was in Somalia, the country from which he fled. His father was intent on killing him to protect the family’s “honor.” No matter where he went, it seemed, his father had enlisted Somali immigrants to mete out his execution. Ismail’s crime? He is gay.

Slender and tall, Ismail dresses sharply, favoring bright colors and tight cuts. He wears a signature mixture of ladies’ perfumes, and carries a silver-chain necklace and anklet in his backpack that he longs to wear but is too afraid to put on. From a young age, Ismail displayed traits that he said were “woman things” — his walk, the way he spoke, how he moved his hands — mannerisms that were not “normal” and provoked his father’s ire. His father forbade him from school and kept him under house arrest.

Read more…

How the Toronto Raptors and the Vancouver Grizzlies Revived the NBA

Carlo Allegri / AFP / Getty

“There’s no character to the Toronto Raptors’s uniform anymore,” Tom O’Grady says. “It’s clean, yes, but not eye-catching. The logo doesn’t jump off the shelf.” He adds, “The uniform today might as well belong to a intramural basketball team.” Read more…

How the Cosby Story Finally Went Viral — And Why It Took So Long

Associated Press, Collage by Homestead

Nicole Weisensee Egan | An excerpt adapted from Chasing Cosby: The Downfall of America’s Dad | Seal Press | 14 minutes (3,614 words)

In October 2014 Bill Cosby was in the middle of a career resurgence. His biography by former Newsweek editor Mark Whitaker had just come out to rave reviews and was climbing the bestseller list. He had a comedy special coming up on Netflix and was in development with NBC to star in a family sitcom. He was about to embark on another comedy tour based on a special that had aired on Comedy Central the year before. The special, Far from Finished, was Cosby’s first stand-up TV special in three decades, and it attracted two million viewers.

It was as if the scandal in 2005 had never happened, as if fourteen women hadn’t accused him of heinous offenses. The book didn’t even mention Andrea Constand’s allegations, let alone her civil suit or any of the other accusers. And no one in the media was asking Whitaker or Cosby why.

The situation was clear: Cosby had successfully repaired what little damage there was to his reputation after Andrea’s case made the news. He slipped right back into his revered status as public moralist and children’s advocate, chalking up even more awards and honors, including his entrée into the NAACP’s Image Awards Hall of Fame in 2006 for being a “true humanitarian and role model.” Read more…

The Artificial Intelligence of the Public Intellectual

morkeman / Getty

Soraya Roberts | Longreads | May 2019 | 8 minutes (2,228 words)

“Well, that’s a really important thing to investigate.” While Naomi Wolf’s intellectual side failed her last week, her public side did not. That first line was her measured response when a BBC interviewer pointed out — on live radio — that cursory research had disproven a major thesis in her new book, Outrages: Sex, Censorship, and the Criminalization of Love (she misinterpreted a Victorian legal term, “death recorded,” to mean execution — the term actually meant the person was pardoned). Hearing this go down, journalists like me theorized how we would react in similar circumstances (defenestration) and decried the lack of fact-checkers in publishing (fact: Authors often have to pay for their own). The mistake did, however, ironically, offer one corrective: It turned Wolf from cerebral superhero into mere mortal. No longer was she an otherworldly intellect who could suddenly complete her Ph.D. — abandoned at Oxford when she was a Rhodes Scholar in the mid-’80s, Outrages is a reworking of her second, successful, attempt — while juggling columns for outlets like The Guardian, a speaking circuit, an institute for ethical leadership, and her own site, DailyClout, not to mention a new marriage. Something had to give, and it was the Victorians.

Once, the public intellectual had the deserved reputation of a scholarly individual who steered the public discourse: I always think of Oscar Wilde, the perfect dinner wit who could riff on any subject on command and always had the presence of mind to come up with an immortal line like, “One can survive everything nowadays except death.” The public intellectual now has no time for dinner. Wolf, for instance, parlayed the success of her 1991 book The Beauty Myth into an intellectual career that has spanned three decades, multiple books, and a couple of political advisory jobs, in which time her supposed expertise has spread far beyond third-wave feminism. She has become a symbol of intellectual rigor that spans everything from vaginas to dictatorships — a sort of lifestyle brand for the brain. Other thought leaders like her include Jordan Peterson, Fareed Zakaria, and Jill Abramson. Their minds have hijacked the public trust, each one acting as the pinnacle of intellect, an individual example of brilliance to cut through all the dullness, before sacrificing the very rigor that put them there in order to maintain the illusion floated by the media, by them, even by us. The public intellectual once meant public action, a voice from the outside shifting the inside, but then it became personal, populated by self-serving insiders. The public intellectual thus became an extension — rather than an indictment — of the American Dream, the idea that one person, on their own, can achieve anything, including being the smartest person in the room as well as the richest.

* * *

I accuse the Age of Enlightenment of being indirectly responsible for 12 Rules for Life. The increasingly literate population of the 18th century was primed to live up to the era’s ultimate aspiration: an increasingly informed public. This was a time of debates, public lectures, and publications and fame for the academics behind them. Ralph Waldo Emerson, for one. In his celebrated “The American Scholar” speech from 1837, Emerson provided a framework for an American cultural identity — distinct from Europe’s — which was composed of a multifaceted intellect (the One Man theory). “The scholar is that man who must take up into himself all the ability of the time, all the contributions of the past, all the hopes of the future,” he said. “In yourself slumbers the whole of Reason; it is for you to know all, it is for you to dare all.” While Emerson argued that the intellectual was bound to action, the “public intellectual” really arrived at the end of the 19th century, when French novelist Émile Zola publicly accused the French military of antisemitism over the Dreyfus Affair in an open letter published in  L’Aurore newspaper in 1898. With  “J’Accuse…!,” the social commentary Zola spread through his naturalist novels was transformed into a direct appeal to the public: Observational wisdom became intellectual action. “I have but one passion: to enlighten those who have been kept in the dark, in the name of humanity which has suffered so much and is entitled to happiness,” he wrote. “My fiery protest is simply the cry of my very soul.”

The public intellectual thenceforth became the individual who used scholarship for social justice. But only briefly. After the Second World War, universities opened up to serve those who had served America, which lead to a boost in educated citizens and a captive audience for philosophers and other scholars. By the end of the ’60s, television commanded our attention further with learned debates on The Dick Cavett Show — where autodidact James Baldwin famously dressed down Yale philosopher Paul Weiss — and Firing Line with William F. Buckley Jr. (also famously destroyed by Baldwin), which would go on to host academics like Camille Paglia in the ’90s. But Culture Trip editor Michael Barron dates the “splintering of televised American intellectualism” to a 1968 debate between Gore Vidal — “I want to make 200 million people change their minds,” the “writer-hero” once said — and Buckley, which devolved into playground insults. A decade later, the public intellectual reached its celebrity peak, with Susan Sontag introducing the branded brain in People magazine (“I’m a book junkie. … I buy special editions like other women shop for designer originals at Saks.”)

As television lost patience with Vidal’s verbose bravado, he was replaced with more telegenic — angrier, stupider, more right-wing — white men like Bill O’Reilly, who did not clarify nuance but blustered over the issues of the day; the public intellectual was now all public, no intellect. Which is to say, the celebrity pushed out the scholar, but it was on its way out anyway. By the ’80s, the communal philosophical and political conversations of the post-war era slunk back to the confines of academia, which became increasingly professionalized, specialized, and insular, producing experts with less general and public-facing knowledge. “Anyone who engages in public debate as a scholar is at risk of being labelled not a serious scholar, someone who is diverting their attention and resources away from research and publicly seeking personal aggrandizement,” one professor told University Affairs in 2014. “It discourages people from participating at a time when public issues are more complicated and ethically fraught, more requiring of diverse voices than ever before.” Diversity rarely got past the ivy, with the towering brilliance of trespassers like Baldwin and Zora Neale Hurston, among other marginalized writers, limited by their circumstances. “The white audience does not seek out black public intellectuals to challenge their worldview,” wrote Mychal Denzel Smith in Harper’s last year, “instead they are meant to serve as tour guides through a foreign experience that the white audience wishes to keep at a comfortable distance.”

Speaking of white audiences … here’s where I mention the intellectual dark web even though I would rather not. It’s the place — online, outside the academy, in pseudo-intellectual “free thought” mag Quillette — where reactionary “intellectuals” flash their advanced degrees while claiming their views are too edgy for the schools that graduated them. These are your Petersons, your Sam Harrises, your Ben Shapiros, the white (non)thinkers, usually men, tied in some vague way to academia, which they use to validate their anti-intellectualism while passing their feelings off as philosophy and, worse, as (mis)guides for the misguided. Last month, a hyped debate between psychology professor Peterson and philosopher Slavoj Žižek had the former spending his opening remarks stumbling around Marxism, having only just read The Communist Manifesto for the first time since high school. As Andray Domise wrote in Maclean’s, “The good professor hadn’t done his homework.” But neither have his fans.

But it’s not just the conservative public intellectuals who are slacking off. Earlier this year, Jill Abramson, the former executive editor of The New York Times, published Merchants of Truth: The Business of News and the Fight for Facts. She was the foremost mind on journalism in the Trump era for roughly two seconds before being accused of plagiarizing parts of her book. Her response revealed that the authorship wasn’t exactly hers alone, a fact which only came to light in order for her to blame others for her mistakes. “I did have fact-checking, I did have assistants in research, and in some cases, the drafting of parts of the book,” she told NPR. “I certainly did spend money. But maybe it wasn’t enough.” Abramson’s explanation implied a tradition in which, if you are smart enough to be rich enough, you can pay to uphold your intellectual reputation, no matter how artificial it may be.

That certainly wasn’t the first time a public intellectual overrepresented their abilities. CNN host Fareed Zakaria, a specialist in foreign policy with a Ph.D. from Harvard — a marker of intelligence that can almost stand in for actual acumen these days — has been accused multiple times of plagiarism, despite “stripping down” his extensive workload (books, speeches, columns, tweets). Yet he continues to host his own show and to write a column for The Washington Post in the midst of a growing number of unemployed journalists and dwindling number of outlets. Which is part of the problem. “What happens in the media is the cult of personality,” said Charles R. Eisendrath, director of the Livingston Awards and Knight-Wallace Fellowship, in the Times. “As long as it’s cheaper to brand individual personalities than to build staff and bolster their brand, they will do it.” Which is why Wolf, and even Abramson, are unlikely to be gone for good.

To be honest, we want them around. Media output hasn’t contracted along with the industry, so it’s easier to follow an individual than a sprawling media site, just like it’s easier to consult a YouTube beauty influencer than it is to browse an entire Sephora. With public intellectuals concealing the amount of work required of them, the pressure to live up to the myth we are all helping to maintain only increases, since the rest of us have given up on trying to keep pace with these superstars. They think better than we ever could, so why should we bother? Except that, like the human beings they are, they’re cutting corners and making errors and no longer have room to think the way they did when they first got noticed. It takes significant strength of character in this economy of nonstop (and precarious) work to bow out, but Ta-Nehisi Coates did when he stepped down last year from his columnist gig at The Atlantic, where he had worked long before he started writing books and comics. “I became the public face of the magazine in many ways and I don’t really want to be that,” he told The Washington Post. “I want to be a writer. I’m not a symbol of what The Atlantic wants to do or whatever.”

* * *

Of course a public intellectual saw this coming. In a 1968 discussion between Norman Mailer and Marshall McLuhan on identity in the technology age (which explains the rise in STEM-based public intellectuals), the latter said, “When you give people too much information, they resort to pattern recognition.” The individuals who have since become symbols of thought — from the right (Christina Hoff Sommers) to the left (Roxane Gay) — are overrepresented in the media, contravening the original definition of their role as outsiders who spur public action against the insiders. In a capitalist system that promotes branded individualism at the expense of collective action, the public intellectual becomes a myth of impossible aspiration that not even it can live up to, which is the point — to keep selling a dream that is easier to buy than to engage in reality. But an increasingly intelligent public is gaining ground.

The “Public Intellectual” entry in Urban Dictionary defines it as, “A professor who spends too much time on Twitter,” citing Peterson as an example. Ha? The entry is by OrinKerr, who may or may not be (I am leaning toward the former) a legal scholar who writes for the conservative Volokh Conspiracy blog. His bad joke is facetious, but not entirely inaccurate — there’s a shift afoot, from the traditional individual public intellectual toward a collective model. That includes online activists and writers like Mikki Kendall, who regularly leads discussions about feminism and race on Twitter; Bill McKibben, who cofounded 360.org, an online community of climate change activists; and YouTubers like Natalie Wynn, whose ContraPoints video essays respond to real questions from alt-right men. In both models, complex thought does not reside solely with the individual, but engages the community. This is a reversion to one of the early definitions of public intellectualism by philosopher Antonio Gramsci. “The traditional and vulgarized type of the intellectual is given by the man of letters, the philosopher, the artist,” he wrote in his Prison Notebooks — first published in 1971. “The mode of being of the new intellectual can no longer consist in eloquence, which is an exterior and momentary mover of feelings and passions, but in active participation in practical life, as constructor, organizer, ‘permanent persuader’ and not just a simple orator.” It doesn’t matter if you’re the smartest person in the room, as long as you can make it move.

* * *

Soraya Roberts is a culture columnist at Longreads.

I’ve Done a Lot of Forgetting

Getty / Illustration by Homestead

Jordan Michael Smith | Longreads | May 2019 | 10 minutes (2,744 words)

If someone spits bigotry at you while you’re a kid, you’re unlikely to forget it. You’ll remember it not because it’s traumatic, though it can be. You’ll remember it not even because it’s degrading and excruciating, though it is certainly those things, too. No, you’ll remember it because it instills in you an understanding that people are capable of motiveless evil. That humans can be moved to hate because they are hateful. You aren’t given a reason for why people hate you, because they don’t need a reason. You’re you, through no fault of your own, even if you want desperately to be anyone else. And that’s enough.

I am a Canadian. I was born in Markham, which is a small city about 30 kilometers northeast of Toronto. That distance meant a great deal. Markham was a large town of middle- and working-class families when my newlywed parents moved there, in the late 1970s, with a population that hovered around 60,000. It was pretty mixed demographically, I recall, though containing a white majority. My older sister and I were the only Jews in our elementary school, except for one other family who arrived after we did and seemed not to attract much ire; I imagined it was because they were beautiful and popular (we were neither).

We were one of the minority of Canadian Jewish families living outside Toronto or Montreal. More than 71% of all Canadian Jews reside in these two cities, according to Allan Levine’s serviceable but unexceptional new book on the history of Jewish Canada, Seeking the Fabled City. Levine describes a familiar story of an immigrant group gradually gaining acceptance (and some power) in a once-largely white Christian country. For the first half of the 20th century, Jews in Canada were arguably detested to a greater degree than in America. By the 21st century, Canadian Jews felt as safe as Jews anywhere felt safe. Levine quotes a Toronto rabbi as saying, “Living in Toronto, my children don’t know that Jews are a minority.” Read more…

Born to Be Eaten

Illustration by Glenn Harvey

Eva Holland | Longreads | May 30, 2019 | 26 minutes (7,122 words)

Calving

The caribou cow gives birth on her feet. She stands with legs wide apart, or turns on the spot, shuffling in slow circles, craning her long neck to watch as her calf emerges inch by inch from below her tail, between her hips. It’s oddly calm, this process — a strange thing to witness for us two-legged mammals, more accustomed to the stirrups and the struggle and the white-knuckled screaming of a Hollywood birth scene.

The calf, when he comes, emerges hooves first. He climbs into the world fully extended, like a diver stretching toward the water. Out come the front pair of hooves, capping spindly legs, then the long narrow head, the lean, wet-furred body, and finally, another set of bony legs and sharp little hooves. His divergence from his mother leaves behind nothing but some strings of sticky fluid and a small patch of bloody fur. He doesn’t know it, but the land he is born on is one of the most contentious stretches of wilderness in North America.

The calf, when he comes, emerges hooves first…He doesn’t know it, but the land he is born on is one of the most contentious stretches of wilderness in North America.

Still slick with mucus, the calf takes his first steps within minutes, stumbling awkwardly to his feet as his mother licks him clean. Within 24 hours, he is able to walk a mile or more. Soon, if he survives long enough, he will be capable of swimming white-water rivers, outrunning wolves, and trotting overland for miles upon miles every day. His life will offer myriad dangers and only the rarest respite; for the caribou, staying alive means staying on the move.

Read more…