My uncle Howard killed himself in college. He was a grad student in Ann Arbor, engaged to be married, and, according to my family, well-liked. He suffered from depression worsened by tensions with his father. My grandmother knew this, yet she struggled to understand her son’s suicide for the rest of her long life. When Howard committed suicide in 1968, he did it in private inside a school chemistry lab, but he clearly wanted to be found, because he was sending a message. When 18-year-old Océane ended her life in May, 2016, she streamed the incident in real time, jumping in front of a suburban Paris subway train while strangers watched and commented.
At The Guardian, Rana Dasgupta tells Océane’s story and tries to understand why a young ailing woman could both criticize social media and use social media to communicate her message. Océane was wounded by trauma and haunted by the sense that no one cared, a fact that social media only amplified. Examining this central contradiction, Dasgupta teases outthe allure of escape in the depressed Parisian suburbs, the way disconnected youth seek connection, and the way celebrity, even internet celebrity, drains people of life.
After her sought-after, five-letter Instagram handle was stolen by an Iranian hacker, professor Negar Mottahedeh opened up the door to her former homeland, striking up an unlikely friendship with the thief to learn more about a man struggling to earn a living in an economy compromised by 35 years of U.S.-led sanctions. Read the story at Backchannel.
How good of a hacker was he, really? Who were his friends? What sorts of things did he enjoy? What were he and Negar like when they were together? Not knowing him unsettled me. So I was determined to find out.
Mohamad was curious about me, too. It was odd, considering that I was much older than him. It felt like he wanted a trusted friend—someone he could use as a sounding board. He chose me.
He needed money, but more than that, he wanted to find a way out of Iran. He asked me about student visas, tourist visas, work visas; he’d send me links or screenshots with sections circled in red ink, asking me to read through them for him. He discussed his marriage options. Could he find an American girl to marry so he could stay in the US after he got there? Or could I maybe adopt him?
But by robbing me of my online identity, my hacker had unshuttered a window to life in the country of my birth. While I had been barred from my home as a young child, my new setting was chock-full of luck. With my Instagram hacker in my life, my fortuitous situation stared me in the face. Looking at myself through his eyes, my life was abundant. I felt fortunate. I wasn’t about to give up the friendship I had forged with my hacker for anything.
At UCLA’s Donated Body Program, Dean Fisher uses a device to dissolve the dead bodies of donors. This alkaline hydrolysis machine, called the “Resomator,” turns bodies into liquid and pure white bone, which is then pounded and scattered at sea. Compared to cremation, alkaline hydrolysis is better for the environment, yet the process is currently only legal in the U.K. and in 14 U.S. states and three Canadian provinces. Is this the machine that could disrupt the death industry?
The machine is mid-cycle. Fisher, grey-haired and tall in light green scrubs, explains what’s happening inside the high-pressure chamber: potassium hydroxide is being mixed with water heated to 150°C. A biochemical reaction is taking place and the flesh is melting off the bones. Over the course of up to four hours, the strong alkaline base causes everything but the skeleton to break down to the original components that built it: sugar, salt, peptides and amino acids; DNA unzips into its nucleobases, cytosine, guanine, adenine, thymine. The body becomes fertiliser and soap, a sterile watery liquid that looks like weak tea. The liquid shoots through a pipe into a holding tank in the opposite corner of the room where it will cool down, be brought down to an acceptable pH for the water treatment plant, and be released down the drain.
Fisher says I can step outside if it all gets too much, but it’s not actually that terrible. The human body, liquefied, smells like steamed clams.
According to Tristan Harris, it’s going to take more than infinite willpower for billions of people to resist the infinite scroll of the attention economy. It’s going to take regulation, reform, and Apple becoming something of an acting government.
Harris — a former Google design ethicist and co-founder of Time Well Spent, a nonprofit that encourages tech companies to put users’ best interests before limitless profit models — insists that our minds have been hijacked in an arms race for our attention. He also insists that, with the help of a Hippocratic Oath for software designers, we can win.
“YouTube has a hundred engineers who are trying to get the perfect next video to play automatically,” Harris says in a new interview with WIRED‘s editor in chief Nicholas Thompson. “Their techniques are only going to get more and more perfect over time, and we will have to resist the perfect.”
See? This is me resisting:
In their WIRED interview, Thompson and Harris discuss why now is the moment to invest in reforming the attention economy.
THOMPSON: At what point do I stop making the choice [to use Facebook or Google or Instagram]? At what point am I being manipulated? At what point is it Nick and at what point is it the machine?
HARRIS: Well I think that’s the million-dollar question. First of all, let’s also say that it’s not necessarily bad to be hijacked, we might be glad if it was time well spent for us. I’m not against technology. And we’re persuaded to do things all the time. It’s just that the premise in the war for attention is that it’s going to get better and better at steering us toward its goals, not ours. We might enjoy the thing it persuades us to do, which makes us feel like we made the choice ourselves. For example, we forget if the next video loaded and we were happy about the video we watched. But, in fact, we were hijacked in that moment. All those people who are working to give you the next perfect thing on YouTube don’t know that it’s 2 am and you might also want to sleep. They’re not on your team. They’re only on the team of what gets you to spend more time on that service.
Again, the energy analogy is useful. Energy companies used to have the same perverse dynamic: I want you to use as much energy as possible. Please just let the water run until you drain the reservoir. Please keep the lights on until there’s no energy left. We, the energy companies, make more money the more energy you use. And that was a perverse relationship. And in many US states, we changed the model to decouple how much money energy companies make from how much energy you use. We need to do something like that for the attention economy, because we can’t afford a world in which this arms race is to get as much attention from you as possible.
The opportunity here, is for Apple. Apple is the one company that could actually do it. Because their business model does not rely on attention, and they actually define the playing field on which everyone seeking our attention plays. They define the rules. If you want to say it, they’re like a government. They get to set the rules for everybody else. They set the currency of competition, which is currently attention and engagement. App stores rank things based on their success in number of downloads or how much they get used. Imagine if instead they said, “We’re going to change the currency.” They could move it from the current race to the bottom to creating a race to the top for what most helps people with different parts of their lives. I think they’re in an incredible position to do that.
Tourists taking selfies with the Oriental Pearl tower in Shanghai. (Photo by Zhang Peng/LightRocket via Getty Images)
Is it possible to make the internet a kinder place? Instagram CEO Kevin Systrom thinks so. In Wired, Nick Thompson reports on how Instagram has been working to clean up its photo sharing platform, creating tools for users to close comments on certain posts and ban offensive words—or, in one notable case, offensive emojis:
In mid July 2016, just after VidCon, Systrom was faced with just such an ophiological scourge. Somehow, in the course of one week, Taylor Swift had lost internet fights with Calvin Harris, Katy Perry, and Kim Kardashian. Swift was accused of treacherous perfidy, and her feed quickly began to look like the Reptile Discovery Center at the National Zoo. Her posts were followed almost entirely by snake emoji: snakes piled on snakes, snakes arranged numerically, snakes alternating with pigs. And then, suddenly, the snakes started to vanish. Soon Swift’s feed was back to the way she preferred it: filled with images of her and her beautiful friends in beautiful swimsuits, with commenters telling her how beautiful they all looked.
But Instagram can’t build that world with simple technical fixes like automated snake emoji deletion.
This was no accident. Over the previous weeks, Systrom and his team at Instagram had quietly built a filter that would automatically delete specific words and emoji from users’ feeds. Swift’s snakes became the first live test case. In September, Systrom announced the feature to the world. Users could click a button to “hide inappropriate comments,” which would block a list of words the company had selected, including racial slurs and words like whore. They could also add custom keywords or even custom emoji, like, say, snakes.
This weekend’s events in Charlottesville will resonate long after the crowd was dispersed, long after the cable news trucks leave, long after the school year begins—new students are scheduled to arrive on the University of Virginia campus on Friday. The confrontation — and the resulting deaths of three people, two national guard pilots who were killed in an accident, and counter-protestor Heather Heyer, who was killed in a deliberate act of domestic terrorism — is neither the beginning nor the end of an ongoing resurgence of white supremacy. What was once discussed in closed online forums is now on the streets, armed—as Virginia Governor Terry Mcauliffe described —with more firepower than the Virginia National Guard. “Emboldened” is the word that’s been used by politicians and the media to describe the relationship between white nationalists and Donald Trump’s rhetoric. “Blame” is what the word should be.
Here is our reading list of features from the past two years that trace the disturbing path of how we got to Charlottesville. Read more…
Here’s how it works: an agent records a video of a targeted slot machine, sends the footage back to St. Petersburg, and Alex analyzes the slot’s behavior to determine the moment it will pay out. “By using these cues to beat slots in multiple casinos,” writes Brendan Koerner in Wired, “a four-person team can earn more than $250,000 a week.”
In the course of reverse engineering Novomatic’s software, Alex encountered his first PRNG. He was instantly fascinated by the elegance of this sort of algorithm, which is designed to spew forth an endless series of results that appear impossible to forecast. It does this by taking an initial number, known as a seed, and then mashing it together with various hidden and shifting inputs—the time from a machine’s internal clock, for example. Writing such algorithms requires tremendous mathematical skill, since they’re supposed to produce an output that defies human comprehension; ideally, a PRNG should approximate the utter unpredictability of radioactive decay.
After wrapping up the casino gig, Alex spent six months teaching himself everything he could about PRNGs—in part because he admired their beauty but also because he knew that such expertise could prove profitable. “I mastered it to the point where I can develop such algorithms myself, on a level I am yet to see in a gambling machine,” says Alex, who will never be accused of lacking confidence. “It’s in my bloodstream now. I feel the numbers; I know how they move.”
When James Vlahos learned that his father, John, was diagnosed with stage IV lung cancer, he set out to save his dad’s memories by programming them into a chatbot that he could pull up from his phone. The chatbot, nicknamed “Dadbot,” could emulate John in text conversations, bringing up various stories from John’s life. Vlahos wrote about his experience in Wired, and recounts the first time his mother interacted with “Dadbot”:
The Dadbot gives my mom a few other pointers, then writes, “Finally, it is critical that you remember one final thing. Can you guess what it is?”
“Not a clue.”
“I will tell you then. The verb ‘to be’ takes the predicate nominative.”
My mom laughs as she reads this stock grammar lecture of my father’s. “Oh, I’ve heard that a million times,” she writes.
“That’s the spirit.” The Dadbot then asks my mom what she would like to talk about.
“How about your parents’ lives in Greece?” she writes.
I hold my breath, then exhale when the Dadbot successfully transitions. “My mother was born Eleni, or Helen, Katsulakis. She was born in 1904 and orphaned at three years old.”
“Oh, the poor child. Who took care of her?”
“She did have other relatives in the area besides her parents.”
I watch the unfolding conversation with a mixture of nervousness and pride. After a few minutes, the discussion segues to my grandfather’s life in Greece. The Dadbot, knowing that it is talking to my mom and not to someone else, reminds her of a trip that she and my dad took to see my grandfather’s village. “Remember that big barbecue dinner they hosted for us at the taverna?” the Dadbot says.
At a recent conference in Detroit, billionaire Jack Ma, founder of the online marketplace Alibaba, told CNBC that, thanks to advances in artificial intelligence, people will soon work less.
“I think in the next 30 years, people only work four hours a day and maybe four days a week,” Ma said. “My grandfather worked 16 hours a day in the farmland and [thought he was] very busy. We work eight hours, five days a week and think we are very busy.”
People have been making this prediction for generations. Economist John Maynard Keynes posited, in an essay published a year after the 1929 Wall Street crash, that his grandchildren would work 15-hour weeks, with five-day weekends. In 2015, NPR caught up with some of his descendants and discovered Keynes — who, according to his grand-nephew died “from working too hard” — was wrong. His grand-nephew reported working over 100 hours a week as a professor, and his grand-niece, a self-employed psychotherapist, said she has to write in her agenda “not working” to remind herself to take breaks. Read more…
You must be logged in to post a comment.