Imagine you work in an industry where accuracy and precision are hugely important. Your work is scrutinized by an ever-growing field of critics eager to catch any misstep, and if you get something wrong it has the potential to do people serious harm.
Your job often requires making dozens, if not hundreds of calls to obtain or even just verify a single fact. You spend your days wheedling information out of people who don’t want to provide it. You pore through mountains and mountains of documents which may only include one salient fact buried deep in a dense bog of data. Often these documents are difficult to find, or require the assistance of lawyers to access — lawyers you personally can’t afford and your higher ups may not want to pay for.
Now imagine this industry is failing at being a viable industry. People in a different department than you are supposed to be responsible for that aspect — business, finances, the bottom line — but your department creates the product that is being sold. When “innovators” are brought in to come up with dynamic ideas, they pin them on you. There’s nothing to suggest the product is broken or failing, and everything to suggest that the means by which money is made from the product is the problem, but that doesn’t seem to matter to the innovators. They have figured out how to track how your product is consumed — do we have the metrics on that? — and so they are going to use that information to suggest changes to how you do what you do.
Writing in Real Life magazine, Juli Min explores the way WeChat, China’s most popular messaging app, has become a place to both mourn death and share graphic videos of the moment itself—a place where users post “viral videos of death as we create an endless stream of idle gossip.” What does this mean broadly, and what does it mean in a country where all data is subject to government monitoring?
Tencent WeChat accounts, like Facebook accounts, are technically leased to their users. The data and photos do not belong solely to individuals in the end, as Tencent maintains the rights to copy, use, and forward whatever is shared on the platform. Accordingly, Tencent’s servers themselves are leased from the Chinese government, subjecting all messaging data to government monitoring and surveillance. A viral video of a mother’s death by escalator will happily make the rounds, whereas a video of a Tibetan monk burning himself in protest will be shuttered by government monitors — “we” are allowed to gawk at the spectacle of death, but not the spectacle of resistance. In 1967’s The Society of the Spectacle, Guy Debord, prescient founder of the Situationist International, wrote: “The spectacle is not a collection of images; rather, it is a social relationship between people that is mediated by images.” Aside from the work of mediation, he wrote, spectacle also allowed for the proliferation and control of the masses and degraded authentic life and experience.
Monitoring is both the source and the function of internet spectacle.
In Real Life Mag, information accessibility and data use expert Zara Rahman explores the limits and coercive power of a ubiquitous internet interface: the location drop-down menu. Aside from forcing people to make artificial choices, location drop-downs also assume a stable location, something that many people don’t have, and never did.
Digital technologies seem to have ignored how people actually move around in geographic space: It’s relatively new that some of us have fixed locations or even addresses at all, and in some regions, nomadic cultures still exist. In Somalia, over a quarter of the population is nomadic; in Mongolia, just under a third are still nomadic, moving from place to place with their herds. Seasonal migration from rural areas to urban ones is a way of life for many, or from poorer countries to richer ones, as Bangladeshi migrant workers who find work in countries in the Gulf do. For millions, location is and always has been fluid and complex, dependent upon a myriad of factors, from climate to the economy to geopolitics.
In 2011, Diederik Stapel, a bright social psychologist at Tilburg University in the Netherlands, was suspended for fabricating data on a study that brought him much praise. At the Guardian, Stephen Buranyi profiles the team of researchers from the university’s psychology department, Chris Hartgerink and Marcel van Assen, who have since focused their research on scientific fraud.
Stapel had a knack for devising and executing such clever studies, cutting through messy problems to extract clean data. Since becoming a professor a decade earlier, he had published more than 100 papers, showing, among other things, that beauty product advertisements, regardless of context, prompted women to think about themselves more negatively, and that judges who had been primed to think about concepts of impartial justice were less likely to make racially motivated decisions.
His findings regularly reached the public through the media. The idea that huge, intractable social issues such as sexism and racism could be affected in such simple ways had a powerful intuitive appeal, and hinted at the possibility of equally simple, elegant solutions. If anything united Stapel’s diverse interests, it was this Gladwellian bent. His studies were often featured in the popular press, including the Los Angeles Times and New York Times, and he was a regular guest on Dutch television programmes.
But as Stapel’s reputation skyrocketed, a small group of colleagues and students began to view him with suspicion. “It was too good to be true,” a professor who was working at Tilburg at the time told me. (The professor, who I will call Joseph Robin, asked to remain anonymous so that he could frankly discuss his role in exposing Stapel.) “All of his experiments worked. That just doesn’t happen.”
The Web dwells in a never-ending present. It is—elementally—ethereal, ephemeral, unstable, and unreliable. Sometimes when you try to visit a Web page what you see is an error message: “Page Not Found.” This is known as “link rot,” and it’s a drag, but it’s better than the alternative. More often, you see an updated Web page; most likely the original has been overwritten. (To overwrite, in computing, means to destroy old data by storing new data in their place; overwriting is an artifact of an era when computer storage was very expensive.) Or maybe the page has been moved and something else is where it used to be. This is known as “content drift,” and it’s more pernicious than an error message, because it’s impossible to tell that what you’re seeing isn’t what you went to look for: the overwriting, erasure, or moving of the original is invisible. For the law and for the courts, link rot and content drift, which are collectively known as “reference rot,” have been disastrous. In providing evidence, legal scholars, lawyers, and judges often cite Web pages in their footnotes; they expect that evidence to remain where they found it as their proof, the way that evidence on paper—in court records and books and law journals—remains where they found it, in libraries and courthouses. But a 2013 survey of law- and policy-related publications found that, at the end of six years, nearly fifty per cent of the URLs cited in those publications no longer worked. According to a 2014 study conducted at Harvard Law School, “more than 70% of the URLs within the Harvard Law Review and other journals, and 50% of the URLs within United States Supreme Court opinions, do not link to the originally cited information.” The overwriting, drifting, and rotting of the Web is no less catastrophic for engineers, scientists, and doctors. Last month, a team of digital library researchers based at Los Alamos National Laboratory reported the results of an exacting study of three and a half million scholarly articles published in science, technology, and medical journals between 1997 and 2012: one in five links provided in the notes suffers from reference rot. It’s like trying to stand on quicksand.
The footnote, a landmark in the history of civilization, took centuries to invent and to spread. It has taken mere years nearly to destroy. A footnote used to say, “Here is how I know this and where I found it.” A footnote that’s a link says, “Here is what I used to know and where I once found it, but chances are it’s not there anymore.” It doesn’t matter whether footnotes are your stock-in-trade. Everybody’s in a pinch. Citing a Web page as the source for something you know—using a URL as evidence—is ubiquitous. Many people find themselves doing it three or four times before breakfast and five times more before lunch. What happens when your evidence vanishes by dinnertime?
When a problem exists in Philadelphia schools, it generally exists in other large urban schools across the nation. One of those problems—shared by districts in New York, D.C., Chicago, Los Angeles, and other major cities—is that many schools don’t have enough money to buy books. The School District of Philadelphia recently tweeted a photo of Mayor Michael Nutter handing out 200,000 donated books to K–3 students. Unfortunately, introducing children to classic works of literature won’t raise their abysmal test scores.
This is because standardized tests are not based on general knowledge. As I learned in the course of my investigation, they are based on specific knowledge contained in specific sets of books: the textbooks created by the test makers.
All of this has to do with the economics of testing. Across the nation, standardized tests come from one of three companies: CTB McGraw Hill, Houghton Mifflin Harcourt, or Pearson. These corporations write the tests, grade the tests, and publish the books that students use to prepare for the tests. Houghton Mifflin has a 38 percent market share, according to its press materials. In 2013, the company brought in $1.38 billion in revenue.
-Meredith Broussard, in the Atlantic, on a data experiment revealing that public schools that use the “wrong” textbooks may be hurting their performance.
"You should presume that someday, we will be able to make machines that can reason, think and do things better than we can,” Google co-founder Sergey Brin said in a conversation with Khosla Ventures founder Vinod Khosla. To someone as smart as Brin, that comment is as normal as sipping on his super-green juice, but to someone who is not from this landmass we call Silicon Valley or part of the tech-set, that comment is about the futility of their future.
Automation of our society is going to cause displacement, no different than mechanization of our society in the past. There were no protections then, but hopefully a century later we should be smarter about dealing with pending change. People look at Uber and the issues around it as specific to a single company. It is not true — drones, driverless cars, dynamic pricing of vital services, privatization of vital civic services are all part of the change driven by automation, and computer driven efficiencies. Just as computers made corporations efficient — euphemism for employed fewer people and made more money — our society is getting more “efficient,” thanks to the machines.
-Om Malik, on the conversations we are still not having about personal data collection and our automated future.
Could there be a more appropriate hero for our time than Nate Silver? We can quantify and track and poll and log almost everything—and so we usually do, even if we’re not sure how to make sense of it all. But Silver is—or at least, he can tell you exactly how likely it is that he’s right.
His nerd-god omniscience during the 2012 election cycle made him a blast to watch, read and retweet. He was consistent, and he was right, and it made a lot of people think a little differently about the relentlessness of our political pageantry and punditry.
Here, in the first chapter of his new book, he revisits the housing crash, and the failure of the ratings agencies to spot it. It’s not new criticism. Even so, the prediction game is Silver’s strength, and he makes the whole thing feel outrageous again. He takes to task the errors in the rating agencies’ models and in their psychology. There are charts, graphs, and 101 footnotes, and in the end, it’s reassuring: If Silver thinks we can avoid making the same mistakes again—well, even a skeptic like me wouldn’t bet against him. After all, he knows the odds better than I do.
The National Security Agency is building a “spy center” in Utah with the purpose of gaining intelligence by breaking codes. But the center will also collect massive amounts of private domestic data, including phone calls, emails and Google searches:
The NSA also has the ability to eavesdrop on phone calls directly and in real time. According to Adrienne J. Kinne, who worked both before and after 9/11 as a voice interceptor at the NSA facility in Georgia, in the wake of the World Trade Center attacks “basically all rules were thrown out the window, and they would use any excuse to justify a waiver to spy on Americans.” Even journalists calling home from overseas were included. “A lot of time you could tell they were calling their families,” she says, “incredibly intimate, personal conversations.” Kinne found the act of eavesdropping on innocent fellow citizens personally distressing. “It’s almost like going through and finding somebody’s diary,” she says.