Search Results for: innovation

The Battery Breakthrough That Could Juice U.S. Manufacturing

In a new report, McKinsey describes a broad new age of manufacturing that it calls Industry 4.0. The consulting firm says the changes under way are affecting most businesses. They are probably not “another industrial revolution,” it says, but together, there is “strong potential to change the way factories work.”

For decades, the US has watched its bedrock manufacturing industries wither away, as they’ve instead grown thick in Japan, in South Korea, in China, Taiwan and elsewhere in Asia. According to the Economic Policy Institute, the US lost about 5 million manufacturing jobs just from 1997 to 2014. This includes the production of lithium-ion batteries, which, though invented by Americans, were commercialized in Japan and later South Korea and China.

So Chiang’s innovation could be a poster-child for a new strain of thinking in the US. This says that, while such industries are not likely to return from Asia, the US can possibly reinvent how they manufacture. The country wouldn’t take back nearly as many jobs as it has lost. But there could be large profits, as the country once again moves a step ahead in crucial areas of technology.

To be clear, this is not Chiang’s goal. He is a professed universalist, divorced from scientific realpolitik. But should he succeed, as he plans to, then in addition to helping to decode the perplexing problem of batteries, he might contribute to continuing America’s political and economic dominance.

—Steve LeVine, Washington correspondent for Quartz and author of The Powerhouse: Inside the Invention of a Battery to Save the Worldexplains how Yet-Ming Chiang’s startup 24M is reinventing lithium-ion battery manufacturing, potentially making the devices able to compete on cost with gasoline.

Read the story

What Happens When We Run Out of Jobs?

After 300 years of breathtaking innovation, people aren’t massively unemployed or indentured by machines. But to suggest how this could change, some economists have pointed to the defunct career of the second-most-important species in U.S. economic history: the horse.

For many centuries, people created technologies that made the horse more productive and more valuable—like plows for agriculture and swords for battle. One might have assumed that the continuing advance of complementary technologies would make the animal ever more essential to farming and fighting, historically perhaps the two most consequential human activities. Instead came inventions that made the horse obsolete—the tractor, the car, and the tank. After tractors rolled onto American farms in the early 20th century, the population of horses and mules began to decline steeply, falling nearly 50 percent by the 1930s and 90 percent by the 1950s.

Humans can do much more than trot, carry, and pull. But the skills required in most offices hardly elicit our full range of intelligence. Most jobs are still boring, repetitive, and easily learned. The most-common occupations in the United States are retail salesperson, cashier, food and beverage server, and office clerk. Together, these four jobs employ 15.4 million people—nearly 10 percent of the labor force, or more workers than there are in Texas and Massachusetts combined. Each is highly susceptible to automation, according to the Oxford study.

Technology creates some jobs too, but the creative half of creative destruction is easily overstated. Nine out of 10 workers today are in occupations that existed 100 years ago, and just 5 percent of the jobs generated between 1993 and 2013 came from “high tech” sectors like computing, software, and telecommunications. Our newest industries tend to be the most labor-efficient: they just don’t require many people. It is for precisely this reason that the economic historian Robert Skidelsky, comparing the exponential growth in computing power with the less-than-exponential growth in job complexity, has said, “Sooner or later, we will run out of jobs.”

—In “A World Without Work,” Atlantic senior editor Derek Thompson argues it’s time to plan for a future in which machines, from driverless cars to operating room robots, do most of our current jobs.

Read the story

How Apple’s Transcendent Chihuahua Killed the Revolution

Wreckage of the Zeppelin LZ4 after the crash in Echterdingen. Photo: Wikipedia Commons

Ian Bogost | from The Geek’s Chihuahua | University of Minnesota Press | April 2015 | 22 minutes (5,539 words)

 

The following is an excerpt from Ian Bogost’s book The Geek’s Chihuahua, which addresses “the modern love affair of ‘living with Apple’ during the height of the company’s market influence and technology dominance,” and how smartphones created a phenomenon of “hyperemployment.”

***

Think back to 2007, when you got the first iPhone. (You did get one, didn’t you? Of course you did.) You don’t need me to remind you that it was a shiny object of impressive design, slick in hand and light in pocket. Its screen was bright and its many animations produced endless, silent “oohs” even as they became quickly familiar. Accelerometer-triggered rotations, cell tower triangulations (the first model didn’t have GPS yet), and seamless cellular/WiFi data transitions invoked strong levels of welcome magic. These were all novelties once, and not that long ago.

What you probably don’t remember: that first iPhone was also terrible. Practically unusable, really, for the ordinary barrage of phone calls, text messages, mobile email, and web browsing that earlier smartphones had made portable. And not for the reasons we feared before getting our hands on one—typing without tactile feedback wasn’t as hard to get used to as BlackBerry and Treo road warriors had feared, even if it still required a deliberate transition from t9 or mini-keyboard devices—but rather because the device software was pushing the limits of what affordable hardware could handle at the time.

Applications loaded incredibly slowly. Pulling up a number or composing an email by contact name was best begun before ordering a latte or watering a urinal to account for the ensuing delay. Cellular telephone reception was far inferior to other devices available at the time, and regaining a lost signal frequently required an antenna or power cycle. Wireless data reception was poor and slow, and the device’s ability to handle passing in and out of what coverage it might find was limited. Tasks interrupted by coverage losses, such as email sends in progress, frequently failed completely.

The software was barebones. There was no App Store in those early days, making the iPhone’s operating system a self-contained affair, a ladleful of Apple-apportioned software gruel, the same for everyone. That it worked at all was a miracle, but our expectations had been set high by decades of complex, adept desktop software. By comparison, the iPhone’s apps were barebones. The Mail application, for example, borrowed none of its desktop cousin’s elegant color-coded, threaded summary view but instead demanded inexplicable click-touches back and forward from folder to folder, mailbox to mailbox. Read more…

Technology, Privacy, and Searchable Text

The Defense Department, through its Defense Advanced Research Projects Agency (DARPA), started funding academic and commercial research into speech recognition in the early 1970s.

What emerged were several systems to turn speech into text, all of which slowly but gradually improved as they were able to work with more data and at faster speeds.

In a brief interview, Dan Kaufman, director of DARPA’s Information Innovation Office, indicated that the government’s ability to automate transcription is still limited.

***

Experts in speech recognition say that in the last decade or so, the pace of technological improvement has been explosive. As information storage became cheaper and more efficient, technology companies were able to store massive amounts of voice data on their servers, allowing them to continually update and improve the models. Enormous processors, tuned as “deep neural networks” that detect patterns like human brains do, produce much cleaner transcripts.

And the Snowden documents show that the same kinds of leaps forward seen in commercial speech-to-text products have also been happening in secret at the NSA, fueled by the agency’s singular access to astronomical processing power and its own vast data archives.

In fact, the NSA has been repeatedly releasing new and improved speech recognition systems for more than a decade.

Dan Froomkin writing for The Intercept about how the NSA converts spoken words into searchable text.

Read the story

How To Be, In Silence

Dalai Lama at Thomas Merton's grave. Photo by Jim Forest, Flickr

The social world, for all of its fundamental gifts — love, empathy, the lessons arguing provides — obscures the whole self, allowing each of us to mute what is harder to absorb about ourselves in a din of habit and distraction. When an artist breaks through that din, which seems to grow ever louder, she reflects solitude’s crisis: the challenge of being, unmasked.

“I wanted to be quiet in a nonquiet situation,” the composer John Cage wrote in 1948, while he was still formulating a solution that would eventually lead to his famous innovation of writing music with no notes at all. In 1949, the most famous monk of the last century — Thomas Merton — lamented that even cloistered religious people had become too conscious of what their renunciations might do, keeping silence as a form of payback for all the clatter in the world, instead of accessing the real self that was no self, that couldn’t show off by fasting or rising at midnight to sing. In 1961, as part of a dialogue with the Zen master D.T. Suzuki, Merton found it necessary to remind the era’s many spiritual seekers that Paradise, if not Heaven, was a place on earth that could only be achieved by ceasing the constant reactivity that had become the human condition, “the emptiness and purity of heart which had belonged to Adam and Eve in Eden,” where they sought “paradise within themselves, or rather above and beyond themselves.” This was the same goal the secular pilgrim Cheryl Strayed sought when she walked 1100 miles alone up the Pacific Crest Trail in 1994. She found liberation from self while lost above the treeline, shouting into silence she ultimately couldn’t affect, realizing, was she wrote in her memoir, “Everything but me seemed utterly certain of itself. The sky didn’t wonder where it was.”

Ann Powers, writing for NPR about how musicians confront solitude. Her piece uses recent albums by Kendrick Lamar and Sufjan Stevens as a lens to explore the subject.

Read the story

A Very Naughty Little Girl

Illustration by Kjell Reigstad

Rose George | Longreads | March 2015 | 21 minutes (5,358 words)

 

 

She was a name on a plaque and a face on a wall. I ate beneath her portrait for three years and paid it little attention except to notice that the artist had made her look square. There were other portraits of women to hold my attention on the walls of Somerville, my Oxford college: Indira Gandhi, who left without a degree, and Dorothy Hodgkin, a Nobel prize-winner in chemistry. In a room where we had our French language classes, behind glass that was rumored to be bulletproof, there was also a bust of Margaret Thatcher, a former chemistry undergraduate. Somerville was one of only two women’s colleges of the University of Oxford while I was there, from 1988 to 1992, and the walls were crowded with strong, notable women. (The college has since gone co-ed.) Read more…

How Oregon’s Second Largest City Vanished in a Day

Longreads Pick

Intended as temporary solution for Portland’s wartime housing shortage, Vanport housed 40,000 residents at its height, making it the second largest city in Oregon. In a few short years the community went from a shining example of American innovation to a crime-laden slum, largely due to discriminatory housing policies. Ultimately, a natural disaster would spell the end for Vanport, but the community’s legacy remains a dark chapter in Portland’s discriminatory history.

Source: Smithsonian
Published: Feb 18, 2015
Length: 14 minutes (3,661 words)

The Art and Science of Failure

We are excited to share a reading (and watching!) list on science and failure from guest contributor Louise Lief. In 2014 Louise Lief began the Science and the Media project, an initiative that explores how science relates to our everyday lives. She is the former deputy director of the International Reporting Project. Read more…

The Rise of Joan of Arc: How a Visionary Peasant Girl Defied a Dress Code and Challenged the Patriarchy

Albert Lynch, "Jeanne d'Arc"

Kathryn Harrison | Joan of Arc: A Life Transfigured | Doubleday | October 2014 | 29 minutes (7,119 words)

 

Below is an excerpt from the book Joan of Arc: A Life Transfigured, by Kathryn Harrison, as recommended by Longreads contributor Dana Snitzky. Read more…

Longreads Best of 2014: Business Writing

We asked a few writers and editors to choose some of their favorite stories of the year in specific categories. Here, the best in business writing.

* * *

Max Chafkin
Writer focusing on business and technology.

Schooled (Dale Russakoff, New Yorker)

This piece explores the failed attempt by Mark Zuckerberg and Corey Booker, among others, to fix Newark’s schools—and in doing so makes clear just how hard education reform is. Most shockingly, it exposes the huge sums of money spent by the city and its supporters on education consultants who managed to extract huge fees without, apparently, doing a whole lot. It’s pretty hard to make a dense story about education reform read well, but Russakoff amazingly manages it, while managing to be fair and incisive. Read more…