Should We Really Confide in Siri?

LONDON, ENGLAND - OCTOBER 14: A man uses 'Siri' on the new iPhone 4S after being one of the first customers in the Apple store in Covent Garden on October 14, 2011 in London, England. (Photo by Oli Scarff/Getty Images)

As it turns out, people don’t just ask Siri to do metric-imperial conversions or find the nearest burger — they also share their deepest, most private feelings. At Aeon, Judith Duportail and Polina Aronson explore not only how artificial intelligence is shaping (or perhaps misshaping) human growth and experience, but how cultural norms affect what constitutes a proper response from a robot and how developers need to be vigilant about the information they’re feeding the machines in order to learn.

In September 2017, a screenshot of a simple conversation went viral on the Russian-speaking segment of the internet. It showed the same phrase addressed to two conversational agents: the English-speaking Google Assistant, and the Russian-speaking Alisa, developed by the popular Russian search engine Yandex. The phrase was straightforward: ‘I feel sad.’ The responses to it, however, couldn’t be more different. ‘I wish I had arms so I could give you a hug,’ said Google. ‘No one said life was about having fun,’ replied Alisa.

This difference isn’t a mere quirk in the data. Instead, it’s likely to be the result of an elaborate and culturally sensitive process of teaching new technologies to understand human feelings. Artificial intelligence (AI) is no longer just about the ability to calculate the quickest driving route from London to Bucharest, or to outplay Garry Kasparov at chess. Think next-level; think artificial emotional intelligence.

Read the story