Artificial intelligence is moving into the realm of helping us manage our feelings. Apparently some people are more comfortable talking to Alexa or Siri about how they feel than they are to real person.
Interestingly enough, the response of these AI assistants reflects the culture of the people who program them, as explained by this article in Aeon:
“In September 2017, a screenshot of a simple conversation went viral on the Russian-speaking segment of the internet. It showed the same phrase addressed to two conversational agents: the English-speaking Google Assistant, and the Russian-speaking Alisa, developed by the popular Russian search engine Yandex. The phrase was straightforward: ‘I feel sad.’ The responses to it, however, couldn’t be more different. ‘I wish I had arms so I could give you a hug,’ said Google. ‘No one said life was about having fun,’ replied Alisa.”
Interesting, but not surprising. There isn’t a single human sensibility behind AI — there is simply an algorithm, and that algorithm reflects the sensibilities of the programmers.
“AI technologies do not just pick out the boundaries of different emotional regimes; they also push the people that engage with them to prioritise certain values over others. ‘Algorithms are opinions embedded in code,’ writes the data scientist Cathy O’Neil in Weapons of Math Destruction (2016). Everywhere in the world, tech elites – mostly white, mostly middle-class, and mostly male – are deciding which human feelings and forms of behaviour the algorithms should learn to replicate and promote.”
Algorithms are opinions embedded in code — opinions that are predominately white, middle class, and male. Think about that when next tempted to share your innermost feelings with Siri.