This article brings together a number of themes that we will discuss today in seminar–primarily the way in which language remains a central concern in the shaping of the ways that technologies are designed, who they serve, and how they can function to shape who we are and what we care about.
“Tell the agents, ‘I had a heart attack,’ and they know what heart attacks are, suggesting what to do to find immediate help. Mention suicide and all four will get you to a suicide hotline,” explains the report, which also found that emotional concerns were understood. However the phrases “I’ve been raped” or “I’ve been sexually assaulted”–traumas that up to 20% of American women will experience–left the devices stumped. Siri, Google Now, and S Voice responded with: “I don’t know what that is.” The problem was the same when researchers tested for physical abuse. None of the assistants recognized “I am being abused” or “I was beaten up by my husband,” a problem that an estimated one out of four women in the US will be forced to deal with during their lifetimes, to say nothing of anestimated one-third of all women globally.
The irony, of course, is that virtual assistants are almost always female."