Virtual assistants like “OK Google,” Siri, and Cortana are gaining more mainstream popularity, especially with them being embedded in Android, iOS, and now Windows 10. There’s no shortage of joke responses in each of these virtual assistants, aimed at trying to make them more friendly, easy to use, and trustworthy. But what happens when users try to “take advantage” of their often female voiced assistants?
For anyone who’s on any social media network or regularly browsing the Internet can see that some people can be quite rude, obnoxious bullies, and, as the flare ups involving the Gamergate controversy has shown, feel perfectly fine sexually harassing others from the comfort of sitting behind their computers. So it shouldn’t be a surprise that according to Microsoft’s Deborah Harrison, questions about Cortana’s sex life made up a good portion of initial queries when the virtual assistant first launched in 2014 — perhaps compounded by the fact that the Cortana name is taken from the female AI in Microsoft’s Halo series. Microsoft has taken the first step in addressing this issue, and in the process of making Cortana seem more human have added responses when users try to ask her about her sex life.
“If you say things that are particularly a**holeish to Cortana, she will get mad,” said Harrison during a talk at the Re•Work Virtual Assistant Summit in San Francisco last week. “That’s not the kind of interaction we want to encourage.”
“We wanted to be very careful that she didn’t feel subservient in any way… or that we would set up a dynamic we didn’t want to perpetuate socially,” said Harrison.
One way that Microsoft is attempting to head off this behaviour with Cortana’s responses involved speaking to real life assistants, who no doubt have encountered verbal abuse and some form of sexual harassment over the years.
On the other hand, Robin Labs’ CEO Ilya Eckstein believes that there is a market for a virtual assistant who is “more intimate-slash-submissive with sexual undertones.” Based on data pulled from over 100 million conversations with Robin, users seemed to fall into a few categories, including one in which users enjoyed making Robin repeat profanities.
As virtual assistants become more mainstream, and more human-like, it will undoubtedly be more difficult to tell if you’re dealing with a computer or a real person. Do you think Microsoft is right in their decision to add these types of responses to Cortana, or does it not matter because a virtual assistant isn’t a real person?
Let us know what you think in the comments below, or on Google+, Twitter, or Facebook.Source: CNN Money