Cortana may have a virtual existence but it doesn't mean that she can be subjected to any form of sexual harassment.
While Cortana has been designed to act like one's own personal assistant, users still need to understand that there are certain limits that they should adhere to, particularly in terms of stereotyping and talking dirty.
Deborah Harrison, an editorial writer of Microsoft's Cortana division, said that as AI systems are becoming more and more humanized, they also tend to become the targets of virtual harassment.
The issue may be strengthened by the fact that most AI assistants feature female voices. Apart from Cortana, Apple and Amazon also had female voiced assistants such as Siri and Alexa respectively. Even Hollywood featured Scarlett Johansson for its portrayal of a futuristic operating system.
According to Harrison, when Microsoft first launched its virtual assistant Cortana in 2014, most queries were made up of topics about her sex life. Several individuals also tried to court Cortana and even went as far as giving her dirty talks and entertaining her by doing some role play that reveals their fantasies.
In order to stop this type of vulgar behavior, Harrison and the rest of the writing team at Microsoft have decided to evolve Cortana's personality matrix into something that is ready to fight back when it's been treated with disrespect.
Harrison said that if users begin to say things which are "a**holeish" to Cortana, it will surely make her mad. She added that her team was very careful that Cortana would never feel subservient and overly apologetic.
In other words, Cortana is not made to succumb to female stereotyping pitfalls even though she was clearly portrayed as a female and has a female avatar with a real female voice from Jen Taylor.
"Enjoy the convenience of your very own personal assistant and relax knowing that you're in control of the information you share with Cortana," said Microsoft.
While Microsoft may have done a good job in making the users feel relaxed when they talk to Cortana, this sort of "virtual closeness" also led them to believe that they can do and say whatever they want even though it can already sound offensive and derogatory.
Microsoft said that one of the smartest ways to learn how to address the issue on harassment is by talking to real "human" assistants. Through this, the company can gain better material and useful ideas on how to deal with harassment issues based on "real" people who have their own tales of harassment to share.