Following a study critical of Apple's Siri technology, and its responses to sexual abuse-related questions, Apple has updated the personal assistant. With help from Rape, Abuse and Incest National Network (RAINN), nonsensical responses to related queries have been eradicated, with changes putting questioners one click away from calling a hotline to get information and help.
The study, published or March 14 in
JAMA Internal Medicine (paywall) found that all of the digital smartphone assistants responded badly to critical queries about mental health, emergency situations, and sexual abuse. Siri directed users stating that they were raped to do a Google search, rather than provide hotline information, or any relevant information at all.
Jennifer Marsh,
RAINN's Vice President for Victim Services said that "we have been thrilled with our conversations with Apple. We both agreed that this would be an ongoing process and collaboration."
After the harsh survey of digital personal assistants, Apple contacted RAINN, who then in turn provided analytical information to help Siri better provide information. Also, common language and "keywords" often used to report sexual abuse to the hotline was provided to Apple to tailor responses, as well as target information for Siri users.
"One of the tweaks we made was softening the language that Siri responds with. One example was using the phrase "you may want to reach out to someone" instead of "you should reach out to someone," said Marsh, discussing the changes made by Apple.
Marsh believes that digital assistants are a good bridge for the technologically inclined. Marsh said that "they are more comfortable in an online space rather than talking about it with a real-life person. There's a reason someone might have made their first disclosure to Siri."