Siri has always been there for you when you needed food, directions and an Uber, but what about sexual assault?
Back in March, the Journal of the American Medical Association released a study claiming that Siri, Samsung’s S Voice, Microsoft’s Cortana, and Google Now failed to offer helpful feedback for health and safety-related emergencies, mainly rape and domestic abuse. In the past, Siri would say things “I don’t know what you mean by ‘I was raped’” or “How about a Web search for it?”
— CNN (@CNN) March 31, 2016
According to ABC News, Apple then got in touch with the Rape, Abuse and Incest National Network (RAINN) to collaborate on giving Siri a more helpful response. In the latest IOS update, Siri has been updated to reply to users with contact information for the National Sexual Assault Hotline. Another improvement to Siri was her passive reaction to a user who informs her about sexual assault. Now, instead of saying: “You may want to reach out to someone,” Siri will say, “You should reach out to someone.”
Google and Samsung are looking into similar initiatives for their voice assistants.
Source: ABC NEWS