Siri has always been there for you when you needed food, directions and an Uber, but what about sexual assault?

Back in March, the Journal of the American Medical Association released a study claiming that Siri, Samsung’s S Voice, Microsoft’s Cortana, and Google Now failed to offer helpful feedback for health and safety-related emergencies, mainly rape and domestic abuse. In the past, Siri would say things “I don’t know what you mean by ‘I was raped’” or “How about a Web search for it?”

According to ABC News, Apple then got in touch with the Rape, Abuse and Incest National Network (RAINN) to collaborate on giving Siri a more helpful response. In the latest IOS update, Siri has been updated to reply to users with contact information for the National Sexual Assault Hotline. Another improvement to Siri was her passive reaction to a user who informs her about sexual assault. Now, instead of saying: “You may want to reach out to someone,” Siri will say, “You should reach out to someone.”

Google and Samsung are looking into similar initiatives for their voice assistants.

Never miss a story. Follow us on Twitter and Instagram, @BlameEbro and like our Facebook page, here.

Source: ABC NEWS

About The Author Tatyana Jenene

Birds in the Trap Sing Aaron Hall.

comments (0)

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>