Privacy is dead. Ask Alexa!

Luis Ospina (lospina@ketchum.edu) is the Compliance Officer at Marshall B. Ketchum University-Ketchum Health in Anaheim, CA.

An individual needing assistance is excited about using Alexa for his medication refill needs. No more phone calls to the pharmacist; no more emails to his personal care provider (PCP); and best of all, no more trips to the pharmacy to get the medication he needs. He just sends a command to Alexa, asking “her” to place a refill of his prescription. This technology is impressive, however, nothing out of ordinary, knowing the capability of technology today. The scary part comes with what Alexa could do next. Based on an algorithm Amazon engineers are testing, Alexa will eventually be able to identify patterns in the user’s voice to determine, with a high degree of accuracy, the mood of the user. This new algorithm uses emotional intelligence in addition to artificial intelligence (AI) models in the coding, empowering Alexa to make quick decisions on the user’s behalf.

Depending on how stressed or anxious she “feels” the voice of the user is, Alexa will connect the dots, and in this particular case, will call the user’s PCP to notify him that this patient is having an episode of anxiety, or might contact 911 to address a distressed individual who might be ready to commit suicide. All of this without consent from the individual, and even without the user knowing that this is actually happening — until the paramedics and police arrive at the door. Perhaps when this article is published, this story could be already outdated, because another, more impressive feature in Alexa’s arsenal was developed and implemented.

This document is only available to members. Please log in or become a member.
 


Would you like to read this entire article?

If you already subscribe to this publication, just log in. If not, let us send you an email with a link that will allow you to read the entire article for free. Just complete the following form.

* required field