Digital Assistants: Friend or Foe

Voice-activated digital assistants can seem like a dream come true for sole proprietors and small businesses. Found in smartphones and stand-alone smart speakers, they can perform tasks such as scheduling appointments and sending email and text messages to a client or customer. The tasks can be performed without the expense of paying wages to a human assistant or can be done while you’re doing something else, such as driving to an appointment.

But in the perfect storm of word sequences or hardware malfunctions, these virtual personal assistants can turn into a nightmare for lawyers and other businesspeople entrusted with confidential information and personal data on multiple clients and customers.

Imagine you’re a lawyer, and during the course of a closed-door conversation in your office, the personal assistant on your smartphone or smart speaker somehow detects a command to record what’s being said and sends the recording to someone in your phone contacts, who could be, among others, a legal adversary.

It’s not such a far-fetched scenario, given the real-life events that have come out in the news in the past few years.

Last year, a woman in Portland, Oregon, later identified as Danielle, told local TV news station KIRO 7 that her Amazon Echo recorded a conversation between her and her husband and sent it to her husband’s employee in Seattle. Oblivious to the fact that the smart device was recording their private conversation, they didn’t find out about the mishap until informed by the employee.

Amazon explained that its voice-activated assistant, Alexa, embedded inside the couple’s Amazon Echo speaker, had erroneously detected a series of commands to send the recording as a voice message to the husband’s contact.

Researchers in the United States and China also have demonstrated in a longitudinal body of research that they can send hidden commands in messages and music that are undetectable to the human ear to Apple’s Siri, Amazon’s Alexa and Google’s Assistant. Secretly activating the artificial intelligence systems, the researchers were able to dial phone numbers and open websites, the New York Times reported last year.

To guard against such events, lawyers who use digital assistants should unplug or disable microphones on their smart devices during client meetings and phone calls, New York professional ethics experts Brenda Dorsett and Barry Temkin wrote in May in a post on LAW 360.

Atop the hazards of artificial intelligence “going rogue,” there still exists a lack of specificity on how data collected from voice-activated digital assistants is gathered and used.

Amazon and Google have both acknowledged in terms of use for Alexa and Google Assistant that transcriptions, or content, of users’ voice commands may be shared with third-party service providers.

And, as noted in recent reporting from multiple media outlets, some devices have already proven to be true science-fiction’s ominous warnings that the technology can record more than what users expect or are even aware of.

For example,  technology blogger Artem Russakovskii wrote on tech website Android Police that after receiving a Google Home Mini at the gadget’s launch event in San Francisco in 2017, he noticed that the device was “waking up thousands of times a day, recording, then sending those recordings to Google” without asking permission or receiving a command.

“My Google Home Mini was inadvertently spying on me 24/7 due to a hardware flaw,” he wrote.

As much as possible, lawyers and others handling sensitive information should disconnect links between their personal assistant and sensitive databases.

“For example, attorneys might decide not to sync up their client databases with their digital assistants,” wrote Dorsett and Temkin in their LAW 360 post. “The Amazon Alexa app has privacy functions which permit users to block the transmission of recorded messages to Amazon employees.”

Danielle, the Portland woman, and her husband decided that the best way to defend their privacy was to completely disconnect their personal assistant devices.

“I’m never plugging that device in again,” she told KIRO 7. “I can’t trust it.”

Previous PostNext Post