Voice Privacy Experts: Careful, We’re Bugging Ourselves
A cybersecurity executive I talked to recently raised a scenario like this:
You and your cousin swap opinions about a standout basketball player one night. The next day you get an email from your cousin’s friend. He says your cousin told him you’d like this new article about your favorite point guard, and he attaches the link.
Later you find out your working laptop’s been hacked, and some of your employer’s confidential data has been stolen. Your cousin’s “friend” was actually a cybercriminal in Eastern Europe, and by clicking the link he sent, you downloaded his malware. But how did the hacker know the details of your family banter about basketball? You were just shooting the breeze in the den with your cousin, not typing e-mails that could be intercepted.
That’s the kind of hacker exploit that could become more common with the rise of Internet of Things devices equipped with microphones that capture voices around them so they can respond to spoken commands, says Torsten George, an executive at the Sunnyvale, CA-based cybersecurity company RiskSense.
Although we’re used to the idea that smartphones have voice assistants like Apple’s Siri, an increasing number of gizmos such as thermostats, smart TVs, refrigerators, laptops, and voice-enabled hubs such as Amazon Echo and Google Home are also equipped with microphones so they can hear your orders or search queries.
Smartphone users may leave their phones on constantly now so they can give instant commands to Siri or Google Now, a voice assistant for Android devices. Plug-in home devices like Amazon Echo can sit unobtrusively, ready to listen to speech if they’re awakened, George says. In the scenario he raised above, your cousin himself may have forgotten he had one of those small voice-enabled assistants sitting on the desk in the den during your basketball chat.
“There’s not enough awareness that we’re surrounded by a forest of microphones,” George says. Home device networks can be easily hacked, he says, and voice data can come into the hands of cyberattackers.
Who’s most at risk?
At this point, the risk exposure of individuals is not as great as the danger for financial institutions, health care systems, and retailers such as Target, George says. Hackers like to break into networks where they can steal millions of records at one go. The individuals most at risk include corporate executives and military commanders, who become the cyber targets of industrial spies or nations seeking strategic information, he says.
But consumers should still be aware that when they bring voice-enabled devices into their living rooms, cars, and offices, they’re making a trade-off between convenience and the security and privacy of their households, George says.
“Twenty years ago, people were happy to have remote controls,” George says. “Now they don’t even want to push that button any longer.”
Although voice assistants are designed to wake up and listen when the user says a trigger word or phrase such as “Hello Alexa,” a hacker can carry out a remote attack on the device, and install malware that bypasses the need for a wake-up word, George says.
At RiskSense’s office, connected home hubs such as the Amazon Echo are off-limits.
“We are not allowing any Alexa in our work environment,” George says. Default settings are changed on the company’s office video equipment and apps such as Skype to keep microphones and cameras off until someone actively turns them on for a use such as a conference call, he says.
Voice recognition and transcription
Consumers may be lulled into thinking that voice-enabled assistants don’t pose much of a threat because their responses to questions can seem off-base, and even stupid, George says. Users may conclude that the voice recognition function of the device is poor, he says.
But those poor responses are due to flubs by the device’s search function, not its ability to recognize words, George says. Voice recognition accuracy can be up to 99 percent these days, he says. And a user’s commands, perhaps along with conversations, are also funneled into another process—real-time transcription.
“Your voice gets transcribed into written text,” George says. That plain text data file is then easily shared, and it’s searchable by key words, he says. The analysis adds another dimension to the value of each data point. Depending on the strength of privacy controls, it may help your device give you better answers, or help a marketer target a pitch to you.
But a transcript could also be useful for a hacker like your cousin’s “friend,” who needs intel about you to mount a social engineering attack via e-mail.
Law enforcement agencies can search a voice transcript for the mention of terms such as bombs, hacking, or Social Security numbers, George says.
Late last year, homicide investigators in Arkansas pressed Amazon to turn over voice recordings and transcripts from an Amazon Echo in a home where a murder victim was found, the New York Times reported. The law isn’t settled on the privacy rights of homeowners or the legal obligations of tech companies in such cases.
To help tech companies and consumers deal with such issues, and others related to voice interactivity, the Voice Privacy Alliance was founded last year by Alta Associates’ Executive Women’s Forum on Information Security, IT Risk Management and Privacy. The group tackles legal and policy issues, works to raise consumer awareness, and provides a toolkit for developers so they can incorporate security measures into voice-enabled products.
Artificial intelligence: the enhanced value of data points in a larger context
Devices are already getting to know us better and improving their responses through artificial intelligence analysis of our GPS locations, search histories, and other data we volunteer to … Next Page »