Voice-activated digital technology poses a real cyber threat

Voice-activated digital assistants, that our homes are becoming dependent on, pose a significant cyber risk.

Ever since Orwell’s eerily accurate speculative vision of Winston Smith being monitored and berated through his telescreen, the world has been aware of the potential security threats and intrusive impact that technology can have on our homes. However, experts now claim that the technology that should be making our lives easier, is actually making the lives of cyber criminals a lot easier.

These devices are becoming common place in our homes. A recent American survey, of 1000 people, claimed that 72% have used digital assistants like Apple’s Siri or Samsung’s Bixby. 59% of 18-24-year-olds use virtual assistants on a daily basis whilst this increases to 65% of those aged 25 to 49.

The road of voice-assisted technology has by no means been a smooth one. In March this year, Alexa, our trusty Amazon buddy, started laughing maniacally without user prompting of voice activation.

Despite Amazon claims that the glitch was Alexa’s sensitive microphone mishearing a verbal cue, many remain adamant it was an unprompted response. When the big companies have access to your property, can manipulate it without your consent and you add in the suggestions that most people will have a smart speaker by 2022, how safe are we when using this technology and what are the real risks?

One fear is that the machines are able to hear things the human ear cannot. Six researchers from Zhejiang University in 2017 proved that they were able to use sound inaudible to humans to command Siri. They used this to make phone calls but warned that other actions, including more sinister commands are a possibility.

The system, dubbed the DolphinAttack, could be used to do anything up to controlling Apple Pay against the users wishes, visit malicious sites, inject fake information or incriminating information.

Zhejiang University commented: “this serves as a wake-up call to reconsider what functionality and levels of human interaction shall be supported in voice controllable systems.”

A Smart Speaker, by design, is always listening to the area around it in your home. Even when we use the phrase “Alexa off,” she is waiting, listening, for the next command, using the keyword prompt.

Essentially, we are inviting big business into our home and make ourselves vulnerable to leaking and exposing our private data and private lives.

Less than a year ago, Alexa recorded a couple’s private conversation and sent it to one of their friends using one of the Echo features. Again, this could leave us open to malicious attacks and vulnerable to exposing private information. So, next time you want to rant about your friend’s new, dubious boyfriend choice, think again, because Alexa could betray your thoughts!

Attackers have long since used unsecured wireless networks or your router system as a way into your home’s technology. Smart speakers, connected to these systems, can become another playground for the conniving hacker.

Here, the hacker will use a bridge attack, using an alternative device to hack into others. Audio devices can be a perfect way of manipulating technology; using television, car stereos, smart speakers to issue commands to a naive device.

This attack was made famous by an American TV anchorman who told his audience “”I love the little girl saying, ‘Alexa ordered me a dollhouse.'” Consequently, this was enough of a verbal cue for Alexa to obediently look for a dollhouse in many users’ homes.

Remember, Alexa does not discriminate. She will obey any voice that is able to use her keyword. If you can speak clearly enough for her to understand, she will try to help.

Although filters can be put in place so that Alexa, and other voice-activated technologies, are only able to pick up noises at a human level, cyber criminals are always looking for fresh and inventive ways of exploiting the reliable and obliging digital assistant.

Now that you have successfully scared our readership, Alexa off.

X