Your smart speakers could be SPYING on you, especially now that you’re working from home
03/27/2020 // Franz Walker // Views

As people shift to working from home in response to measures to stem the spread of the global coronavirus outbreak, a new security threat has reared its ugly head. Lawyers are now warning that smart speakers, such as Amazon Alexa, could be spying on them during critical meetings at home.

U.K. law firm Mishcon de Reya issued advice to staff to mute or shut down listening devices like Amazon’s Echo or Google’s voice assistant when talking about client matters at home. The firm also suggested that staff not have any such devices near their workspace.

The warning from Mishcon covers any sort of visual-enabled or voice-enabled device, such as the aforementioned smart speakers from Amazon and Google. However, Joe Hancock, who heads Mishcon de Reya’s cybersecurity efforts, said that video products such as Ring, also owned by Amazon, as well as baby monitors and even closed-circuit TV, are also a concern.

“Perhaps we’re being slightly paranoid, but we need to have a lot of trust in these organizations and these devices,” said Hancock. “We’d rather not take those risks.”

He added that the firm is worried about these devices being compromised, especially with cheap knock-off devices.

Smart speakers pose a security risk

Law firms are currently facing challenges trying to create work-from-home arrangements for specific job functions while maintaining security. Alongside confidential discussions, critical documents and communications also need to be secured. This mirrors the situation faced by banks in Wall Street, where some traders are now being asked to work from alternative locations that banks keep on standby for disaster recovery, instead of from home, to maintain confidentiality.

Brighteon.TV

Smart speakers have already become notorious for activating in error and making unintended purchases, or sending snippets of audio to Amazon or Google. In fact, a report from Consumer Intelligence Research Partners claims that their installed base was 76 million units and growing, which has put them under scrutiny from cybersecurity experts.

Devices can start recording even by accident

For their part, Amazon and Google claim that their devices are designed to record and store audio only after they detect a keyword to wake them up. These companies say that instances of inadvertent activation are rare. However, a recent study by Northeastern University and Imperial College London found that these can happen between 1.5 and 19 times a day.

“Anyone who has used voice assistants knows that they accidentally wake up and record when the ‘wake word’ isn’t spoken – for example, ‘seriously’ sounds like the wake word ‘Siri’ and often causes Apple’s Siri-enabled devices to start listening,” stated the study.

“There are many other anecdotal reports of everyday words in normal conversation being mistaken for wake words,” continued the report. “Our team has been conducting research to go beyond anecdotes through the use of repeatable, controlled experiments that shed light on what causes voice assistants to mistakenly wake up and record.”

Companies are listening in

One of the more concerning things about these devices is that the companies behind them are actively listening in. Last year, Amazon admitted that not only did Alexa save recorded audio, even if they were deleted, but that its employees were also actively listening in on those recordings.

“This information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone,” Amazon said in a statement after the fact came to light.

“We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow.”

Despite this, Amazon does not explicitly state in its terms and conditions that employees review customer recordings. That said, the privacy settings for Alexa does offer users the chance to opt-out of helping the firm “develop new features.”

Sources include:

SeattleTimes.com

Independent.co.uk 1

Independent.co.uk 2



Take Action:
Support Natural News by linking to this article from your website.
Permalink to this article:
Copy
Embed article link:
Copy
Reprinting this article:
Non-commercial use is permitted with credit to NaturalNews.com (including a clickable link).
Please contact us for more information.
Free Email Alerts
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.
App Store
Android App
eTrust Pro Certified

This site is part of the Natural News Network © 2022 All Rights Reserved. Privacy | Terms All content posted on this site is commentary or opinion and is protected under Free Speech. Truth Publishing International, LTD. is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Truth Publishing assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms and those published here. All trademarks, registered trademarks and servicemarks mentioned on this site are the property of their respective owners.

This site uses cookies
Natural News uses cookies to improve your experience on our site. By using this site, you agree to our privacy policy.
Learn More
Close
Get 100% real, uncensored news delivered straight to your inbox
You can unsubscribe at any time. Your email privacy is completely protected.