According to a new report by market research experts Canalsys, there will be more than 200 million smart speakers in homes across the world by the end of this year. Devices like Amazon Alexa, Apple HomePod and Google Home were extremely popular Christmas gifts last year – and probably this year too.
The ability to stream millions of songs, answer basic questions and build shopping lists has helped establish digital assistants as a useful addition to our homes. But questions about privacy and security continue to cause problems.
Always on, always listening?
Smart speakers are designed to respond to specific keywords, so they know when you are talking to them. Say the trigger word (“Hey Siri”, “Alexa”, “OK Google”) and the speaker knows that the following words will be an instruction. Which means that the speaker really is always listening.
The good news is that only the words spoken after the trigger word are uploaded and stored in the service provider’s cloud. This may not always be the case however – Google already owns a patent for technology that monitors background sounds in order to serve advertisements to the device owner. And Amazon filed a similar patent just last year.
It would be relatively easy to integrate this feature into smart speakers at some point in future. So it may be that privacy is about to become much harder to protect.
Is your voice data already being misused?
Just last month, reports surfaced that Amazon employees have been accessing private recordings created by Alexa owners. Amazon insist that these recordings are being checked to ensure Alexa is responding to commands correctly.
But Amazon employees are also being asked to verify accidental recordings where Alexa has responded incorrectly. Bloomberg claims that these recordings included a woman singing in the shower and a child screaming for help. Unlike Apple’s voice analysis services, Amazon recordings are not anonymous– workers listening to recordings know the name of the person who owns the Echo speaker.
How to protect your Alexa recordings
Unfortunately there is no way to prevent your Alexa recordings being reviewed by Amazon. You can however delete them; if you delete them quickly enough, Amazon will not have an opportunity to listen to them.
You can access (and listen to) all of your recordings online in the Alexa Privacy Settings webpage. All of the recordings have to be deleted manually – there is no way to set them to delete automatically. You will need to remember to log in and delete recordings regularly. There are some more instructions in our blog post, Your Virtual Assistant Knows Quite a Lot about You.
Stay safe
Whether you own an Amazon Echo or an Apple HomePod, you still need to keep yourself safe. Unfortunately, this means you will probably have to be more careful about what you say at home – and never say anything that sounds like the trigger word unless you do want to talk to your digital assistant.
Voice assistants are still an emerging technology, and tech giants like Amazon, Google and Apple are still trying to strike the right balance between functionality and privacy. Given the negative response to this latest Amazon data sharing incident, we should (hopefully) see these companies take action sooner rather than later.
1 comment
Looks great. Very informative