Posted by - January 24, 2017

Arkansas police are hoping they can use an Echo found at a murder scene, and its recordings, to help with the investigation of a murder.  Echoes only begin recording after hearing the wake word, but background noise/chatter could have activated the device. 

Amazon stores all the voice recordings from its devices on its servers.  As a user, you can delete your personal voice data, but there’s no way to prevent Amazon from saving that data on their servers.  Amazon has said they do not release customer information without a “valid and binding legal demand”.

While this might all not sound like much for the average user, just remember that the Echo could possibly be picking up any background conversations you might be having, including you talking about personal information such as credit card numbers, addresses, social security numbers, or any other self-identifying data.

Be aware of the cache on your Echo and frequently delete those files.  While there have been no cases yet of mass hacking of these devices, you do not want to make things easier for criminals to have access to your personal data by leaving the information on your Echo.  This is especially true if you utilize one at your place of business. 

Leave a Reply

Your email address will not be published. Required fields are marked *