Hackers can use smart speakers with Alexa and Google Home to listen to users; That is the conclusion of the latest research published by experts from Security Research Labs (SRLabs), demonstrated with the development of spy apps for these platforms.
The study reveals possible problems in the application approval process for Google Home and Alexa. Specifically, we are talking about the “Actions” of Home and the “Skills” of Alexa, which allow us to develop new options for our smart speakers.
Thanks to this, it is possible to increase the possibilities of our Amazon Echo or Google Nest, with functions that were not originally available. However, according to German hackers, they can also be an important security hole.
Spy apps on Alexa and Google Home
To prove it, SRLabs developed eight applications that seem harmless to the naked eye, but once installed they are able to obtain information and audios without us noticing.
Four were “Skills” from Alexa and another four “Actions” from Google Home; and all were approved by Amazon and Google respectively. Seven of these programs were based on reciting our horoscope, and one was a random number generator. They were activated just by saying out loud questions like “Ask My Lucky Horoscope to tell me the horoscope for Taurus,” for example.
Initially the apps worked as expected, simply answering the question, and even saying goodbye with a “Goodbye”, indicating that it was over; But in reality, the app does not end, but still works in silent mode . In this way, it seems to the user that Alexa and Google Home have taken control again, when in reality the speaker is still running the app.
In this state, the app can continue using the microphone to listen to everything we say; These conversations are registered and stored on a server owned by the developer.
Another of the apps developed instead tries to deceive by showing an error, indicating that it is not available in the user’s country; A common mistake, especially with apps that serve content. Actually, the app continues to work throughout this time.
After a moment, the app uses a voice similar to that of Alexa and Google Home to warn that the device is going to be updated, and that it needs the user’s password to continue. Of course, once you get the password you can access the user’s account and steal it or get their data and information.
How did they get it
These apps are based on similar errors in Alexa and Google Home; specifically, in the way their speech synthesis engines work, whereby attendees can “talk.” Hackers indicated that he had to say the character ” .” As it cannot be pronounced, the assistant is silent, a period during which the app continues to function without the user’s knowledge.
The SRLabs are “whitehat” hackers, or white hat hackers; That means that they informed Amazon and Google before making this investigation public. In response, both companies expelled the apps created by SRLabs and changed the approval methodology , in order to avoid the entry of similar apps.
Amazon says it has implemented “mitigations” to prevent and detect this type of behavior in the “skills”. Similarly, Google promises that it has implemented “additional mechanisms” to prevent such errors.
The popularization of smart speakers has brought a certain “paranoia” about what they are capable of doing; The idea of having a microphone at home, which always listens to us, is not exactly pleasant for many people.
It has not helped that the sector has been immersed in more than one controversy in its short life. Some of these scandals are born out of a simple ignorance of how these devices work. In other cases they have forced these manufacturers to react; as with the internal programs that allowed employees to listen to audio fragments to improve the system.