Security researchers from the Chinese tech giant Tencent as of late discovered a rather serious vulnerability in Amazon Echo. The vulnerability is termed serious on the grounds that it enables programmers to furtively tune in to users’ conversations without their knowledge.
The researchers in a presentation which was given at the DEF CON security conference, named ‘ Breaking Smart Speakers: We are Listening to you,’ and precisely explained as to how they could assemble a doctored Echo speaker and utilize that to gain access to other Echo devices.
‘After several months of research, we successfully break the Amazon Echo by using multiple vulnerabilities in the Amazon Echo system, and [achieve] remote eavesdropping. When the attack [succeeds], we can control Amazon Echo for eavesdropping and send the voice data through network to the attacker.’
Researchers utilized Amazon’s Home Audio Daemon, which the device uses to communicate with other Echo devices on a similar Wireless connection, to ultimately control the users’ speakers. Through which they could quietly record conversations or even play random sounds.
The attack though, is the first one that the researchers have distinguished a noteworthy security defect in a well-known smart speaker such as the Amazon Echo. The researchers have since informed Amazon of this security imperfection and the firm said it issued a software patch to the users’ in July. They likewise note that it requires access to a physical Echo device.
In any case, Amazon and the researchers both warn that the technique distinguished is extremely modern and in all probability is easy for any average hacker to carry out. ‘Customers do not need to take any action as their devices have been automatically updated with security fixes,’ says an Amazon spokesperson.
Yet, some have brought up that the attack could also be carried out in regions where there are multiple Echo devices being utilized on the same network, the simplest example of it are the Hotels or Restaurants.
Nonetheless prior this year, researchers from University of California, Berkeley too recognized a defect where hackers could not only control prominent voice assistants such as, Alexa, Siri and Google Assistant but could also slip indiscernible voice commands into audio recordings which could further direct a voice assistant to do a wide range of things, that range from taking pictures to launching websites and making phone calls