Laser attack on microphones of voice control systems

Researchers from the University of Michigan and Osaka University have developed new attack technique Light Commands, allowing you to remotely simulate voice commands using a laser for devices that support voice control, such as smart speakers, tablets, smartphones and smart home control systems using Google Assistant, Amazon Alexa, Facebook Portal and Apple Siri. During the experiments, an attack was demonstrated that made it possible to covertly substitute a voice command from a distance of 75 meters through a window glass and 110 meters in open space.

The attack is based on the use photoacoustic effect, in which the absorption of changing (modulated) light by a material leads to thermal excitation of the medium, a change in the density of the material and the appearance of sound waves perceived by the microphone membrane. By modulating the laser power and focusing the beam on the hole with the microphone, you can achieve stimulation of sound vibrations that will be inaudible to others, but will be perceived by the microphone.

Laser attack on microphones of voice control systems

The attack affects electromechanical microphones used in modern devices (MEMS).
Among the consumer devices tested for susceptibility to the problem are various models of Google Home, Google NEST, Amazon Echo, Echo Plus/Spot/Dot, Facebook Portal Mini, Fire Cube TV, EchoBee 4, iPhone XR, iPad 6th Gen, Samsung Galaxy S9 and Google Pixel 2, also smart locks and voice control systems for Tesla and Ford cars. Using the proposed attack method, you can simulate issuing a command to open a garage door, make online purchases, try to guess the PIN code for accessing a smart lock, or start a car that supports voice control.

In most cases, a laser power of 50 mW is sufficient to carry out an attack at a distance of more than 60. To carry out the attack, a $14-$18 laser pointer, a $5 Wavelength Electronics LD339CHA laser driver, a $059 Neoteck NTK28 audio amplifier, and a $650 Opteka 1300-200mm telephoto lens were used. To accurately focus the beam at a large distance from the device, experimenters used a telescope as an optical sight. At close ranges, instead of a laser, an unfocused bright light source, such as flashlight Acebeam W30.

Laser attack on microphones of voice control systems

An attack typically does not require a simulation of the owner’s voice, since voice recognition is usually used at the stage of accessing the device (authentication by pronouncing β€œOK Google” or β€œAlexa”, which can be recorded in advance and then used to modulate the signal during an attack). Voice characteristics can also be faked by modern machine learning-based speech synthesis tools. To block the attack, manufacturers are encouraged to use additional user authentication channels, use data from two microphones, or install a barrier in front of the microphone that blocks the direct passage of light.







Source: opennet.ru

Add a comment