Can AI Steal Data Through Keyboard Sound? Researchers Shed Light on New Threat

AI technology continues to evolve at a rapid pace, capturing the imagination of researchers and enthusiasts alike. However, along with its exciting potential, AI also brings forth concerns and risks that need to be addressed. One such concern has recently been highlighted by researchers who have trained a deep learning model to extract valuable data by analyzing the sound of keyboard strokes.

According to reports, a team of researchers from a British university has successfully developed an algorithm that can decipher keystrokes by recognizing the sound captured through a laptop or PC’s microphone. The accuracy of this AI model is astounding, reaching an impressive 95 percent. The team tested the algorithm on popular communication apps like Zoom and Skype, where the accuracy slightly dropped to 93 percent and 91.7 percent, respectively.

These findings raise serious concerns about the potential for hackers to exploit this technology. Imagine a scenario where AI is used to decipher keystrokes as you log into your banking account, effectively compromising your password and gaining unauthorized access. The researchers themselves acknowledge the inadvertent assistance they may be providing to hackers and emphasize the importance of increased security measures.

To mitigate this risk, experts suggest the use of virtual keyboards that prevent keystrokes from being deciphered through sound recognition. Implementing unique typing patterns or creating random complex passwords can also provide an additional layer of protection.

The researchers conducted their experiments by recording the sound produced by 36 keys on a MacBook, multiple times. They used an iPhone 13 Mini, positioned 17 cm away from the laptop, to capture the sound data. This information was then used to train an image classifier called CoAtNet that accurately predicted the keys pressed based on the recorded sound.

As AI technology continues to advance, it is vital for researchers, tech companies, and individuals to remain vigilant and take necessary precautions to safeguard sensitive information. Adapting to evolving threats and implementing robust security measures will be essential to ensuring a secure digital future.


Can AI really steal data through keyboard sound?

Yes, researchers have trained a deep learning model to decipher typing patterns by analyzing keyboard sound captured through a microphone.

What is the accuracy of this AI model?

The accuracy of this AI model in recognizing keystrokes based on sound is rated at an impressive 95 percent.

How can individuals protect themselves from this threat?

To protect against potential attacks, individuals can use virtual keyboards, create complex passwords, or modify their typing patterns to make it harder for AI algorithms to decipher keystrokes.

What are the implications of this research for cybersecurity?

This research highlights the need for increased security measures and awareness as AI technology evolves. It underscores the importance of staying informed and adopting proactive security measures to safeguard sensitive information.