A concerning discovery has brought attention to the latest privacy concerns related to artificial intelligence. A new study has revealed a worrying technique that could put user passwords at risk. A team of researchers at Cornell University published a paper revealing a concerning discovery related to user privacy.
The study discussed a new technique that could potentially put user passwords at risk. The researchers found that a sophisticated deep-learning model could secretly obtain sensitive information like passwords by simply listening to the sounds of users typing. This type of “sound-based side-channel attack” is a new tactic that cybercriminals can use to steal personal information in a very sneaky way.
The researchers began their experiment by recording the sounds of keystrokes and utilizing these audio patterns to train an advanced algorithm. This algorithm then demonstrated an impressive capability to identify particular keystrokes based solely on their sound accurately. When integrated with a data model, the deductions made by the algorithm were truly remarkable. “When trained on keystrokes recorded by a nearby phone, the [software] achieved an accuracy of 95%, the highest accuracy seen without the use of a language model,” presented in the abstract of their paper. The impressive success rate achieved by this method has caught the attention of the cybersecurity community, leaving them stunned at the possibility of data theft without any physical intrusion.
It is essential to be aware of potential threats and take precautions to protect your typing privacy. Mixing up your typing patterns and using randomized passwords to thwart these eavesdropping attempts is recommended. Stay vigilant and stay safe. In addition to the precautions mentioned above, incorporating software that mimics keystroke sounds, generates white noise, or applies audio filters can also add a layer of complexity to the strategies of potential eavesdroppers.