Friday, 26 May 2017

On 00:23 by admin   No comments
Researchers have used machine learning techniques to make a computer-fooling “masterprint”.
Fingerprints are supposed to be unique markers of a person’s identity. Detectives look for fingerprints in crime scenes. Your phone’s fingerprint sensor means only you can unlock the screen. The truth, however, is that fingerprints might not be as secure as you think – at least not in an age of machine learning.
A team of researchers has demonstrated that, with the help of neural networks, a “masterprint” can be used to fool verification systems. A masterprint, like a master key, is a fingerprint that can be open many different doors. In the case of fingerprint identification, it does this by tricking a computer into thinking the print could belong to a number of different people.  
The team trialled two methods for generating masterprints using Generative Adversarial Networks (GAN), which is a type of algorithm used in machine learning. In both methods, the researchers trained a GAN to create partial fingerprint images based on a series of actual fingerprint images. The differences between the two methods then get quite technical (one is “based on evolutionary optimisation of latent variables”, while the other “uses gradient descent to find latent variables that maximize the number of activated outputs”).“Our method is able to design a MasterPrint that a commercial fingerprint system matches to 22% of all users in a strict security setting, and 75% of all users at a looser security setting,” the researchers ­– Philip Bontrager, Julian Togelius and Nasir Memon – claim in a paper.
The outputs were then tested on three proxy recognition systems (convolutional neural networks) and two external fingerprint recognition systems. They were able to fool all of these – to varying degrees – that the fingerprints belonged to many individuals. The implications are that biometric security systems could be breached with these masterprints, which could be printed onto – say – custom gloves. While this will no doubt be music to the ears of spies and international assassins, it also has the potential to undermine the day-to-day security in a huge number of situations, from unlocking smartphones to passing through US border security.
The researchers say they want to continue testing “in the wild”, against smartphone fingerprint recognition and other types of real-world authentication systems. 


Post a comment