In the document presented at the conference on biometric security (BTAS 2018), the experts explain that the experts considered two things to create the DeepMasterPrint. On the one hand, for ergonomic reasons, many fingerprint sensors are very small (as in smartphones), which makes them work using part of the user's fingerprint image. Therefore, since identifying an identity with small parts of a fingerprint is not an easy task, since it can be when reading a full fingerprint, the possibility that a part of the fingerprint does not match the other part of the fingerprint of another finger is high. Researcher Aditi Roy took this into account and introduced the concept of master fingerprints, which are a collection of real or synthetic fingerprints, which may coincide with a large number of other fingerprints.
The second thing they took into account is that some fingerprints have common characteristics with each other. This means that a false fingerprint, which contains many common characteristics, has a more realistic chance of matching other fingerprints.
Hence, the researchers used an artificial intelligence algorithm called an “antagonistic network of genes” to artificially create new fingerprints that can correspond to as many fingerprints as possible. Thus, they managed to create a library of artificial fingerprints that function as master keys for a specific biometric identification system. In addition, there is no need to have a fingerprint sample belonging to a specific person, but it can be performed against anonymous items and still has some margin of success.
Although it is very difficult for an attacker to use something similar to DeepMasterPrint, because it takes a lot of work to optimize artificial intelligence for a particular system, because each system is different from others, this is an example of what can happen over time and something that needs to be known. . Something similar was noticed this year at the Black Hat security conference, when IBM researchers demonstrated evidence that they could develop malicious software that uses artificial intelligence to perform face recognition attacks.