I was recently privy to a government project that has a requirement to match “100 million fingerprints per second”. Yes, you read that right. I had to do a double take as well because outside FBI and Her Majesty’s government circles, it’s a rare requirement. And no, I don’t think anybody was/is trying to steal public funds because the process has been very transparent. Anyway, it turns out it is possible to do it with less than two dozen HP servers. It’s not easy but very possible if you know how to deploy microservices in a highly parallel configuration.
The example I gave earlier applies to scenarios such as ID card verification where you know who the assumed identity is, and have probably done some 1st level authentication, and you just want to increase confidence levels with a biometric match. On a national scale, you don’t want to do a 1:180mln check. Instead, you hash the profile to get a non-colliding match, which gives you 180,000 potential biometrics to match against, essentially reducing your work by a 1000 orders of magnitude. With that, you can get a good match in less than a second on just two servers.
Regarding the quality of scanners, I agree with you. A lot of the optical scanners leave much to be desired but the capacitive ones are pretty good.