Earlier this year, we learned that Facebook had attained near-human facial-recognition capabilities through its DeepFace software. This was a valuable step toward a fully automated biometric identity-management regime, but it represents only one side of the process. Machines are doing their part to learn to recognize humans, but are humans doing their part to learn to recognize the value of being recognized by machines?

The New York Times explored the subject this past weekend. After presenting a certain amount of naysaying and warnings from one human pioneer of biometry, the paper brought helpful balance by talking with Aharon Zeevi Farkash, head of the company FST Biometrics:

Although the company has residential, corporate and government clients, Mr. Farkash's larger motive is to convince average citizens that face identification is in their best interest. He hopes that people will agree to have their faces recognized while banking, attending school, having medical treatments and so on.

If all the "the good guys" were to volunteer to be faceprinted, he theorizes, "the bad guys" would stand out as obvious outliers. Mass public surveillance, Mr. Farkash argues, should make us all safer.

Farkash's company is already controlling security at the Knickerbocker Village apartment complex in Manhattan, the Times reports, with camera systems that unlock the doors when they identify a resident or other trusted figure (such as a biometrics-company executive) approaching. But it is not merely the technology that makes Farkash's approach so promising. Plenty of companies can deploy facial recognition technology.

What Farkash brings to the task is the willingness to think like a computer himself. Human cognition can be sloppy and imprecise. Machine-mediated identity processing is clean and binary: authorized/unauthorized; unlocked/locked. For facial recognition to reach its fullest potential, human society must embrace that binary logic, too.

So: good guys or bad guys? This is the sort of question that an automated surveillance system is prepared to answer. Also: friend or not-friend? Orderly or disorderly? Typical or atypical? Compliant or noncompliant?

Farkash explained how his technology can clarify such questions:

A private high school in Los Angeles also has an FST system. The school uses the technology to recognize students when they arrive—a security measure intended to keep out unwanted interlopers. But it also serves to keep the students in line.

"If a girl will come to school at 8:05, the door will not open and she will be registered as late," Mr. Farkash explained. "So you can use the system not only for security but for education, for better discipline."

Indeed, the student arriving five minutes late for school will be receiving an education. On time is on time. Late is late. You are learning well, humans.

[Image by Jim Cooke, photo via Shutterstock]