At Safience, a Biometrica company, we believe protecting people and protecting privacy are not in conflict. We built our systems around the core belief that public safety should never require mass surveillance or the sacrifice of civil liberties.
Every solution we have — from our real-time threat and victim identification systems to our law enforcement database — is designed to work without recording, transmitting, accessing, or retaining biometric data. We do not, and will not, conduct mass surveillance.
Our mission is simple:
Biometric data refers to measurable biological characteristics — like facial geometry (faceprints), fingerprints, iris patterns, and voiceprints — that can be used to identify people. Biometric data is highly sensitive because it is inherently linked to a person and, unlike passwords, cannot easily be changed if misused.
This is why laws like the:
…treat biometric data as special and require stronger safeguards.
Most companies doing real-time surveillance do one or all of five things we do not do:
We don’t do any of these.
Instead, Biometrica operates differently:
| Common Practice | Biometrica’s Approach |
| Video surveillance of everyone | No video is ever recorded or stored |
| Faceprints / biometric templates are built and retained | We do not generate, access, store, or transmit biometric templates |
| Automated real-time facial recognition and alerts | Human-in-the-loop oversight for every alert |
| Mass surveillance by default | Relevance-based, lawful alerts only |
| Data processed directly on cameras or devices | All matching is done in an isolated third-party black box environment |
Biometrica does not perform biometric matching internally.
All biometric comparisons (facial matching) are done by an independent, NIST-evaluated and approved third-party provider operating in a fully isolated, secure black box environment that we have no access to.
This means:
This model removes the risk of misuse or unauthorized access to biometric data by Biometrica staff, law enforcement, or any other third parties.
Even after a match is suggested by the algorithm:
This eliminates false positives and reduces risks of bias or error. Audit trails for alerts also ensure algorithmic accountability and guarantees that our technology is used ethically and legally.
Not all matches lead to an alert.
Example:
We design with context and common sense, helping avoid scarlet-letter-style labeling and protecting people from being unfairly flagged.
Our safeguards include:
In public safety, it is tempting to prioritize security over liberty. We refused to make that choice. Instead, we chose both.
Every child recovered, every potential violent offender identified, every missing person reunited with their family — all without spying on everyone else — proves that you do not need mass surveillance to protect people effectively.
We invite you to ask questions, challenge us, and learn how privacy, ethics, and public safety can work together.
For inquiries or training:
Email: privacy@biometrica.com or leo@biometrica.com