Greater App Security with Face Verification
A walkthrough of using and implementing a face verification solution.
Join the DZone community and get the full member experience.
Join For FreeIdentity verification is among the primary contributors to mobile app security. Considering that face data is unique for each person, it has been utilized to develop a major branch of identity verification: face recognition.
Face recognition has been widely applied in services we use every day, such as unlocking a mobile device, face-scan payment, access control, and more. Undoubtedly, face recognition delivers a streamlined verification process for the mentioned services. However, that is not to say that this kind of security is completely safe and secure. Face recognition can only detect faces, and is unable to tell whether they belong to a real person, making face recognition vulnerable to presentation attacks (PAs), including the print attack, replay attack, and mask attack.
This highlights the need for greater security features, paving the way for face verification. Although face recognition and face verification sound similar, they are in fact quite different. For example, a user is unaware of face recognition being performed, whereas they are aware of face verification. Face recognition does not require user collaboration, while face verification is often initiated by a user. Face recognition cannot guarantee user privacy, whereas face verification can. These fundamental differences showcase the heightened security features of face verification.
Truth to be told, I only learned about these differences just recently, which garnered my interest in face verification. I wanted to know how the technology works and integrate this verification feature into my own app. After trying several solutions, I opted for the interactive biometric verification capability from HMS Core ML Kit.
Introduction to Interactive Biometric Verification
This capability performs verification in an interactive way. During verification, it prompts a user to perform either three of the following actions: blink, open their mouth, turn their head left or right, stare at the device camera, and nod. Utilizing key facial point technology and face tracking technology, the capability calculates the ratio of the fixed distance to the changing distance using consecutive frames, and compares a frame with the one following it. This helps interactive biometric verification check whether a detected face is of a real person, helping apps defend against PAs. The whole verification procedure contains the following parts: The capability detects a face in the camera stream, checks whether it belongs to a real person, and returns the verification result to an app. If the verification is a match, the user is given permission to perform the subsequent actions.
Not only that, I also noticed the verification capability provides a lot of assistance when it is in use as it will prompt the user to make adjustments if the lighting is poor, the face image is blurred, the face is covered by a mask or sunglasses, the face is too close to or far from the device camera, and other issues. In this way, interactive biometric verification helps improve user interactivity.
The capability offers two call modes, which are the default view mode and customized view mode. The underlying difference between them is that the customized view mode requires the verification UI to be customized.
I've tried on a face mask to see whether the capability could tell if it was me, and here is the result I got.
Successful defense!
Now let's see how the verification function can be developed using the capability.
Development Procedure
Preparations
Before developing the verification function in an app, there are some things you need to do first. Make sure that the Maven repository address of the HMS Core SDK has been set up in your project and the SDK of interactive biometric verification has been integrated. Integration can be completed via the full SDK mode using the code below:
dependencies{
// Import the package of interactive biometric verification.
implementation 'com.huawei.hms:ml-computer-vision-interactive-livenessdetection
: 3.2.0.122'
}
Function Development
Use either the default view mode or customized view mode to develop the verification function.
Default View Mode
1. Create a result callback to obtain the interactive biometric verification result.
private MLInteractiveLivenessCapture.Callback callback = new MLInteractiveLivenessCapture.Callback() {
@Override
public void onSuccess(MLInteractiveLivenessCaptureResult result) {
// Callback when the verification is successful. The returned result indicates whether the detected face is of a real person.
swich(result.getStateCode()) {
case InteractiveLivenessStateCode.ALL_ACTION_CORRECT:
// Operation after verification is passed.
case InteractiveLivenessStateCode.IN_PROGRESS:
// Operation when verification is in process.
…
}
@Override
public void onFailure(int errorCode) {
// Callback when verification failed. Possible reasons include that the camera is abnormal (CAMERA_ERROR). Add the processing logic after the failure.
}
};
2. Create an instance of MLInteractiveLivenessConfig and start verification.
MLInteractiveLivenessConfig interactiveLivenessConfig = new MLInteractiveLivenessConfig.Builder().build();
MLInteractiveLivenessCaptureConfig captureConfig = new MLInteractiveLivenessCaptureConfig.Builder()
.setOptions(MLInteractiveLivenessCaptureConfig.DETECT_MASK)
.setActionConfig(interactiveLivenessConfig)
.setDetectionTimeOut(TIME_OUT_THRESHOLD)
.build();
MLInteractiveLivenessCapture capture = MLInteractiveLivenessCapture.getInstance();
capture.startDetect(activity, callback);
Customized View Mode
1. Create an MLInteractiveLivenessDetectView object and load it to the activity layout.
/**
* i. Bind the camera preview screen to the remote view and configure the liveness detection area.
* In the camera preview stream, interactive biometric verification checks whether a face is in the middle of the face frame. To ensure a higher verification pass rate, it is recommended that the face frame be in the middle of the screen, and the verification area be slightly larger than the area covered by the face frame.
* ii. Set whether to detect the mask.
* iii. Set the result callback.
* iv. Load MLInteractiveLivenessDetectView to the activity.
*/
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_liveness_custom_detection);
mPreviewContainer = findViewById(R.id.surface_layout);
MLInteractiveLivenessConfig interactiveLivenessConfig = new MLInteractiveLivenessConfig.Builder().build();
mlInteractiveLivenessDetectView = new MLInteractiveLivenessDetectView.Builder()
.setContext(this)
// Set whether detect the mask.
.setOptions(MLInteractiveLivenessCaptureConfig.DETECT_MASK)
// Set the type of liveness detection. 0 indicates static biometric verification, and 1 indicates interactive biometric verification.
.setType(1)
// Set the position for the camera stream.
.setFrameRect(new Rect(0, 0, 1080, 1440))
// Set the configurations for interactive biometric verification.
.setActionConfig(interactiveLivenessConfig)
// Set the face frame position. This position is relative to the camera preview view. The coordinates of the upper left vertex and lower right vertex are determined according to an image with the dimensions of 640 x 480 px. The face frame dimensions should comply with the ratio of a real face. This frame checks whether a face is too close to or far from the camera, and whether a face deviates from the camera view.
.setFaceRect(new Rect(84, 122, 396, 518))
// Set the verification timeout interval. The recommended value is about 10,000 milliseconds.
.setDetectionTimeOut(10000)
// Set the result callback.
.setDetectCallback(new OnMLInteractiveLivenessDetectCallback() {
@Override
public void onCompleted(MLInteractiveLivenessCaptureResult result) {
// Callback when verification is complete.
swich(result.getStateCode()) {
case InteractiveLivenessStateCode.ALL_ACTION_CORRECT:
// Operation when verification is passed.
case InteractiveLivenessStateCode.IN_PROGRESS:
// Operation when verification is in process.
…
}
}
@Override
public void onError(int error) {
// Callback when an error occurs during verification.
}
}).build();
mPreviewContainer.addView(mlInteractiveLivenessDetectView);
mlInteractiveLivenessDetectView.onCreate(savedInstanceState);
}
2. Set a listener for the lifecycle of MLInteractiveLivenessDetectView.
@Override
protected void onDestroy() {
super.onDestroy();
MLInteractiveLivenessDetectView.onDestroy();
}
@Override
protected void onPause() {
super.onPause();
MLInteractiveLivenessDetectView.onPause();
}
@Override
protected void onResume() {
super.onResume();
MLInteractiveLivenessDetectView.onResume();
}
@Override
protected void onStart() {
super.onStart();
MLInteractiveLivenessDetectView.onStart();
}
@Override
protected void onStop() {
super.onStop();
MLInteractiveLivenessDetectView.onStop();
}
And just like that, you've successfully developed an airtight face verification feature for your app.
Where to Use
I noticed that the interactive biometric verification capability is actually one of the sub-services of liveness detection in ML Kit, and the other one is called static biometric verification. After trying them myself, I found that interactive biometric verification is more suited for human-machine scenarios.
Take banking as an example. By integrating the capability, a banking app will allow a user to open an account from home, as long as they perform face verification according to the app prompts. The whole process is secure and saves the user from the hassle of going to a bank in person.
Shopping is also a field where the capability can play a crucial role. Before paying for an order, the user must first verify their identity, which safeguards the security of their account assets.
These are just some situations that best suit the use of this capability. How about you? What situations do you think this capability is ideal for? I look forward to seeing your ideas in the comments section.
Conclusion
For now, face recognition — though convenient and effective — alone is not enough to implement identity verification due to the fact that it cannot verify the authenticity of a face.
The face verification solution helps overcome this issue, and the interactive biometric verification capability is critical to implementing it. This capability can ensure that the person in a selfie is real as it verifies authenticity by prompting the user to perform certain actions. Successfully completing the prompts will confirm that the person is indeed real.
What makes the capability stand out is that it prompts the user during the verification process to streamline authentication. In short, the capability is not only secure, but also very user-friendly.
Published at DZone with permission of Jackson Jiang. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments