Android

Face Capture (Android)

Estimated reading: 7 minutes 332 views

Importing the Face Capture SDK to a Project

To include the Face Capture SDK in your application requires three steps:

  1. Update the gradle dependencies and include the AARs in the application lib directory.
    • Add or update the filetree line to include searching for AARs. i.e. api fileTree(dir: ‘libs’, include: [’*.aar’])
    • The dependency section of your application’s gradle may look like the following:

Listing 1 Gradle Dependency

   dependencies {
       api fileTree(dir: 'libs', include: ['*.jar', '*.aar'])

       androidTestImplementation('com.android.support.test.espresso:espresso-core:2.2.2', {
           exclude group: 'com.android.support', module: 'support-annotations'
       })
       api 'com.squareup.okhttp3:okhttp:3.10.0'
       api 'com.android.support:appcompat-v7:26.1.0'
       api 'com.android.support:design:26.1.0'
       api 'com.android.support.constraint:constraint-layout:1.1.3'
       testImplementation 'junit:junit:4.12'
   }
  1. Copy the Aware-FaceCaptureAwareId.aar from the lib directory to the libs directory of your application.
  2. Import Face Capture into your application code to start using the FaceCaptureAwareId classes.

Listing 2 Import Example

   import com.aware.facecaptureawareid.api.FaceCaptureAwareIdApi;

Running the Face Capture Demo

Our SDK package come with two demos, FaceCaptureAwareIdDemo and FaceCaptureAwareIdDemoQR.

In the FaceCaptureAwareIdDemo demo, the initial launch will show the setting page. User will need to enter data string prior to perform enrollment. The following data fields need to be obtained from AwareID admin server, please refer to Server document for references:

  • Set Host URL
  • Set Customer Name
  • Set Apikey
  • Set Client Secret

The other data fields such as Set User Name, Email, Phone Number can be any sample data.

Once data is entered, back to Home screen and it will prompt for permission to access device camera. User will need to grant this permission and proceed the enrollment by clicking Enroll button. Follow the direction on screen to perform face capture. Once capture succeed and it will switch to back to Home screen. Click on Verify button on Home screen to initial the verification workflow.

In the FaceCaptureAwareIdDemoQR demo, it will prompt for permission to access device camera at the initial launch. Grant the permission will then switch to Home screen. User can proceed the enrollment by clicking the Scan QR Code button. Scan the QR code generated from the Web app (Please see AwareID server document for reference).

Continue instruction on capture page to perform the face capture. Once capture completed and succeed, it will switch back to Home screen. To perform verification, either scan the QR code or trigger push notification from the web app.

Description

The Face Capture AwareId SDK is the client side library to be used with the Face Liveness Server product for determining the liveness of a subject. It is used to capture face images and package them according to the Face Liveness Server specifications. The Face Liveness Server will make a determination for whether or not the subject is live. The library does not directly communicate with any servers; it provides a package for the application to deliver to the Face Liveness Server. It is up to the application developer to implement any server communication and security.

Facial images are captured based on the selected capture profile. Capture profiles specify the criteria the subject must meet in order to successfully capture images and create a server package. Profiles are provided with this installer to be used as a base for capturing faces that are optimized for liveness processing on the back-end. Profiles can be customized to meet any additional requirements for capture.

Design Features

  • Light-weight on-device size
  • Easy to integrate API
  • No direct UI activity to allow for a fully customizable user experience
  • Handles operation of device hardware
  • Optimized data collection for Knomi back-end analysis
  • No direct server interaction. Applications have full control over when and how they interact with the back-end services.

Platforms

  • Android 8.0 and higher

System Requirements

  • Android 8.0 or higher
  • Android NDK 20 or higher
  • API 24 or higher supported on device
  • Device supports Camera2 integration
  • Users must grant access to CAMERA permissions

Android Integration

Overview

The Face Capture AwareId SDK comes with a JAVA interface for Android integration. This chapter will outline the requirements for Android integration, how to operate the included developer demo, and which parts of the demo source code correspond to the integration tasks outlined in the Application Design chapter.

Integration Requirements

The Face Capture AwareId SDK requires Internet and Camera permissions.

Android Face Capture

This section provides details regarding the Face Capture Aware Id API and how it is used to implement an application.

Add needed permissions

Permissions for camera access and internet access should be requested in the manifest at /android/app/src/main

Setup permissions

   <manifest xmlns:android="http://schemas.android.com/apk/res/android"
   //...

   <uses-permission android:name="android.permission.CAMERA" />
   <uses-permission android:name="android.permission.INTERNET" />

   </manifest>

Android Face Capture SDK

Create a Face Capture Object

The first step is to create a library object. This is done by creating a FaceCaptureJNI object.

Create a Face Capture Object

   private FaceCaptureJNI mFaceCapture ;

   try {
       mFaceCapture  = new FaceCaptureJNI();
   } catch (FaceCaptureException ex) {
       ex.printStackTrace();
   }

Create a Workflow Object

Create a Workflow Object

   try {
       mWorkFlow = mFaceCapture.workflowCreate(FaceCaptureJNI.FOXTROT);
   } catch (FaceCaptureException ex) {
       ex.printStackTrace();
   }

Adjust Workflow Settings

The Capture Profile is a XML file that must be read into your project as a UTF-8 String. This file is supplied in the sample project at assets/profiles/face_capture_foxtrot_client.xml

Adjust Workflow Settings

   try {
       mWorkFlow.setPropertyString(WorkflowProperty.USERNAME, "TestUser");
       mWorkFlow.setPropertyDouble(WorkflowProperty.CAPTURE_TIMEOUT, 0.0);
       mWorkFlow.setPropertyString(WorkflowProperty.CAPTURE_PROFILE, mCaptureProfile);
   } catch (FaceCaptureException ex) {
       ex.printStackTrace();
   }

Select a Camera

Select a camera

   try {
       mCameraList = mFaceCapture.getCameraList(CameraPosition.FRONT);
       mCurrentCamera = mCameraList[0];
       mCurrentCamera.setOrientation(CameraOrientation.PORTRAIT);
   } catch (FaceCaptureException ex) {
       ex.printStackTrace();
   }

Begin a Capture Session

Begin a Capture Session

   try {
       mFaceCapture.startCaptureSession(mWorkFlow, mCurrentCamera);
   } catch (FaceCaptureException ex) {
       ex.printStackTrace();
   }

Stop a Capture Session

Stop a Capture Session

   try {
       mFaceCapture.stopCaptureSession();
   } catch (FaceCaptureException ex) {
       ex.printStackTrace();
   }

Get the Capture Region

Get Capture Region

   Rectangle mCurrentCaptureRegion;

   try {
       mCurrentCaptureRegion = mFaceCapture.captureSessionGetCaptureRegion();
   } catch (FaceCaptureException ex) {
       ex.printStackTrace();
   }

Get the Current Capture Session State

Current Capture Session State

   try {
       mCurrentCaptureState = mFaceCapture.getCaptureSessionState();
   } catch (FaceCaptureException ex) {
       ex.printStackTrace();
   }

Get the Current Capture State’s Image

Current Capture State Image

   try {
       mCurrentCaptureSessionFrame = mCurrentCaptureState.getFrame();
   } catch (FaceCaptureException ex) {
       ex.printStackTrace();
   }

Get the Capture State’s Feedback

Capture Session Feedback

   try {
       mCurrentCaptureSessionFrame = mCurrentCaptureState.getFeedback();
   } catch (FaceCaptureException ex) {
       ex.printStackTrace();
   }

Get the Capture State’s Status

Capture Session Status

   try {
       mCurrentCaptureSessionStatus = mCurrentCaptureState.getStatus();
   } catch (FaceCaptureException ex) {
       ex.printStackTrace();
   }

Get the Server Package (unencrypted)

Get the Server Package

   try {
       mCurrentCaptureServerPackage = mFaceCapture.getServerPackage(mWorkFlow, mPackageType);
   } catch (FaceCaptureException ex) {
       ex.printStackTrace();
   }

Get the Encrypted Server Package

Get the Encrypted Server Package

   try {
       mCurrentCaptureServerPackage = mFaceCapture.getEncryptedServerPackage(mEncryptionType, mPublicKey, mWorkFlow, mPackageType);
   } catch (FaceCaptureException ex) {
       ex.printStackTrace();
   }

Enable Autocapture

Enable Autocapture

   try {
       mFaceCapture.captureSessionEnableAutocapture(true);
   } catch (FaceCaptureException ex) {
       ex.printStackTrace();
   }

Android Complete Sample in Java

Complete Sample

   public class FaceCaptureInterface {

   private FaceCaptureJNI mFaceCapture;
   private IFaceCapture.IWorkflow mWorkFlow;
   private IFaceCapture.ICamera[] mCameraList ;
   private IFaceCapture.ICamera mCurrentCamera;

   ///Constructor for FaceCaptureInterface
   public  FaceCaptureInterface (String workFlowFile)  {
       try {
           mFaceCapture = new FaceCaptureJNI();
           mWorkFlow = mFaceCapture.workflowCreate("Charlie");
           mWorkFlow.setPropertyString(WorkflowProperty.CAPTURE_PROFILE, workFlowFile);
           mCameraList = mFaceCapture.getCameraList(CameraPosition.FRONT);
           mCurrentCamera = mCameraList[0];
           mCurrentCamera.setOrientation(CameraOrientation.PORTRAIT);
           mFaceCapture.captureSessionEnableAutocapture(true);
       } catch (FaceCaptureException e) {
           e.printStackTrace();
       }
   }

   public void setUpWorkFlow(String username, double timeout ) throws FaceCaptureException {
       mWorkFlow.setPropertyString(WorkflowProperty.USERNAME, username);
       mWorkFlow.setPropertyDouble(WorkflowProperty.CAPTURE_TIMEOUT, timeout);
   }

   public void beginCapture(){
       try {
           mFaceCapture.startCaptureSession(mWorkFlow, mCurrentCamera);
       } catch (FaceCaptureException e) {
           e.printStackTrace();
       }
   }

   public  void stopCapture(){
       try {
           mFaceCapture.stopCaptureSession();
       } catch (FaceCaptureException e) {
           e.printStackTrace();
       }
   }

   public IFaceCapture.ICamera returnCamera(){
       return mCurrentCamera;
   }

   public String getEncryptedServerPackage(EncryptionType encryptionType, String publicKey, PackageType packageType){
       try {
       return  mFaceCapture.getEncryptedServerPackage(encryptionType, publicKey, mWorkFlow, packageType);
       } catch (FaceCaptureException e) {
           throw new RuntimeException(e);
       }
   }

   public  String getServerPackage(PackageType packageType){
       try{
           return mFaceCapture.getServerPackage(mWorkFlow, packageType)
       }catch(FaceCaptureException e){
           throw new RuntimeException(e);
       }
   }

   public IFaceCapture.ICaptureState returnCaptureState(){
       try {
       IFaceCapture.ICaptureState currentCaptureState =  mFaceCapture.getCaptureSessionState();
           return currentCaptureState;
       } catch (FaceCaptureException e) {
           e.printStackTrace();
       }
       return null;
   }

CONTENTS