Voice Capture (iOS)
Introduction
Description
The Voice Capture SDK is the client-side library to be used with the Voice Liveness Server product for determining the liveness of a subject. It is used to capture voice images and package them according to the Voice Liveness Server specifications. The Voice Liveness Server will make a determination for whether or not the subject is live. The library does not directly communicate with any servers; it provides a package for the application to deliver to the Voice Liveness Server. It is up to the application developer to implement any server communication and security.
Facial images are captured based on the selected capture profile. Capture profiles specify the criteria the subject must meet in order to successfully capture images and create a server package. Profiles are provided with this installer to be used as a base for capturing voices that are optimized for liveness processing on the back end. Profiles can be customized to meet any additional requirements for capture.
Design Features
- Light-weight on-device size
- Easy to integrate API
- No direct UI activity to allow for a fully customizable user experience
- Handles operation of device hardware
- Optimized data collection for Knomi back-end analysis
- No direct server interaction. Applications have full control over when and how they interact with the back-end services.
Platforms
- iOS 13.0 and higher
System Requirements
- iOS 13 or higher
- Users must grant access to MICROPHONE permissions
- Users must grant access to INTERNET access permissions
iOS Integration
Overview
The Voice Capture SDK comes with a Swift interface for iOS integration. This chapter will outline the requirements for iOS integration, how to operate the included developer demo, and which parts of the demo source code correspond to the integration tasks outlined in the Application Design chapter.
Integration Requirements
The Voice Capture AwareId SDK requires Internet and microphone permissions.
iOS Voice Capture
This section provides details regarding the Voice Capture Aware Id API and how it is used to implement an application.
Add Needed Permissions
Permission for microphone access and internet access should be requested in the info.plist NSMicrophoneUsage : Some string explaining why you need to record audio
Info Plist
iOS Voice Capture SDK
Create a Voice Capture Library
The first step is to create a library object. This is done by creating a VoiceCapture object.
Create Voice Capture Object
// Create a voice capture object. var voiceCapture : VoiceCapture! do { voiceCapture = try VoiceCapture() } catch let err { print("Error: " + err.localizedDescription) }
Register a Capture Session Status Callback
VoiceCapture library is designed around the concept of a capture session. This is a session in which voice samples are collected. During the collection process the application needs to listen for status updates from VoiceCapture via a CaptureSessionStatus callback. The application needs to register a class as the listener for this callback. Setting the status callback is done using the API call setCaptureSessionStatusCallback.
Register Capture Session Status Callback
var voiceCapture : VoiceCapture! // created by an earlier call to VoiceCapture() var captureSessionStatusCallback : VoiceCapture.CaptureSessionStatusCallback // defined below do { try voiceCapture.setCaptureSessionStatusCallback(captureSessionStatusCallback: captureSessionStatusCallback) } catch let err { print("Error: " + err.localizedDescription) }
Create a Workflow Object
A workflow object controls the parameters for a capture session.
Create a Workflow Object
var voiceCapture : VoiceCapture! // created by an earlier call to VoiceCapture() var workflow : Workflow! do { try workflow = voiceCapture.workflowCreate(name: VoiceCapture.ALFA2) } catch let err { print("Error: " + err.localizedDescription) }
Adjust Workflow Settings
Change any settings for the workflow object before starting a capture session.
Adjust Workflow Settings
var voiceCapture : VoiceCapture! // created by an earlier call to VoiceCapture() var workflow : Workflow! // Created by an earlier call to workflowCreate() do { try workflow.setPropertyString(property: .USERNAME, value: "user_name"); try workflow.setPropertyDouble(property: .CAPTURE_TIMEOUT, value: 8.0); try workflow.setPropertyDouble(property: .MINIMUM_RECORDING_LENGTH, value: 1.0); try workflow.setPropertyBool(property: .CAPTURE_ON_DEVICE, value: false); if let microphones = try? voiceCapture.getMicrophoneList() { try workflow.setPropertyString(property: .MICROPHONE, value: microphones[0]) } } catch let err { print("Error: " + err.localizedDescription) }
Begin a Capture Session
Once the workflow object has been created and any necessary properties set, it is used to start the capture session. During the capture session the application will receive status messages via the CaptureSessionStatusCallback as described above.
Begin a Capture Session
var voiceCapture : VoiceCapture! // created by an earlier call to VoiceCapture() var workflow : Workflow! // Created by an earlier call to workflowCreate() do { try voiceCapture.startCaptureSession(workflow: workflow) } catch let err { print("Error: " + err.localizedDescription) }
Handle Capture Session Status
Each of the statuses sent communicated via the CaptureSessionStatusCallback should be handled by the application. The most important statuses to receive are those related to the end of the capture session as they indicate when the application should move forwards.
Capture Session Status
var captureSessionStatusCallback : VoiceCapture.CaptureSessionStatusCallback captureSessionStatusCallback = { (status: VoiceCapture.CaptureSessionStatus) -> Void in print( "[CVC | captureSessionStatusCallback] capture status is now \(status)") // ... handle callback here }
Capture Session Completion - Retrieve JSON
After receiving the completed message the application can get the resulting JSON by calling getServerPackage. The resulting JSON can be sent to the server to obtain a liveness result.
Get Server Package
var voiceCapture : VoiceCapture! // created by an earlier call to VoiceCapture() var workflow : Workflow! // Created by an earlier call to workflowCreate() var serverPackage : String? do { serverPackage = try voiceCapture.getServerPackage(workflow: workflow) } catch let err { print("Error: " + err.localizedDescription) }
Capture Session Completion - Retrieve Voice Samples
If the capture on device setting was set to true, the getCapturedVoiceRecording function can be called to get retrieve the byte data representing the collected voice samples.
Select a camera
var voiceCapture : VoiceCapture! // created by an earlier call to VoiceCapture() var workflow : Workflow! // Created by an earlier call to workflowCreate() var voiceRecording : [UInt8]? do { voiceRecording = try voiceCapture.getCapturedVoiceRecording(workflow: workflow) } catch let err { print("Error: " + err.localizedDescription) }
iOS Complete example
Example
// CaptureViewController.swift // Copyright © 2023 Aware Inc. All rights reserved. import UIKit import SwiftUI import AVFAudio import AVFoundation public class VoiceLivenessAppUIClientServerViewController: UIViewController { public override func viewDidLoad() { super.viewDidLoad() print("[CVC] viewDidLoad") if AVCaptureDevice.authorizationStatus(for: AVMediaType.audio) == AVAuthorizationStatus.denied || AVCaptureDevice.authorizationStatus(for: AVMediaType.audio) == AVAuthorizationStatus.notDetermined { AVCaptureDevice.requestAccess(for: .audio) { success in if success { print("Permission granted, starts recording") } else { print("Permission denied") self.returnToMainViewController(message: "return to home page") return } } } self.initUI() self.definesPresentationContext = true // Setup voice capture self.setupVoiceCapture() } public override func viewWillAppear( _ animated: Bool ) { if AVCaptureDevice.authorizationStatus(for: AVMediaType.audio) == AVAuthorizationStatus.denied { return } if (!enrollMode) { authCancelButton.setTitle(Bundle.main.localizedString(forKey: "CANCEL", value: "Cancel", table: "Localizable"), for: .normal) authInstructionLabel.text = Bundle.main.localizedString(forKey: "AUTH INSTRUCTION", value: "Please speak the following phrase:", table: "Localizable") authAppBarTitle.text = Bundle.main.localizedString(forKey: "AUTH APP BAR TITLE", value: "Authenticate", table: "Localizable") self.statusLabel.isHidden = true } else { enrollCancelBtn.setTitle(Bundle.main.localizedString(forKey: "CANCEL", value: "Cancel", table: "Localizable"), for: .normal) instructionsLabel.text = Bundle.main.localizedString(forKey: "INSTRUCTIONS LABEL", value: "Repeat the phrase below:", table: "Localizable") appBarTitle.text = Bundle.main.localizedString(forKey: "APP BAR TITLE", value: "Enroll new account", table: "Localizable") self.enrollStatusLabel.isHidden = true } self.startRecording() super.viewWillAppear( animated ) do { let errorURL = URL(fileURLWithPath: Bundle.main.path(forResource: "ErrorSound", ofType: "wav")!) let validationURL = URL(fileURLWithPath: Bundle.main.path(forResource: "ValidationSound", ofType: "wav")!) let captureURL = URL(fileURLWithPath: Bundle.main.path(forResource: "CaptureSound", ofType: "wav")!) errorSound = try AVAudioPlayer(contentsOf: errorURL) validateSound = try AVAudioPlayer(contentsOf: validationURL) captureSound = try AVAudioPlayer(contentsOf: captureURL) } catch { print("Could not load sounds.") } print("[CVC] viewWillAppear | animated: \( animated )") } public override func viewDidAppear(_ animated: Bool ) { super.viewDidAppear( animated ) print("[CVC] viewDidAppear | animated: \( animated )") } public override func viewWillDisappear( _ animated: Bool ) { super.viewWillDisappear( animated ) print("[CVC] viewWillDisappear | animated: \( animated )") errorSound?.stop() validateSound?.stop() // Cleanup voice capture if self.workflow != nil { self.workflow!.dispose() } self.timer = nil } public override func viewDidDisappear( _ animated: Bool) { super.viewDidDisappear( animated ) print("[CVC] viewDidDisappear | animated: \( animated )") } // // MARK: Application State Methods // @objc func enterForeground(_ notification: Notification) { print("[CVC] enterForeground") } @objc func enterBackground(_ notification: Notification) { print("[CVC] enterBackground") } // enrollment public func setBiMaas(addVoiceUrl: String, apiKey: String, enrollmentToken: String, jwt: String) { self.biMaasAddVoiceUrl = addVoiceUrl self.biMaasApiKey = apiKey self.biMaasEnrollmentToken = enrollmentToken self.biMaasJwt = jwt setOnBoardingType(onBoardingType: .ENROLLMENT) } // verification public func setBiMaas(verifyVoiceUrl: String, apiKey: String, verifyAuthToken: String, jwt: String) { self.biMaasVerifyVoiceUrl = verifyVoiceUrl self.biMaasApiKey = apiKey self.biMaasVerifyAuthToken = verifyAuthToken self.biMaasJwt = jwt setOnBoardingType(onBoardingType: .VERIFICATION) } // re-enrollment public func setBiMaas(reEnrollmentVerifyVoiceUrl: String, apiKey: String, reEnrollmentToken: String, jwt: String) { self.biMaasReEnrollmentVerifyVoiceUrl = reEnrollmentVerifyVoiceUrl self.biMaasApiKey = apiKey self.biMaasReEnrollmentToken = reEnrollmentToken self.biMaasJwt = jwt setOnBoardingType(onBoardingType: .REENROLLMENT) } func setOnBoardingType(onBoardingType: OnBoardingType) { self.isEnroll = false self.isVerify = false self.isReEnroll = false switch onBoardingType { case .ENROLLMENT: self.isEnroll = true case .VERIFICATION: self.isVerify = true case .REENROLLMENT: self.isReEnroll = true } } public func setCaptureSettings(settings: VoiceCaptureSettings) { self.captureSettings = settings } // // MARK: Callback Handler Methods // private func captureSessionStatusCallback(captureStatus: VoiceCapture.CaptureSessionStatus) { print("[CVC] captureSessionStatusCallback | captureStatus: \(captureStatus)") // Check capture status if (self.currentStatus != captureStatus) { self.setStatusViewText(captureStatus.description) switch (captureStatus) { case .CAPTURING: self.handleCaptureSessionCapturing() break case .POST_CAPTURE_PROCESSING: self.handleCaptureSessionPostCaptureProcessing() break case .COMPLETED: self.handleCaptureSessionCompleted() break case .ABORTED: self.handleCaptureSessionAborted() break default: break } self.currentStatus = captureStatus } } private func handleCaptureSessionCapturing() { print("[CVC] handleCaptureSessionCapturing") // Set UI to capturing self.setModeCapturing() } private func handleCaptureSessionPostCaptureProcessing() { print("[CVC] handleCaptureSessionPostCaptureProcessing") // Set UI to not capturing self.setModeNotCapturing() } private func handleCaptureSessionCompleted() { print("[CVC] handleCaptureSessionCompleted") self.setModeNotCapturing(cancel: true) self.stopVoiceCapture() do { self.setStatusViewText("Retrieving capture data") let serverPackage = try self.voiceCapture.getServerPackage(workflow: self.workflow!) var dict = serverPackage.convertToDictionary()! as [String : Any] self.voiceSamples?.append(((dict["voice"]! as! [String: Any])["voiceSamples"]! as! [Any])[0]) // For enroll, record limit set to 3 and send package to server if (enrollMode) { if self.voiceSamples!.count <= self.recordLimit { runTimer() } if self.voiceSamples!.count < self.recordLimit && self.voiceSamples!.count != 4{ self.recordStep += 1 self.startRecording() } else { if var voice = (dict["voice"] as! [String: Any]?) { voice.updateValue(self.voiceSamples!, forKey: "voiceSamples") dict.updateValue(voice, forKey: "voice") } self.sendServerPackage(serverPackage: dict) } } // For auth, record only once, no circleProgressView else { if var voice = (dict["voice"] as! [String: Any]?) { voice.updateValue(self.voiceSamples!, forKey: "voiceSamples") dict.updateValue(voice, forKey: "voice") } self.sendServerPackage(serverPackage: dict) } } catch let err { let message = err.localizedDescription print("Error: " + message) self.reportCaptureError(message: message) } } private func handleCaptureSessionAborted() { print("[CVC] handleCaptureSessionAborted") self.setModeNotCapturing(cancel: true) self.reportCaptureFailed(reason: "VoiceLiveness Capture Aborted") } // // MARK: Capture Methods // private func setupVoiceCapture() { // Already Authorized print("[CVC] setupVoiceCapture") self.voiceSamples = [] if isEnroll != nil, isEnroll! { self.recordLimit = 3 } do { if (self.voiceCapture == nil) { try self.voiceCapture = VoiceCapture() } // Display version let voiceSdkVersion = try VoiceCapture.getVersionString() print("Voice SDK version: \(voiceSdkVersion)") try self.workflow = self.voiceCapture.workflowCreate(name: (self.captureSettings?.workflow)!) // Set callback try self.voiceCapture.setCaptureSessionStatusCallback(captureSessionStatusCallback: self.captureSessionStatusCallback) // Set username try self.workflow?.setPropertyString(property: .USERNAME, value: self.captureSettings?.username ?? "") // Set capture timeout // try self.workflow?.setPropertyDouble(property: .CAPTURE_TIMEOUT, value: (self.captureSettings?.captureTimeout)!) // Set min capture time try self.workflow?.setPropertyDouble(property: .MINIMUM_RECORDING_LENGTH, value: (self.captureSettings?.minRecordLength)!) // Set capture on device try self.workflow?.setPropertyBool(property: .CAPTURE_ON_DEVICE, value: self.captureSettings?.captureOnDevice ?? true) // Set microphone try self.workflow?.setPropertyString(property: .MICROPHONE, value: "iPhone Microphone") self.voiceRecordTimeConstant = (self.captureSettings?.captureTimeout)! self.timerCounter = self.voiceRecordTimeConstant } catch let err as VoiceCaptureErrorInfo.ErrorCode { print("VoiceCaptureException: " + err.description) self.reportCaptureError(message: err.description) } catch let err { print("Error: " + err.localizedDescription) } } private func startCaptureSession() { do { // Set workflow if let workflowObj = self.workflow { try self.voiceCapture.startCaptureSession(workflow: workflowObj) } } catch let err as VoiceCaptureErrorInfo.ErrorCode { print("VoiceCaptureException: " + err.description) self.reportCaptureError(message: err.description) } catch let err { print("Error: " + err.localizedDescription) } } public func stopVoiceCapture() { print("[CVC] stopVoiceCapture") self.finishRecording() // Stop capture session do { try self.voiceCapture.stopCaptureSession() } catch let err { print("Error: " + err.localizedDescription ) } } // // MARK: Server Package Methods // private func sendServerPackage(serverPackage: [String: Any]) { print("[CVC] sendServerPackage | serverPackage: \(serverPackage)") self.responseDidReceived?(false, serverPackage, nil) } // // MARK: UI Methods // public func setStatusViewText(_ text: String) { print("[CVC] setStatusViewText | text: \(text)") DispatchQueue.main.async { } } private func setModeCapturing() { print("[CVC] setModeCapturing") } public func setModeNotCapturing(cancel: Bool = false) { print("[CVC] setModeNotCapturing | cancel: \(cancel)") self.finishRecording() DispatchQueue.main.async { self.resetCircleProgress() } } private func alertModalDialog(title: String, message: String) { print("[CVC] alertModalDialog | title: \(title), message: \(message)") DispatchQueue.main.async { let alert = UIAlertController(title: title, message: message, preferredStyle: .alert) let reTryAction = UIAlertAction(title: NSLocalizedString("Retry", comment: "Default action"), style: .default) { (action: UIAlertAction) -> Void in self.startRecording() } let cancelAction = UIAlertAction(title: NSLocalizedString("Stop", comment: "Default action"), style: .cancel) { (action: UIAlertAction) -> Void in self.returnToMainViewController(message: message) } alert.addAction(reTryAction) alert.addAction(cancelAction) self.present(alert, animated: true, completion: nil) } } private func reportLivenessResult(_ score: Int) { print("[CVC] reportLivenessResult | score: \(score)") self.finishRecording() self.alertModalDialog(title: Bundle.main.localizedString(forKey: "VOICE TITLE", value: "VoiceLiveness", table: "Localizable"), message: "Liveness score: \(score)") } private func reportCaptureFailed(reason: String) { print("[CVC] reportCaptureFailed | reason: \(reason)") self.finishRecording() self.alertModalDialog(title: Bundle.main.localizedString(forKey: "VOICE CAPTURE FAILED", value: "Voice Capture Failed", table: "Localizable"), message: reason) } private func reportCaptureError(message: String) { print("[CVC] reportCaptureError | message: \(message)") self.finishRecording() self.alertModalDialog(title: Bundle.main.localizedString(forKey: "VOICE CAPTURE ERROR", value: "Voice Capture Error", table: "Localizable"), message: getErrorDescription(message)) } // // MARK: Misc Methods // private func returnToMainViewController(message: String) { print("[CVC] returnToMainViewController") DispatchQueue.main.async { self.dismiss(animated: true) { self.responseDidReceived?(true, nil, message) } } } private func getErrorDescription(_ error: String) -> String { print("[CVC] getErrorDescription | error: \(error)") switch error { case "AW_VOICE_CAPTURE_E_NO_ERRORS": return "No errors or warnings." case "AW_VOICE_CAPTURE_E_INTERNAL_ERROR": return "An internal error occurred." case "AW_VOICE_CAPTURE_E_NULL_VOICE_CAPTURE_OBJ": return "The Voice Liveness object was NULL." case "AW_VOICE_CAPTURE_E_TRIAL_EXPIRATION_PASSED": return "The trial expiration has passed." case "AW_VOICE_CAPTURE_E_OUT_OF_MEMORY": return "The library failed to allocate memory." case "AW_VOICE_CAPTURE_E_INITIALIZATION_FAILED": return "Could not initialize the Voice Liveness library or a required component." case "AW_VOICE_CAPTURE_E_INVALID_WORKFLOW_PTR": return "The given pointer does not reference a valid workflow." case "AW_VOICE_CAPTURE_E_UNKNOWN_WORKFLOW": return "No workflow exists with the given name." case "AW_VOICE_CAPTURE_E_WORKFLOW_IN_USE": return "Workflow cannot be modified while capture is in progress." case "AW_VOICE_CAPTURE_E_UNKNOWN_PROPERTY": return "The specified property does not exist." case "AW_VOICE_CAPTURE_E_PROPERTY_TYPE_MISMATCH": return "The specified value is the wrong type for the specified property." case "AW_VOICE_CAPTURE_E_INVALID_PROPERTY_VALUE": return "The specified value is not valid for the specified property." case "AW_VOICE_CAPTURE_E_MIN_RECORDING_LEN_TOO_SHORT": return "Minimum recording length must be at least 1 second." case "AW_VOICE_CAPTURE_E_MICROPHONE_NOT_FOUND": return "The requested microphone was not found on the system." case "AW_VOICE_CAPTURE_E_CAPTURE_TIMEOUT_TOO_SHORT": return "Please specify a capture timeout that is at least " + (VoiceCapture.CAPTURETIMEOUTOFFSET + 0.1).description + " seconds larger than the minimum recording time." case "AW_VOICE_CAPTURE_E_CAPTURE_IN_PROGRESS": return "An existing capture session is already in progress." case "AW_VOICE_CAPTURE_E_CAPTURE_SESSION_UNAVAILABLE": return "The last capture session for the specified Workflow was aborted." case "AW_VOICE_CAPTURE_E_UNKNOWN_ERROR": return "An unknown error has occurred." default: return error } } } extension VoiceLivenessAppUIClientServerViewController { public func setVoicePhrase(voicePhraseInput: String) { voicePhrase = voicePhraseInput } @objc func updateInstructionTimer() { self.instructionTimerCount -= 1 print("instruction timer: \(self.instructionTimerCount)") if (self.instructionTimerCount <= 0){ instructionTimer.invalidate() if (self.isCapturing == false) { startRecording() } } } // handle stopBtn @IBAction func handleStopBtn(_ sender: UIButton) { self.setModeNotCapturing(cancel: true) self.stopVoiceCapture() } // UI settings func initUI(){ if (enrollMode){ // only enroll need recording 3 times. }else { } if (staticMode){ phraseText = Bundle.main.localizedString(forKey: "VERIFY", value: "Hello, please verify my identity", table: "Localizable") voiceTitle = Bundle.main.localizedString(forKey: "SPEAK", value: "Speak your passphrase", table: "Localizable") voicePhraseLabel.text = Bundle.main.localizedString(forKey: "VERIFY", value: "Hello, please verify my identity", table: "Localizable") if let voicePhrase = self.voicePhrase { phraseText = voicePhrase voicePhraseLabel.text = "\"" + voicePhrase + "\"" } } else{ if(enrollMode){ phraseText = "zero one two three four five six seven eight nine" } else{ voicePhraseLabel.text = createVoiceText(size: 5) } } resetCircleProgress() } func setFinishCycleProgress(completion: ((Bool) -> Void)?) { DispatchQueue.main.async { self.circleProgressView.animate(toAngle: 360, duration: 1, relativeDuration: true, completion: completion) } } func resetCircleProgress() { DispatchQueue.main.async { self.circleProgressView.startAngle = -90 self.circleProgressView.clockwise = true } } //Runs the circle during voice capture with self explanatory labels func runTimer() { print("runTimer invoked.") DispatchQueue.main.async { if self.currentCount == 1 { self.instructionsLabel.text = Bundle.main.localizedString(forKey: "REPEAT 1", value: "Repeat the phrase again:", table: "Localizable") self.circleProgressView.animate(fromAngle: 0, toAngle: 120, duration: 0.99, completion: nil) self.timer = Timer.scheduledTimer(timeInterval: 0.1, target: self, selector: (#selector(self.updateTimer)), userInfo: nil, repeats: false) } else if self.currentCount == 2 { self.instructionsLabel.text = Bundle.main.localizedString(forKey: "REPEAT 2", value: "Repeat the phrase one more time:", table: "Localizable") self.circleProgressView.animate(fromAngle: 120, toAngle: 240, duration: 0.99, completion: nil) self.timer = Timer.scheduledTimer(timeInterval: 0.1, target: self, selector: (#selector(self.updateTimer)), userInfo: nil, repeats: false) } else if self.currentCount == 3 { self.circleProgressView.animate(toAngle: 360, duration: 0.99, relativeDuration: true, completion: nil) self.instructionsLabel.text = Bundle.main.localizedString(forKey: "DONE", value: "Done!", table: "Localizable") self.timer = Timer.scheduledTimer(timeInterval: 0.1, target: self, selector: (#selector(self.updateTimer)), userInfo: nil, repeats: false) } self.currentCount += 1 } } @objc func updateTimer() { self.timerCounter! -= 0.1 if (self.timerCounter! < 0){ print("TimerStoped.") resetCircleProgress() timer!.invalidate() timerCounter = voiceRecordTimeConstant } } private func createVoiceText(size: Int8) ->String { return " " } public func setEnrollMode(enrollMode mode: Bool){ enrollMode = mode } public func setStaticMode(staticMode mode: Bool){ staticMode = mode } func startRecording(){ self.isCapturing = true self.startCaptureSession() } func finishRecording(){ if (self.voiceCapture == nil) { return } print("finishRecording invoked") self.isCapturing = false } public func circleColorGreen() { SoundManager.shared.playValidate() DispatchQueue.main.async { self.circleProgressView.set(colors: .green) if let enrollSpeaker = self.enrollSpeaker.image { let colorlessSpeaker = enrollSpeaker.withRenderingMode(.alwaysTemplate) self.enrollSpeaker.image = colorlessSpeaker self.enrollSpeaker.tintColor = .green } self.enrollStatusLabel.isHidden = false self.enrollStatusLabel.text = Bundle.main.localizedString(forKey: "VOICE SUCCESS", value: "Voice Enrollment Success", table: "Localizable") } } public func circleColorRed() { SoundManager.shared.playError() DispatchQueue.main.async { self.circleProgressView.set(colors: .red) if let enrollSpeaker = self.enrollSpeaker.image { let colorlessSpeaker = enrollSpeaker.withRenderingMode(.alwaysTemplate) self.enrollSpeaker.image = colorlessSpeaker self.enrollSpeaker.tintColor = .red } self.enrollStatusLabel.isHidden = false self.enrollStatusLabel.text = Bundle.main.localizedString(forKey: "VOICE FAILED", value: "Voice Enrollment Failed", table: "Localizable") } } public func authSpeakerGreen() { SoundManager.shared.playValidate() DispatchQueue.main.async { if let mic = self.micIcon.image { let colorlessMic = mic.withRenderingMode(.alwaysTemplate) self.micIcon.image = colorlessMic self.micIcon.tintColor = .green } self.statusLabel.isHidden = false self.statusLabel.text = Bundle.main.localizedString(forKey: "VOICE AUTH SUCCESS", value: "Voice Authentication Success", table: "Localizable") } } public func authSpeakerRed() { SoundManager.shared.playError() DispatchQueue.main.sync { if let mic = micIcon.image { let colorlessMic = mic.withRenderingMode(.alwaysTemplate) micIcon.image = colorlessMic micIcon.tintColor = .red } self.statusLabel.isHidden = false self.statusLabel.text = Bundle.main.localizedString(forKey: "VOICE AUTH FAILED", value: "Voice Authentication Failed", table: "Localizable") } } public func authClear() { self.statusLabel.isHidden = true } func showHourglass(message: String) -> Void { DispatchQueue.main.async { LoadingOverlay.showHourglass(view: self.view, message: message) } } func hideHourglass() -> Void { DispatchQueue.main.async { LoadingOverlay.hideHourglass() } } }
Enrollment Process Initiated from Client
This section provides details regarding Enrollment Process Initiated from Client.
Base URL
www.awareid.aware-apis.com
To perform a successful enroll using the Voice SDK and AwareID there are two possible workflows. The first workflow is the most flexible and involves initiating the enrollment from the client. The second workflow option involves the use of a secondary application to initiate the enrollment and generate a QR code encoded with the session token necessary to proceed with the enrollment process. The QR code is scanned from the client application and then proceeds with enrollment using the data encoded in the QR code.
To enroll by initiating from the client we have to follow 4 steps. These steps include:
- Retrieve an access token. This token allows communication between the client application and the AwareID servers.
- Initiate an enrollment.
- Add device
- Enroll voice
Enrollment Initiated from Client Step 1 - Get Access Token
Our first step is to retrieve an “access_token”. This token will be used in our next api call to retrieve an enrollment token to proceed with enrollment.
Get Access Token
POST www.awareid.aware-apis.com/auth/realms/{{customer_name}}-consumers/protocol/openid-connect/token Content-Type: 'application/x-www-form-urlencoded', "client_id": client_id "client_secret": client_secret "scope": openid "grant_type" : client_credentials
Urlencoded
This is the only call whose content type of this call is “application/x-www-form-urlencoded”
Response - openid-connect
Response - openid-connect
STATUS CODE 200 { "access_token": "eyJhbGciOiJSUzI1NiIsInR5cCIgOiAiSldUIiwia2lkIiA6ICJCY2IxNXZJQkZsY2JYazVmQUdJZFZXV2pTUEtTaWpsazNydmFwMHp0ekN3In0.eyJleHAiOjE2NzM5OTExMjksImlhdCI6MTY3Mzk5MDgyOSwianRpIjoiN2MzYmY1MmItNjdlMC00ODNlLWFhZjAtYjlkNWJhODE3ZWJiIiwiaXNzIjoiaHR0cHM6Ly9hd2FyZWlkLWRldi5hd2FyZS1hcGlzLmNvbS9hdXRoL3JlYWxtcy9hbmRyYWUtY29uc3VtZXJzIiwic3ViIjoiOTU3ZWMyYmYtZTczOS00YjFjLWEyN2QtMTczMjQzMDIyYTE5IiwidHlwIjoiQmVhcmVyIiwiYXpwIjoiYmltYWFzLWIyYyIsImFjciI6IjEiLCJzY29wZSI6Im9wZW5pZCIsImNsaWVudElkIjoiYmltYWFzLWIyYyIsImNsaWVudEhvc3QiOiIzOC4xNDAuNTkuMjI2IiwiY2xpZW50QWRkcmVzcyI6IjM4LjE0MC41OS4yMjYifQ.OzggQ--Gl4w3NWZPg1BukkEg0fmsSyGgN-ag8eW0FARWl0Ic5fkrnrEdnIgsq5Molq0R52oe4Hy-8Tp4cOn9iCD51kPCPfTt15zVBIAYOvb5M5XZ0uPTygh02KjuFqsxIhbhH8CCUjHkpu3OhoWByc8bC8c9D_cFp3BFE-XIhNPaPxXdTLZOcJOqpdSVxsgxB66-xukI7AA8PWt10huO47l6TSBSnJIjUxNbEqR48ILfnkYY2bmyfoo-laKDv9XSSZ8hXU9sDkiGfpXOl112_f3L1sc6n1-UbRTJGFMd4fgntuanwEvN68TsyS5pz0izGlW-1T3fFJ3D2pGPefsWNA", "expires_in": 300, "refresh_expires_in": 0, "token_type": "Bearer", "id_token": "eyJhbGciOiJSUzI1NiIsInR5cCIgOiAiSldUIiwia2lkIiA6ICJCY2IxNXZJQkZsY2JYazVmQUdJZFZXV2pTUEtTaWpsazNydmFwMHp0ekN3In0.eyJleHAiOjE2NzM5OTExMjksImlhdCI6MTY3Mzk5MDgyOSwiYXV0aF90aW1lIjowLCJqdGkiOiJkYWNiNTc1NS1jMGEyLTQxZTEtYjMwMi05ZGEzOWRiNGNiYmUiLCJpc3MiOiJodHRwczovL2F3YXJlaWQtZGV2LmF3YXJlLWFwaXMuY29tL2F1dGgvcmVhbG1zL2FuZHJhZS1jb25zdW1lcnMiLCJhdWQiOiJiaW1hYXMtYjJjIiwic3ViIjoiOTU3ZWMyYmYtZTczOS00YjFjLWEyN2QtMTczMjQzMDIyYTE5IiwidHlwIjoiSUQiLCJhenAiOiJiaW1hYXMtYjJjIiwiYXRfaGFzaCI6IlcwbXNUU05WQUo1MG9oQ2JOR3dlTmciLCJhY3IiOiIxIiwiY2xpZW50SWQiOiJiaW1hYXMtYjJjIiwiY2xpZW50SG9zdCI6IjM4LjE0MC41OS4yMjYiLCJjbGllbnRBZGRyZXNzIjoiMzguMTQwLjU5LjIyNiJ9.MOgJ3giF0ikQnUAOBgK6eHpC0Tz3pCjhTX4IjHSjh3kzxx0KCLiWd494Fl3JSHiyvnNP7Ty1SXl4Bhq19f7y_lpGp4yLkbV9I1xsfC7m2D-EIf73D1LEluf1y97ISbh8668VqnGRG8U1FtXuwQGPZb7cgMiTbprECwLFj44_vM2qmLxFpOkOuVaqPmpgjt6MAmUbcWV8GDMAdxVnlZDZuzFkwOlb6S_WypNSYKHA6TFIe_FsA2EoxMu_9MAP3OLX7LIwX3jYIsT4z-TnUmyKC5RFzx6oc9D9Fr2eSTRBxC6QKGJrFAPt40p9_U3YFFi6VpzaGK9YQvCvdw70CVBe5Q", "not-before-policy": 0, "scope": "openid" }
Enrollment Initiated from Client Step 2 - Initiate An Enrollment
With the method type we start onboarding with accessToken, sessionToken, apikey
Initiate An Enrollment
POST /enroll Authorization: 'Bearer AccessToken' apikey: 'apikey' { "username": "username", "firstName": "first name", //optional "lastName": "last name" //optional "email": "user email", "phoneNumber": "user phonenumber" }
Response - Initiate An Enrollment
Response - Initiate An Enrollment
STATUS CODE 200 { "enrollmentToken": "enrollmentToken", "userExistsAlready": false, "requiredChecks": [ "addDevice", "addVoice" ] }
Enrollment Initiated from Client Step 3 - Add Device
The device ID is checked when performing an authentication to confirm the device enrolled is the same as the current device attempting the authentication.
Device Id
This can be retrieved in iOS by using the following code var deviceId = UIDevice.current.identifierForVendor?.uuidString
Add Device
POST /adddevice Authorization: 'Bearer AccessToken' apikey: 'apikey' { "enrollmentToken": "enrollmentToken", "deviceId": "deviceID" }
Response - Add Device
Response - Add Device
{ "enrollmentStatus": 1, "registrationCode": "" }
From here the response will include a registration code and enrollment status.
There are 3 enrollment statuses:
Enrollment Statuses
0 = Enrollment Failed 1 = Enrollment Pending 2 = Enrollment Complete
Enrollment Initiated from Client Step 4 - Add voice sample and check if sample belongs to a live person.
The add voice API call requires the json package generated by the Voice SDK.
Add Voice Sample
POST /addVoice Content-Type: 'application/json; charset=UTF-8', Authorization: 'Bearer AccessToken' { "enrollmentToken": "{{enrollment_token}}", "livenessData": { "voice": { "voiceSamples": [ { "data": {{LiveVoice}}, "Phrase": "\"Hello, Testing BIMaaS Application.\"" } ], "workflow": "alfa2", "meta_data": { "client_device_brand": "Apple", "client_device_model": "iPhone 8", "client_os_version": "11.0.3", "client_version": "KnomiSLive_v:2.4.1_b:0.0.0_sdk_v:2.4.1_b:0.0.0", "localization": "en-US", "programming_language_version": "Swift 4.1", "username": "test" } } } }
Response - Add Voice Sample
The response from the enrollment call returns:
- Liveness Result
- This is a boolean value.
- returns true if the sample is assessed to be a live sample
- returns false is the sample is assessed to not be live
- Enrollment Status
- 0 = failed
- 1 = pending
- 2 = success
- Registration Code
- this code is used in the re-enrollment process
- Liveness Result
- Broken down into several fields giving feedback on the liveness score
Liveness Result
STATUS CODE 200 { "livenessResult": true, "enrollmentStatus": 2, "registrationCode": "LLXL2N", "livenessResults": { "voice": { "liveness_result": { "decision": "LIVE", "feedback": [], "score_frr": 1.1757098732441127 } } } }
Enrollment Process Initiated Through QR Code
To enroll by initiating through a QR code scan we first have to take in the enrollment details like first name, last name, email address and phone number and make a REST call to generate the QR code from a secondary application. In our demo we use a web app for this process.
Generate QR Code
Web Portal
To start an enrollment using the QR code method we must first generate the QR code to be consumed by the client application (in our example we use a web app). This web application uses a UI to allow an end user to register using a username and email address. Then uses these pieces of information in the API Call for /triggerEnroll.
Base URL
www.awareid.aware-apis.com/onboarding/enrollment
Trigger Enroll
We initiate an enrollment by calling the /triggerEnroll endpoint with our received access token
- username - required
- email - required
- notifyOptions - optional
Trigger Enroll
POST www.awareid.aware-apis.com/b2c/proxy/triggerEnroll Content-Type: 'application/json; charset=UTF-8', { "username": "[email protected]", "email": "[email protected]", "notifyOptions": { "notifyByEmail": false } }
Response - Trigger Enroll
STATUS CODE 200 { status : "SUCCESS" sessionCallbackURL : "https://awareid-dev.aware-apis.com/onboarding?data=jgTW40dmoG6Hp_d6Rg7YaZ97vfGSlV5BcBJvLvqXVmhoQ2Hg2DcC2Kvr9AkTZ38ZkyIfiSj80QFxOWs1YeckYsp3D0D9vS46wppl1Zdt-tpiAdzlvBKA2DBfcj7rf0VePWUn1vKdIPgEoWAulqRxZ_mNakFB7FijLg0QJ8kYsB6w0Nk1A4m9QtLGIdHcuGn9XJnxooQHyr2yhtUsgfOo2FrRXYmFIF7ZNwxYd56miFCs-yuD6eZZcvZ1M01Wje7ji0NYUWVpdes-DA_P0cKgsLPX5sV7SyPSlf9kmoCQz7Ag20kAKkOf-LFFKQmgnJ3362nXIEovxS8vp4BCClu7vIfEVCE2s1zS7zNwrDuRfFdViVAQMMxDMe77LnbKbfvLqUhiv--wPFyV9Iier1EDSL9y5kikOw_PGSyuRzvbQKuoNdGj-IqVZYZ_5QivOFqq_OEt8jaX1zZxAiQ8uXRt3g" qrcodeImage : "iVBORw0KGgoAAAANSUhEUgAAAZAAAAGQAQAAAACoxAthAAAGgklEQVR42u2ca4okOQyEDb6WwVc36FoGr+ILZ9NZO7Dsj2VtqB7orsrKKPArFAopp6x//VO+kC/kC/lCvpAv5Av5TyCzlNJnH62u2SNG6yvyVX6Sl0bjsm+6AJIX8npe7ZE3r8gXK6F9VL3Mr8n783vmDZCmC+u5a9QVo5Q6av7O6dCUzNb2TRdA8u3IQY8cas0VbDlojVzX9cG4CaJ95/ty4HlTy99TG3FUrWD8zNHpELZlnjH97Tn8XL0hXGcqqpY45h928omQHHL7x5+/McyREH400tyGYgctX8udqOGbMYbvXudDRGylRkKSH7Qtpw5c39vSZyuhERdAtClzsFUHzDfnpwpQOQ35VVOzom17AySv5P1FxygUUrvpW4zX4XPF3ipOPB+i0YrAQ9c7E5HT0YlDXkHofPyivmMheQMiIbm6+BZ9hY7cgO+CtS513gDRkuXeU2jN9ZIEym3JJDjo7snp63xIDldSJ4hLU8vJ+4xMmgztxgy32pgXQNBoCqqKO5uyQ2vpYKt5CVP7BZDKyCukoDGLvwffkfB87+zhNfxTIYvX3prSb1o2Dl2yuoUbarvVGyBKdgYxJ+dgo8R7HYqvlnbaqhdAvOG8WKG/UIcUtcQooUlf+pv5j4UgFnSMFFa1qkoVYs/Jzg9GKVdAEJz6hHkIohB5gdJRWQRBdBoXQCbiJiDrHUvRCYXY9HBFrBeNHwqx8MmRwnadwJoLmbwnqwBXpygu9Rsg+qTVvaDcKLkzGrtTGQKKtf9e/VMhJG0kPdgESnRimSRQ2Q67s88bIHWSOFfGzN8aayvrhZ+mtLv8ziyOhXhwRfQnqhNhB/qUuKQdSmrXb4BgZbCMOILDHhRJdsW8tdnxXv0zIUoJQpaglq0KSsaGOSiigMFHq1dAhFEaSlSqj0cg/YY0NXnUl91xLEScNm0SYN3s99XpqI5XtZ97PgTnRltS9IBOGDi1yOvRt59rUXc6ZNkWgPds4izyOaXV4j1kD07u+RDWr9s9G1BGs2iY+DVkCfg5N0A4UjsMicPF2+g5CMPhtY5yA4QiIP/w0UyDaDo7OtuRem/LQyFyaIbtJ0lRqwKN/ImoyndEhBdAyBMaJjPOxiC3FvdJ1HUoEc2wLoAEZQGJHukd3LNpce3kh5Rovko2B0MYqZZQqgH7dtQ9K6hrMciHnX4ohNRteC9WzGfVN2yDSDfYWmtXQMQRBJxJwjasPJ2+UU1zx0PcAJGCI7BWCmfiCZ8vveASgbZeALFJw9rpmElnUyeEN6oz6vqWo8dCsJwG5Wfqm5Rpm7dlUIdyNveqWB0L0ZmqHDSUT8TjGljWyQBpH8M/FELwkfZ0QCo2PnClFulbNY3XCyDk1U0OVOHD5Son5pQyOmaDLzofgsLBUqPVROy3XTWXnC0jPuyOUyF17NO1iY7Os8CRGs2OzofsORhCprkpgdSnWAPtuvNjU/XzIShni9GwUCNp++kSqpsZ+7oAglIY5DxUO5EJ9ekKEvWFCx0XQBRN6fwThY9d5FzhSfBm7bEugRBFN5haLcna8AZ1Z217l9JOhZCiifpg6uEyusuC5J8D1/Dd3XEqhJasSc+sKxs0AcvtoFnTDDLeztWxEDKCnVZTbhoUcNyHojf0a7UrIBFuMGueAla1EaGQccUL+dHZdShktMfUeEIrGpU2mt1G67aUCyC7BCWCQ65Vmwbbx227iWOUGyB2biaKFNtz9/6Q1E3rnYccD4dM5wfTq4iBC6zY6Vxu1yjvtPpQyB6bqhr8fbqbGs4B1bSn2fF8yCTntOdB2jYwA5FtaFQqn7/b5o+FoKG9jLRn6qO5S2jByOWvv1rOzoXQuYSXyWGi6TR4eIYsruzU+v1oxqEQBRwCD0n1CofSsHmwN2282ufOhTSiZ8GypUFLXu3urNk1gpj9N40fCyG3EWMPn7MqU8rPMvipoHgI8gKIVQHKE6vWCmhi57g9c1oRXQAZPl1UzG2o+7GZgjtFsPp5ivF0yH76h+fM+l476gMkbc2tW/PzUaYjIXMXMM3XcLrO2DCNuz/7p8J5OqS7sKlsYbo7q6MbKKDblf5DmeNMSMOe5WCR7CgxYMDUnx576qPl7GRIoW1GeVxzY7Yrt8VZT3meZrgEMvw8FmYhxqcfNCVguXvrAshu8e8PBXITzzPZfn5agm6AiC6oyrrdLObz3CK19e6H/+qKCyDf/73hC/lCvpAv5Av5Qv5nyF+w76Y2yWY7wwAAAABJRU5ErkJggg==" sessionToken : "aa73e547-0f1b-4235-a7b0-dd52fa4ab774" errorSummary : null }
Our response for trigger enroll includes five pieces of information with the most relevant piece being the base64 encoded string of the QR that is used to be displayed to the user to continue the enrollment on their device.
With our QR code generated in our web application, we then have to proceed with scanning the QR code and completing the enrollment. The following steps apply to enrolling a user from the client side application using a QR code. These steps include:
- Scan QR code and decrypt data.
- Initiate an enrollment.
- Add device
- Enroll voice
The QR Code will return a url with an encrypted data parameter named “data”.
This data has to be decrypted using the available public key. Once decoded 3 pieces of information are provided separated by “&”.
- Host url: This URL is where all subsequent api calls will be made through
- The API Key
- This API Key is used in the header of api calls
- The key value pair in the header is as follows: “apikey”:API_Key
3. Session Token: The session token is used to validate the session.
Enrollment Using QR Code Step 1 - Validate Session Token
The first api call necessary to enroll a user is /tokenVerify/validateSession.
Validate Session
POST /tokenVerify/validateSession "Content-Type": 'application/json; charset=UTF-8', "apikey": apiKey { "sessionToken":sessionToken }
Urlencoded
This is the only call whose content type of this call is “application/x-www-form-urlencoded”
Response - Validate Session Token
Response - Validate Session Token
{ "accessToken": "accessToken", "methodType": "enroll", "customerName": "customerName", "customerLogo": "", "userName": "customerUsername", "email": "customerEmail" }
Enrollment Using QR Code Step 2 - Initiate An Enrollment
With the method type we start onboarding with accessToken, session token, apikey
Initiate An Enrollment
POST /enroll Authorization: 'Bearer AccessToken' apikey: 'apikey' { "username": "username", "firstName": "first name", //optional "lastName": "last name" //optional "email": "user email", "phoneNumber": "user phonenumber" }
Response - Initiate An Enrollment
Response - Initiate An Enrollment
STATUS CODE 200 { "enrollmentToken": "enrollmentToken", "userExistsAlready": false, "requiredChecks": [ "addDevice", "addVoice" ] }
Enrollment Using QR Code Step 3 - Add Device
The device ID is checked when performing authentication to confirm the device enrolled is the same as the device attempting.
Device Id
This can be retrieved in iOS by using the following code let deviceId = UIDevice.current.identifierForVendor?.uuidString
Add Device
POST /adddevice Authorization: 'Bearer AccessToken' apikey: 'apikey' { "enrollmentToken": "enrollmentToken", "deviceId": "deviceID" }
Response - Add Device
Response - Add Device
{ "enrollmentStatus": 1, "registrationCode": "" }
From here the response will include a registration code and enrollment status.
There are 3 enrollment statuses:
Enrollment Statuses
0 = Enrollment Failed 1 = Enrollment Pending 2 = Enrollment Complete
Enrollment Using QR Code Step 4 - Add voice sample and check if sample belongs to a live person.
The add voice API call requires the json package generated by the Voice SDK.
Add Voice Sample¶
POST /addVoice Content-Type: 'application/json; charset=UTF-8', Authorization: 'Bearer AccessToken' { "enrollmentToken": "{{enrollment_token}}", "livenessData": { "voice": { "voiceSamples": [ { "data": {{LiveVoice}}, "Phrase": "\"Hello, Testing BIMaaS Application.\"" } ], "workflow": "alfa2", "meta_data": { "client_device_brand": "Apple", "client_device_model": "iPhone 8", "client_os_version": "11.0.3", "client_version": "KnomiSLive_v:2.4.1_b:0.0.0_sdk_v:2.4.1_b:0.0.0", "localization": "en-US", "programming_language_version": "Swift 4.1", "username": "test" } } } }
Response - Add Voice Sample
The response from the enrollment call returns:
- Liveness Result
- This is a boolean value.
- returns true if the sample is assessed to be a live sample
- returns false is the sample is assessed to not be live
- Enrollment Status
- 0 = failed
- 1 = pending
- 2 = success
- Registration Code
- this code is used in the re-enrollment process
- Liveness Result
- Broken down into several fields giving feedback on the liveness score
Liveness Result
STATUS CODE 200 { "livenessResult": true, "enrollmentStatus": 2, "registrationCode": "LLXL2N", "livenessResults": { "voice": { "liveness_result": { "decision": "LIVE", "feedback": [], "score_frr": 1.1757098732441127 } } } }
Authentication Workflow
Like the enrollment process we can also complete an authentication in two variations. The first is initiating the authentication from the client side and the second involves using a QR code scan to initiate the authencation process and using the client application to complete it. Below we explain how to achieve both options beginning with the Client Initiated Authentication
Client Initiated Authentication
A client initiated authentication is performed in 3 steps:
- Initiate authentication
- Verify device
- Verify voice
Base URL
Base URL¶
www.awareid.aware-apis.com
Authentication Initiated from Client Step 1 - Initiate Authentication
Initiate Authentication¶
POST /onboarding/authentication/authenticate Authorization: 'Bearer AccessToken' apikey: 'apikey' { "registrationCode": "registrationToken", "deviceId": "deviceID" }
Response - Initiate Authentication
Response - Initiate Authentication¶
STATUS CODE 200 { "authToken": "b8bf6f22-6f93-4bcb-a5b6-871b689c6750", "requiredChecks": [ "verifyDevice", "verifyVoice" ] }
Authentication Initiated from Client Step 2 - Verify Device
Verify Device¶
POST /onboarding/authentication/verifyDevice Authorization: 'Bearer AccessToken' apikey: 'apikey' { "authToken": "authToken", "signature": "signature", "deviceId": "deviceID" }
Response - Verify Device
Response - Verify Device¶
STATUS CODE 200 { "message": "Device verified.", "authStatus": 1 }
Authentication Initiated from Client Step 3 - Verify Voice
Verify Voice¶
POST /onboarding/authentication/verifyVoice { "authToken": "auth_token", "livenessData": { "voice": { "voiceSamples": [ { "data": "base64 voice sample package from SDK", "Phrase": "\"Hello, Testing BIMaaS Application.\"" } ], "workflow": "alfa2", "meta_data": { "client_device_brand": "Apple", "client_device_model": "iPhone 8", "client_os_version": "11.0.3", "client_version": "KnomiSLive_v:2.4.1_b:0.0.0_sdk_v:2.4.1_b:0.0.0", "localization": "en-US", "programming_language_version": "Swift 4.1", "username": "test" } } } }
Response - Verify Voice
Response - Verify Voice¶
{ "livenessResult": true, "matchResult": true, "matchScore": 100, "authStatus": "2 (Complete)", "voiceLivenessResults": "" }
Authentication Using QR Code
To authenticate by inititating through a QR code scan we first have to take in the authenticate details like first name, last name, email address and phone number and make a call to generate the QR code from a secondary application. In our demo we use a web app for this process.
Generate QR Code
Web Portal
To start an authentication using the QR code method we must first generate the QR code to be consumed by the client application (in our example we use a web app). This web application uses a UI to allow an enduser to register using a username and email address. Then uses these pieces of information in the API Call for /triggerAuthenticate .
Base URL¶
www.awareid.aware-apis.com
Trigger Authenticate
We initiate an authenticate by calling the /triggerAuthenticate endpoint.
- username - required
- notifyOptions - optional
Trigger Authenticate¶
POST baseUrl + /b2c/proxy/triggerAuthenticate Content-Type: 'application/json; charset=UTF-8', { "username": "[email protected]", "notifyOptions": { "notifyByPush": true } }
Response - Trigger Authenticate¶
STATUS CODE 200 { status : "SUCCESS" sessionCallbackURL : "https://awareid-dev.aware-apis.com/onboarding?data=jgTW40dmoG6Hp_d6Rg7YaZ97vfGSlV5BcBJvLvqXVmhoQ2Hg2DcC2Kvr9AkTZ38ZkyIfiSj80QFxOWs1YeckYsp3D0D9vS46wppl1Zdt-tpiAdzlvBKA2DBfcj7rf0VePWUn1vKdIPgEoWAulqRxZ_mNakFB7FijLg0QJ8kYsB6w0Nk1A4m9QtLGIdHcuGn9XJnxooQHyr2yhtUsgfOo2FrRXYmFIF7ZNwxYd56miFCs-yuD6eZZcvZ1M01Wje7ji0NYUWVpdes-DA_P0cKgsLPX5sV7SyPSlf9kmoCQz7Ag20kAKkOf-LFFKQmgnJ3362nXIEovxS8vp4BCClu7vIfEVCE2s1zS7zNwrDuRfFdViVAQMMxDMe77LnbKbfvLqUhiv--wPFyV9Iier1EDSL9y5kikOw_PGSyuRzvbQKuoNdGj-IqVZYZ_5QivOFqq_OEt8jaX1zZxAiQ8uXRt3g" qrcodeImage : "iVBORw0KGgoAAAANSUhEUgAAAZAAAAGQAQAAAACoxAthAAAGgklEQVR42u2ca4okOQyEDb6WwVc36FoGr+ILZ9NZO7Dsj2VtqB7orsrKKPArFAopp6x//VO+kC/kC/lCvpAv5Av5TyCzlNJnH62u2SNG6yvyVX6Sl0bjsm+6AJIX8npe7ZE3r8gXK6F9VL3Mr8n783vmDZCmC+u5a9QVo5Q6av7O6dCUzNb2TRdA8u3IQY8cas0VbDlojVzX9cG4CaJ95/ty4HlTy99TG3FUrWD8zNHpELZlnjH97Tn8XL0hXGcqqpY45h928omQHHL7x5+/McyREH400tyGYgctX8udqOGbMYbvXudDRGylRkKSH7Qtpw5c39vSZyuhERdAtClzsFUHzDfnpwpQOQ35VVOzom17AySv5P1FxygUUrvpW4zX4XPF3ipOPB+i0YrAQ9c7E5HT0YlDXkHofPyivmMheQMiIbm6+BZ9hY7cgO+CtS513gDRkuXeU2jN9ZIEym3JJDjo7snp63xIDldSJ4hLU8vJ+4xMmgztxgy32pgXQNBoCqqKO5uyQ2vpYKt5CVP7BZDKyCukoDGLvwffkfB87+zhNfxTIYvX3prSb1o2Dl2yuoUbarvVGyBKdgYxJ+dgo8R7HYqvlnbaqhdAvOG8WKG/UIcUtcQooUlf+pv5j4UgFnSMFFa1qkoVYs/Jzg9GKVdAEJz6hHkIohB5gdJRWQRBdBoXQCbiJiDrHUvRCYXY9HBFrBeNHwqx8MmRwnadwJoLmbwnqwBXpygu9Rsg+qTVvaDcKLkzGrtTGQKKtf9e/VMhJG0kPdgESnRimSRQ2Q67s88bIHWSOFfGzN8aayvrhZ+mtLv8ziyOhXhwRfQnqhNhB/qUuKQdSmrXb4BgZbCMOILDHhRJdsW8tdnxXv0zIUoJQpaglq0KSsaGOSiigMFHq1dAhFEaSlSqj0cg/YY0NXnUl91xLEScNm0SYN3s99XpqI5XtZ97PgTnRltS9IBOGDi1yOvRt59rUXc6ZNkWgPds4izyOaXV4j1kD07u+RDWr9s9G1BGs2iY+DVkCfg5N0A4UjsMicPF2+g5CMPhtY5yA4QiIP/w0UyDaDo7OtuRem/LQyFyaIbtJ0lRqwKN/ImoyndEhBdAyBMaJjPOxiC3FvdJ1HUoEc2wLoAEZQGJHukd3LNpce3kh5Rovko2B0MYqZZQqgH7dtQ9K6hrMciHnX4ohNRteC9WzGfVN2yDSDfYWmtXQMQRBJxJwjasPJ2+UU1zx0PcAJGCI7BWCmfiCZ8vveASgbZeALFJw9rpmElnUyeEN6oz6vqWo8dCsJwG5Wfqm5Rpm7dlUIdyNveqWB0L0ZmqHDSUT8TjGljWyQBpH8M/FELwkfZ0QCo2PnClFulbNY3XCyDk1U0OVOHD5Son5pQyOmaDLzofgsLBUqPVROy3XTWXnC0jPuyOUyF17NO1iY7Os8CRGs2OzofsORhCprkpgdSnWAPtuvNjU/XzIShni9GwUCNp++kSqpsZ+7oAglIY5DxUO5EJ9ekKEvWFCx0XQBRN6fwThY9d5FzhSfBm7bEugRBFN5haLcna8AZ1Z217l9JOhZCiifpg6uEyusuC5J8D1/Dd3XEqhJasSc+sKxs0AcvtoFnTDDLeztWxEDKCnVZTbhoUcNyHojf0a7UrIBFuMGueAla1EaGQccUL+dHZdShktMfUeEIrGpU2mt1G67aUCyC7BCWCQ65Vmwbbx227iWOUGyB2biaKFNtz9/6Q1E3rnYccD4dM5wfTq4iBC6zY6Vxu1yjvtPpQyB6bqhr8fbqbGs4B1bSn2fF8yCTntOdB2jYwA5FtaFQqn7/b5o+FoKG9jLRn6qO5S2jByOWvv1rOzoXQuYSXyWGi6TR4eIYsruzU+v1oxqEQBRwCD0n1CofSsHmwN2282ufOhTSiZ8GypUFLXu3urNk1gpj9N40fCyG3EWMPn7MqU8rPMvipoHgI8gKIVQHKE6vWCmhi57g9c1oRXQAZPl1UzG2o+7GZgjtFsPp5ivF0yH76h+fM+l476gMkbc2tW/PzUaYjIXMXMM3XcLrO2DCNuz/7p8J5OqS7sKlsYbo7q6MbKKDblf5DmeNMSMOe5WCR7CgxYMDUnx576qPl7GRIoW1GeVxzY7Yrt8VZT3meZrgEMvw8FmYhxqcfNCVguXvrAshu8e8PBXITzzPZfn5agm6AiC6oyrrdLObz3CK19e6H/+qKCyDf/73hC/lCvpAv5Av5Qv5nyF+w76Y2yWY7wwAAAABJRU5ErkJggg==" sessionToken : "aa73e547-0f1b-4235-a7b0-dd52fa4ab774" errorSummary : null }
Our response for trigger authenticate includes five pieces of information with the most relevant piece being the base64 encoded string of the QR that is used to be displayed to the user to continue the authentication on their device.
With our QR code generated in our web application we then have to proceed with scanning the QR code and completing the authentication. The authentication workflow involves 4 steps:
- Scan QR code and decrypt data.
- Initiate authentication
- Verify device
- Verify voice
The QR Code will return a url with an encrypted data parameter named “data”.
This data has to be decrypted using the available public key. Once decoded 3 pieces of information are provided separated by “&”.
- Host url
- This URL is where all subsequent api calls will be made through
- The API Key
- This API Key is used in the header of api calls
- The key value pair in the header is as follows:
“apikey”:API_Key
- Session Token
- The session token is used to validate the session.
Authentication Using QR Code Step 1 - Validate Session Token
The first api call necessary to authenticate a user is /tokenVerify/validateSession.
Validate Session
POST /tokenVerify/validateSession "Content-Type": 'application/json; charset=UTF-8', "apikey": apiKey { "sessionToken":sessionToken }
Urlencoded
This is the only call whose content type of this call is “application/x-www-form-urlencoded”
Response - Validate Session Token
Response - Validate Session Token
{ "accessToken": "accessToken", "methodType": "authenticate", "customerName": "customerName", "customerLogo": "", "userName": "customerUsername", "email": "customerEmail" }
Authentication Using QR Code Step 2 - Initiate Authentication
Initiate Authentication
POST /authenticate Authorization: 'Bearer AccessToken' apikey: 'apikey' { "registrationCode": "registrationToken", "deviceId": "deviceID" }
Response - Initiate Authentication
Response - Initiate Authentication
STATUS CODE 200 { "authToken": "b8bf6f22-6f93-4bcb-a5b6-871b689c6750", "requiredChecks": [ "verifyDevice", "verifyVoice" ] }
Authentication Using QR Code Step 3 - Verify Device
Verify Device
POST /onboarding/authentication/verifyDevice Authorization: 'Bearer AccessToken' apikey: 'apikey' { "authToken": "authToken", "signature": "signature", "deviceId": "deviceID" }
Response - Verify Device
Response - Verify Device
STATUS CODE 200 { "message": "Device verified.", "authStatus": 1 }
Authentication Using QR Code Step 4 - Verify Voice
Verify Voice
POST /onboarding/authentication/verifyVoice { "authToken": "auth_token", "livenessData": { "voice": { "voiceSamples": [ { "data": "base64 voice sample package from SDK", "Phrase": "\"Hello, Testing BIMaaS Application.\"" } ], "workflow": "alfa2", "meta_data": { "client_device_brand": "Apple", "client_device_model": "iPhone 8", "client_os_version": "11.0.3", "client_version": "KnomiSLive_v:2.4.1_b:0.0.0_sdk_v:2.4.1_b:0.0.0", "localization": "en-US", "programming_language_version": "Swift 4.1", "username": "test" } } } }
Response - Verify Voice
Response - Verify Voice
{ "livenessResult": true, "matchResult": true, "matchScore": 100, "authStatus": "2 (Complete)", "voiceLivenessResults": "" }
Software Acknowledgments
Aware VoiceCaptureAwareId libraries incorporate third-party software. See the LICENSE file installed with the FaceCaptureAwareId software for the full license agreement.