iLive Docs

iOS quickstart

Get face liveness detection running in your iOS app in under five minutes.

Get face liveness detection running in your iOS app in under five minutes.

What you'll build

A single button that launches the iLive liveness verification flow — camera preview, face detection, on-device analysis — and returns a verdict (.pass / .fail / .retry) with confidence scores. All inference runs on-device.

Requirements

RequirementMinimum
Xcode15.0+
Swift5.9+
iOS deployment target14.0
CocoaPods1.14+
DeviceiPhone with front camera (arm64)

Step 1: Install dependencies

Podfile:

platform :ios, '14.0'
 
target 'YourApp' do
  use_frameworks! :linkage => :static
 
  pod 'ILive', '~> 1.0'
end

Then run:

pod install

CocoaPods pulls in all transitive native dependencies automatically.

Step 2: Add camera permission

Info.plist:

<key>NSCameraUsageDescription</key>
<string>Camera access is required for face liveness verification.</string>

Step 3: Add the SDK model assets

The SDK ships with a set of model files that need to be added to your app bundle. Add them to your Xcode target's Copy Bundle Resources build phase — your SDK distribution bundle includes the full list and expected sizes.

Model files are distributed separately from the SDK source. Your SDK bundle ships a models/ directory — add every file inside it to your app target's Copy Bundle Resources phase before building.

Step 4: Launch the liveness flow

Present the pre-built LivenessViewController:

import ILiveCore
import ILiveUI
 
class ViewController: UIViewController {
 
    @IBAction func startVerification() {
        let config = ILiveConfig(
            passThreshold: 0.70,
            voicePromptsEnabled: false,
            photoExtractionEnabled: true
        )
 
        let vc = LivenessViewController(config: config, passiveMode: true)
        vc.modalPresentationStyle = .fullScreen
 
        vc.onResult = { [weak self] result in
            self?.handleResult(result)
        }
 
        vc.onDismiss = { [weak self] in
            self?.dismiss(animated: true)
        }
 
        present(vc, animated: true)
    }
 
    private func handleResult(_ result: LivenessResult) {
        switch result.verdict {
        case .pass:
            print("Verified! Confidence: \(Int(result.confidence * 100))%")
            if let photo = result.icaoPhoto {
                let image = UIImage(data: photo)
            }
        case .retry:
            print("Please try again: \(result.retryHint ?? "")")
        case .fail:
            print("Failed: \(result.failureReason ?? "")")
        }
 
        for score in result.layerScores {
            print("  \(score.layer): \(Int(score.score * 100))%")
        }
    }
}

Configuration

let config = ILiveConfig(
    // Challenge settings (active mode)
    challengeCount: 4,                    // 3-6 challenges per session
    challengeTimeoutSeconds: 10,          // seconds per challenge
    challengeTypes: [.blink, .smile, .turnLeft, .nod],
 
    // Verdict thresholds
    passThreshold: 0.75,                  // minimum confidence for PASS
    retryThreshold: 0.45,                 // below this = FAIL
    antispoofFloor: 0.30,                 // anti-spoof veto threshold
    deepfakeFloor: 0.20,                  // deepfake veto threshold
 
    // Voice prompts
    voicePromptsEnabled: true,
    voiceLanguage: "en-US",
 
    // Photo extraction
    photoExtractionEnabled: true,
    photoWidth: 480,
    photoHeight: 600,
    photoJpegQuality: 95,
 
    // Security
    attestationKey: myHmacKeyData,        // 32+ bytes for HMAC-SHA256
    frameBundleEncryptionKey: myAesKey,   // exactly 32 bytes for AES-256-GCM
 
    // Timeouts
    totalSessionTimeoutSeconds: 120
)

Passive mode vs challenge mode

ModeHow it worksWhen to use
Passive (passiveMode: true)Camera captures frames silently for about 3 seconds, then runs anti-spoof, deepfake, face-consistency, and motion analysis. No user interaction.Low-friction onboarding, background verification
Challenge (passiveMode: false)User completes 3–6 randomized challenges (blink, turn head, smile, etc.), then analysis runs. The challenge score is an additional verification layer.High-security scenarios, regulatory compliance

Both modes use the same underlying analysis pipeline. Passive mode redistributes the challenge weight across the other four layers.

LivenessResult fields

FieldTypeDescription
sessionIdStringUnique session identifier (UUID)
verdictVerdict.pass, .fail, or .retry
confidenceFloatWeighted aggregate score (0.0–1.0)
layerScores[LayerScore]Per-layer breakdown (anti-spoof, deepfake, consistency, motion)
retryHintString?User-facing suggestion on retry
failureReasonString?Internal reason on fail
icaoPhotoData?JPEG passport photo (480×600) on pass
photoQualityScoreFloat?Photo quality (0.0–1.0)
attestationAttestation?HMAC-SHA256 signed payload (if key configured)
encryptedFrameBundleFrameBundle?AES-256-GCM encrypted frames (if key configured)
metadataSessionMetadataDuration, device, SDK version

Security

Attestation (HMAC-SHA256 via CryptoKit):

let config = ILiveConfig(
    attestationKey: myKey  // Data, 32+ bytes
)
 
// After verification
if let att = result.attestation {
    verifyOnServer(payload: att.payload, signature: att.signature)
}

Frame bundle encryption (AES-256-GCM via CryptoKit):

let config = ILiveConfig(
    frameBundleEncryptionKey: myAesKey  // Data, exactly 32 bytes
)
 
// Encrypted frame bundle for server-side audit
if let bundle = result.encryptedFrameBundle {
    uploadToServer(ciphertext: bundle.ciphertext, iv: bundle.iv)
}

Device support check

let support = ILive.isDeviceSupported()
if support.isSupported {
    // Safe to launch the liveness flow
} else {
    print("Issues: \(support.issues.joined(separator: ", "))")
}

SwiftUI integration

import SwiftUI
import ILiveUI
 
struct VerificationView: View {
    @State private var showLiveness = false
    @State private var result: LivenessResult?
 
    var body: some View {
        VStack(spacing: 20) {
            if let result = result {
                Text(result.verdict.rawValue.uppercased())
                    .font(.title)
                    .foregroundColor(result.verdict == .pass ? .green : .red)
                Text("Confidence: \(Int(result.confidence * 100))%")
            }
 
            Button("Verify Identity") {
                showLiveness = true
            }
        }
        .sheet(isPresented: $showLiveness) {
            LivenessViewControllerRepresentable(
                passiveMode: true,
                onResult: { livenessResult in
                    result = livenessResult
                    showLiveness = false
                }
            )
        }
    }
}
 
// UIKit wrapper for SwiftUI
struct LivenessViewControllerRepresentable: UIViewControllerRepresentable {
    let passiveMode: Bool
    let onResult: (LivenessResult) -> Void
 
    func makeUIViewController(context: Context) -> LivenessViewController {
        let vc = LivenessViewController(passiveMode: passiveMode)
        vc.onResult = onResult
        vc.onDismiss = { }
        return vc
    }
 
    func updateUIViewController(_ vc: LivenessViewController, context: Context) {}
}

Face comparison

Compare the verified face against a reference photo (for example, an ID document):

let match = ILive.compareFaces(
    reference: idDocumentJpeg,
    probe: livenessResult.icaoPhoto!
)
 
if let match = match, match.isMatch {
    print("Same person: \(Int(match.similarity * 100))% match")
}

For faster repeat comparisons, store the face embedding from the liveness result:

// Store the embedding after the first verification
let embedding = livenessResult.faceEmbedding  // 512-dimensional
 
// Later, compare against a new photo without reloading the model
let match2 = ILive.compareFaceWithEmbedding(
    referenceEmbedding: embedding!,
    probe: newPhotoData
)
ParameterDefaultDescription
threshold0.45Minimum similarity for a match (0.0–1.0)

Troubleshooting

Missing module errors when building a framework target: The SDK's native dependencies must be linked via an app host target — use use_frameworks! :linkage => :static in your Podfile.

Pod install "statically linked binaries" warning: Safe to ignore. Some dependencies distribute static xcframeworks which trigger this warning but work correctly with static linkage.

Models not found at runtime: Ensure model files are in the Copy Bundle Resources build phase of your app target, not just added to the project navigator.

Face not detected: Ensure camera frames are in BGRA format (kCVPixelFormatType_32BGRA) and portrait orientation. The face tracker expects this format.

Low confidence scores: Verification performs best with even, front-facing illumination. Test in a well-lit environment.

Simulator not available: Build for device only with -sdk iphoneos if your Xcode installation doesn't have simulator runtimes installed.

Platform comparison

FeatureiOSAndroid
Face trackingOn-deviceOn-device
CameraAVFoundationOn-device
UI frameworkSwiftUI + UIKitJetpack Compose
CryptoCryptoKit (HMAC, AES-GCM)javax.crypto
Package managerCocoaPodsGradle
OfflineFully offlineFully offline