Published on

How to Auto-Cut POV Footage into B-Roll Clips for TikTok/Reels with Creator Mode on iOS & Android

Authors
  • avatar
    Name
    Almaz Khalilov
    Twitter

How to Auto-Cut POV Footage into B-Roll Clips for TikTok/Reels with Creator Mode on iOS & Android

TL;DR

  • You’ll build: a simple mobile app feature that connects to smart glasses and automatically produces short B-roll video clips from first-person (POV) footage, ready to share on TikTok or Reels.
  • You’ll do: Get access to the Wearables SDK → Install the SDK in Xcode/Android Studio → Run the sample app on iOS and Android → Integrate the Creator Mode SDK into your own app → Test using a wearable device or mock simulator.
  • You’ll need: a Meta developer account (Wearables Toolkit preview access), a pair of Ray-Ban Meta smart glasses (or SDK mock device), and a development environment (Xcode 15+ for iOS, Android Studio for Android).

1) What is Creator Mode?

What it enables

  • Hands-free POV content capture: Leverage smart glasses' 12MP camera and mics to film from a first-person perspective without holding a camera with hands-free capture capabilities. This provides a natural "eyesight" view for your videos.
  • Automatic highlight detection: The SDK can identify interesting moments in long, unstructured first-person videos and extract the best segments as highlights – saving you from manual editing.
  • Instant B-roll generation: It quickly cuts raw POV footage into short, shareable B-roll clips (vertical format, with smooth transitions) that are ideal for TikTok and Instagram Reels, speeding up your content creation workflow.

When to use it

  • Vlogging & daily life capture: Record your day hands-free using glasses and automatically get the most engaging snippets to post on social media.
  • Action sports and travel: For activities like hiking, biking, or sightseeing, capture hours of POV video and let Creator Mode auto-pick the scenic or exciting moments as B-roll.
  • IRL streaming & UGC creation: Streamers and UGC creators can use it to generate extra cutaway shots or behind-the-scenes clips from a first-person view, enriching their videos with minimal effort.

Current limitations

  • Device availability: Currently supports Meta's Ray-Ban AI glasses (developer preview); no broad device support yet. If you don't have the glasses, you'll rely on the provided mock device simulator in the SDK.
  • Developer preview only: The SDK is in early access – apps can be tested internally, but public app store releases are restricted until general availability in 2026. This is a prototype/MVP feature.
  • Processing constraints: Automatic highlight detection is still basic in this MVP. It may miss some key moments or include shaky footage. It doesn't (yet) add music or captions – you'll need to use editing tools (like CapCut's AutoCut) for that final polish.
  • Permissions & platform: Continuous video capture can be battery-intensive. iOS apps cannot access the glasses when completely in background (so keep the app active during capture). Voice-command control of the glasses isn't available in this version with no voice/AI integration yet.

2) Prerequisites

Access requirements

  • Meta Developer Account: Create or log in to the Meta Wearables Developer Portal (the Ray-Ban Meta glasses dev center).
  • Join the Preview Program: Request access to the Wearables Device Access Toolkit preview. You may need to agree to preview terms since this feature is in beta.
  • Organization setup: If working with a team, set up or join a Developer Organization/Team on the portal. Ensure all members have the necessary permissions.
  • Enable Wearables project: Create a new Wearables Project/App ID in the developer console. Select the Wearables Device Access (glasses) integration and note the generated App ID or credentials. (This ties your mobile app to the glasses SDK.)

Platform setup

iOS

  • Xcode 15+ with iOS 16.0 or later SDK (iOS 17 recommended).
  • Swift Package Manager (built into Xcode) or CocoaPods (if provided) to add the SDK.
  • A physical iPhone running iOS 16+ (required for Bluetooth and camera access). Simulator is not supported for Bluetooth-based wearable features.

Android

  • Android Studio Flamingo/Giraffe or newer with Android SDK 33+ (Android 13.0+).
  • Gradle (7.0+ recommended) and Kotlin 1.8+ for the project.
  • A physical Android phone (Android 12 or above). Emulators typically cannot pair with Bluetooth wearables, so use real hardware if possible.

Hardware or mock device

  • Ray-Ban Meta Smart Glasses – the supported wearable device providing the POV camera and microphones (developer preview edition).
  • OR a Mock Device: If you don't have the glasses, use the SDK's built-in mock device simulator (for development only). This can emulate camera input and sensor data.
  • Bluetooth enabled: Ensure your phone's Bluetooth is on. Understand the platform's Bluetooth permission prompts (on iOS, the app needs Bluetooth permission; on Android, the app needs BLUETOOTH_CONNECT permission to communicate with the glasses).

3) Get Access to Creator Mode

  1. Go to the Developer Portal: Visit the Meta Wearables Developer Center at wearables.developer.meta.com and sign in with your Meta developer credentials.
  2. Request Preview Access: Find the Device Access Toolkit section and opt into the developer preview. This may involve filling out an application or simply enabling the feature in your account. (Only approved developers or partners can use the glasses SDK at this stage.)
  3. Accept Terms: Review any beta agreements or Wearables Developer Terms and accept them. You must agree to usage policies (e.g. data collection rules) before proceeding.
  4. Create a Project: In the portal, create a new Wearables project for your app:
    • Give it a name (e.g. "CreatorModeDemo").
    • Register your app's Bundle ID (iOS) or Application ID (Android) if prompted – this should match your actual app's identifier.
    • Add any team members to the project if needed.
  5. Obtain Credentials: After project setup, download or note any credentials:
    • iOS: You might get a config file (e.g. GoogleService-Info.plist equivalent) or an App ID/Key to add to your Xcode project. Save any provided .plist or keys.
    • Android: You may receive an API key or Client Token. Often, you'll add this to a gradle.properties or as a string resource. (Also prepare a GitHub PAT for fetching the SDK packages – details in the next section.)
    • If no explicit keys are given, the App ID and your developer account association are used behind the scenes once the SDK is integrated.

Done when: you have proof of access – e.g. an App ID or project ID visible in the portal – and the portal shows your new project. You should be able to see a dashboard or settings for your Wearables project, confirming you're enrolled in the preview.


4) Quickstart A — Run the Sample App (iOS)

Goal

Run the official iOS sample app to verify that you can connect to the glasses and use the camera. You'll ensure that the Creator Mode features (like capturing a photo or streaming video) work end-to-end with a real or simulated device.

Step 1 — Get the sample

  • Option 1: Clone the repo. Clone the Meta Wearables iOS SDK repository:
    git clone <https://github.com/facebook/meta-wearables-dat-ios.git>
    
    
    Open the Xcode project located in samples/CameraAccess within the repository.
  • Option 2: Download as ZIP. Download the repository ZIP from GitHub, extract it, and open samples/CameraAccess/CameraAccess.xcodeproj in Xcode.

The sample CameraAccess app comes with minimal code to discover the glasses, connect, and show camera output.

Step 2 — Install dependencies

The sample uses Swift Package Manager (SPM) to include the Wearables SDK:

  • In Xcode, open the Project Navigator and select the project. Check that the Package Dependencies section already references meta-wearables-dat-ios. If not, add it:
    1. Go to File > Add Packages...
    2. Enter the package URL: https://github.com/facebook/meta-wearables-dat-ios
    3. Choose the latest version (e.g. 0.x.y preview).
    4. Add the package to the sample app target.
  • If the package requires authentication (it shouldn't, as it's public), ensure you have a GitHub account set in Xcode preferences. (CocoaPods is not needed since SPM is used. There is no Podspec as of this preview.)

After adding, Xcode will fetch the SDK. You should see MetaWearablesDAT (or similarly named) frameworks in the project.

Step 3 — Configure app

Before running, adjust a few settings:

  • Bundle ID: Change the sample app's bundle identifier to your registered one (from step 3). In Xcode, select the project target, go to Signing & Capabilities, and set Bundle Identifier to the value you set in the dev portal (e.g. com.yourname.CreatorModeDemo). This ensures the glasses recognize your app as authorized.
  • Add config file: If the portal provided an iOS config .plist (for API keys or project info), add it to the Xcode project (usually by dragging into the project and ensuring it's included in the app target). For example, a WearablesConfig.plist might contain your App ID or OAuth credentials.
  • Capabilities/Entitlements: Enable required capabilities:
    • In Signing & Capabilities, click "+ Capability" and add Background Modes → enable "Uses Bluetooth LE Accessories" (if your app will maintain BLE connections even in background).
    • If provided, add any custom entitlement file from Meta (not common for this SDK).
  • Info.plist permissions: Add usage descriptions for any system permissions the app will request:
    • NSBluetoothAlwaysUsageDescription – "Allow connection to wearable camera devices via Bluetooth."
    • NSCameraUsageDescription – "Allow app to use camera." (Not strictly needed for external camera, but include if you plan to use phone camera too.)
    • NSMicrophoneUsageDescription – "Allow app to use microphone for audio from glasses."

Step 4 — Run

  1. Build Settings: Select the CameraAccess sample app scheme in Xcode. Connect your iPhone via USB or use wireless debugging.
  2. Run on device: Choose your iPhone as the run target (the sample cannot run on Simulator due to Bluetooth). Click Build & Run ▶️.
  3. Allow permissions: On first launch, iOS will ask for Bluetooth permission (and microphone, if used). Grant these so the app can find and communicate with the glasses.

The app should launch on your iPhone.

Step 5 — Connect to wearable/mock

  • Power on the glasses: Make sure your Ray-Ban Meta glasses are powered and in range. If they haven't been paired before, put them in pairing mode (usually by pressing the power or capture button for a few seconds until an LED indicates pairing).
  • Connect via app: The sample app will likely automatically scan for the device. If it shows a device list, select your glasses model. Otherwise, it may connect to the first detected glasses.
  • Grant any prompts: If iOS prompts for Bluetooth pairing or confirmation to connect to the device, accept it. Also ensure the glasses are not already connected to another app (only one app at a time can usually hold the connection).

Once connected, the sample app's UI should indicate a successful connection (e.g. a "Connected" status or the camera viewfinder might appear streaming the glasses' view).

Verify:

  • The app shows Connected (e.g. a status label or icon turns green). The glasses might also indicate connection via an LED or sound.
  • You can trigger a camera action from the app. For example, tap a "Capture Photo" button – the glasses should take a photo and the image appears in the app. Or if the sample has a live preview, you see real-time video from the glasses on your phone screen.

Common issues

  • Build error ("Module not found" or codesign issues): This can happen if the Swift package didn't resolve or if your signing is misconfigured. Ensure the Meta Wearables package is added and try Product > Clean Build Folder, then rebuild. For signing, use a valid Development Team and iPhone provisioning profile (Xcode should handle this if you have an Apple ID added).
  • No device found: If the app doesn't discover the glasses, confirm Bluetooth is on and the glasses are in pairing mode. You might need to pair the glasses via the phone's Bluetooth settings first. Also, keep the glasses close to the phone. If using the mock device, ensure you included the mock device module and that the sample app has an option to simulate a device (some samples include a toggle to use a virtual device).
  • Permission denied: If the app isn't connecting, check Settings > Privacy on the iPhone:
    • Bluetooth permission must be enabled for your app. If you accidentally denied it, re-enable it.
    • The sample might not explicitly request microphone if it doesn't use audio, but if it does, ensure microphone permission is granted.
    • Restart the app after changing permissions.

5) Quickstart B — Run the Sample App (Android)

Goal

Run the official Android sample app to verify the glasses' integration on Android. You will confirm that you can connect to the wearable and use its camera from an Android app.

Step 1 — Get the sample

  • Clone the repository:
    git clone <https://github.com/facebook/meta-wearables-dat-android.git>
    
    
    Open the project in Android Studio, and locate the sample app under samples/CameraAccess. (If Android Studio prompts to import Gradle project, proceed.)
  • (Alternatively, download the repo as a ZIP from GitHub and open it.)

The sample CameraAccess app (Android) contains minimal code to connect to the glasses and stream from the camera.

Step 2 — Configure dependencies

The Android SDK is distributed via GitHub Packages, so you'll need to add the Maven repository and authentication token:

  • Add Maven repository: In the project's settings.gradle or settings.gradle.kts, add the Meta GitHub Maven URL. For example, in Kotlin DSL:
    dependencyResolutionManagement {
        repositories {
            maven {
                url = uri("<https://maven.pkg.github.com/facebook/meta-wearables-dat-android>")
                credentials {
                    username = "" // not needed
                    password = System.getenv("GITHUB_TOKEN") ?: localProperties.getProperty("github_token")
                }
            }
            // ... other repos ...
        }
    }
    
    
    This snippet (from Meta's docs) configures Gradle to use the GitHub token for access.
  • Provide GitHub token: Generate a GitHub Personal Access Token (classic) with at least read:packages scope. In your ~/.gradle/gradle.properties or the project's local.properties, add:
    github_token = YOUR_GITHUB_PAT_HERE
    
    
    Gradle will use this token to authenticate when downloading the Wearables SDK artifacts.
  • Sync Gradle: After adding the repository, click "Sync Project with Gradle Files" in Android Studio. Gradle should fetch the Wearables Toolkit libraries. (Ensure internet access and that the token is correct – if not, you'll see errors in the Sync console.)

Step 3 — Configure app

  • Application ID: Open the app module's build.gradle. Set the applicationId to the one you registered in the portal (e.g. com.yourname.creatormodedemo). This should match the App ID from step 3 to satisfy any backend checks.
  • Add SDK dependencies: The Meta SDK is modular. In the sample's Gradle config, you might see something like:
    dependencies {
        implementation("com.meta.wearable:mwdat-core:0.3.0")
        implementation("com.meta.wearable:mwdat-camera:0.3.0")
        implementation("com.meta.wearable:mwdat-mockdevice:0.3.0")
    }
    
    
    If not present, add the necessary modules:
    • mwdat-core (core connectivity and data)
    • mwdat-camera (camera control and streaming)
    • mwdat-mockdevice (for simulating a device if needed) Use the latest version (e.g., 0.3.0 as of Dec 2025).
  • AndroidManifest permissions: Ensure the app has required permissions in AndroidManifest.xml:
    • <uses-permission android:name="android.permission.BLUETOOTH_CONNECT" /> (for Bluetooth communication, Android 12+).
    • <uses-permission android:name="android.permission.BLUETOOTH_SCAN" /> (for finding devices, Android 12+; requires <uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" /> as well due to scanning).
    • <uses-permission android:name="android.permission.RECORD_AUDIO" /> (if you will capture audio from the glasses' mics).
    • (Optional) <uses-permission android:name="android.permission.CAMERA" /> – not needed for the wearable's camera, but include if your app also uses the phone camera.
    • <uses-feature android:name="android.hardware.bluetooth_le" android:required="true"/> to declare BLE support.
  • File access: If the app saves videos/clips to storage, add the appropriate storage permission (WRITE_EXTERNAL_STORAGE) or use scoped storage APIs. For API 33+, you might not need this if using the MediaStore.

After these changes, rebuild the project to ensure everything compiles and the IDE recognizes the SDK imports.

Step 4 — Run

  1. Connect device: Plug in your Android phone (with USB debugging enabled) or ensure it's connected via ADB over WiFi. Select it as the deployment target in Android Studio.
  2. Launch the app: Click Run ▶️. The app will build and install on the phone. Watch for any runtime permission prompts on app start.
  3. Permissions: On first run, Android will likely ask for Bluetooth permissions. For Android 13+, it will specifically prompt "Allow this app to find, connect to, and determine position of nearby devices?" – choose Allow. If prompted for microphone (for audio) or storage (if writing files), allow those as well.

The sample app should open on your Android device now.

Step 5 — Connect to wearable/mock

  • Pair or discover the glasses: If your Ray-Ban glasses are already Bluetooth-paired with the phone (through the companion app or OS settings), the sample might detect them. Otherwise, ensure the glasses are in pairing/discoverable mode.
  • In-app connection: Use the sample's UI to connect:
    • Some samples auto-connect to the first available wearable. Others provide a Scan/Connect button. Tap it to start scanning for nearby devices.
    • When you see your glasses (e.g. "Ray-Ban AI Glasses") in the list, tap to connect. You might get a system pairing request – confirm pairing by comparing PIN if shown.
  • Grant permissions: If you hadn't already, the app might request BLUETOOTH_SCAN permission when scanning (on Android 12+ this triggers a location permission dialog). Grant it to allow finding the device.
  • Mock mode (if no device): If using the mock simulator, the sample app might have a developer setting or automatically use a fake device when no real one is found. In this case, it will simulate a connection and feed dummy camera data (like a test pattern or sample video).

Once connected, the app should indicate success (e.g., "Connected to device"). The glasses' camera may start streaming to the app if the sample is coded to do so, or there may be a button to start the camera.

Verify:

  • The app UI shows Connected status to the wearable. You might see device info or an LED on the glasses confirming connection.
  • The key feature works end-to-end. For example, if you press "Start Preview" in the app, you begin seeing the POV camera feed from the glasses on-screen. Or if you tap "Capture Photo", a photo is taken and displayed. The round-trip (command to glasses → action → result back to app) should function.

Common issues

  • Gradle authentication error: If the build failed to download the SDK (Could not resolve com.meta.wearable:...), it means the Maven repo access failed. Double-check your GitHub token configuration. Make sure the token is valid and set in local.properties (github_token) or as GITHUB_TOKEN env var, and that the maven { url "<https://maven.pkg.github.com/>..." } block is in the top-level settings.gradle(.kts). Re-sync after fixing.
  • Manifest merger conflict: If your app's manifest had a uses-sdk or other entries that conflict with the SDK's, you might see a merger error. Usually, the sample is configured correctly. Resolve by adjusting manifest or Gradle settings (e.g., if prompted to add tools:remove for a duplicated attribute).
  • Device connection timeout: If the app can't find or connect to the glasses:
    • Ensure the glasses are not connected to another app (e.g., the official Facebook View app or another phone).
    • Restart Bluetooth on the phone and try again.
    • If pairing for the first time, go to Bluetooth settings on the phone and pair the glasses there, then return to the app.
    • Use the mock device as a fallback to continue development until you can use the hardware.
  • Permissions (Android): If you mistakenly denied a permission, the feature will fail silently or throw an error (check Logcat). Go to Settings > Apps > [Your App] > Permissions and enable the required ones, then retry. (For BLE scan on Android 13, you might also need to enable Location if it was off, as BLE scanning is tied to location services.)

6) Integration Guide — Add Creator Mode to an Existing Mobile App

Goal

Now that you've tried the samples, let's integrate the Creator Mode SDK into your own app. We'll walk through adding the SDK, establishing a connection to the glasses, and triggering one end-to-end feature (auto-cutting a highlight clip). By the end, your app will be able to communicate with the wearable and handle a basic POV capture workflow.

Architecture

Think of the integration in layers:

  • UI Layer (Your App): Buttons and screens that let the user connect the glasses and request a capture.
  • Client Layer (SDK wrapper): A WearablesClient in your app manages the connection to the glasses (connect, disconnect, monitor status).
  • Feature Layer: Services or controllers to handle specific actions. For example, a CaptureService to start a video and process it, or a PhotoService for still images.
  • Device SDK (Meta Wearables SDK): The SDK provided by Meta that actually interfaces with the glasses hardware (handles Bluetooth communication, camera control, etc.), providing callbacks/events for your app.
  • Wearable Device: The Ray-Ban glasses themselves, which capture the video/audio and send data to the phone.

Flow of events: App UI → calls into WearablesClient (SDK wrapper) → glasses hardware performs action → results (video, photo, status) come back via SDK callbacks → Your App updates UI or saves data accordingly.

Step 1 — Install the SDK

Add the Creator Mode (Wearables Toolkit) SDK to your project if you haven't already:

iOS (Swift):

  • Using Swift Package Manager, add the dependency as in Quickstart A. In your existing Xcode project, do File > Add Packages... and enter facebook/meta-wearables-dat-ios repo URL. Choose the latest tag and add it to your app target.
  • If you prefer CocoaPods (and if Meta provides one in the future), add to your Podfile:pod 'MetaWearablesDAT', '~> 0.3.0' (hypothetical example; as of now, use SPM).
  • After integration, import the relevant modules in your code: e.g. import MetaWearablesDAT (and submodules like MetaWearablesCamera if applicable).

Android (Kotlin/Java):

  • In your app's Gradle scripts, add the Maven repo and dependencies as described in Quickstart B:
    • In settings.gradle: include the maven.pkg.github.com/facebook/meta-wearables-dat-android repository and GitHub token credentials.
    • In app/build.gradle: add dependencies:
      implementation "com.meta.wearable:mwdat-core:0.3.0"
      implementation "com.meta.wearable:mwdat-camera:0.3.0"
      implementation "com.meta.wearable:mwdat-mockdevice:0.3.0" // remove if only using real device
      
      
    • Sync the project to download the libraries.
  • Import classes in your code as needed, e.g. import com.meta.wearable.mwdat.Camera (actual package names may vary; check SDK docs).

Step 2 — Add necessary permissions

Ensure your app's permission setup covers the wearable use case:

iOS (Info.plist):

  • Add NSBluetoothAlwaysUsageDescription with a user-facing message, since the app needs Bluetooth to communicate with the glasses.
  • If your app will record audio from the glasses, add NSMicrophoneUsageDescription.
  • (Camera usage may not be needed for external camera, but adding NSCameraUsageDescription doesn't hurt for completeness.)
  • Optionally, add a background mode capability in the project if you want the connection to persist while the app is backgrounded (this is advanced and not always allowed; for initial integration, you can skip background support).

Android (AndroidManifest.xml):

  • Include the Bluetooth permissions:
    <uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
    <uses-permission android:name="android.permission.BLUETOOTH_SCAN" />
    <uses-permission android:name="android.permission.BLUETOOTH" /> <!-- for older devices -->
    
    
    If using BLUETOOTH_SCAN, also include location permission as mentioned.
  • Include microphone permission if capturing audio:
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    
    
  • (No need for android.permission.CAMERA specifically for glasses, but include if your app also accesses phone camera.)
  • If you intend to keep the connection in background, you might need to request android.permission.ACCESS_BACKGROUND_LOCATION (for BLE scanning in background) and handle foreground service for long-running connections. This is beyond the basic integration (consider for advanced use).

Step 3 — Create a thin client wrapper

To keep things organized, implement small classes to encapsulate the SDK's functionality:

  • WearablesClient (Connection Manager): This class will:
    • Initialize the SDK (if required) when your app starts.
    • Scan and connect to the glasses, or attach to an already paired device.
    • Expose connection state (connected/disconnected) to the rest of your app.
    • Handle reconnection logic (e.g., try to reconnect if connection drops).
    • Example: methods connect(deviceId), disconnect(), property isConnected.
  • CaptureService / FeatureService: This could handle the high-level logic of capturing and processing content:
    • For example, a BRollCaptureService that, when triggered, commands the glasses to start recording video, then retrieves the video data and runs the auto-cut algorithm to produce a highlight clip.
    • It might use the SDK's camera API (e.g. start video stream or send a "record" command) and listen for completion or frames.
    • It should also handle timeouts or errors during capture.
  • PermissionsService: Especially on Android, create a small helper to check and request runtime permissions (Bluetooth, audio, etc.) before actions. On iOS, you might not need a separate class, but you do need to call the CBCentralManager authorization for Bluetooth if required.

Implementing these as separate components makes your code cleaner and easier to maintain.

Definition of done:

  • The Wearables SDK is initialized (for instance, some SDKs require a context or config to be set up – check if Meta's needs initialization or if using any singleton from it).
  • You can call a method like wearablesClient.connect() and it successfully connects to the glasses, updating a state (and UI) to "connected". You handle the full lifecycle: connect on demand, disconnect on app exit or when user toggles off, and attempt to reconnect if needed (e.g. device goes out of range and comes back).
  • Errors (e.g. "device not found" or "connection lost") are caught and either shown to the user or logged. The app should not crash on any exceptions coming from the SDK – handle them gracefully (use try/catch or SDK callbacks).

Step 4 — Add a minimal UI screen

Design a simple interface in your app for the Creator Mode feature. It could be as basic as a single screen with a few controls:

  • "Connect Glasses" button: Initiates scanning/connecting using your WearablesClient. It can change to "Disconnect" when already connected.
  • Connection status indicator: Show an icon or text ("Connected ✅ / Not Connected ❌") so the user knows if the wearable is linked.
  • "Capture Highlight" button: When pressed, this triggers the process to capture a B-roll highlight clip from the glasses. (We will implement this in the next section.)
  • Result preview: A small video player or image view to display the output. For example, after capturing, show a thumbnail or the actual video clip so the user can preview what was recorded/cut. Alternatively, a text log area could confirm actions (e.g. "Saved clip to gallery").

Keep the UI minimal for now – just enough to test the integration. You can always expand it later with more bells and whistles (like a gallery of multiple clips, or controls for different capture modes).


7) Feature Recipe — Automatically Capture a B-Roll Highlight Clip from the Wearable

Goal

Implement the core user story: The user taps a "Capture Highlight" button in your app → the glasses record a short video from the user's POV → the app receives that video, automatically detects the most interesting segment (the highlight) and extracts it → the resulting short clip is displayed or saved, ready to share as B-roll.

This is the magic of Creator Mode: turning raw first-person footage into a ready-to-use snippet with one tap.

UX flow

  1. Pre-check: If the glasses aren't connected, the app should prompt the user to connect first (don't attempt capture when not connected).
  2. User taps "Capture Highlight." The UI should indicate that a capture is in progress (e.g. show a "Recording..." message or a loading spinner).
  3. Recording phase: The app instructs the glasses to record a video (for example, a 10-second clip). Provide visual feedback (maybe a progress bar if it's a fixed duration recording).
  4. Processing phase: Once the raw video is transferred to the app, process it to find the best moment. The UI can show "Processing..." during this short analysis.
  5. Result: When the highlight is ready, display it. Show a thumbnail or autoplay the clip in a small video view. Also confirm saving: e.g., "Clip saved to Photos" or a share button to send it to TikTok.
  6. Post-capture: Reset the UI state (hide spinners, re-enable buttons, etc.), allowing the user to capture another or disconnect as needed.

Implementation checklist

  • Connected state verified: In the onCaptureHighlight handler, first check if !isConnected (for iOS, or for Android, if your WearablesClient says not connected). If not, maybe open the connection dialog or show "Please connect your wearable first." This avoids false starts.
  • Permissions verified: Ensure the app has the needed permissions at the moment of capture. E.g., on Android check BluetoothAdapter.isEnabled() and for any needed runtime permissions (audio recording if capturing audio). If something is missing, request it and do not proceed with capture until granted.
  • Capture request issued: Use the SDK's camera API to start a capture. Depending on the SDK:
    • If there's a convenience method like GlassesCamera.captureVideo(durationSeconds: Int), call that.
    • Or you might have to start a stream and manually stop it after X seconds. In that case, start recording and use a timer to stop after the desired duration.
    • The SDK might also allow grabbing a continuous stream of frames; for simplicity, use whatever highest-level method is available for recording a clip.
  • Timeout & retry handled: Set a reasonable timeout for the operation. If no response from device in, say, 5–10 seconds, assume something went wrong. Stop any recording attempt and show an error ("Capture failed, please try again"). Also handle if the device disconnects mid-capture (the SDK should throw an error or notify; you then reconnect and prompt user to retry).
  • Highlight extraction: Once you have the video data (file or buffer), run the "auto-cut" logic:
    • In this MVP, this could be a simple heuristic (for example, choose the middle 3 seconds of the 10s clip assuming that's when the camera was steady).
    • Or integrate a highlight detection algorithm: e.g., use an AI model or an open-source library to score frames and pick the highest-scoring segment. For now, a simple approach is fine.
    • Trim the video to the highlight segment (using AVFoundation on iOS or Android's MediaCodec/FFmpeg).
  • Persist & update UI: Save the highlight clip:
    • On iOS, write it to the Photos library or app Documents directory.
    • On Android, save to MediaStore (so it appears in Gallery).
    • Then update the UI: show a thumbnail or play the video. Indicate success ("Saved ✅" message as in our pseudocode below).
    • If saving fails (out of storage, etc.), inform the user where it went or if they need to grant storage permission.

Pseudocode

Here's a simplified pseudocode (mixing concepts, for illustration):

func onCaptureHighlightTapped() {
    guard wearablesClient.isConnected else {
        showAlert("Connect your glasses first!")
        return
    }
    if !permissionsService.allPermissionsGranted() {
        permissionsService.requestNeededPermissions()  // asynchronous
        return
    }

    showStatus("Capturing…")
    do {
       // Start recording a 10-second video
       let videoData = try wearablesClient.camera.captureVideo(duration: 10)
       showStatus("Processing…")
       // Simple highlight: take middle portion
       let highlightClip = VideoUtils.extractBestSegment(from: videoData)
       VideoUtils.saveToGallery(highlightClip)
       thumbnailView.image = VideoUtils.generateThumbnail(highlightClip)
       showStatus("Saved ✅")
    } catch {
       showStatus("Capture failed 😢")
       log("Error during capture: \\(error)")
    }
}

And for Android (Kotlin-style):

fun onCaptureHighlightClicked() {
    if (!wearablesClient.isConnected) {
        Toast.makeText(this, "Please connect your glasses first.", Toast.LENGTH_SHORT).show()
        return
    }
    if (!permissionsService.hasAllPermissions()) {
        permissionsService.requestPermissions(this)
        return
    }

    statusText.text = "Capturing…"
    wearablesClient.camera.captureVideo(duration = 10,
        onSuccess = { videoFile ->
            statusText.text = "Processing…"
            val highlightFile = VideoUtils.extractBestSegment(videoFile)
            VideoUtils.saveToGallery(context, highlightFile)
            runOnUiThread {
                thumbnailImage.setVideoURI(highlightFile.toUri())
                statusText.text = "Saved ✅"
            }
        },
        onFailure = { error ->
            statusText.text = "Capture failed 😢"
            Log.e(TAG, "Capture failed", error)
        }
    )
}

(The above pseudocode assumes synchronous capture for simplicity. In reality, the SDK likely provides asynchronous callbacks or suspend functions for capture.)

Troubleshooting

  • Highlight comes out empty or uninteresting: If your extractBestSegment logic isn't sophisticated, it might pick a dull segment (e.g., sky or ground). To improve:
    • Log sensor data or use motion info – if the camera was very shaky, maybe skip those parts as they're less watchable.
    • Ensure the device was actually recording what you expected (if user didn't point their head at anything interesting, the highlight might reflect that).
    • You can always allow the user to manually select a segment as a fallback if auto selection isn't great.
  • Capture hangs or times out: Sometimes the glasses might not stop recording when expected:
    • Always implement a timeout. If you command "record 10s" and you don't get a confirmation back in, say, 12s, send a stop command or throw an error to recover.
    • Check Bluetooth connection stability – moving too far from the phone or interference could cause delays.
    • If using the mock device, ensure your simulation sends an "end of video" event; otherwise your app might wait indefinitely.
  • "I want it instantly!": Users might expect the highlight to appear immediately when they tap:
    • Remind them that there's a short recording period. You can mitigate impatience by providing feedback (a countdown "Recording 10…9…seconds").
    • For faster results, you could record a shorter clip (e.g., 5 seconds) but risk cutting something important. It's a balance between speed and content quality.
    • As an advanced option, you might continuously buffer video on the glasses (if hardware allows) and on tap, just grab the last few seconds – this would feel instantaneous. That's more complex, but worth exploring in the future.

8) Testing Matrix

Test your integration under various scenarios to ensure reliability:

← Scroll for more →
ScenarioExpected ResultNotes
Mock device (simulator)Feature works with dummy data (e.g., a test pattern video)Useful for CI or if hardware is unavailable. Make sure your app can operate in this mode without crashing.
Real device – close rangeLow latency connection, smooth captureBaseline happy path. E.g., glasses 1 meter from phone. Expect minimal delay in starting capture and transferring video.
Real device – far/obstructedPossible slight lag or connection drop if out of rangeTest walking away with glasses or having a wall between. The capture might fail mid-way – ensure graceful handling (error message and ability to retry).
App in background/lockedDefined behavior: likely capture stops or is not allowedOn iOS, if the app goes background during recording, the session may pause (since no background mode by default). Document that the app must stay active. On Android, a background service would be needed to continue – not covered in basic integration.
Permission denied mid-flowClear error and recovery pathE.g., user denies microphone permission when prompted – the app should show "Cannot capture without audio permission" and let them enable it (don't just fail silently).
Disconnect mid-captureGraceful abort, maybe automatic reconnect attemptIf the glasses power off or Bluetooth disconnects during a capture, your app should handle the exception. It might cancel the operation and notify "Connection lost." Test by turning off the glasses during a recording. No crashes should occur.

Using a combination of real device tests and the mock environment will give confidence that your feature works in all conditions.


9) Observability and Logging

When deploying or debugging the Creator Mode feature, it helps to have robust logging and analytics around it. Here are key events/metrics to log:

  • Connection events: Log connect_start (user initiated pairing), connect_success (device connected) and connect_fail (timeout or error). Include timestamps and device identifiers for diagnostics.
  • Permission status: Log permission_state on app launch or when trying to capture – e.g., "Bluetooth=granted, Audio=denied" – so you can see if missing permissions cause failures.
  • Capture lifecycle: For the highlight capture feature, log highlight_start when recording begins, highlight_success when a clip is successfully cut and saved, or highlight_fail if any step fails. If there are sub-steps, you could also log record_start, record_end, process_start, process_end.
  • Performance metrics: Measure how long each stage takes. Log highlight_ms (total milliseconds from user tap to result ready). This helps identify bottlenecks (e.g., if processing is too slow). Also track average video size, etc.
  • Reconnect attempts: If you implement auto-reconnect, log each attempt and outcome. A reconnect_count metric per session can tell you if the connection is unstable.
  • User actions: (If relevant) Log if user cancels or retries, to see if UX adjustments are needed.

By monitoring these logs (and sending anonymized analytics events if appropriate), you can continuously improve the reliability and speed of Creator Mode.


10) FAQ

  • Q: Do I need the actual hardware to start development? A: Not initially. You can use the provided mock device simulator in the SDK to simulate a glasses connection and fake camera feed. This is great for early development and CI tests. However, to truly validate the experience (especially the quality of auto-cut highlights), testing with real glasses is highly recommended before release.
  • Q: Which wearable devices are supported by Creator Mode? A: Currently, it's built for the Meta Ray-Ban AI Glasses (2023 model). These have the camera and audio hardware that the SDK interfaces with. Future Meta wearable models should work as well. Other brands' glasses or cameras are not supported in this SDK. (If you have a GoPro or other camera, you'd need a different integration – Creator Mode is specific to Meta's device and SDK.)
  • Q: Can I ship this feature to production apps now? A: Not yet. The Wearables Device Access Toolkit is in developer preview, which means it's for testing and prototyping only. Only select partners can publish apps using it to the public at this time. General availability is expected in 2026, at which point you should be able to release your Creator Mode-enabled app to all users (subject to Meta's terms).
  • Q: Does the SDK let me push content or remote commands to the glasses (like an AR display)?A: The current SDK is primarily focused on accessing the glasses' sensors (camera, microphones, etc.) and not on sending visuals to the user. Note that Ray-Ban Meta glasses have no built-in display— they're not AR glasses, they are camera and audio enabled. You can send commands (like "start recording" or "take photo") and get data back, but you can't, for example, make them display an image (since there's no lens display). Audio playback (like using the glasses' speakers) is possible through standard Bluetooth audio profiles, but that's outside the SDK's special scope.
  • Q: How does the highlight detection work? Can I customize it?A: In this MVP, highlight detection is basic. It might use simple rules (or a lightweight AI model) to choose a segment. You can absolutely customize or improve it – the SDK gives you the raw footage, and it's up to your app how to process it. For example, you could integrate an AI service or your own algorithm to detect excitement (perhaps via audio volume, or detecting faces/landmarks in the video, etc.). The provided logic is just a starting point.
  • Q: What about aspect ratio and editing for TikTok/Reels?A: TikTok and Reels favor vertical 9:16 video. The glasses capture in a wide format (likely horizontal or square). As part of the post-processing, you should crop or rotate the video into 9:16 portrait orientation if intending to post directly. You might take the center of the frame or use a smart crop to focus on the subject. Also, adding music, captions, or effects is not handled by Creator Mode itself – you would do that either in-app (if you have editing features) or by exporting the clip and using an app like CapCut. The goal of Creator Mode is to give you a raw clip that's already cut to the best moment; you can then quickly polish it for publishing.

11) SEO Title Options

  • "How to Get Access to Creator Mode and Run the Smart Glasses Sample App (iOS & Android)" – covers the onboarding and sample usage, good for developers looking to start with the SDK.
  • "Integrate Meta's Wearable Camera SDK (Creator Mode) into an Existing App: Step-by-Step Guide" – highlights integration steps for those who want to add this to their app.
  • "How to Automatically Create TikTok/Reels B-Roll from POV Glasses Footage" – appeals to creators interested in the outcome, emphasizing the automatic aspect.
  • "Creator Mode Troubleshooting: Pairing Smart Glasses, Permissions, and Common Errors" – targets searchers who hit issues while developing or using the feature.

(These titles include relevant keywords like "smart glasses", "POV footage", "TikTok/Reels", which can help the article rank for those queries.)


12) Changelog

  • 2025-12-31 — Verified with Meta Wearables SDK v0.3.0 on iOS 17.1 (Xcode 15.2) and Android 13 (Pixel 6). Tested using Ray-Ban Meta Smart Glasses (Dev Preview) and the SDK mock device. Updated instructions for latest Gradle setup and iOS permission requirements.