Published on

How to Integrate Meta Wearables SDK with Flutter (Cross-Platform AI Glasses Guide)

Authors
  • avatar
    Name
    Almaz Khalilov
    Twitter

How to Integrate Meta Wearables SDK with Flutter (Cross-Platform AI Glasses Guide)

TL;DR

  • You'll build: a Flutter app that connects to Meta's AI glasses (Ray-Ban Meta) and captures photos/videos hands-free.
  • You'll do: Get access → Install the SDK (iOS Swift Package & Android Gradle) → Run the sample apps → Integrate into your Flutter app → Test on real glasses or a mock device.
  • You'll need: a Meta developer account (preview program), Ray-Ban Meta glasses (or mock device), a dev environment with Flutter (plus Xcode for iOS, Android Studio for Android).

1) What is Meta Wearables SDK?

Figure: Ray-Ban Meta AI glasses, which the Meta Wearables SDK targets for enabling camera and audio integration. The Meta Wearables Device Access Toolkit (DAT) – often called the Meta Wearables SDK – is Meta's new cross-platform SDK for AI glasses (e.g. Ray-Ban Meta smart glasses). It allows mobile apps to reliably connect to Meta's AI glasses and leverage on-glasses capabilities like video streaming and photo capture. In short, it lets your app access the glasses' camera and audio features to create hands-free, "wearable" experiences.

What it enables

  • Hands-free camera access: Stream live video from the glasses' POV camera to your app and trigger high-resolution photo captures.
  • Seamless wearable integration: Connect your Flutter app to the glasses via Bluetooth, receiving sensor data (camera frames, etc.) while the user keeps their phone pocketed.
  • Audio integration: Use the glasses' built-in microphones and speakers as a Bluetooth audio source for your app (e.g. for voice notes or playback). (Handled as a standard headset by the OS.)

When to use it

  • Primary use case: Extend an existing mobile app with augmented vision or hands-free capture features. For example, a fitness or travel app can let users capture photos/videos from their glasses' perspective without using their hands.
  • Secondary use cases: Build innovative AR/AI assistant features – e.g. live transcription of what the wearer sees or hears, real-time POV sharing, or visual search – where the glasses act as an input sensor for your app. Essentially, use it whenever you want to blend mobile apps with real-world vision/audio in a natural, unobtrusive way.

Current limitations

  • Preview-only & limited features: The SDK is in developer preview – meaning apps can currently only be distributed to test users (not broadly released). It's also feature-limited: currently it supports camera streaming, taking pictures, and basic Bluetooth headset functions. More advanced capabilities (e.g. voice assistant APIs, gesture/touchpad events, or display output on glasses) are not yet available in this SDK.
  • Device support: At the time of writing, the SDK supports the latest Ray-Ban Meta AI glasses (Gen 2). Support for other models (like first-gen Ray-Ban Stories/Display) is expected in future updates.
  • Platform constraints: Some features require physical hardware (e.g. video streaming from glasses won't work on an emulator without a mock device). Also, Bluetooth permissions and a companion app (the Meta AI app for glasses) are required – the glasses must be paired and linked via Meta's app for the SDK to function. Voice command integrations with Meta AI are not open to third-party apps in this preview.

2) Prerequisites

Access requirements

  • Meta developer account: Create or sign in to your account on the Meta Wearables Developer Centre. You'll need to join the Wearables Device Access Toolkit preview program (if not open, request access via the provided form).
  • Organisation/Project setup: Inside the Wearables Dev Centre, set up your organisation/team and create a Project for your app. This will give you a unique Application ID for the Wearables SDK.
  • Preview entitlement: Ensure your account/app is enrolled in the glasses SDK preview/beta. You may need to accept additional terms and enable the preview for your project (check the Wearables section in the Meta developer portal).

Platform setup

iOS (Flutter module)

  • Xcode 15+ with iOS 17+ SDK; iOS device running iOS 16 or later (minimum iOS version depends on SDK docs, 16+ recommended).
  • Swift Package Manager (built into Xcode) – to add the Meta Wearables SDK package.
  • Cocoapods (if using a Pod for Flutter plugins) – ensure it's installed and up to date.
  • A physical iPhone is recommended (the glasses connect via Bluetooth to a real device). Simulator can be used for UI dev, but obviously cannot pair with Bluetooth wearables.

Android (Flutter module)

  • Android Studio Giraffe+ (Arctic Fox or later should work) with Android SDK API 33+ (Android 13).
  • Gradle (7.x) and Kotlin (1.8+) – the sample uses Kotlin and Gradle kts scripts.
  • A physical Android phone (Android 12+ recommended). Bluetooth integration may not work on a generic emulator (unless you use a special emulator with Bluetooth passthrough).

Hardware or mock

  • Supported wearable device: Ray-Ban Meta AI Glasses (Gen 2) or a supported Meta wearable. Having the actual glasses is ideal to test live camera streaming and capture.
  • OR Mock device kit: If you don't have the hardware, use the provided Mock Device in the SDK. The Android SDK includes a component that simulates a wearable for testing. This lets you simulate a connected glasses camera feed on the phone for development purposes.
  • Meta AI companion app: Install the "Meta AI" app (formerly Meta View) on your phone and pair it with the glasses. This official app manages device pairing and is required for third-party apps to register with the glasses.
  • Bluetooth enabled: Enable Bluetooth on your phone and understand that you'll need to grant Bluetooth, camera, and possibly microphone permissions to your app to communicate with the glasses.

3) Get Access to Meta Wearables SDK

  1. Sign up on the portal: Go to the Wearables Dev Centre and log in with your Meta developer credentials. Navigate to the Wearables section.
  2. Request preview access: Fill out the Wearables SDK early access form if prompted (it will ask about your experience, use cases, etc.). Submit the form and wait for approval email if required. (Some accounts may be granted instant access if the preview is open.)
  3. Accept terms: Once approved, accept any Developer Preview Terms for the wearables toolkit in the portal. You must agree to Meta's Wearables Developer Terms and Acceptable Use Policy.
  4. Create a project: In the Wearables Dev Centre, create a new Project (or App). This will generate an Application ID for the Wearables SDK. Take note of this ID (you'll use it in your app's config). Also set up a Release Channel if you plan to distribute the app to testers.
  5. Download credentials (if any): Depending on the SDK version, you may need to grab some config files or keys:
    • iOS: You might receive a config plist or an entitlement file to include in your Xcode project, or you simply use the Application ID and ensure your app's bundle ID is registered.
    • Android: You will need a GitHub Packages token to fetch the SDK (see next steps) and will use the Application ID in your Android manifest.
    • Note: There is no general "API key/secret" for this SDK; access is controlled via the preview program and the registered App ID.

Done when: you have a Wearables Application ID for your app and access to download the SDK packages. You should also see your project listed in the Wearables portal (and can manage devices, release channels, etc., there).


4) Quickstart A — Run the Sample App (iOS)

Goal

Run Meta's official iOS sample app to verify the SDK works and that you can connect to the glasses (or mock device) on an iPhone.

Step 1 — Get the sample

  • Clone the repo: Clone the official iOS SDK repository: git clone https://github.com/facebook/meta-wearables-dat-ios.git. This repository contains the SDK and a sample app.
  • Open the sample: In Xcode, open the sample app project. For example, navigate to meta-wearables-dat-ios/Samples/ and look for an Xcode project or workspace (the sample might be in a folder like "CameraAccess"). Open the .xcodeproj or .xcworkspace.

(If Meta provided a downloadable sample project zip, you could alternatively download and unpack it, then open in Xcode.)

Step 2 — Install dependencies

The Meta Wearables SDK is distributed via Swift Package Manager (SPM):

  • In Xcode, if the sample project isn't already configured, add the Swift Package dependency:
  • If the sample uses CocoaPods (less likely since SPM is standard here), run pod install to install pods.

Step 3 — Configure the app

Before running, configure the sample with your credentials:

  • App ID: Locate where to set the Application ID. On iOS, this might be done by the sample code (e.g. a constant or Info.plist entry). Ensure it matches the App ID from your Wearables project. (If the SDK uses your bundle identifier as the key, make sure the sample's bundle ID is added to your project in the portal.)
  • Info.plist settings: Add any required keys:
    • Add the MWDAT dictionary if not present:

      • For example, to opt-out of analytics, the sample's Info.plist should include the snippet shown below (optional):

        xml

        Copy code

        <key>MWDAT</key> <dict> <key>Analytics</key> <dict> <key>OptOut</key> <true/> </dict> </dict>

        Setting OptOut=true disables Meta's analytics collection.

    • Ensure a URL scheme is set up for the sample (if required for the registration callback). The SDK uses a custom URL callback (metaWearablesAction) to return from the Meta AI app. The sample's Info.plist might already contain a URL type for this; if not, follow the docs to add the appropriate URL scheme so that the Meta AI companion app can redirect back to the sample after user authorisation.

  • Capabilities: In Xcode, enable any necessary capabilities for the project:
    • e.g. Bluetooth usage descriptions in Info.plist (NSBluetoothAlwaysUsageDescription) explaining why the app needs Bluetooth (to connect to glasses).
    • Possibly enable Background Modes if sample supports running camera in background (not typically allowed, but just in case).

Step 4 — Run

  1. Build & run on device: Connect your iPhone (or use wireless debugging). In Xcode, select the sample app target and choose your iPhone as the run destination. Build and run the app.
  2. Launch the sample: The sample app should launch on the iPhone. It likely presents a UI to connect to glasses or start streaming.

Step 5 — Connect to wearable/mock

  • Pair the glasses: Ensure your Ray-Ban Meta glasses are on, paired via the Meta AI app, and nearby.
  • Registration flow: In the sample app, trigger the registration or connect action (often a "Connect Glasses" button). This will likely open the Meta AI companion app to authorise linking your sample app with the glasses.
  • Approve permission: The Meta AI app will ask the user to approve granting camera access to the sample app. Follow the prompts (you might see a confirmation in the companion app, or need to press the capture button on the glasses to confirm).
  • Return to sample: After approval, the sample app receives a callback (deep link) and should now show that the glasses are Available/Connected (the registration status changes to registered).
  • Grant iOS permissions: The first time, iOS will ask for Bluetooth permission ("App wants to use Bluetooth") and Camera permission (for accessing an external camera feed) – grant these.

Verify

  • Connected status: The sample app should indicate it's connected to the glasses (for example, a status label or icon turns green or says "Connected ✅").
  • Video stream works: Initiate a camera stream in the app. You might tap a "Start Stream" button – the viewfinder in the app should show live video from the glasses' camera.
  • Capture photo: Test the capture feature – e.g. press a "Capture Photo" button. The glasses should snap a picture (often indicated by an LED flash on the frame) and the image should appear in the app (thumbnail or new screen).

Common issues

  • Build error (package not found): Symptom: Xcode fails to build because it can't find the MetaWearables package or module. Fix: Make sure the Swift Package is added to the correct target and that you opened the .xcworkspace if CocoaPods are used. Also ensure your internet connection and GitHub permissions for fetching the package.
  • "Device Unavailable" in app: Symptom: The sample app shows "No device found" or cannot connect. Fix: Confirm that your glasses are powered on, paired via the Meta AI app, and that the Meta AI app is running. The SDK requires the companion app to mediate the connection. Also ensure you completed the registration flow (if not, the glasses won't be authorised for your app).
  • Permissions denied: Symptom: The app fails to start streaming, possibly with an error about permission. Fix: Check iOS settings: the app needs Bluetooth permission (in iOS Settings > Privacy > Bluetooth) and Camera permission. If previously denied, enable them. Also, the user must have granted the glasses' camera permission via the Meta AI app flow – you might need to re-initiate registration if that was not done.

5) Quickstart B — Run the Sample App (Android)

Goal

Run the official Android sample app to verify camera streaming works with your glasses (or the mock device) on an Android phone.

Step 1 — Get the sample

  • Clone the repo: Clone Meta's Android SDK repository: git clone https://github.com/facebook/meta-wearables-dat-android.git.
  • Open in Android Studio: In Android Studio, choose File > Open... and select the meta-wearables-dat-android project. The sample app is under samples/CameraAccess. Open that module if it's not automatically opened.

Step 2 — Configure dependencies

The Android SDK is distributed via GitHub Packages (Maven). You need to authenticate to download it:

  • Add Maven repo: In the project's settings.gradle.kts, add the GitHub Maven repository:

    kotlin

    Copy code

    dependencyResolutionManagement { repositories { maven { url = uri("https://maven.pkg.github.com/facebook/meta-wearables-dat-android") credentials { username = "" // not needed password = System.getenv("GITHUB_TOKEN") ?: localProperties.getProperty("github_token") } } // ... other repos } }

    This configuration reads a GitHub token from an environment variable or local.properties.

  • Get a token: Create a personal GitHub access token (classic) with at least read:packages scope. In Android Studio, edit local.properties and add:

    ini

    Copy code

    github_token = ghp_yourPersonalAccessToken

    (Or set the GITHUB_TOKEN env var.) This will allow Gradle to fetch the mwdat artifacts.

  • Sync Gradle: After adding the repo and token, click "Sync Project". Gradle should resolve:

    • com.meta.wearable:mwdat-core:<version>
    • com.meta.wearable:mwdat-camera:<version>
    • com.meta.wearable:mwdat-mockdevice:<version> (for the mock device support). Ensure the version (e.g. 0.3.0) matches the latest tag.

Step 3 — Configure app

Before running, make a few app-level changes:

  • Application ID: Open app/src/main/AndroidManifest.xml of the sample. Insert your Application ID from the Wearables portal:

    xml

    Copy code

    <application ...> <!-- Required: Application ID for Meta Wearables --> <meta-data android:name="com.meta.wearable.mwdat.APPLICATION_ID" android:value="YOUR_APP_ID_HERE"/> ... </application>

    This meta-data is necessary so the SDK knows which project/app it's linking to.

  • Permissions: Add required permissions to the manifest (if not already present):

    xml

    Copy code

    <uses-permission android:name="android.permission.BLUETOOTH" /> <uses-permission android:name="android.permission.BLUETOOTH_CONNECT" /> <uses-permission android:name="android.permission.BLUETOOTH_SCAN" /> <uses-permission android:name="android.permission.INTERNET" /> <!-- Camera/mic permissions, if your app will use them for other purposes: --> <uses-permission android:name="android.permission.CAMERA" /> <uses-permission android:name="android.permission.RECORD_AUDIO" />

    These allow the app to communicate with the glasses via Bluetooth and handle media. (Note: On Android 12+, BLUETOOTH_CONNECT (and SCAN for discovery) are required instead of the old BLUETOOTH permission.)

  • Gradle settings: If the sample uses a Gradle version catalog (libs.versions.toml), ensure it includes the mwdat libs as shown in the README. In our case, we added them in Step 2.

  • Package name: If you created your own project for testing, set your app's applicationId (in build.gradle) to one that you registered in the portal.

Step 4 — Run

  1. Build & install: In Android Studio, select the CameraAccess sample app configuration. Connect your Android phone via USB (enable USB debugging). Click Run ▶️. The app will build and install on your phone.
  2. Launch the app: It should open automatically. If not, find the CameraAccess app on your device and launch it.
  3. Grant permissions: Android will prompt for Bluetooth permissions (and possibly Location, depending on API level for BLE) when the app starts. Approve these, as well as Camera permission if prompted.

Step 5 — Connect to wearable/mock

  • Pairing: Ensure the glasses are paired to your phone via the Meta AI app (the glasses should appear as connected in your phone's Bluetooth settings or within the Meta app).
  • Registration flow: In the sample app, tap the Register/Connect button. This will likely pop up an intent that redirects you to the Meta AI companion app to approve the connection (similar to iOS flow).
  • Approve on companion: The Meta AI app will ask you to authorise your sample app. Approve the request. Then it should redirect back to the sample (check for a toast or UI change).
  • Connect and stream: Once registered, use the sample app UI to start a stream. If you don't have real glasses, you can activate the mock device mode (some samples might allow toggling a virtual device which simulates a camera).
  • Permissions on device: The first time, the glasses themselves might need to grant camera access – this could be done by a physical action (some glasses require pressing the capture button to confirm). Also, ensure the app has Bluetooth permission (Android 12+ will show a prompt "Allow to find, connect to devices").

Verify

  • App shows connected: The sample app should indicate a successful connection/registration (e.g. a status text or a device list showing your glasses as connected).
  • Video feed appears: Tap "Start Stream" – you should see a preview or some indication that frames are coming from the glasses' camera (for a mock device, it might show a dummy feed or test pattern).
  • Photo capture works: Trigger a photo capture via the app. You might see a flash or hear a shutter sound from the glasses. The app should receive the photo (perhaps displayed on-screen or saved to gallery).

Common issues

  • Gradle authentication error: Symptom: Build fails with errors like Could not resolve com.meta.wearable:mwdat-core:… (HTTP 401). Fix: This means the GitHub Packages authentication failed. Double-check your personal access token (correct scopes, not expired) and that it's referenced in local.properties or env var. Make sure the repository URL matches exactly (including casing of "facebook/meta-wearables-dat-android").
  • Manifest merge conflict: Symptom: Errors when merging manifests (e.g. if you integrated into an existing app). Fix: Ensure that the <meta-data> keys for APPLICATION_ID and ANALYTICS_OPT_OUT are only defined once. If your app already had a meta-data with the same name, remove duplicates. Also check that the library's manifest (if any) isn't conflicting – usually it shouldn't.
  • Device connection timeout: Symptom: The app fails to find the glasses or times out trying to connect. Fix: Make sure the glasses are already paired via Bluetooth to the phone and the companion app is running in the background. The Wearables SDK relies on the companion service. If using a mock device, ensure you included the mwdat-mockdevice dependency and possibly enabled the mock mode in code (consult SDK docs for how to activate the mock device; it might auto-activate if no real device found). Also verify you added all needed permissions (Android may require ACCESS_FINE_LOCATION for Bluetooth scanning on some devices).

6) Integration Guide — Add Meta Wearables SDK to an Existing Flutter App

Goal

Integrate the Meta Wearables SDK into your Flutter app and implement one end-to-end feature (photo capture from the glasses) in your own app.

Because Flutter needs to interface with native iOS/Android for this SDK, we'll use a platform plugin approach. We can either use an existing Flutter plugin or write platform channels ourselves. In this guide, we'll outline using the open-source meta_wearables Flutter plugin which wraps the Meta SDK for both platforms.

Architecture

Your Flutter app will interact with the glasses through a layering:

  • Flutter UI (Dart) → Meta Wearables Flutter pluginNative SDK (iOS Swift & Android Kotlin) → Glasses (via Bluetooth).
  • Data flows back from the glasses (frames, events) through the native SDK to the Flutter plugin, which exposes Dart streams and callbacks.

In practice, you will have a singleton MetaWearables object in Dart that handles initialization and streaming, backed by iOS/Android code provided by the SDK.

Step 1 — Install the SDK (Flutter plugin)

Flutter pub package: Add the meta_wearables plugin to your pubspec.yaml:

yaml

Copy code

dependencies: meta_wearables: ^0.1.0 # (hypothetical version)

Then run flutter pub get. This plugin will internally pull in the iOS and Android SDK:

  • iOS: It uses Swift Package Manager to fetch the SDK (you'll also have to perform the Xcode SPM integration as in Quickstart A).
  • Android: It includes the Maven setup for you, but you'll still need to provide the GitHub token.

After adding, open the iOS Runner project in Xcode to add the Swift package (as done in Quickstart A, Step 2). For Android, ensure the Gradle setup (Step 2 from Quickstart B) is done in the Android module. Essentially, integrate the native SDK on each platform so that the Flutter plugin can use it.

(If not using the plugin, you'd create Method Channels and call the native SDK functions directly. The plugin saves you this work by using Pigeon to generate channel code.)

Step 2 — Add permissions

Add the necessary permissions and descriptions in your app:

iOS (Info.plist):

  • NSBluetoothAlwaysUsageDescription = "This app uses Bluetooth to connect to your AI glasses."
  • NSCameraUsageDescription = "This app streams video from your connected glasses' camera."
  • (Optionally NSMicrophoneUsageDescription if you plan to record audio via the glasses.)
  • Ensure the URL Types include a callback URL scheme as noted in Quickstart A (so the Meta AI app can return to your app after registration).

Android (AndroidManifest.xml):

  • Include the <uses-permission> entries for BLUETOOTH, BLUETOOTH_CONNECT, BLUETOOTH_SCAN, and INTERNET. (If your Flutter app already has these from another plugin, just ensure they are present.)

  • Include the meta-data tags inside <application> for Application ID and (optional) analytics opt-out:

    xml

    Copy code

    <meta-data android:name="com.meta.wearable.mwdat.APPLICATION_ID" android:value="YOUR_APP_ID"/> <meta-data android:name="com.meta.wearable.mwdat.ANALYTICS_OPT_OUT" android:value="true"/>

Make sure to request permissions at runtime in your Flutter code as well (Bluetooth permissions on Android 12+, etc., using something like the permission_handler plugin or similar).

Step 3 — Create a thin client wrapper

It's helpful to wrap the SDK calls in a Dart service class for cleaner architecture. For example:

  • WearablesClient (singleton): Handles initializing the SDK, registering/unregistering with the glasses, and exposing streams for connection state and video frames.
  • CaptureService: Implements higher-level actions like "take photo" by orchestrating calls to the WearablesClient and managing permissions.
  • PermissionsService: (Optional) A utility to check/request permissions (Bluetooth, camera).

Using the meta_wearables plugin, much of this is already exposed:

dart

Copy code

final wearables = MetaWearables.instance; await wearables.initialize(); await wearables.startRegistration();

This will trigger the registration flow (opening the Meta AI app). The plugin provides streams like wearables.registrationUpdates and wearables.videoFrames. You'll listen to these in your Dart code to update UI.

Definition of done:

  • SDK properly initialized when your app launches or when user opts in.
  • Registration flow handled: your app can call a method to connect, and you handle the callback (the plugin should auto-handle the URL callback as noted).
  • State management: Your UI reflects whether glasses are connected/available. Reconnection logic is handled (e.g., if the glasses go out of range, the SDK should emit disconnected state and attempt reconnection when back in range).
  • Error handling: Any errors from the SDK (e.g., registration failures, stream errors) are caught and displayed to the user (and/or logged). The plugin offers an errors stream for this.

Step 4 — Add a minimal UI screen

Design a simple Flutter UI to interact with the glasses:

  • Connect button: A button labelled "Connect Glasses" that calls wearables.startRegistration() (or a custom function in your service). While connecting, you can show a progress indicator. Once connected, maybe this button changes to "Disconnect" or gets disabled.
  • Status indicator: A text or icon showing the connection status (e.g., "Connected to Glasses ✅" or "No glasses"). Bind this to the registrationUpdates stream – when status is registered or available, show connected; if unavailable or error, show not connected.
  • Capture button: A button "📸 Capture Photo" that, when pressed, calls your photo capture logic.
  • Image preview: An Image widget or placeholder in the UI to display the last photo taken from the glasses. Initially empty, then updated with the image received.

With this setup, you can now implement the capture flow end-to-end in Flutter, which we detail next.


7) Feature Recipe — Trigger Photo Capture from Wearable into Your App

Goal

When the user taps a "Capture" button in your app, the glasses should take a photo and send it to your app, which then displays and saves the image. This is a prime example of using the Meta Wearables SDK in a real feature.

UX flow

  1. Pre-condition: The glasses are connected (your app shows a "Connected" status).
  2. User taps Capture: They tap the Capture Photo button in the app.
  3. Feedback: The app might immediately show a "Capturing…" message or spinner.
  4. Glasses action: The glasses' camera takes a photo (typically there's a white LED flash on the glasses).
  5. Data transfer: The photo data is sent to the phone via the SDK.
  6. App receives image: The SDK triggers a callback with the photo bytes.
  7. Display result: The app hides the spinner, shows a thumbnail of the photo, and maybe a success message.

Implementation checklist

  • Connected state verified: In the button's onPressed, first check if (!wearables.isRegistered) { showAlert("Please connect your glasses first"); return; }.
  • Permissions verified: Ensure the camera permission for the glasses is granted. The Meta SDK may require the user's one-time approval each session. Also check your app's permissions (Flutter side). If not, request them.
  • Capture request issued: Call the SDK to capture. In the plugin, this might be await wearables.capturePhoto() (for example). Under the hood, this likely requires an active video stream, so you might need to call startStream() first if not already streaming video.
  • Timeout & retry handled: If no response comes within a few seconds, consider timing out. You could allow a retry – perhaps prompt the user to try again or re-connect.
  • Result handling: When the photo comes in (the plugin might emit it on a photos stream), save it to app storage (e.g., temporarily to cache or gallery) and update the UI with the new image.
  • UI update: Show the thumbnail of the captured photo in your app's image view. Also provide feedback like a toast or label "Photo saved to gallery!" if you auto-save it.

Pseudocode

dart

Copy code

onCaptureButtonPressed() async { if (!wearables.isRegistered) { showMessage("Connect your glasses first!"); return; } // Check permissions: if (!await wearables.requestPermission(Permission.camera)) { showMessage("Glasses camera permission needed"); return; } showMessage("Capturing…"); try { // Ensure stream is started to get photo (if not already running): await wearables.startStream(quality: VideoQuality.medium); await wearables.capturePhoto(); // Photo will be delivered via the stream listener: } catch (err) { showMessage("Capture failed 😢"); log("Capture error: $err"); } }

And set up the listener on startup:

dart

Copy code

wearables.photos.listen((photoData) { // This is called when a new photo is captured final bytes = photoData.bytes; final image = Image.memory(bytes); setState(() { lastPhotoWidget = image; }); saveToGallery(bytes); showMessage("Photo received ✅"); });

(The exact API may differ, but conceptually this is how to handle it.)

Troubleshooting

  • Capture returns empty: If you get a callback but the image data is empty or null, check logs. Possibly the glasses weren't ready or permission wasn't actually granted. Ensure the glasses' camera is not busy (you can only have one app streaming at a time – if the Meta AI app is actively using the camera, your app might be blocked).
  • Capture hangs (no response): This can happen if the connection dropped or if the user didn't approve the camera use. Implement a timeout (e.g., if no photo after ~5-10 seconds, stop the stream and notify the user). Often re-running the registration (connect) flow can help if the session went stale.
  • "Instant display" expectation: Users might expect an immediate photo. Remember there will be some latency (the photo has to be transferred). To make the UI feel responsive, use a placeholder thumbnail (perhaps a low-res snapshot from the video stream) while the full photo is in transit. Then replace it when the real photo arrives.

8) Testing Matrix

Make sure to test your integrated app under various scenarios:

← Scroll for more →
ScenarioExpected OutcomeNotes
Mock device (no hardware)Feature works with simulated dataUse the mwdat-mockdevice on Android or any official simulation mode to ensure your code handles the flows without real glasses.
Real device (close range)Low latency streaming, stable connectionThis is the baseline: glasses near the phone, strong Bluetooth connection – video should stream smoothly with minimal lag.
Real device (far/obstructed)Possible lag or disconnectsTest moving away or blocking BT signal – ensure your app handles temporary disconnections gracefully (SDK should emit disconnect events).
App in background / screen lockedStream pauses or stops as definediOS might pause camera when app goes background (for privacy). Ensure no crashes if the user locks the phone during a session. The connection might drop – your app should recover when foregrounded.
Permission denied (user rejects)Clear error message shownE.g., if user denies Bluetooth or camera permission, your UI should explain the feature won't work without it and possibly guide them to enable it.
Disconnect during captureGraceful handling, auto-retry optionIf glasses power off or disconnect mid-stream or mid-capture, your app should not crash. It should time out the operation, inform the user ("Lost connection, please try again"), and perhaps automatically attempt reconnection.

Use the above matrix to ensure your implementation is robust across real-world conditions.


9) Observability and Logging

To maintain quality, instrument your app with logs/analytics for key events:

  • Connection events: Log when you start connecting (connect_start), when connection succeeds (connect_success along with device info), and when it fails or drops (connect_fail with error code). This helps identify stability issues.
  • Permission state: Log whether the glasses' camera permission was already granted or had to be requested, and if the user granted or denied it (permission_granted / permission_denied events). This can highlight onboarding issues.
  • Feature usage: For each major action like photo or video:
    • Log an event when user initiates it (photo_capture_start), and when it completes (photo_capture_success) or fails (photo_capture_fail with reason).
    • Similarly for video streaming: video_stream_start, video_stream_end (include duration).
  • Performance metrics: It's useful to capture timing: e.g., measure the latency from capture request to photo received (photo_latency_ms). Also track how many frames per second the video is getting on average (video_fps). These can be logged to console during dev, or sent to an analytics backend if you have one.
  • Reconnection attempts: If you implement auto-reconnect, log each attempt (reconnect_attempt) and whether it succeeded (reconnect_success) or not. If multiple reconnects happen often, that's a signal of Bluetooth instability.

By monitoring these logs during testing (and even in beta releases), you can pinpoint problem areas (like if capture often fails at a certain step or if Bluetooth drops frequently at a certain range).


10) FAQ

  • Q: Do I need hardware to start developing? A: Not strictly – Meta provides a Mock Device option in the SDK. You can simulate a camera feed in the Android emulator or a debug mode on iOS. This allows you to build and test the integration up to a point. However, to experience and fine-tune real-world behavior (latency, connectivity), testing on actual Ray-Ban Meta glasses is highly recommended.
  • Q: Which wearable devices are supported by this SDK? A: Currently, the SDK is aimed at Meta's AI glasses – notably the Ray-Ban Meta Smart Glasses (2nd Gen, 2023). Support for first-gen Ray-Ban Stories/Display is expected but as of now those devices may not fully work with the SDK. Future Meta wearables will likely be supported as the SDK evolves.
  • Q: Can I ship an app using this to production (App Store/Play Store)? A: Not yet. The Wearables SDK is in developer preview. Apps you build are meant for internal testing and can only be distributed to trusted users via Meta's release channel system. Only select partners have publishing rights during the preview. General availability (GA) is planned for 2026, after which broader app distribution will be possible (subject to Meta's approval and store guidelines).
  • Q: Does my app need the Meta AI companion app installed? A: Yes. The Meta AI (formerly Meta View) companion app must be installed and set up with the glasses. The SDK works in tandem with it – the companion app handles the low-level Bluetooth connection and user permissions. Your app communicates via the SDK which relies on services provided by Meta AI app in the background. If the user doesn't have the companion app, you should direct them to install it.
  • Q: Can I push content or notifications to the glasses via this SDK? A: Currently, no – the SDK is focused on accessing the glasses' sensors (camera, mic) rather than sending content to the glasses. Features like displaying images or text on the glasses' HUD, or controlling the glasses' LED indicators beyond what the system does, are not exposed in this preview. Future updates might expand capabilities, but for now think of it as a one-way data access (glasses → app).
  • Q: How is video streaming handled – is it continuous and can I do long sessions? A: The SDK provides a continuous video frame stream (you can choose resolution and frame rate). However, keep in mind hardware limits: the glasses' battery and thermals may limit long-running streams. In testing, 720p at 24fps works, but doing a live stream for many minutes could drain battery quickly. It's best to use streaming in short bursts or for specific tasks (not as a 24/7 camera).
  • Q: What about voice commands or AI integration on the glasses? A: The initial SDK does not include access to the glasses' voice assistant or touchpad gestures. You cannot intercept "Hey Meta" voice commands or listen for tap gestures via the SDK in this preview. Those remain in Meta's domain for now. The focus is on camera and audio capture. Meta has hinted at exploring voice/AI features in future updates.

11) SEO Title Options

  • "How to Get Access to Meta Wearables SDK and Run the Sample App (Flutter Cross-Platform Guide)" – emphasizes the access steps and sample run-through.
  • "Integrate Meta Wearables SDK into a Flutter App: Step-by-Step Cross-Platform Guide" – highlights integration into existing Flutter projects.
  • "How to Capture Photos from Ray-Ban Meta Glasses in Your Flutter App" – niche use-case focus, great for developers looking for a specific feature tutorial.
  • "Meta Wearables SDK Troubleshooting: Pairing, Permissions, and Build Errors Solved" – addresses common pain points, likely to attract search traffic from developers encountering issues.

Best option for Cybergarden SEO: The second option, "Integrate Meta Wearables SDK into a Flutter App: Step-by-Step Cross-Platform Guide," is ideal. It contains keywords like "Flutter," "Meta Wearables SDK," and "Guide," which should rank well for developers searching how to do cross-platform integration.


12) Changelog

  • 2026-01-30 – Verified with Meta Wearables SDK v0.3.0 (iOS 0.3.0, Android 0.3.0) on iOS 17.2 (iPhone 14) and Android 13 (Pixel 6) using Ray-Ban Meta (Gen 2) glasses. Updated instructions for latest SDK changes and included Flutter plugin details.
  • 2025-09-18 – Initial draft based on early developer preview (SDK v0.2). Basic connection and capture features documented.