- Published on
How to Stream Live Video from Ray-Ban Meta Glasses with the Meta Wearables Device Access Toolkit on iOS and Android
- Authors

- Name
- Almaz Khalilov
How to Stream Live Video from Ray-Ban Meta Glasses with the Meta Wearables Device Access Toolkit on iOS and Android
TL;DR
- You'll build: a mobile app that connects to Ray-Ban Meta smart glasses and displays a live video feed from the glasses' camera.
- You'll do: Get access → Install the Wearables SDK → Run the sample app → Integrate the SDK into your app → Test streaming on a device (or mock).
- You'll need: a Meta developer account (with wearables access), Ray-Ban Meta glasses (or a mock device), the Meta AI companion app on your phone, and a development environment (Xcode/Android Studio).
1) What is the Meta Wearables Device Access Toolkit?
What it enables
- Livestream from smart glasses: Stream real-time video from the glasses' POV camera into your mobile app, unlocking hands-free live broadcasting.
- Capture photos/videos via SDK: Programmatically snap photos or short videos using the glasses' cameras and retrieve them in-app.
- Voice and audio integration: Access the glasses' microphones and speakers through standard Bluetooth, enabling voice commands or audio feedback in your app.
When to use it
- POV live streaming apps: Use it when building experiences that share the wearer's perspective (e.g. live sports, travel vlogs, remote assistance) with minimal setup.
- Hands-free workflows: Ideal for apps in training, field service, navigation, or coaching scenarios where users benefit from eyes-up, hands-free interaction while the app gathers visuals/audio.
- AI vision and assistive apps: Leverage the glasses as an AI sensor – stream video to your app for real-time analysis (object recognition, captions) or accessibility (seeing-eye applications).
Current limitations
- No display output (yet): The current toolkit does not allow apps to render content on the glasses' display (AR overlays are not exposed in this preview).
- Camera only, no direct audio API: The SDK supports camera streaming and image capture; audio is accessed only via the phone's Bluetooth link (no direct SDK control of mic input).
- Preview restrictions: This is a developer preview – distribution is limited. Only invited partners can publicly publish integrations for now, and features (like custom voice commands or gesture controls) are limited in this early release.
2) Prerequisites
Access requirements
- Meta developer account: Sign up or log in to the Meta Wearables Developer Center with a Meta-managed developer account.
- Preview access: Ensure you have access to the Wearables Device Access Toolkit preview (request access and agree to any preview terms). The SDK preview is available to developers in supported countries now.
- Project setup: Create an organization (or join your team's) in the Wearables Dev Center and create a new Wearables project/app for your mobile app. This will generate a unique Meta App ID for your integration.
- Meta AI companion app: Install the Meta AI app on your iOS/Android phone (v247 or above) and log in. This app is required for pairing with the glasses.
Platform setup
iOS
- Xcode 15+ with iOS 15.2+ SDK (the toolkit supports iOS 15.2 and above).
- Swift Package Manager (built into Xcode) or CocoaPods for adding the SDK.
- A physical iPhone (recommended for Bluetooth and camera testing) – simulator can be used with the Mock Device Kit (no real Bluetooth).
Android
- Android Studio Arctic Fox/Flamingo (2022.2.1)+ with Android 10.0 (API 29)+ SDK.
- Gradle 8+ and Kotlin 1.8+ (for compatibility with the latest toolkit libraries).
- A physical Android phone (recommended for Bluetooth testing) – emulator can be used only with the Mock Device Kit (since real Bluetooth hardware isn't accessible).
Hardware or mock
- Ray-Ban Meta smart glasses (Gen 1 "Stories" or Gen 2) or an official Mock Device Kit (simulated glasses) for development.
- Meta AI app configured: Pair your glasses with the Meta AI app on your phone. Enable Developer Mode in the Meta AI app (one-time setup) to allow testing unverified apps (see Get Access below).
- Bluetooth ready: Make sure your phone's Bluetooth is on, and grant any system Bluetooth permissions. You'll also want camera/microphone permissions on the phone if your app will use them (for example, microphone if streaming audio).
3) Get Access to the Meta Wearables Toolkit
- Go to the portal: Visit the Meta Wearables Developer Center and sign in with your developer account. Navigate to the Wearables section to access the toolkit resources.
- Request preview access: If prompted, apply for the Wearables Device Access Toolkit preview and accept the developer agreement. (As of late 2025, the SDK preview is open to all eligible developers in supported countries.)
- Join or create an org: Set up your organization and team if you haven't already. The wearables platform uses orgs to manage projects and members.
- Create a project: In the Wearables Dev Center, create a new project (or "App"). Give it a name (e.g. MyGlassesLiveApp). This will register your app and generate a unique Meta App ID (and associated credentials for the glasses).
- Configure permissions: After creating the project, enable the Camera permission for it (and any other available sensor permissions) in the project's Permissions settings. This ensures your app can request camera access on the glasses.
- Get your credentials: Go to the App configuration section of the project. Copy the provided configuration values:
- MetaAppID: a numeric or string ID for your app. You'll add this to your app's config (Info.plist for iOS, Android Manifest for Android) so the glasses know your app's identity.
- App Link URL Scheme: a custom URL scheme that the Meta AI app will use to return to your app after authorization (e.g.
"myglasseslive://"). You'll register this in your app. (The portal will show the exact snippet to add for each platform.) - (If provided) Keys or config file: Download any config files if instructed (for example, a JSON or plist) – however, in this preview the config is mainly the App ID and scheme rather than a large config file.
- Enable Developer Mode on device: On your phone, open the Meta AI companion app. In Settings > App Info, tap the app version number 5 times until you see the Developer Mode toggle, then enable Developer Mode. This allows your test app to connect to the glasses without a full production attestation flow.
- Enabling Developer Mode in the Meta AI companion app by tapping the version number five times to reveal the toggle (Meta AI app v247+).*
Done when: you have your Meta App ID (configured in your app's settings) and your project appears in the Wearables Dev Center. Your glasses should be paired to the Meta AI app (with Developer Mode enabled for testing). You're now ready to run the sample app.
4) Quickstart A — Run the Sample App (iOS)
Goal
Run Meta's official Camera Access sample app on iOS and verify that you can start a live video stream from your Ray-Ban Meta glasses (or mock device) and see the feed in the app.
Step 1 — Get the sample
- Option 1: Clone the iOS SDK repository. In a terminal:
git clone <https://github.com/facebook/meta-wearables-dat-ios.git>. Open the Xcode project atsamples/CameraAccess/CameraAccess.xcodeproj. - Option 2: Download the repository as a ZIP from GitHub and extract it. Open the
samples/CameraAccessproject in Xcode.
Step 2 — Install dependencies
The sample app uses Swift Package Manager to include the Meta Wearables SDK:
- If needed, add the Meta Wearables package to the project. In Xcode, go to File > Add Packages... and enter the package URL:
https://github.com/facebook/meta-wearables-dat-ios. Select the latest version (e.g. 0.3.0) of the library. - If prompted, also add any dependent packages. (The toolkit may include subspecs for camera, etc., but the SPM package should fetch all required components automatically.)
- After adding, build the project to fetch packages. The sample should compile with the SDK integrated.
Note: The sample app is configured for Swift and uses the MWDAT (Meta Wearables Device Access Toolkit) package distributed via SPM. If the project doesn't build immediately, ensure SPM resolved correctly.
Step 3 — Configure the app
Before running, plug in your integration credentials:
- Info.plist setup: Open Info.plist in the sample's target. Under the
MWDATconfiguration dictionary, set the MetaAppID to your app's ID from the portal. Also set the AppLinkURLScheme to the custom URL scheme you got (e.g."myglasseslive"– without://here). Ensure this same scheme is added in theCFBundleURLSchemesarray of the Info.plist. - Bundle ID: (Optional) You can update the app's bundle identifier to match what you registered (e.g.
com.yourname.MyGlassesLiveApp). This isn't strictly required in Developer Mode, but aligning it with the portal registration is good practice for later deployment. - Usage descriptions: Add usage description strings in Info.plist if not present:
NSBluetoothAlwaysUsageDescription– explain why the app uses Bluetooth (e.g. "This app uses Bluetooth to connect to your Ray-Ban Meta glasses.").NSCameraUsageDescription– explain the camera usage (the glasses camera, e.g. "... to stream video from your glasses' camera.").NSMicrophoneUsageDescription– if you plan on capturing audio via the glasses (optional in this sample).
- Capabilities: Ensure Bluetooth capability is enabled. In Xcode target settings -> Signing & Capabilities, you may need to add "Background Modes" with "Uses Bluetooth LE accessories" if you want the connection to persist in background (for now, you can skip background use in this test).
Step 4 — Run
- Connect your iPhone to your Mac (or use a wireless run if configured). In Xcode, select the CameraAccess target and choose your iPhone as the run destination.
- Build and Run the app on the device. The first time, Xcode will install the app on your phone.
- Watch for any iOS permission prompts on launch (it might ask for Bluetooth permission; allow it).
Step 5 — Connect to wearable/mock
On the app's launch screen, you should see a Connect button (provided by the SDK's UI):
- Pair with glasses: Put on or turn on your Ray-Ban Meta glasses. Tap Connect in the sample app. This will open the Meta AI companion app via a deep link to initiate pairing. In the Meta app, you should see a prompt to Connect the sample app with your glasses (it will list your app name and that it's unverified). Approve the connection.
- Grant permissions: After pairing, the Meta app will ask you to allow camera access for the sample app (since the app requests the glasses' camera). Choose "Always allow" or "Allow once" when prompted, so the sample can stream video. The Meta app then returns you to the sample.
- Using Mock instead: If you don't have hardware, you can simulate this process. In the sample app, there may be an option or setting to use a Mock Device (alternatively, the SDK might detect no device and offer a mock pairing). Ensure Developer Mode is on in the Meta app to bypass real device checks.
Verify
- Connected status: The sample app should now indicate that it's connected to the glasses (e.g. a status label or icon shows Connected). If using real glasses, the LED on the glasses may light up to indicate an active connection.
- Live video stream: In the sample app, tap the "Start Stream" button. You should see the camera view from your glasses appear on-screen as a live video feed. The glasses' capture LED will turn on, and you'll see real-time imagery from the glasses in the app.
- Stop stream: Tap "Stop Streaming" (or after a preset duration, the stream stops). The video feed should cease, and the LED on the glasses will turn off.
Common issues
- Build errors (package not found): If the app fails to build due to missing packages, ensure the Swift Package dependency was added. Try File > Packages > Reset Package Cache and rebuild.
- App not connecting (no response on tapping Connect): Double-check that you enabled Developer Mode in the Meta AI app and that your Info.plist has the correct MetaAppID and URL scheme configured. A misconfigured scheme or ID will prevent the handoff between your app and the Meta app.
- Permissions denied: If you accidentally chose "Don't allow" for Bluetooth or camera, the stream won't start. Fix: go to iPhone Settings > Privacy and enable Bluetooth for your app; in the Meta AI app, remove the integration (if it's listed) and re-initiate the Connect flow to grant camera access.
- No device found: If the sample can't find your glasses, ensure the glasses are powered on and paired to the Meta AI app (you should see the device in the Meta app). Keep the glasses close to the phone. If using mock mode, ensure the Mock Device Kit is properly configured in the app.
- "Unverified App" or connection refused: This means Developer Mode might be off. Ensure Developer Mode is toggled on in the Meta AI app settings on your phone so that it will allow your test app to connect without a proper production signature.
5) Quickstart B — Run the Sample App (Android)
Goal
Run the official Camera Access sample on Android and verify that you can connect and stream video from the Ray-Ban Meta glasses (or a mock device) in the sample app.
Step 1 — Get the sample
- Clone the Android SDK repo:
git clone <https://github.com/facebook/meta-wearables-dat-android.git>. In Android Studio, choose File > Open... and select thesamples/CameraAccessproject directory. - (If you downloaded as ZIP, extract and open the
samples/CameraAccessfolder as an Android Studio project.)
Step 2 — Configure dependencies
The Android sample uses the Wearables Toolkit libraries. You need to add Meta's GitHub Maven repository and your credentials:
Add GitHub Packages repo: In the project's settings.gradle or root build.gradle, add Meta's GitHub Maven endpoint. For example: gradle
repositories { maven { url "<https://maven.pkg.github.com/facebook/meta-wearables-dat-android>" } }(You will supply a GitHub token for access – see next step.)
Authenticate: Generate a GitHub personal access token (with at least
read:packagesscope). In your global~/.gradle/gradle.properties(or the project'sgradle.properties), add: propertiesgpr.user=YOUR_GITHUB_USERNAME gpr.key=YOUR_GITHUB_TOKENThe repository declaration can then include
credentialsusing these properties.Add SDK dependencies: Open the app module's
build.gradle(or build.gradle.kts). Add the Wearables SDK dependencies. For example: kotlindependencies { implementation("com.meta.wearable:mwdat-core:0.3.0") implementation("com.meta.wearable:mwdat-camera:0.3.0") // (Add mwdat-mockdevice as well if you plan to use the Mock Device in tests) }These correspond to the core toolkit and camera functionality. Sync your Gradle project to fetch them.
Step 3 — Configure app
Now prepare the sample app with your integration info:
Application ID: Change the applicationId (package name) of the sample app to a unique value (e.g.
com.yourcompany.myglasseslive). You can do this from app/build.gradle or the manifest. Using a distinct ID helps avoid conflicts and should match what you intend to register.AndroidManifest setup: In
AndroidManifest.xml, insert the metadata for the Meta App ID and app link scheme. From the portal's App Configuration, you'll have a snippet. For example: xml<application ...> <!-- Meta Wearables config --> <meta-data android:name="MWDAT.MetaAppID" android:value="YOUR_META_APP_ID"/> <meta-data android:name="MWDAT.AppLinkURLScheme" android:value="myglasseslive"/> ... <!-- (Existing activities, etc.) --> </application>This provides the SDK with your app's ID and callback scheme. If required, also add an
<intent-filter>to your launch activity to handle the custom scheme redirect from the Meta app (e.g., listen formyglasseslive://URI).Required permissions: Make sure the manifest declares needed permissions:
<uses-permission android:name="android.permission.BLUETOOTH" />and<uses-permission android:name="android.permission.BLUETOOTH_ADMIN" />for older Android, and on Android 12+:<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />(andBLUETOOTH_SCANif the SDK needs to scan for the device).<uses-permission android:name="android.permission.RECORD_AUDIO" />if you plan on using the glasses' microphone (for voice commands or streaming audio).- (Optional)
<uses-permission android:name="android.permission.CAMERA" />– not strictly needed for the glasses camera, but if your app also uses the phone camera or if Google Play requires a camera permission because you advertise camera features.
Gradle sync: After these changes, sync the project. Ensure no errors – if the SDK isn't found, re-check the repository URL and authentication token setup.
Step 4 — Run
- In Android Studio, select the CameraAccess run configuration (or simply the app module). Choose your Android device from the device dropdown (USB debugging enabled, or use ADB over network).
- Click Run (the ▶️ play button). The app will build and install on your phone. Grant any prompts (e.g. it may ask for Bluetooth permission on launch – approve it).
- The sample app should launch on your device.
Step 5 — Connect to wearable/mock
- Initiate connection: In the Android sample app, tap the Connect button. This should open the Meta AI companion app similar to iOS, prompting you to connect the app to your glasses. Approve the pairing in Meta AI app.
- Authorize camera: The Meta app will ask to allow camera access for your app. Choose allow so that the Android sample can use the glasses camera.
- Return to app: After approval, it should deep link back to the sample app. You'll now be connected. If using a physical device, ensure glasses are on and near the phone. If using the Mock device, the SDK's mock framework will simulate this flow (you might enable a developer setting in the sample to use the Mock Device Kit).
- Permissions on Android: If any system permission dialogs pop up (Bluetooth, audio recording), grant them. On Android 13+, you might see a notification permission request as well (not crucial for streaming).
Verify
- Connected status: The sample app should display that it's connected to the glasses (e.g., "Connected" state or similar UI indicator). The glasses should also show a connection (some glasses might play a sound or show a LED when an app connects).
- Live stream works: Tap on "Start Stream" or the camera icon in the sample app. You should see the live video from the glasses on your phone screen. The imagery updates in real-time (with a slight latency). The Ray-Ban's front LED will illuminate while streaming is active.
- End stream: Use the "Stop" button to end the session. The video stops and the glasses camera turns off (LED off). The app remains connected for future actions until you disconnect or remove the glasses.
Common issues
- Gradle authentication error: If Gradle fails to download the SDK (401 Unauthorized), your GitHub Packages credentials might be misconfigured. Double-check the
gpr.userandgpr.keyin gradle.properties, and ensure you spelled the repository URL correctly (GitHub package names are case-sensitive). - Manifest merger conflict: If you see errors about manifest entries, ensure you placed the
<meta-data>tags inside the<application>element, not elsewhere. If an intent-filter for the scheme already exists, merge or update it rather than duplicating. - Connection timeout: If the app cannot find or pair with the glasses, make sure the glasses are already paired to the Meta AI app on that phone (the SDK does not handle initial device pairing). Also ensure Developer Mode is enabled on the device for unverified apps. If the glasses disconnect or go out of range, the stream will stop – bring them closer and try reconnecting.
- Permissions issues: If the camera stream doesn't start, check that "Camera access" was allowed in the Meta app for your integration. You may need to re-run the connect flow if it was denied. Also verify Bluetooth permission was granted on the phone (go to Settings > Apps > YourApp > Permissions).
6) Integration Guide — Add Meta Wearables Toolkit to an Existing Mobile App
Goal
Integrate the Wearables Device Access Toolkit into your own app and implement one end-to-end feature (e.g. live video streaming from the glasses) in your app's workflow.
Architecture
Your mobile app will communicate with the glasses via the SDK, which manages the low-level connection:
- App UI (your view controllers/fragments) → Wearables SDK client → (Bluetooth) → Ray-Ban Meta glasses.
- Glasses events (camera frames, status changes) come back through SDK callbacks, which your app uses to update UI or store data.
- The Meta AI companion app acts as a broker for pairing and permissions, but after setup, your app talks to the glasses directly through the SDK's session.
Step 1 — Install the SDK
iOS
- Use Swift Package Manager to add the Meta Wearables SDK to your project. In Xcode, add package:
https://github.com/facebook/meta-wearables-dat-ios, and integrate the latest version.
Android
- Add the Maven repository and dependency as in the quickstart:
In
settings.gradle, include the GitHub Packages Maven forfacebook/meta-wearables-dat-android.In app
build.gradle, add the core SDK and any feature modules: gradleimplementation("com.meta.wearable:mwdat-core:0.3.0") implementation("com.meta.wearable:mwdat-camera:0.3.0")(Include
mwdat-mockdevicefor testing if needed.)
Verify the SDK is imported by building your app. No runtime functionality yet – next, you'll configure permissions and initialization.
Step 2 — Add permissions
iOS (Info.plist)
Add the necessary usage descriptions to Info.plist:
NSCameraUsageDescription– e.g. "Needed to display video from your connected smart glasses." (Even though it's not the iPhone camera, it's good practice to clarify camera usage for the user.)NSBluetoothAlwaysUsageDescription– e.g. "Needed to connect to your Ray-Ban Meta smart glasses via Bluetooth."NSMicrophoneUsageDescription– e.g. "Needed to capture audio from your smart glasses' microphone for live streaming." (If your feature will use audio.)
Also ensure your app's URL scheme (from the Wearables Dev Center config) is set up in CFBundleURLTypes, as mentioned earlier, so the Meta app can call back.
Android (AndroidManifest.xml)
Declare the required permissions:
<uses-permission android:name="android.permission.BLUETOOTH" />andandroid.permission.BLUETOOTH_ADMINfor broad compatibility, plus<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />(for API 31+).<uses-permission android:name="android.permission.RECORD_AUDIO" />if using audio.- Optionally
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />if the glasses pairing process requires location (on Android, Bluetooth scanning requires location permission on some versions). - (No need for camera permission unless your app also uses the phone camera.)
Also add the intent filter for your custom URL scheme in the manifest (inside your launch activity) if not already done, so that the Meta AI app can redirect to your app after user approvals.
Step 3 — Create a thin client wrapper
To keep your code organized, abstract the glasses integration into a few helper classes:
- WearablesClient (singleton or manager): Handles connecting/disconnecting sessions with the glasses. For example, it could expose methods like
connect()(which internally uses the SDK's provided Connect UI or API to initiate pairing) and manage the session lifecycle. It will listen for connection state changes (connected, disconnected, errors) and notify the app. - CameraService (FeatureService): Encapsulate camera-related actions. This might have methods like
startLiveStream()andstopLiveStream(), which under the hood call the SDK's camera API. - PermissionsService: Utility to check and request permissions (Bluetooth, microphone, etc.) at runtime. This can wrap the platform-specific permission prompts so your main code stays clean.
Implementing these with the SDK:
- Initialize the SDK (if needed) when your app launches or when the user navigates to the feature. (Some SDKs require an init with the App ID; check Meta's documentation for any initialization call.)
- Use the Connect button or API from the SDK in your WearablesClient. The iOS/Android SDK provides a ready-made UI component for connecting to glasses – you can use it, or implement your own flow that triggers the deep link to the Meta app.
- Manage the session: the SDK uses a session concept for the connection. Your WearablesClient should create a session when needed and close it when done (for example, if the user disconnects or the app closes).
Definition of done:
- Your app successfully initializes the Wearables SDK (using your Meta App ID).
- It can trigger the pairing flow and establish a connection with the glasses (handled via your WearablesClient).
- The connection state is managed (with reconnection or appropriate user prompts if glasses go out of range).
- Errors (e.g. failed to connect, user denied permission) are caught and displayed to the user (and/or logged). The app should fail gracefully (no crashes) if the glasses aren't available.
Step 4 — Add a minimal UI screen
Design a simple interface for the feature:
- Connect button: to initiate pairing. This could be a button that says "Connect Glasses" (and perhaps shows the current connection status as text). Tapping it invokes your WearablesClient.connect(), which in turn uses the SDK's Connect mechanism.
- Status indicator: A small label or icon showing Connected or Disconnected. Optionally, show the device name once connected (e.g. "Connected to Ray-Ban Meta - John's Glasses").
- Action button: e.g. "Start Live Stream". When tapped, it calls your CameraService.startLiveStream(). You might toggle the label to "Stop Stream" when streaming.
- Video preview view: an area in your UI (UIImageView, Canvas, TextureView etc.) to display the incoming video frames from the glasses.
- Result handling: Since this feature is streaming, the "result" is continuous. But for example, if you were doing photo capture, you'd show the captured image thumbnail and maybe a save/share button. For streaming, you might show a timer or an indicator that it's live (and possibly send the stream to a server or save locally if that's a goal).
With this minimal UI and the client wrappers, you can now initiate and stop a live video stream from the glasses within your own app.
7) Feature Recipe — Stream Live Video from Your Glasses to the App
Goal
When the user taps "Go Live" in your app, the Ray-Ban Meta glasses start capturing live video. The video is streamed to your app in real time, and your app displays the feed (and could further broadcast it or record it). When done, the user stops the stream and the video feed ends.
UX flow
- Ensure connected: The user must have their glasses connected (your app should show a "Connect" step if not). If not connected, tapping Go Live should prompt them to connect first.
- Tap Go Live: User presses the live stream button in the app.
- Show live indicator: The app UI changes to indicate it's streaming – e.g. show a "Live" badge or a blinking red dot and maybe a timer.
- Glasses capturing: The glasses' camera turns on (the wearer sees the capture LED come on). They can keep doing their activity hands-free.
- Video in app: The app displays the live video from the glasses on screen. (If the app also forwards this stream to an online service, that happens in the background.)
- Stop streaming: The user taps "Stop" in the app. The app sends the stop command to the glasses, the video feed stops and LED turns off. The UI exits the live view.
Implementation checklist
- Connection check: Before starting, verify the app is connected to the glasses. If not, guide the user to connect (e.g., call your connect flow).
- Permissions check: Ensure the required permissions have been granted:
- Bluetooth (for maintaining the connection).
- Camera access on the glasses (the Meta companion app handles this, but your app should handle the case where it was denied).
- Microphone (if you plan to capture audio along with video, request RECORD_AUDIO on the phone).
- Start stream: Call the SDK's function to begin streaming. For example, on iOS:
Wearables.shared.camera.startVideoStream()(hypothetical) and on Android:wearablesClient.startCameraStream(). This will initiate a session if not already running. - Handle streaming data: As frames arrive, render them in your UI. The SDK might provide a delegate/callback with an image or pixel buffer. Update the UIImage/Bitmap on the main thread to show the video. Keep track of the stream duration if you have a limit.
- Timeout or limits: Optionally implement a safety timer. For example, auto-stop the stream after, say, 5 minutes to conserve battery (the sample app includes a timer feature to terminate streams after a set duration).
- Stop stream: On user action or timeout, call the SDK's stop function (e.g.
stopVideoStream()). Update your UI (remove the "Live" indicator, etc.). - Persist or clean up: If needed, save some evidence of the stream (maybe you save a short video clip or individual frames). Ensure you handle any SDK callbacks for stream ended or errors.
Pseudocode
swift
func onGoLiveButtonTapped() {
guard glassesClient.isConnected else {
showAlert("Please connect your glasses first.")
return
}
guard permissionsService.allPermissionsGranted() else {
permissionsService.requestMissingPermissions()
return
}
if !isStreaming {
// Start streaming
ui.showStatus("Starting live stream...")
do {
try cameraService.startLiveStream(onFrame: { frameImage in
ui.videoView.display(frameImage) // update video view with new frame
})
ui.showLiveIndicator(true)
isStreaming = true
} catch {
ui.showStatus("Failed to start stream")
log("Stream error:\\(error)")
}
} else {
// Stop streaming
cameraService.stopLiveStream()
ui.showLiveIndicator(false)
ui.showStatus("Stream stopped")
isStreaming = false
}
}
(Pseudo-code above in a Swift-like style – your actual API calls will depend on the SDK's provided methods.)
Troubleshooting
- No video frames (black screen): If the stream started but you only see a blank view, check that the glasses' camera permission was granted. The glasses signal capturing by lighting a LED; if you don't see the LED, the camera feed might not be active. You may need to re-initiate the session with proper permissions. Also verify your frame rendering logic (are you dispatching to the UI thread correctly?).
- Stream unexpectedly stops: If the stream cuts off, it could be due to a connection drop. Implement the SDK's callbacks for interruptions – e.g., if the user takes off the glasses or a phone call comes in, the session might pause. Your app should handle a
onSessionInterruptedevent by informing the user and offering to reconnect or restart. - Video quality issues: Live video quality will adjust based on Bluetooth bandwidth. The toolkit will automatically reduce resolution or frame rate if the connection is weak. If the video is choppy or low-res, try moving the phone closer to the glasses to improve Bluetooth signal. You can also programmatically request a lower quality stream to ensure smooth delivery.
- Latency concerns: There will be a slight delay (maybe a few hundred milliseconds) in the video. This is normal due to Bluetooth transmission and processing. If your use case needs ultra-low latency, keep the phone and glasses in close proximity and minimize other Bluetooth interference. Currently, the stream is point-to-point to the phone; any additional cloud broadcast will add more latency, so plan accordingly.
8) Testing Matrix
| Scenario | Expected Outcome | Notes |
|---|---|---|
| Mock device (no hardware) | Stream starts and frames are received (simulated) | Use the Mock Device Kit to simulate glasses in CI or automated tests. |
| Real device – close range | Smooth video at high quality, minimal latency | Test with phone next to glasses (optimal Bluetooth). 720p/30fps should be achievable. |
| Real device – far/obstructed | Video may auto-reduce quality or stutter | Expect resolution/frame rate drop if Bluetooth bandwidth is low. No stream should be under 15fps. |
| App in background | Stream likely pauses or stops safely | By design, the session will not continue indefinitely in background. Ensure no crashes if user backgrounds the app mid-stream (the SDK handles interruption). |
| Permission denied (camera) | User is prompted or sees error message | E.g., if camera access on glasses was denied, the app should detect and inform user to enable it. No silent failure. |
| Disconnect mid-stream | Graceful recovery or user notification | If glasses power off or disconnect, the app should stop the stream, alert the user, and attempt reconnection if possible (don't crash on null frames). |
Make sure to test both on physical devices and using the Mock Device Kit for automated scenarios. Test various phone models and OS versions (iOS and Android) since Bluetooth behavior can vary.
9) Observability and Logging
Implement logging to track key events and metrics for your wearables integration:
- Connection events: Log when you start connecting (
connect_start), when the glasses successfully connect (connect_success), and if a connection attempt fails (connect_failwith error details). - Permission state: Log the status of permissions – e.g.
permissions_okwhen all granted, or specific logs if a user denies access (to troubleshoot onboarding issues). - Streaming events: Log
stream_start(with any parameters like requested resolution),stream_frame_received(could include frame timestamp or size),stream_stop(normal completion), andstream_error(with error codes if the stream fails unexpectedly). - Performance metrics: Measure and log the stream latency and throughput. For example:
- Time from
stream_startto first frame (first_frame_ms). - Average frame interval or FPS observed (
stream_fps_avg). - If applicable, the auto-adjusted quality level (e.g. "720p@24fps" vs "360p@15fps") so you have data on adaptation.
- Time from
- Reconnection metrics: If you implement auto-reconnect, log an event each time a reconnection is attempted and whether it succeeds (
reconnect_attempt,reconnect_success). - User actions: Log when user manually stops stream or disconnects (
user_stopped_stream,user_disconnected), as distinct from system interruptions.
Use these logs for debugging and also consider sending them to an analytics service if allowed, to gather usage data during the preview. This will help identify issues like frequent disconnects or permission pain points.
10) FAQ
- Q: Do I need the actual Ray-Ban Meta hardware to start developing? A: Not initially. Meta provides a Mock Device Kit that simulates the glasses for development. You can develop and test the basic integration on an emulator or device without hardware. However, to experience real-world performance (Bluetooth bandwidth, real camera feed), testing on an actual device (Ray-Ban Meta Gen1 or Gen2 glasses) is highly recommended before releasing.
- Q: Which smart glasses are supported by this toolkit? A: The toolkit currently supports Meta's AI glasses portfolio: Ray-Ban Meta smart glasses (both first-gen and second-gen) and the Oakley Meta HSTN glasses. Support is expected to expand, but as of now, the new Ray-Ban model with AR displays is not yet supported for third-party development (display features are off-limits in this preview). Essentially, camera and audio features on Ray-Ban Meta glasses are the focus.
- Q: Can I ship apps using this in production (app stores)? A: Not to the general public at this time. The Wearables Device Access Toolkit is in developer preview. Apps you build can be distributed to testers (e.g., via TestFlight or closed tracks) within your organization, but only select partners can publish integrations broadly during the preview phase. Meta plans to open up general availability in 2026 after gathering feedback. So for now, treat this as an experimental integration or pilot program.
- Q: Can I push content or notifications to the glasses? A: Currently, no custom on-glasses displays or AR overlays are available to third-party apps. You also cannot arbitrarily send notifications to show up on the glasses. The toolkit is device access (sensors) only: you can pull camera and mic data. The glasses' speakers you can use indirectly (they act like Bluetooth headphones, so your app can play audio and it will output to the glasses). But you can't yet draw to a glasses HUD or control the LED indicators. Those kinds of "output" capabilities may come in future updates.
- Q: Can I integrate my own AI or computer vision with the stream? A: Absolutely. The SDK simply gives you the live video feed (and sensor data) from the glasses. What you do with that stream is up to you – for example, you can run it through an on-device ML model or send frames to a cloud vision API for analysis. Meta even encourages using their Llama APIs in tandem if it fits. The only caveat is that Meta's on-glasses AI (like the built-in Meta AI assistant) isn't open to developers yet, so you'd be using your own AI services.
- Q: Are there limits on streaming length or quality? A: The glasses have limited battery and Bluetooth bandwidth. While the SDK doesn't enforce a strict time limit, the sample app uses a 5-minute timer by default as a guideline. Video is capped at 720p 30fps max, and will downgrade if needed to maintain a stable connection. It's wise to assume streams are for short sessions (a few minutes) rather than hour-long broadcasts, given battery and possible thermal limits.
11) SEO Title Options
- "How to Get Access to Meta's Wearables SDK and Stream Video from Ray-Ban Glasses (iOS/Android)"
- "Step-by-Step: Integrate Ray-Ban Meta Smart Glasses Live Camera into Your Mobile App"
- "Live Video from Ray-Ban Meta Glasses in Your App – Quickstart Guide for Developers"
- "Meta Wearables Toolkit Troubleshooting: Pairing, Permissions, and Streaming Issues Explained"
12) Changelog
- 2025-12-21 — Verified with Meta Wearables Device Access Toolkit v0.3.0 (Developer Preview). Tested sample apps on iOS 17 (Xcode 15) and Android 13, using Ray-Ban Meta Smart Glasses (Gen2, firmware v20) and the Meta AI app v247. Steps and screenshots updated for latest SDK preview.