- Published on
How to Build a Glasses-Powered MVP in 7 Days with Meta's Wearables Device Access Toolkit on Mobile
- Authors

- Name
- Almaz Khalilov
How to Build a Glasses-Powered MVP in 7 Days with Meta's Wearables Device Access Toolkit on Mobile
TL;DR
- You'll build: a smart glasses MVP that captures photos/videos from Meta AI glasses and streams them into a mobile app (think first-person POV and hands-free features).
- You'll do: Get access → Install the Wearables Toolkit SDK → Run the official sample app → Integrate the SDK into your own iOS/Android app → Test with real glasses or a mock device.
- You'll need: a Meta developer account (with Wearables Toolkit preview access), Ray-Ban Meta AI glasses (2nd gen) or supported device or the SDK's mock glasses, a development device (iPhone with iOS 16+/Android 12+) with Bluetooth enabled, and Xcode/Android Studio.
1) What is Meta's Wearables Device Access Toolkit?
What it enables
- Camera access (glasses POV): Capture photos or stream 720p video (up to 30 FPS) directly from the glasses' 12 MP wide-angle camera into your app. This allows first-person livestreaming and content capture, lifelogging, or computer vision features (e.g. real-time object recognition).
- Audio in/out: Leverage the glasses' 5-microphone array for voice input and open-ear speakers for audio output. Your app can record ambient audio or provide audible feedback to the wearer hands-free.
- Hands-free interactions: By tapping into the glasses' sensors and controls, apps can enable voice- and touch-free user experiences. For example, the toolkit supports POV use cases like a "What am I looking at?" AI assistant that describes the scene, or a remote expert guiding a user via what the glasses see.
When to use it
- Immersive productivity & assistive apps: Use it for "see-what-I-see" remote support and expert assistance, live how-to guides, or expert assistance tools where a wearer shares their view with an assistant on the other end. It's also great for assistive tech (e.g. apps helping visually impaired users via the glasses' camera).
- IRL streaming & content creation: Ideal for building live streaming or recording features that capture the world from the user's perspective. Early partners like Twitch and Streamlabs are already using the SDK to let users stream their first-person view hands-free.
- Field work and sports: Perfect for scenarios like on-course sports assistants (e.g. a golf app giving club recommendations based on what the golfer sees) or enterprise field service tools where workers can document and receive guidance without holding a phone.
Current limitations
- Device support (preview): Currently only Ray-Ban Meta smart glasses (2nd gen) and Oakley Meta HSTN glasses are supported in the developer preview. Support for new models (e.g. Ray-Ban Meta Display glasses with a HUD) is coming, but note that even when connected, those display glasses can only send camera imagery, not receive HUD content.
- Preview restrictions: The SDK is developer preview status – you can build and test within your organization, but you cannot publish apps to the public using it yet as it's available as a public preview for internal testing only. General availability for production use is targeted for 2026, when Meta plans to allow integrations to the general public.
- Features not (yet) included: Voice commands and on-glasses AI assistant integrations are not part of this initial preview as general voice controls are being explored for future updates. The toolkit focuses on camera, mic, and basic sensor access. Also, there's currently no API to push visuals to the glasses' display in this toolkit, so your app's UI remains on the phone.
- Usage requirements: The glasses must be paired with the Meta AI app on the phone, which acts as a bridge for all connections and permissions. Your users will need to have the Meta AI mobile app running and logged in, with the glasses connected, for your app to communicate with the wearables.
2) Prerequisites
Access requirements
- Meta developer account: Create or log in to the Meta for Developers portal. Ensure you have access to the Wearables Device Access Toolkit preview (you may need to apply for the preview program as it's gated – Meta provided a sign-up form for interested developers).
- Join/create an organization: In the Wearables Developer Center, register an organization for your team or project (this is required to manage projects and testing access).
- Enable preview access: Once accepted into the preview, agree to any terms and enable the Device Access Toolkit for your account. This gives you access to the SDK downloads and documentation.
- Create a project and release channel: Set up a Project/App ID in the wearables dev console. This will be used to identify your app integration. Also create a release channel if you plan to distribute test builds to others in your org (for internal testing only, since public publishing isn't open yet).
- (Optional) Meta AI app linked: Make sure you've installed the Meta AI companion app on your phone and linked it to your Meta account and glasses. While not a "developer portal" step, having the glasses connected via the Meta AI app is essential for testing later.
Platform setup
iOS
- Xcode 15+ with iOS 16.0 or later SDK (iOS 17 recommended). The toolkit libraries support modern iOS and require recent Xcode for Swift Package Manager integration.
- Swift Package Manager (SPM) enabled in Xcode (comes built-in). No CocoaPods needed – the SDK is distributed as a Swift Package.
- Device for testing: A physical iPhone with iOS 16+ is highly recommended (Bluetooth connectivity to the glasses won't work on an iOS simulator). Use an iPhone 12 or newer for best performance. (Simulator can be used with the mock device mode, but it won't connect to real Bluetooth hardware.)
Android
- Android Studio Giraffe or newer (Arctic Fox 2021.3.1 or later should work) with Android SDK 33+ (Android 13). Ensure you can compile against at least API Level 31.
- Gradle (Plugin) 8.1+ and Kotlin 1.8+ (the sample uses Kotlin DSL build scripts). The SDK is delivered via Maven packages on GitHub, which requires Gradle configuration.
- Device for testing: A physical Android phone running Android 12+ with Bluetooth. (An emulator is generally not suitable for Bluetooth device testing. However, like on iOS, you could use the SDK's mock mode on an emulator for basic development if needed.)
Hardware or mock
- Supported smart glasses: At least one pair of Ray-Ban Meta AI Glasses (2nd Gen) or Oakley Meta HSTN glasses, updated with the latest firmware. These are the devices currently supported in the preview. If you have the upcoming Oakley Vanguard or Ray-Ban Meta Display glasses, note they aren't fully supported yet.
- OR use the Mock device kit: The SDK provides a built-in mock wearable device simulator (
mwdat-mockdevicelibrary) that you can use if you don't have hardware. This lets you simulate a camera feed and test your app's logic without real glasses. - Bluetooth and permissions: If using real glasses, make sure your phone's Bluetooth is on, and you understand that users will need to grant Bluetooth, Camera, and Microphone permissions to your app. The glasses will stream data via Bluetooth Low Energy, so those permissions (and possibly Location on Android for BLE scanning) are required.
3) Get Access to Meta's Wearables Device Access Toolkit
- Go to the Wearables Developer Center: Navigate to the Meta Wearables Dev Center and log in. From there, access the Device Access Toolkit section. (If you haven't been approved for the preview yet, you'll see info on how to request access.)
- Request preview access: Fill out the application form for the Device Access Toolkit preview (Meta might ask for info on your developer experience and use case). Submit the form and wait for approval.
- Accept terms: Once approved, you'll need to agree to the Meta Wearables Developer Terms and Acceptable Use Policy before getting the SDK. These outline data handling and privacy requirements for using the glasses SDK.
- Create a project: In the wearables dev console, create a new Project (this will generate an App ID/Project ID). For example, you might create "GlassesMVP" as a project. Under this project, register your iOS bundle ID and/or Android package name if required.
- Set up a release channel: Within your project, set up a Release Channel for testing. For instance, create a "InternalTest" channel and add yourself (and any team testers) as authorized users. This mechanism allows you to share test builds that can connect to the glasses.
- Download credentials (if any): Currently, the Device Access Toolkit doesn't require downloading API keys or config files – access is handled via the SDK and the Meta AI companion app. However, ensure you have:
- iOS: Your Apple Developer Team and Bundle ID configured in Xcode, and necessary entitlements (if any) enabled.
- Android: A GitHub personal access token (for fetching the Maven package) – we'll set this up in the next section. No special API token for Meta is needed at this stage.
Done when: You have access to the "Wearables Device Access Toolkit" in the Meta developer portal, have set up your project (with appropriate App IDs), and have the SDK download links or GitHub repos ready. At this point, you should also have your glasses paired with the Meta AI app and visible as "online/connected" in that app for testing.
4) Quickstart A — Run the Sample App (iOS)
Goal
Run Meta's official Camera Access sample app on iOS and verify that it can connect to your glasses (or the mock device) and perform a basic feature (like taking a photo via the glasses camera).
Step 1 — Get the sample
- Option 1: Clone the repo. Clone the iOS SDK repo:
git clone https://github.com/facebook/meta-wearables-dat-ios.git. Open the sample Xcode project atmeta-wearables-dat-ios/samples/CameraAccessin Xcode. - Option 2: Download ZIP. On the GitHub page for the iOS toolkit, download the repository as a ZIP. Unzip it, then open
samples/CameraAccess/CameraAccess.xcodeprojin Xcode. - The sample "CameraAccess" app comes with minimal code to discover the glasses and capture images. It's a great starting point to ensure everything is set up correctly.
Step 2 — Install dependencies
- Add the SDK via Swift Package Manager: The sample might have the package reference already, but if not, add the package manually:
- In Xcode, go to File > Add Packages…
- Enter the repository URL:
https://github.com/facebook/meta-wearables-dat-ios. - Select the
meta-wearables-dat-iospackage that appears. - Set the Version to the latest tag (e.g.
0.3.0). - Add the package to the CameraAccess sample target.
- Resolve SPM packages: Xcode will fetch the toolkit's packages (it includes modules for core functionality and camera). You should see it added to your project's Package Dependencies.
- (No CocoaPods needed – Meta distributes this SDK via SPM. If your Xcode is older or you prefer manual integration, you could add the frameworks directly, but SPM is recommended.)
Step 3 — Configure app
- Bundle ID: Ensure the sample app's bundle identifier is unique within your Apple account (you may use the default or set one of your own). It doesn't need to match the Meta dev portal project necessarily for testing locally.
- Permissions (Info.plist): Add usage description strings for the required permissions:
NSCameraUsageDescription– e.g. "Allow using camera (glasses) to capture photos." (Not strictly the phone's camera, but it's good to include in case of any camera API usage.)NSMicrophoneUsageDescription– e.g. "Allow recording audio through smart glasses' microphones."NSBluetoothAlwaysUsageDescription– "Allow connecting to smart glasses via Bluetooth."
- Capabilities: In Xcode > Signing & Capabilities, enable Background Modes > Uses Bluetooth LE accessories if you intend the app to maintain a connection while backgrounded (for basic functionality this may not be needed, but it's good to prepare for real use cases).
- The sample app likely already requests needed permissions at runtime. When you run it, be ready to grant Bluetooth and microphone access prompts on the device.
Step 4 — Run
- Select the target: In Xcode's scheme selector, choose the
CameraAccesssample app target. - Choose a device: Select your iPhone as the run destination (connect it via USB or use Wi-Fi debugging). Do not use an iOS simulator, because it cannot interface with Bluetooth devices (the glasses).
- Build & Run: Hit the Run ▶️ button. The app will install on your iPhone and launch.
- Initial launch: Grant the requested permissions on the phone (Bluetooth permission prompt, microphone permission if prompted).
Step 5 — Connect to wearable/mock
- Pair the glasses: Make sure your Ray-Ban/Oakley glasses are on and connected to the Meta AI app on the iPhone. The Meta AI app must be running (foreground or background) as it brokers the connection. If not already paired, pair the glasses via the Meta app first.
- SDK connection: The sample app should automatically detect the presence of a supported wearable. There might be a "Connect" button in the UI – tap it, and the SDK will hand off to the Meta AI app to authorize the connection.
- Using the mock device: If you don't have hardware and included the
mwdat-mockdevicemodule, the sample might allow switching to a "Mock Mode". This simulates a glasses device. (Ensure you've includedmwdat.mockdevicein dependencies if trying this.) - Grant any glasses permissions: The first time the app tries to use the glasses' camera, the Meta AI companion app might ask for user consent (e.g., "Allow Camera Access from GlassesMVP?"). Approve this on the glasses/Meta app. The LED on the glasses will light up when streaming as a privacy indicator.
Verify
- Connected status: The sample app should indicate when it's connected to the glasses (e.g., a status label or icon turning green). Verify that the app recognizes the glasses.
- Photo capture works: Use the sample's UI to trigger a photo capture from the glasses. Typically, this might be a button labeled "Take Photo" in the app. When tapped, the glasses should snap a picture (you might hear a shutter sound from the glasses). The image should then appear in the app's UI (or log output). Confirm that you see the photo or that no errors occurred.
- Live stream (if available): If the sample supports video streaming, start the stream and verify that you get live video frames from the glasses camera in the app. The preview might be low-res (720p) due to Bluetooth limits, but you should see it updating.
- Note: If you're using the mock device, verifying means you should see the sample app "connect" to a fake device and perhaps display a placeholder image or a test pattern from the simulated camera.
Common issues
- Build errors (iOS): If the project fails to build, check that you added the Swift Package correctly. An error about missing packages or
No such module 'MWDAT'means the SPM integration failed – re-add the package and ensure Xcode resolved version 0.3.0 (or latest). If you see a codesigning error, update the sample app's bundle ID to one of your own and refresh provisioning profiles. - Glasses not found: If the app says "No device" or doesn't connect, ensure your glasses are powered on and connected in the Meta AI app. The phone might need to be on the same Wi-Fi as the glasses (for Ray-Ban, Bluetooth alone should suffice). Also confirm that the Meta AI app is running; without it, the toolkit cannot find the glasses.
- Permission denied: If you trigger capture and nothing happens, or the app immediately shows an error, it could be a permission issue. Make sure you allowed Bluetooth (for connecting to glasses) and Microphone (for audio, if used). On iOS, check in Settings > Privacy that the app has Bluetooth access. On first use, iOS's Bluetooth permission dialog is easy to miss – if it was denied, you'll need to enable it in Settings for the app.
5) Quickstart B — Run the Sample App (Android)
Goal
Run the official Camera Access sample app on Android and verify it works with the glasses or mock device, similarly to the iOS quickstart. We'll build and run the sample to capture an image via the glasses.
Step 1 — Get the sample
- Clone the repo:
git clone https://github.com/facebook/meta-wearables-dat-android.gitand open the project in Android Studio. The sample is located undersamples/CameraAccess. - (If Android Studio prompts to import Gradle project, do so. You might open the top-level
build.gradle.ktsor use the _Import Project function pointing to the repository root.)_ - The sample app is likely a minimal UI that connects to the glasses and has a button to take a photo. Opening it in Android Studio will allow us to set up dependencies next.
Step 2 — Configure dependencies
- Add GitHub Maven repository: The toolkit is distributed via GitHub Packages, so we need to authenticate to download it. In the project's
settings.gradle.kts, add the Meta GitHub Maven repo:
dependencyResolutionManagement {
repositories {
maven {
url = uri("https://maven.pkg.github.com/facebook/meta-wearables-dat-android")
credentials {
username = "" // not needed
password =
System.getenv("GITHUB_TOKEN") ?: localProperties.getProperty("github_token")
}
}
// ... other repos ...
}
}
Ensure you have a GitHub personal access token (Classic) with at least read:packages scope. You can add the Wearables SDK to your Android setup and put this token in your global environment as GITHUB_TOKEN, or in your local.properties as github_token=YOUR_TOKEN.
- Add SDK dependencies: In the app module's
build.gradle.kts, add the Wearables Toolkit libraries. The sample's Gradle TOML uses version0.3.0:
dependencies {
implementation(libs.mwdat.core) // Core functionality
implementation(libs.mwdat.camera) // Camera access feature
implementation(libs.mwdat.mockdevice) // (Optional) Mock device for testing
}
If you aren't using the version catalog, you can directly add:
implementation("com.meta.wearable:mwdat-core:0.3.0")
implementation("com.meta.wearable:mwdat-camera:0.3.0")
implementation("com.meta.wearable:mwdat-mockdevice:0.3.0")
This brings in the SDK's core, camera, and mock device components. You can check the available versions in GitHub Packages.
- Sync Gradle: Let Gradle sync and download the artifacts. If prompted for credentials for the GitHub repo, ensure the token is configured properly (you shouldn't need a username for GitHub Packages if token is provided).
- Note on Gradle config: The sample Gradle may already have these steps done. Just make sure the sync succeeds. If you encounter
401 Unauthorizederrors, double-check the GitHub Packages token setup.
Step 3 — Configure app
- Application ID: Change the app's
applicationId(inapp/build.gradle) if needed to a unique package (especially if you plan to install alongside other samples). The default might be something likecom.meta.wearables.sample. - Permissions in AndroidManifest: Ensure the app manifest requests the required permissions:
<uses-permission android:name="android.permission.BLUETOOTH" />(for older Android versions).<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />and<uses-permission android:name="android.permission.BLUETOOTH_SCAN" />(for Android 12+ to allow connecting to the glasses).<uses-permission android:name="android.permission.RECORD_AUDIO" />(if audio is used or to future-proof for microphone input).- (The camera on the glasses doesn't require
android.permission.CAMERAon the phone, since it's not the phone's camera. But adding it won't hurt if your app also uses the phone camera.)
- Meta AI app requirement: Ensure the user has the Meta AI companion app installed on the Android device. The sample might check for it. Without the Meta app, the glasses likely cannot relay data to your app.
Step 4 — Run
- Select run configuration: In Android Studio, select the
apprun configuration (it should be the default one for the CameraAccess sample app). - Choose a device: Pick a physical Android phone (via ADB). Connect your device via USB (enable USB debugging) or ensure it's visible to Android Studio. As noted, an emulator likely won't handle the Bluetooth interactions with real glasses.
- Run the app: Click Run. The app will build, install on the phone, and launch. Watch for any run-time permission prompts on first launch.
- Permissions on launch: Grant Bluetooth permission when requested (for Android 12+, you'll see prompts for Nearby Devices which covers Bluetooth scanning/connecting). Also grant Audio/Microphone if the app asks.
Step 5 — Connect to wearable/mock
- Pairing on Android: As with iOS, ensure your Meta glasses are paired to the phone via the Meta AI app. On Android, you should have the Meta AI app logged in and the glasses connected (check that the glasses show as connected in that app).
- Connecting in app: In the sample app, tap the "Connect" button (if it doesn't auto-connect). The Wearables SDK will likely pop a system dialog or utilize the Meta app to confirm connection. Approve any prompts. The glasses' LED should light when the camera is accessed.
- Using the mock: If no hardware is available, ensure you included the
mwdat-mockdevicedependency and that the sample has an option for a mock device. This might be automatic if no real device is found. The mock simulates a glasses connection for testing. - Grant glasses permissions: The first time streaming from the glasses, Android may show a system notification via the Meta app about casting or streaming. Also, the Meta AI app might need to grant permission for your app. Confirm all to establish the link.
Verify
- App shows "Connected": The sample app should update its UI to indicate a successful connection to the glasses (e.g., "Device: Ray-Ban Meta – Connected"). Verify this appears.
- Photo capture works end-to-end: Press the sample app's capture button. The glasses should take a photo (you'll see the capture LED flash). The image should be sent to the app – typically the app will display it on screen or notify you of success (maybe logging the file path). Confirm that an image was received.
- (If streaming) Start a video stream if the sample supports it, and ensure you see live video on the phone from the glasses camera. The quality may auto-reduce if Bluetooth bandwidth is limited, but you should see motion.
- For a mock device test, verify that the app behaves as if a device is present and can "capture" (it might return a dummy image or a test pattern to the app).
Common issues
- Gradle authentication error: If you get errors like "Could not download mwdat-core… 401 Unauthorized" during Gradle sync or build, it means the GitHub Packages auth failed. Double-check your Personal Access Token is correct and not expired, and that you placed it in
~/.gradle/gradle.propertiesorlocal.propertiesor environment variable properly. Ensure the repository URL in Gradle matches exactly (including casing) what Meta's instructions say. - Manifest merger conflict: If your app's
AndroidManifest.xmlhas conflicting attributes (for example, if the SDK library brings in something), resolve by merging or overriding in your app manifest. Usually, adding needed permissions as described prevents most issues. - Device connection timeout: If the sample app cannot find the glasses, ensure the glasses are not connected to another device (they can only pair with one phone at a time). Also, keep the Meta AI app open in background. If it still doesn't connect, try restarting Bluetooth on the phone and power-cycling the glasses, then reconnect via Meta AI app before retrying your app.
6) Integration Guide — Add Meta's Wearables Toolkit to an Existing Mobile App
Goal
Now that you've seen the sample in action, it's time to integrate the Smart Glasses SDK into your own app. We'll set up the SDK in an existing project and implement one end-to-end feature (photo capture from glasses) as a proof of concept.
Architecture
Your app will incorporate a new layer for the wearable integration:
- Mobile App UI – existing interface with which the user interacts (e.g. a "Capture" button).
- Wearables SDK Client – a new component in your app that manages the connection to the glasses and handles data (frames, audio) via the Meta toolkit.
- Wearable device (glasses) – Ray-Ban/Oakley glasses capturing camera frames and audio, communicating with the phone (via the Meta AI companion app bridge).
- Callbacks/Events to App – the SDK will provide callbacks (e.g. "photo captured" event with image data) which your app uses to update UI or store data.
- In practice, your app will call into the Wearables SDK to request actions (like take a photo), and the SDK will asynchronously deliver results (image or error) back to your app.
Step 1 — Install SDK
iOS
- Open your Xcode project. Use Swift Package Manager to add the dependency:
- File > Add Packages… > enter
https://github.com/facebook/meta-wearables-dat-ios. - Select latest version (ensure it matches the preview version you have access to, e.g. 0.3.0).
- Add the package to your app target.
- File > Add Packages… > enter
- Alternatively, if you prefer, add the package reference in your
Package.swiftor use CocoaPods (not officially provided, but you could integrate manually by adding the binaries from the GitHub release). - After adding, import the relevant modules in code (e.g.
import MWDATCameraor similar, depending on how they modularized it).
Android
- In your app's settings.gradle or build.gradle, add the GitHub Maven repo and authentication as described in Quickstart B.
- Add the dependencies in your module's
build.gradle:
implementation("com.meta.wearable:mwdat-core:0.3.0")
implementation("com.meta.wearable:mwdat-camera:0.3.0")
implementation("com.meta.wearable:mwdat-mockdevice:0.3.0")
(Check GitHub Packages for the latest version.)
- Sync your project to fetch the SDK. You may need to add your GitHub token to the global Gradle properties for continuous integration or other dev machines.
Step 2 — Add permissions
Make sure your app's manifest (Android) and plist (iOS) declare necessary permissions, as earlier:
iOS (Info.plist):
NSBluetoothAlwaysUsageDescription– Explain why the app needs Bluetooth (e.g. "Bluetooth is used to connect to your AR glasses for camera streaming").NSMicrophoneUsageDescription– If you plan to use audio.NSCameraUsageDescription– Mention using the glasses camera.- (No specific entitlement is known to be needed, since the Meta AI app handles the Bluetooth session. Standard Bluetooth usage covers it.)
Android (AndroidManifest.xml):
<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />(for controlling the connection).<uses-permission android:name="android.permission.BLUETOOTH_SCAN" android:usesPermissionFlags="neverForLocation" />(for discovering the device without implying location use).<uses-permission android:name="android.permission.RECORD_AUDIO" />(if audio from glasses will be used).- If targeting Android 13, also consider
<uses-permission android:name="android.permission.NEARBY_WIFI_DEVICES" />if needed (though primarily BLE will use the above). - You do not need to add special Meta-specific permissions. The SDK will internally work with the Meta AI companion app which has its own permissions.
Both Platforms:
- At runtime, be prepared to request Bluetooth and Microphone permissions from the user. On Android, you should call the
requestPermissionsforBLUETOOTH_CONNECT(andBLUETOOTH_SCANif needed for initial discovery) andRECORD_AUDIOif using audio. On iOS, the system will prompt for Bluetooth when the SDK first tries to access the glasses.
Step 3 — Create a thin client wrapper
To keep your code organized, create a small set of classes/services to abstract the wearable integration:
- WearablesClient (e.g.
GlassesClient): This will handle connecting to and disconnecting from the glasses. It uses the SDK to scan for devices, initiate connection, and maintain state (connected/disconnected). It should expose state to your UI (e.g. aisConnectedproperty and an event for connection changes). - FeatureService (e.g.
GlassesCameraManager): This will handle specific actions like taking a photo or starting a video stream. It wraps the SDK calls (for example, atakePhoto()function that invokes the toolkit's capture method and returns the image or error via a callback/promise). - PermissionsService: Handle checking and requesting permissions (Bluetooth, mic, etc.) before performing actions. This ensures that when the user taps a feature, your app has the necessary OS permissions; if not, it prompts them.
Implement minimal versions of these:
- In WearablesClient: initialize the SDK (if needed) on app launch. For example, the toolkit might require a context or initialization call – set that up in your AppDelegate (iOS) or Application class (Android). Have functions like
connect()anddisconnect()which internally call the SDK's methods to start connecting to the first available glasses. - In FeatureService: have a method like
capturePhoto()that first ensures the glasses are connected (perhaps via WearablesClient), then calls the SDK's camera API. The SDK likely provides an async callback when the photo is ready. Tie that into your app (maybe via a delegate or LiveData/Flow to update the UI).
Definition of done:
- The Wearables SDK is initialized when your app launches or when the user opts in (depending on your design). No crashes or errors on startup related to the SDK.
- Your app can connect/reconnect to the glasses gracefully. E.g., if the connection drops (glasses turned off), your WearablesClient notices and updates the UI/status, and can attempt reconnection if the user chooses.
- Errors are handled: If a user tries to take a photo while not connected, your app should show a friendly error ("Glasses not connected. Please connect your device."). Any SDK errors or exceptions should be caught and either logged or surfaced to the user in understandable terms.
Step 4 — Add a minimal UI screen
Design a simple interface in your app for the wearable feature. For example, create a "Glasses" screen or section:
UI elements to include:
- "Connect Glasses" button: when tapped, triggers the WearablesClient to scan and connect to the glasses. It could also toggle to "Disconnect" when already connected.
- Connection status indicator: a small icon or text that shows "Connected" (maybe in green) or "Not connected" (red/gray). This should update live based on WearablesClient's status.
- "Capture Photo" button: when tapped, invokes the FeatureService to capture a photo via the glasses camera.
- Image thumbnail display: an
UIImageView(iOS) orImageView(Android) to show the last captured photo from the glasses. You might also include a label or toast that says "Photo saved!" upon success. - (Optional) A log or text area for debug info – during development it helps to show messages like "Connecting to glasses…", "Capture in progress…", etc., especially since the glasses interactions have some latency (few seconds).
With this simple UI, you can now integrate the logic: enable the Capture button only when connected, etc. This sets the stage for actually implementing the feature in the next section.
7) Feature Recipe — Trigger Photo Capture from Wearable into Your App
Goal
When the user taps "Capture" in your app, the glasses should snap a photo and send it to your app, which then displays the image (and possibly saves it). This is a full round-trip of a key MVP feature for many glasses apps.
UX flow
- Ensure connected: Your UI should indicate if glasses are connected. If not, the first step is to connect (or prompt the user to do so) before allowing capture.
- User taps Capture: The user initiates the action via the app UI.
- Show progress state: Immediately give feedback (e.g. disable the capture button and show a "Capturing…" message or spinner). The wearer will also see the glasses' capture LED come on during this process.
- Receive result: The SDK returns the photo (image bytes or file URL) once the glasses have taken it and sent it over Bluetooth.
- Display & save: The app shows a thumbnail of the image in the UI and saves it to the camera roll or app storage, then notifies the user (e.g. a brief "Saved ✅" message).
Implementation checklist
- Connected state verified: In your
onCaptureTapped()handler, first checkif (!wearablesClient.isConnected) { alert("Please connect your glasses first."); return; }. This prevents a bad call. - Permissions verified: Also check that necessary permissions are granted (your PermissionsService can help here). If not, request them and only proceed if granted.
- Capture request issued: Call the Device Access Toolkit's capture function. For example, it might be something like
wearablesClient.camera?.takePhoto(completion). This will likely be asynchronous. - Timeout & retry handled: If the photo doesn't come back within a reasonable time (say a few seconds), implement a timeout. The Bluetooth transfer of image data could fail or be slow if the connection is weak. After, say, 10 seconds with no response, you might cancel the request and show an error with an option to retry.
- Result persisted & UI updated: When a photo is successfully received, save it (e.g., to the Photos library or a designated folder). Then update the UI: display the new image in your thumbnail view and perhaps flash a "Saved" confirmation. Clear the "Capturing…" state and re-enable the Capture button.
Pseudocode
func onCaptureButtonPressed() {
guard GlassesClient.shared.isConnected else {
showAlert("Please connect your glasses first.")
return
}
guard PermissionsService.shared.allPermissionsGranted() else {
PermissionsService.shared.requestPermissions { granted in
if granted {
self.onCaptureButtonPressed()
}
}
return
}
self.updateUI(state: .capturing)
GlassesCameraManager.shared.capturePhoto { result in
self.updateUI(state: .idle)
switch result {
case .success(let photo):
self.photoImageView.image = photo
PhotoLibrary.save(photo)
self.showToast("Photo captured and saved ✅")
case .failure(let error):
self.showAlert("Capture failed: \(error.localizedDescription)")
log("Glasses capture error: \(error)")
}
}
}
This pseudocode illustrates the flow:
- Ensure connection and permissions.
- Update UI to a "capturing" state.
- Invoke the capture.
- Handle the callback with success (display image, save file) or failure (show an error).
Troubleshooting
- Capture returns empty: If you get a callback but the image is null/empty, check logs. A common cause is lack of permission – e.g. if the glasses' camera usage wasn't authorized by the user (the Meta app might block it). Ensure the glasses are actively connected (sometimes after long idle you may need to reconnect). Also verify your glasses actually have a camera (e.g., Ray-Ban Display has one, but if in some future a device didn't, that could be an issue).
- Capture hangs indefinitely: This could be due to a lost connection mid-capture. Implement a timeout on your end. If a capture is in progress for >~10 seconds with no response, cancel and inform the user. Advise them to check the glasses (battery, connection) and try again. In some cases, toggling Bluetooth off/on on the phone and reconnecting can fix a stuck state.
- "Instant display" expectation: Users might expect a nearly instant photo like a smartphone. However, with Bluetooth transfer, a full-resolution image can take a couple of seconds to come through. Manage this by using a placeholder UI – e.g., show a loading thumbnail or animation while waiting for the image, so the app feels responsive. You could even show a low-res preview if the SDK provides one, then swap in the high-res image when done.
8) Testing Matrix
Test your integrated feature in various scenarios to ensure robustness:
| Scenario | Expected Result | Notes |
|---|---|---|
| Mock device (no hardware) | Feature works with simulated data (e.g., returns a sample image) | Use this in CI or when no glasses are on hand. Make sure your app can gracefully handle using the mock device by perhaps a debug setting. |
| Real device – close range | Low latency capture & stream | Test with phone and glasses in the same room with good Bluetooth signal. This is the baseline happy path – captures should succeed consistently within ~1-3 seconds. |
| Background / lock screen | Defined behavior (likely capture is blocked or delayed) | If the user tries to capture while the app is backgrounded or phone is locked, it may fail or queue until app is foreground. Document that the app must be active. Also note that if the phone goes to lock during a stream, the stream might cut off. |
| Permission denied | Clear error message to user, prompt to enable | E.g., if Bluetooth permission was denied and the user taps Connect or Capture, the app should detect this and show "Bluetooth permission is required. Please enable it in settings." No silent failure. |
| Disconnect mid-action | Graceful handling, no crashes | If the glasses turn off or disconnect during a capture or stream, the app should handle the error (SDK likely throws an exception or returns failure). The UI should reset to a safe state (stop any spinners) and perhaps auto-attempt reconnection or prompt the user. No crashes or stuck UI. |
Make sure to run through these and log any unexpected behavior. It's easier to fix edge-case issues now than when a user finds them later.
9) Observability and Logging
Add logging and metrics to make debugging easier and to collect usage data (if permitted):
- Connection events: Log when you start connecting (
connect_start), when the glasses successfully connect (connect_success), and if a connection attempt fails or drops (connect_failwith error info). This helps track reliability. - Permission state: Log whether required permissions are granted or not at key points (
permission_okorpermission_missing) to catch if a feature was invoked without proper permissions. - Feature usage: For each major action like photo or stream:
- Log
photo_startwhen user taps Capture, andphoto_successorphoto_failwhen done (with error details). Similarlystream_start/stop.
- Log
- Latency metrics: Measure the time between request and response for captures or stream start. For example, log
photo_ms=2345for a 2.345s round trip. This can reveal if performance is degrading. - Reconnection count: If you implement auto-reconnect, count how many times a reconnection happens in a session and log if it exceeds a threshold (it might indicate underlying issues with Bluetooth stability).
- Use the above logs for debugging during development. In production (when allowed in future), these can feed into analytics to understand how the feature performs in the wild (just ensure to follow any data collection rules in Meta's policy and user privacy best practices).
10) FAQ
Q: Do I need the actual glasses hardware to start developing?
A: Not initially – Meta's toolkit includes a mock device mode that lets you simulate an AI glasses device on your development machine. You can build out most of your integration and even run the sample app using this. However, to fully test real-world behavior (especially Bluetooth performance and user experience), having at least one pair of supported glasses is recommended. The mock kit is great for CI and early development, but nothing beats testing on real hardware before launch.
Q: Which smart glasses are supported by this toolkit?
A: As of the preview, the SDK supports Ray-Ban Meta AI Glasses (2nd Gen) and Oakley Meta HSTN smart glasses. These are the 2023 models with cameras and audio. Support for Oakley Meta Vanguard and Ray-Ban Meta Display (the one with the heads-up display) is in the works, but note that display features won't be accessible yet (you'll only get camera access from them). Always check the latest Meta documentation for newly added devices as the ecosystem grows.
Q: Can I ship an app using this to the App Store/Play Store now?
A: Not to general users. The Wearables Device Access Toolkit is in developer preview – it's intended for prototyping and testing only, meaning developers can build and test but cannot use it for general public use. You can distribute test builds (TestFlight, internal testing on Play Console) within your organization or to approved testers. Meta plans to allow public publishing once the toolkit is out of preview (targeted sometime in 2026, when general voice controls are being explored for future updates). Keep an eye on Meta's announcements for when the SDK exits preview before planning a public launch.
Q: Do users need to have the Meta AI app or a Facebook account for this to work?
A: Yes, users will need the Meta AI companion app (and a Meta account) set up with their glasses. The Meta AI app acts as the bridge to the glasses. Your app communicates through it – there isn't a way to talk to the glasses directly without Meta's app in the middle (this ensures things like permissions and device management are handled by Meta's ecosystem). So in your onboarding, you should mention that the user must have their glasses configured with the Meta AI app beforehand.
Q: Can I push content or notifications to the glasses?
A: Not with this toolkit in its current form. The Device Access Toolkit is primarily one-way (glasses -> app) for camera and microphone data. You cannot overlay graphics on the glasses' display (for those that have one) via this SDK, and you cannot directly send audio to the glasses except as part of normal phone audio (e.g. if your app plays a sound, it will come through the glasses' speakers as they act like Bluetooth headphones). Features like AR visuals on the glasses or sending text to a heads-up display are likely for future SDK updates, but not available now.
Q: Is voice control (Hey Meta…) accessible through this SDK?
A: Not yet. The built-in Meta voice assistant and its AI capabilities (like live transcription or Q&A) are outside the scope of the current preview, as general voice controls are being explored for future updates. The SDK doesn't provide hooks into the glasses' voice commands or allow you to intercept those. You can, of course, capture audio from the mics and run it through your own speech recognition if you want, but integration with the on-glasses "Hey Meta" assistant isn't part of the toolkit at this time.
12) Changelog
- 2026-01-05 – Verified with Wearables Device Access Toolkit v0.3.0 (Developer Preview) on iOS 17 (Xcode 15) and Android 14 (API 34). Tested with Ray-Ban Meta AI Glasses (2nd Gen) and SDK mock device. Updated steps for latest Gradle and Xcode integration.