Published on

Meta Wearables Device Access Toolkit: Quickstart Guide for AI Glasses Integration (2025)

Authors
  • avatar
    Name
    Almaz Khalilov
    Twitter

Meta Wearables Device Access Toolkit: Quickstart Guide for AI Glasses Integration (2025)

Want to extend your mobile app into a glasses form factor without guessing the connection steps, permission prompts, or event handling lifecycle? This guide gives you the fastest path from obtaining the SDK to a working demo—plus a production-ready checklist for building with Meta’s AI glasses.


What This Guide Covers

  • How to get access to the toolkit (Preview program and requirements)
  • How to install the SDK and run the sample apps (iOS and Android)
  • The core building blocks (connect, permissions, events, actions)
  • A reusable "first feature" pattern you can adapt to any use case
  • Testing, privacy, and rollout basics (with Australian context)

Before You Start

Access & Accounts

Preview Access: The toolkit is in developer preview, meaning you can build and test within your team, but only select partners can publish integrations to the public until general availability in 2026. The Meta Wearables Developer Center lets you manage organizations, tester groups, and releases during this phase.

Devices & Environment

← Scroll for more →
RequirementiOSAndroid
PhoneiPhone (iOS 15.2 or later)
(e.g. iPhone 12+ with iOS 17)
Modern Android (Android 10 or later)
(e.g. Pixel 5+ with Android 13)
Dev toolsXcode 15+ (Swift Package Manager ready)Android Studio Giraffe+ (Gradle 8+)
Toolkit SDKWearables DAT Preview SDK (Swift)Wearables DAT Preview SDK (Gradle/Maven)
Target deviceRay-Ban Meta Smart Glasses (2nd Gen) or Oakley Meta HSTN
Supported AI glasses with camera
Ray-Ban Meta Smart Glasses (2nd Gen) or Oakley Meta HSTN
Supported AI glasses with camera

Note: You'll need a pair of supported AI glasses (e.g. Ray-Ban Meta) to fully test camera features. If you don't have hardware, use the Mock Device Kit to simulate a glasses device in software. Also, make sure the glasses are set up and paired via the Meta AI mobile app before attempting to connect from your own app.


Quickstart: Install + Run the Samples

Step 1) Download SDK + Samples

  • Developer Center: Grab the Meta Wearables Device Access Toolkit from the Wearables Developer Center (available to preview participants). This will provide the SDK and sample projects.
  • Documentation: Refer to the official Meta Wearables documentation for setup instructions and API references.
  • Sample apps: Use the provided sample apps (for iOS and Android) as your starting point. They demonstrate connecting to glasses, streaming camera output, and basic interactions. You can find the sample app code in the GitHub repositories (in the samples/CameraAccess folder for each platform).

Step 2) iOS: Build & Run

First, add the Wearables SDK to your iOS app. The easiest method is via Swift Package Manager (SPM):

  1. In Xcode, go to File ▸ Add Packages... and search for the repository URL https://github.com/facebook/meta-wearables-dat-ios.
  2. Select Meta Wearables DAT iOS and choose the latest available version tag.
  3. Add the package to your app target. Xcode will fetch the SDK package and integrate it into your project.

Now open the sample app or your project with the SDK:

# Example: open the Xcode workspace of the sample app
open MetaWearablesSample.xcworkspace
# In Xcode, select your development team for code signing if needed,
# then build and run on a physical iPhone connected via USB.

When you run the app on your device, the first launch should prompt for necessary permissions. Make sure you've added usage descriptions in your app's Info.plist for Camera, Microphone, and Bluetooth. iOS requires you to declare why your app needs these features, or it will reject the app's access at runtime. For example:

  • NSCameraUsageDescription – e.g. "Allow camera access to enable glasses video streaming."
  • NSMicrophoneUsageDescription – e.g. "Allow microphone access for glasses audio input."
  • NSBluetoothAlwaysUsageDescription – e.g. "Allow Bluetooth to connect to AI glasses."

On first run, grant the camera and mic permissions when prompted by iOS. Also, pair the glasses via the Meta AI app beforehand as users must give each app permission to access their glasses. You should then see the sample app's connect screen.

Checklist

  • Project builds successfully in Xcode (no missing packages or code errors).
  • App launches on the iPhone without crashing.
  • All required permissions are present in Info.plist and are requested at runtime (you should see iOS permission pop-ups for camera, microphone, etc.).
  • You can navigate to the sample's "Connect to Glasses" screen in the app.

Step 3) Android: Build & Run

On Android, integrate the Wearables SDK via Gradle/Maven. Meta distributes the Android SDK as a Maven package on GitHub. You'll need a GitHub personal access token (with read:packages scope) to fetch it.

  1. Add Meta's GitHub Maven repository in your settings.gradle or build.gradle:
// settings.gradle.kts (Groovy similar)
dependencyResolutionManagement {
    repositories {
        // ... other repos ...
        maven {
            url = uri("<https://maven.pkg.github.com/facebook/meta-wearables-dat-android>")
            credentials {
                username = "" // not used
                password = System.getenv("GITHUB_TOKEN") ?: localProperties.getProperty("github_token")
            }
        }
    }
}

  1. Add the Wearables Toolkit dependencies in your app's build.gradle. For example, in Gradle Kotlin DSL with a version catalog:
// In libs.versions.toml (version catalog)
[versions]
mwdat = "0.3.0"                    # Use latest preview version available

[libraries]
mwdat-core = { group = "com.meta.wearable", name = "mwdat-core", version.ref = "mwdat" }
mwdat-camera = { group = "com.meta.wearable", name = "mwdat-camera", version.ref = "mwdat" }
mwdat-mockdevice = { group = "com.meta.wearable", name = "mwdat-mockdevice", version.ref = "mwdat" }

Then in your module build.gradle.kts:

dependencies {
    implementation(libs.mwdat.core)
    implementation(libs.mwdat.camera)
    implementation(libs.mwdat.mockdevice)
}

(If you're not using version catalogs, simply add the coordinates com.meta.wearable:mwdat-core:0.3.0 (and the camera and mockdevice libraries) in your dependencies.)

  1. Sync your Gradle project to download the SDK. Open the sample app project (if provided) or your project with the new dependencies. Ensure you have a physical Android device connected (with developer mode and Bluetooth enabled), then run the app from Android Studio.

Before running, double-check Android permissions: declare in AndroidManifest.xml what you'll use. This typically includes:

  • <uses-permission android:name="android.permission.CAMERA" />
  • <uses-permission android:name="android.permission.RECORD_AUDIO" />
  • Bluetooth permissions (for Android 12+, e.g. BLUETOOTH_CONNECT, BLUETOOTH_SCAN).
  • (If targeting older Android, you may also need ACCESS_FINE_LOCATION for Bluetooth device discovery.)

At runtime, request any dangerous permissions (camera, audio, location if needed for BLE) via ActivityCompat.requestPermissions. The sample app likely handles this for you. Grant the permissions and proceed to the glasses connection screen.

Checklist

  • Project builds and Gradle dependencies resolve (you may need to add a github_token in local.properties as described in the README).
  • App launches on the Android device without immediate issues.
  • All required permissions are declared in the manifest and requested on first use (you should see runtime permission dialogs for camera, microphone, etc.).
  • You can reach the "Connect to Glasses" screen in the app.

Step 4) Connectivity Smoke Test

Now for the moment of truth: connecting to the glasses. Using either the iOS or Android sample app (running on a device), go through the flow to connect to your Meta glasses. This usually involves selecting the glasses from a device list or simply tapping a "Connect" button if the device is already paired.

Goal: Verify that your app can successfully connect to the glasses and receive at least one sensor event (e.g. a camera frame or a status update).

  • Expected result: The app should indicate a Connected status (e.g. a UI label or toast). For a camera test, you might see a live preview from the glasses' camera in the app. If testing microphone, you might see an audio level or hear audio through the glasses. In any case, some data should flow from the glasses to the app (even if just a log message confirming connection).
  • If it fails: Don't panic. Check the following:
    • Pairing status: Ensure the glasses are powered on and paired via the Meta AI app on your phone. The toolkit leverages the system pairing – if the device isn't paired to the phone, the SDK can't connect.
    • Permissions: Confirm you granted all prompts (camera, mic, Bluetooth). If you denied any, you may need to enable them in settings or reinstall the app.
    • OS settings: On Android, verify Bluetooth and Location (for BLE) are enabled. On iOS, ensure Bluetooth is on and the app has permission to use it. Also, the glasses might need to be connected as an audio device (for microphone streaming via Bluetooth HFP) – check your phone's Bluetooth settings to see if the glasses are connected for calls/media.
    • SDK version: Use the latest preview SDK. If you pulled an older version, consider updating to pick up any bug fixes (check the changelog in the repository).
    • Logs: Look at Xcode/Logcat logs for hints (e.g. authentication errors or missing permission messages).

If all else fails, consult the official docs and community forums with your log details. The preview program is evolving, so you might encounter some hiccups that others have solutions for.


Core Concepts You'll Use in Every Feature

Once the basic connection is working, it's important to understand the core patterns of this SDK. Virtually every glasses-enabled feature you build will involve the following:

1) Connection Lifecycle

Managing the connection to the glasses is fundamental. Your app should handle connect and disconnect events gracefully. For example, when a user taps "Connect", you'll call the SDK to initiate pairing or connection to the glasses. You need to manage what happens if the connection drops (e.g., if the user takes off or turns off the glasses, or goes out of range).

Key considerations:

  • Connect/Disconnect Handling: Use the SDK's connect method to initiate the session, and provide a way for users to disconnect. Monitor callbacks or notifications for connection state changes (connected, disconnected, error).
  • App Background/Foreground: Decide how the connection should behave when your app moves to the background. During early development, it might be simplest to disconnect when backgrounded and reconnect on foreground. In production, you might implement background modes (iOS) or a foreground service (Android) to keep critical sessions alive if needed.
  • Reconnect strategy: Implement a retry or auto-reconnect strategy. For instance, if the connection is lost unexpectedly, attempt to reconnect a few times with delays, and alert the user if it fails. This prevents a "works once then stops" scenario. Having timeouts and giving feedback (like "Reconnecting...") leads to a better user experience.

2) Permissions & Capabilities

Each glasses feature will touch certain device capabilities, so you must handle permissions proactively:

  • Know your required permissions: Determine which sensors you need. Camera and microphone are obvious for streaming features; voice commands need microphone; if you use any location-dependent features (less likely for glasses themselves, but maybe pairing requires BLE scan which needs location on Android), include those too. Always update your app's privacy usage strings to reflect why these permissions are needed, in a user-friendly way.
  • Request at the right time: Don't just dump all permission requests on first launch. Ideally, request when a feature is about to be used (e.g., ask for camera access when the user starts a livestream feature). This context helps users accept them. However, for glasses functionality, you might bundle camera+mic together if both are essential for the very first use.
  • Graceful fallbacks: If a user denies a permission, your app should handle it without crashing. Show an explanation and offer a fallback if possible. For example, if camera access is denied, perhaps the feature is disabled with a message like "Camera permission is required to use glasses streaming. Please enable it in Settings." Similarly, if Bluetooth is off or not allowed, prompt the user to enable it.

Remember that when it comes to the glasses themselves, the user also has to grant permission to link the glasses to your app. The first time your app tries to use the glasses, the Meta system may prompt the user to confirm they trust your app with the glasses' camera/audio. This is similar to how apps request permission for phone sensors, but it's for the wearable device. Ensure your users understand why they should approve this.

3) Events → Actions Pattern

The core loop of glasses integration is reacting to events from the device and triggering actions in your app. Think of it as an extension of the observer pattern:

  • Events: These are signals from the glasses — e.g. "user pressed the capture button", "glasses started or stopped recording", "a voice command was detected", or sensor data like "new camera frame ready". The Wearables SDK will provide event callbacks or delegates for such happenings. For example, you might have a onCameraFrame(frame) callback or a onGlassesButtonPressed() event. Standard events may include pause/resume if the user taps the glasses to control media, or status updates when the glasses state changes.
  • Actions: These are your app's reactions to those events — "respond by doing X in the app." For instance, on a camera frame event, your action might be to display the frame in the UI or start analyzing it with an AI model. On a glasses button press, your action could be to toggle a feature (e.g., start a specific workflow or take a note). Essentially, you map glasses inputs to app behaviors.

By structuring your code around events and actions, you keep the integration modular. You could have an EventHandler that receives glasses events and then dispatches to various functions in your app (some might update UI, some might call backend APIs, etc.). This pattern makes it easy to add new capabilities: just subscribe to a new event and define an action for it.


Build Your First Glasses-Enabled Workflow

Now that you have connectivity and core concepts down, it's time to actually build something. This section will guide you through creating a simple feature step-by-step. You can generalize this approach to any glasses-powered idea.

Pick a "First Feature" That's Easy to Validate

Not sure what to build first? Aim for a feature with these traits:

  • Instantly observable: You should see it working immediately, so you know your setup is correct. (Visual or audible feedback is great.)
  • Low risk: It shouldn't require a ton of new infrastructure. We want to leverage the glasses with minimal other dependencies to keep it simple.
  • Useful or at least novel: Ideally, it ties to a real user need or a cool demo that you can show off. This helps keep momentum and justify the integration.

Examples of good first features:

  • "Tap on the glasses → open a specific screen in the phone app." Demo: User taps the frame or a button on the glasses, and the phone app responds by opening, say, a navigation screen or a checklist. This tests event capture (tap event) and an app action (UI navigation).
  • "Voice command → start a predefined workflow." Demo: The user says a trigger phrase (the glasses' microphones pick it up), and your app launches an action (maybe starting audio recording or sending a notification). This tests microphone pipeline and voice recognition (you might use a simple keyword spotter or just detect that audio is streaming).
  • "Camera capture → send a placeholder payload to the phone app." Demo: User presses the capture button on glasses. Instead of saving a real photo, your app simply logs "Photo captured!" or sends an alert in the UI. This tests the camera event and basic data flow without needing to process images yet.

These are simplistic, but that's the point. You want to ensure the end-to-end connection (glasses -> SDK -> your code -> app response) works, before you dive into heavy logic or cloud integrations.

Implementation Template (Pseudo-Code)

Here's a high-level pseudocode outline that most features will follow. You can use this as a blueprint in your actual app code (fill in the platform-specific details):

1) Initialize the Wearables SDK in your app (e.g., set up the SDK instance or service).
2) Check and request any needed permissions (camera, microphone, bluetooth, etc.).
3) Connect to the glasses (trigger the pairing/connection flow via the SDK).
4) Subscribe to relevant glasses events (e.g., camera frame events, button press events).
5) When an event occurs: call an appropriate app action (your custom function).
6) Provide user feedback for that action (update UI or play a sound to confirm it happened).
7) Log key lifecycle events and errors (for debugging and support).

Let's map this to a concrete example for clarity:

  • Initialize SDK: e.g., call Wearables.initialize(context) in Android or set up the SDK singleton in iOS. Possibly register your app with the toolkit if required (some SDKs require an app ID or similar – check docs).
  • Permissions: Before connecting, ensure CAMERA and RECORD_AUDIO (and Bluetooth) permissions are granted. Use the OS frameworks to prompt the user if not.
  • Connect: Use the SDK's connect method. This might return immediately and then asynchronously update you when the connection succeeds or fails (perhaps via a delegate or callback). Handle both outcomes (success = proceed, failure = show error/retry).
  • Subscribe to events: After connecting, set up listeners. For example, Glasses.onCameraFrameReceived = { frame in ... } or Glasses.onButtonPressed = { ... }. The SDK documentation will list available events. Subscribe only to what you need for now.
  • On event, perform action: Write the function that gets called. For a button press, maybe call showSpecialScreen() in your app. For a camera frame, maybe just increment a counter or display a dot to show you got something. Keep it simple initially.
  • User feedback: Always let the user know what's happening. If they pressed a button on their glasses, and it triggered an app action, show a toast or popup like "Glasses input received – opening X". If the connection status changes, update a status indicator in the app UI (e.g., "Connected ✅" or "Disconnected ❌"). This is crucial for trust in the integration.
  • Logging: Use print()/Log.d or a logging framework to record major steps (connect success/fail, events received, actions taken). In early development, this is your best insight into what's going on with the glasses link. In the future, you might route these logs to a file or remote analytics for debugging user issues.

By following this pattern, you can iteratively add more events and actions. Get one working, then you can tackle the next.

Minimal UX Requirements (Don't Skip These)

Even for a simple prototype, your app should include some basic user experience elements whenever glasses are involved:

  • Clear connection status: Always display whether the app is connected to the glasses or not. This could be a simple green/red dot, or text like "Glasses Connected" / "Glasses Disconnected". If the connection is in progress or trying to reconnect, show a status like "Connecting…" so the user isn't left guessing.
  • Friendly error messages: If something goes wrong (and in early stages, something always goes wrong), show errors in plain language. For example, if Bluetooth is off, a message like "Please enable Bluetooth to connect to your glasses." Or if a permission is missing: "Camera access is required for this feature. Please enable it in Settings." This beats a silent failure or a cryptic error code.
  • Fallback path: Design your feature so that if the glasses are unavailable, the user can still do a core task on the phone. For instance, if your glasses feature was taking a photo, allow a fallback to use the phone camera. If it was a voice command, allow a button tap on the phone as an alternative. This way, your app isn't completely blocked if the wearable isn't there or ready.

By covering these UX bases, you make your demo feel much more polished and user-centric, even if the feature is basic. It's also good practice for real-world deployment.


Testing & Troubleshooting

When building for new hardware like AI glasses, testing is your best friend. You'll want to verify not just the perfect case, but also edge cases and failure modes. Here's a sample test matrix and some common pitfalls to be aware of:

Test Matrix

← Scroll for more →
ScenarioExpected BehaviourNotes
First-time setupApp guides user through pairing and permissions on first launch. The glasses connect successfully.e.g. The app shows a onboarding sequence: "We need permission X for Y feature" then "Connect to your glasses" instructions. Use the Mock Device if you want to simulate first run without real hardware.
App backgroundedIf the app goes to background during an active session, it either stays connected (if supported) or gracefully disconnects and recovers on return.iOS might suspend the camera stream when backgrounded unless you enable specific background modes. Android might require a Foreground Service to keep streaming. Document in your user guide what to expect (e.g., "If you leave the app, the glasses will pause streaming").
Permission deniedIf any critical permission is denied, the app handles it.E.g., user denies camera: the app should show a message and disable the camera feature rather than crashing. Test each permission toggle via Settings to see your app's behavior.
Disconnect mid-flowIf glasses disconnect or turn off while a workflow is in progress, the app stops the workflow safely and notifies the user.For example, if recording video from glasses and they power off, the app should stop recording and perhaps auto-save what was captured with a message "Glasses disconnected, recording stopped." The user can then try to reconnect.

Run through each scenario above and verify your app's response. It's often useful to simulate these events (many SDKs provide callbacks you can trigger in debug, or you can simply turn off the glasses to test disconnects).

Common Gotchas

Be on the lookout for these common issues that have tripped up developers:

  • Forgetting runtime permission requests: It's not enough to declare permissions in the manifest/Info.plist – you must call the APIs to request them at runtime on modern Android and iOS. A frequent mistake is adding the keys and assuming it's done. Use the sample app as a reference for how to prompt users.
  • SPM or Gradle quirks: On iOS, if the Swift Package Manager doesn't resolve or the package doesn't fetch, make sure Xcode is updated and the Git URL is correct. On Android, authentication issues for GitHub packages are common – ensure your GITHUB_TOKEN is set in the environment or Gradle properties before sync.
  • Code signing and provisioning (iOS): Since the glasses SDK is from Meta, there's no special entitlement needed (it's not using MFi ExternalAccessory, just Bluetooth), so your usual App ID should work. However, always check that your bundle ID, team, and provisioning profiles are correctly set in Xcode; otherwise, the app won't run on device.
  • Background battery optimization (Android): If you test long-running sessions, some Android phones may kill the app in background (especially if not using a foreground service). During development, disable battery optimization for your app or keep it in foreground to avoid this. For production, if you need background operation (say continuous voice listening), use a Foreground Service notification to stay alive.
  • "One-time demo effect": Many developers get a successful connection once and then it stops. This often traces back to not handling reconnects or not resetting state. For example, if you don't properly close a session on disconnect, the SDK might think it's still active and refuse a new connection. Make sure to follow the SDK's recommended init->connect->disconnect patterns each time. Logging and using the mock device can help isolate if the issue is in your app logic or the SDK.

Remember, this is a preview toolkit – updates will come, and so will improvements. Keep an eye on the SDK's changelog and update your integration accordingly. If something seems off, it might not be you; checking the forums or GitHub issues can reveal if others hit the same snag and found a workaround.


Privacy, Security, and AU Notes

Building applications that tap into cameras and microphones – especially on wearable devices – raises privacy and security considerations. Here are some practical defaults and an Australian context to keep in mind:

Practical Defaults

← Scroll for more →
AreaRecommended DefaultWhy
Data minimizationOnly collect what you need.Reduce the amount of personal data (videos, audio) you store. This lowers the compliance burden and risk if data leaks. If your feature doesn't need to save a photo, just process it in memory and discard it.
StorageAvoid saving raw sensor media by default.Raw video or audio from glasses can be highly sensitive (it's literally what the user sees/hears). Unless necessary, don't auto-save it. If you do, inform the user and encrypt it if possible.
LoggingRedact sensitive info in logs.Your debug logs might accidentally capture personal data (e.g., speech transcripts, image analysis results). Make it a habit to strip or mask user identifiers or content in logs, especially if logs are uploaded for analytics.

AU context: If your app will be used in Australia or by Australian users, be aware of local privacy laws and security guidelines:

  • Under the Privacy Act 1988 (Australia), any personal information (which could include audio/video of individuals) you collect requires you to handle it in accordance with the Australian Privacy Principles. Make sure your privacy policy clearly discloses any collection of camera or audio data from the glasses. Users should know what's being captured, how it's used, and have consented to it.
  • For apps in regulated environments or enterprises, consider aligning with Australia's Essential Eight security strategies (from the Australian Cyber Security Centre). For example, application hardening and logging practices from the Essential Eight can be mapped to how you secure the data coming from the glasses. While your glasses integration might not touch all eight areas, it's good practice to align on things like access control, patch management (keep that SDK updated!), and incident response (know what you'd do if a user reported a sensitive data leak).

Finally, remember that trust is key for adoption of wearable tech. Be transparent with users about when the glasses are active (LEDs on the Ray-Ban Meta glasses help indicate camera use, but your app can reinforce that with on-screen indicators). Also provide easy ways to pause or stop the glasses features. Privacy by design will not only keep you compliant but also delight your users.


Production Readiness Checklist

So your prototype is working and you're ready to consider a wider release (maybe an internal beta or even on an app store once allowed). Go through this checklist to ensure you haven't missed anything important:

  • Robust connection handling: The app reliably connects to the glasses and stays connected as needed. It handles reconnections without user intervention (where possible) and times out/retries gracefully when glasses are unavailable. No infinite spinners!
  • Permission recovery: Every permission denial has a user-facing recovery. (E.g., a "Enable permission" button that opens settings via URL scheme/intents.) New users are walked through granting permissions with clear explanations.
  • Feature flags for preview features: Since the toolkit is in preview, consider gating glasses-specific features behind a flag or enabling them only for certain users. This way, if something changes in the SDK or not all users have glasses, it won't break the whole app experience.
  • Crash and event logging in place: Integrate a crash reporting SDK (like Sentry, Firebase Crashlytics) to catch any exceptions from the glasses integration. Also, log important events (connect, disconnect, data received) in a way that can be shared in support tickets. Because this is new tech, you'll want insight into what's happening in the wild.
  • Device and version compatibility notes: Document which glasses models and OS versions your app was tested with (e.g., "Tested on Ray-Ban Meta (firmware X), iPhone 14 iOS 17, Pixel 7 Android 14"). This helps QA and also sets user expectations. If, for example, a new glasses firmware update is needed, mention that.
  • Onboarding for non-developers: Your integration should come with an in-app onboarding flow that a normal user (or tester) can follow. Assume they know nothing about the toolkit. Provide a short tutorial or guided setup when they first use the glasses feature. This might include a link to install the Meta AI app if not already, instructions to pair the glasses, etc. The goal is to make the experience plug-and-play for beta testers or stakeholders without you having to manually intervene.

By covering these, you'll increase the chances that your glasses-enhanced features will delight users and not result in support headaches.


Next Steps

Congratulations on getting something running with the Meta Wearables Device Access Toolkit! From here, you can start dreaming up more ways to augment your app with hands-free glasses interactions:

  • Choose your next capability (e.g., try adding the photo capture stream if you started with a simple button event, or vice versa). Apply the same events → actions pattern to implement it.
  • Replace placeholders with real logic. If your first feature used a dummy action (like showing an alert), connect it to a real workflow. For example, if a voice command was detected, integrate an API like Meta's Llama or your own backend to do speech-to-intent and fulfill that command. If a camera frame is coming in, perhaps run it through an on-device ML model for object recognition. Little by little, make the feature more functional.
  • Ship an internal beta: Once you have a couple of features working, release the app to a small group (via TestFlight, internal app sharing, etc.). Collect feedback on the glasses experience – was it smooth? What confused users? This feedback is gold for refining both your app and providing input to Meta's toolkit team.

And remember – this is new territory for everyone. If you need help accelerating the integration or ensuring it's production-grade, Cybergarden is here to help you go from prototype to production. We can assist with native module wrappers, event architecture design, testing automation, and privacy-by-design reviews for your wearable integrations.

Keep innovating, and enjoy building the future of hands-free computing!


FAQs

Do I need preview access to use the toolkit?

Yes. You must be accepted into the developer preview to fully use the Wearables Device Access Toolkit. Start by signing up through Meta's official form and developer portal. Once you have access, you'll be able to download the SDK and use the Wearables Developer Center tools. Without preview access, you won't be able to register apps or get the necessary credentials to connect to the glasses (although the SDK packages are publicly visible, the programmatic access and documentation require login).

Can I build with React Native or Flutter?

Absolutely. The SDKs are native (Swift for iOS, Kotlin/Java for Android), but you can create a thin native module to bridge into React Native, Flutter, or other cross-platform frameworks. The key is to keep your business logic in the cross-platform layer and handle the device connection in the native layer. For example, in React Native you might write a Native Module that wraps the essential SDK calls (connect, subscribe to events, etc.), then emit events to JavaScript. Several community developers are exploring this route, so keep an eye on forums for any open-source bridging libraries. In the meantime, plan a bit of extra time for writing and testing the bridge code.

What's the fastest path to a working demo?

Use the official sample app from Meta's GitHub repository as your baseline. Get it running and confirm you can connect to the glasses and see one piece of data (like a camera stream or a log message from the glasses). This proves your environment and hardware are set up correctly. Then, in that same project (or your own app), implement one simple event→action loop: for instance, if the glasses have a touchpad tap event, hook that to trigger an alert in the app. Keep the loop minimal — one event, one action, and one piece of feedback UI. This narrow slice is often enough to wow stakeholders. From there, you can incrementally expand (one event at a time) with confidence that the basics work.