Published on

Meta Wearables Device Access Toolkit: Quickstart Guide for Troubleshooting 15 Common Errors (2026)

Authors
  • avatar
    Name
    Almaz Khalilov
    Twitter

Meta Wearables Device Access Toolkit: Quickstart Guide for Troubleshooting Common Errors (2026)

Getting started with the Meta Wearables SDK but hitting errors on day one? This guide tackles the 15 most common iOS & Android issues developers face—and shows you how to fix them fast. By the end, you'll have an error-free demo running and a production-ready checklist for your first glasses-enabled app.

Watch the Quickstart VSL

This quick video walkthrough covers the entire setup—from installing the SDK and running the sample apps to connecting real Meta glasses. By the end of the video (and this guide), you'll have your mobile app talking to a pair of AI glasses and know how to avoid common setup pitfalls.

Meta's Ray-Ban AI glasses connect to your mobile app's camera and audio features, enabling hands-free experiences The Device Access Toolkit (DAT) lets iOS and Android apps stream video, capture photos, and respond to events from supported glasses.


What This Guide Covers

  • How to get access to the toolkit (Preview)
  • How to install the SDK and run the sample apps (iOS + Android)
  • The core building blocks (connect, permissions, events, actions)
  • A reusable "first feature" pattern you can adapt to any use case
  • Testing, troubleshooting 15 common errors, privacy, and rollout basics

Before You Start

Access & Accounts

  • Meta Developer account: URL_PLACEHOLDER (Sign up with a Meta account for developer access.)
  • Wearables Device Access Toolkit preview access: URL_PLACEHOLDER (Enroll in the preview programme to get the SDK and tools.)
  • Documentation hub: URL_PLACEHOLDER (Detailed API docs and guides on Meta's developer site.)
  • Support / forums: URL_PLACEHOLDER (Community forums or GitHub discussions for help.)

Note: Ensure you have developer preview approval for the Wearables Toolkit. You'll need to set up an organisation and project in the Meta Wearables Developer Centre, which provides an App ID for your application. This App ID must be configured in your app (e.g. in AndroidManifest as shown later) so that the Meta AI companion app recognizes your app during pairing. Also, make sure the Meta AI companion app is installed on your phone and you're logged in – the SDK uses it to pair with the glasses.

Devices & Environment

← Scroll for more →
RequirementiOSAndroid
PhoneiPhone (iOS 15.2 or later)Modern Android device (Android 10+)
Dev toolsXcode 15 (or newer)Android Studio (2023+ version)
ToolkitPreview SDK (iOS Swift Package)Preview SDK (Gradle/Maven packages)
Target deviceRay-Ban Meta (Gen 1 or Gen 2), Oakley Meta HSTN AI glassesRay-Ban Meta (Gen 1 or Gen 2), Oakley Meta HSTN AI glasses

Heads-up: Make sure you're using a physical device for testing (emulators won't connect to Bluetooth/wearables). On iOS, enable Developer Mode on the device (Settings > Privacy & Security > Developer Mode) so you can run the app from Xcode. On the glasses side, if the preview documentation requires it, enable any Developer Mode toggle for the glasses in the Meta AI companion app (this allows third-party apps to connect in preview). If you're developing without hardware, the toolkit provides a Mock Device Kit to simulate a pair of glasses – handy for initial testing.


Quickstart: Install + Run the Samples

Step 1) Download SDK + Samples

  • Meta Wearables Developer Centre: Get the latest SDK downloads and sample projects from the official portal (Preview access required). The Developer Centre will also guide you through creating an app/project and obtaining any needed credentials.
  • Official Docs: Review the Getting Started guide and API reference on Meta's site for platform-specific setup steps (iOS integration, Android integration).
  • GitHub Repos: The SDK is open-source. Clone the repositories for iOS and Android to examine sample code and include the libraries directly. The sample apps included will serve as our starting point.

Step 2) iOS: Build & Run

# If the sample uses CocoaPods (check README):
pod install  # install dependencies
open SampleApp.xcworkspace

# If using Swift Package Manager:
# - Open the Xcode project/workspace
# - Xcode will auto-resolve the Swift Package for the SDK

  1. Open the sample project in Xcode. If a .xcworkspace is provided (for CocoaPods), use that; otherwise open the Xcode project and add the SDK Swift Package. In Xcode, go to File > Add Packages... and enter the SDK repo URL (facebook/meta-wearables-dat-ios) to fetch the package.
  2. Update bundle ID & signing: Set a unique bundle identifier (if required) and select your Apple Team in project settings so you can run the app on your device. Enable Developer Mode on the iPhone (as mentioned above) if you haven't already.
  3. Build and run on a device: Connect your iPhone and run the app from Xcode. The app should install and launch on the phone. Grant any permissions requested on first launch (camera, microphone, Bluetooth, local network, etc.). You should reach the sample's connection screen without crashes.

Checklist

  • Project builds successfully in Xcode (no build errors).
  • App launches on the iPhone (Developer Mode enabled, device connected).
  • Required usage descriptions are in the app's Info.plist (e.g. Camera, Microphone, Bluetooth) and permissions prompts appear as needed.
  • You can navigate to the sample app's "Connect to Glasses" screen.

Step 3) Android: Build & Run

# In your project-level build.gradle or settings.gradle:
# Add the GitHub Maven repository for the Meta SDK:
maven { url "<https://maven.pkg.github.com/facebook/meta-wearables-dat-android>" }
# (Ensure GITHUB_TOKEN is set as an env var or in local.properties)

# In app/build.gradle, add the dependencies:
implementation "com.meta.wearable:mwdat-core:<version>"
implementation "com.meta.wearable:mwdat-camera:<version>"
implementation "com.meta.wearable:mwdat-mockdevice:<version>"  # optional, for simulated device

  1. Import the sample or dependency: If Meta provides a sample Android app (likely via GitHub), open it in Android Studio. Otherwise, add the SDK to your own app by including the Maven repo and SDK dependencies as shown. You'll need to use a GitHub personal access token to access the packages (set GITHUB_TOKEN in your environment or local.properties).

  2. Configure App ID: Update your AndroidManifest.xml with the Meta Wearables App ID you got from the Developer Centre. For example, add:

    <meta-data
        android:name="com.meta.wearable.mwdat.APPLICATION_ID"
        android:value="YOUR_APP_ID_HERE" />
    
    

    This ensures your app is recognised by the Meta AI companion app for pairing. (You can also opt out of analytics by adding the ANALYTICS_OPT_OUT meta-data as shown in docs.)

  3. Add permissions: Make sure all required permissions are declared in AndroidManifest.xml – e.g. BLUETOOTH / BLUETOOTH_CONNECT, CAMERA, RECORD_AUDIO, ACCESS_FINE_LOCATION (for Bluetooth/Wi-Fi scanning), and INTERNET. At runtime, request these permissions from the user (especially Bluetooth and camera/mic on Android 12+).

  4. Build and run on a device: Connect your Android phone (with developer options and USB debugging enabled). Run the sample app from Android Studio onto the device. Accept any permission prompts on first launch. The app should load to a screen where you can initiate connection to the glasses.

Checklist

  • Project syncs and builds successfully (Gradle configured with the SDK and token).
  • App launches on the Android device without immediate errors.
  • All required permissions are declared and requested at runtime (you should see system prompts for Bluetooth, camera, etc. on first run).
  • You can reach the "Connect" or pairing screen in the app.

Step 4) Connectivity Smoke Test

Goal: Verify your app can connect to the glasses and receive at least one event from them.

  1. Pair the glasses: Make sure your AI glasses are powered on and have been paired to your phone via the Meta AI app beforehand. (The Meta AI app handles the Bluetooth/Wi-Fi pairing process – third-party apps connect through the SDK, not via the OS's Bluetooth menu.) Ensure your glasses are in any required developer mode for the preview (as noted earlier).
  2. Initiate connect: In the sample app, tap the "Connect" button (or similar) to start the connection. The SDK will typically launch or communicate with the Meta AI companion app to authorize your app. You might see a prompt or the Meta app UI briefly for granting permission.
  3. Expect success state: Once connected, the app should indicate a "Connected" status and start streaming or listening for events (for example, you might see live video from the glasses camera in the sample app, or a log message of a first frame received).

If the connection fails or no data flows, check the following fast fixes before digging deeper:

  • Pairing & Device status: Is the phone connected to the glasses via the Meta AI app (and not connected to multiple glasses at once)? Only one device should be active – having multiple glasses connected can cause streaming to fail. Also verify the glasses have the latest firmware (unsupported devices like the Meta Ray-Ban Display model won't work until a future update).
  • Meta AI companion: Is the Meta AI app installed, logged in, and running in the background? The SDK handshakes through it. If the SDK reports an error like "metaAINotInstalled," it means it couldn't detect the companion app – ensure the app is present (on iOS, having it on an iPad might not count if the SDK expects the iPhone version).
  • Permissions: Make sure all needed permissions were granted. For example, camera or mic denial could prevent streaming. On Android, Bluetooth permission denial will stop the connection from establishing. On iOS, if you denied Bluetooth or Local Network access (when prompted for glasses communication), go to Settings and enable them for your app.
  • OS settings: Disable any battery saver or background restriction that might be cutting off the app. For instance, Android's battery optimisation can terminate the connection when the app is in background. On iOS, if you lock the screen, note that camera streaming might pause or frames may drop (there's a known issue where frames become unavailable if the screen is locked). Keep the app in foreground during this initial test.
  • SDK version: Use the latest SDK release. If you installed an older version, update to the newest (check the GitHub releases/changelog). Early versions of a preview might have bugs that are fixed in updates.

If everything is set up correctly, you should see a "Connected" state and live data coming from the glasses. Congrats – you've gone from zero to one on a glasses-enabled app!


Core Concepts You'll Use in Every Feature

With the basic connection working, it's important to understand a few core concepts of the Device Access Toolkit. Every feature you build will revolve around these:

1) Connection Lifecycle

Once your app registers and connects to the glasses, you'll need to manage that connection gracefully. This includes handling connect and disconnect events (e.g. user turns off glasses or leaves range) and managing app state changes. For example, if your app goes to the background, the SDK might pause the video stream or disconnect after a timeout – you should be aware of that and attempt reconnection or cleanup when coming back to foreground. Implement a reconnect strategy: if the connection drops, try to re-establish it (with backoff retries, to avoid rapid cycling). The SDK provides callbacks for connection status changes, so use those to update your UI (e.g. show "Reconnecting…"). In short, treat the glasses like a dynamic resource – always check if connected before sending actions, and be ready to restore the link if needed.

2) Permissions & Capabilities

Your glasses-enabled features will often require extra permissions beyond a typical mobile app. Think camera, microphone, Bluetooth, local network (for Wi-Fi direct), etc. It's crucial to declare these in your app config and request them at runtime. For example, an app that streams the glasses' camera will need Camera and maybe Microphone permission (for audio) – even though the capture is from the glasses, the data flows through the phone. If any permission is missing, the feature will simply not work (or the app might crash on initialization). Always check for permissions on start, and if not granted, prompt the user with a clear explanation why it's needed (users might not immediately realize their phone's Bluetooth needs to be on for their smart glasses to work!). Also, handle the case where a permission is denied – provide a fallback or an in-app reminder that functionality is limited until they enable it. Lastly, consider capability checks: the toolkit might offer a way to query if certain features (like voice commands or gesture events) are available on the connected device, so your app can adjust accordingly and not assume unsupported features.

3) Events → Actions Pattern

Developing for wearables is event-driven. The glasses will send events to your app, and your app will perform actions in response. For example, an event could be "user pressed the capture button on the glasses" or "glasses have started a new video frame stream." Your app needs to listen (subscribe) to these events and then trigger the appropriate action in your app's context. Often this means mapping a glasses event to an app function. For instance: "On glasses camera button press -> capture a photo in app and save to gallery," or "On specific voice command -> open a certain screen in the app." The SDK will have event listener APIs or delegate callbacks for various glasses events (camera frame received, photo taken, button pressed, etc.). Use these to drive your app logic. Keep the separation clear: Events are inputs from the device (you don't control when they happen, you just respond), whereas Actions are outputs from your app (you initiate these, like starting a recording, sending a message, updating UI). Designing your feature as a reaction to events will help create a seamless hands-free experience.


Build Your First Glasses-Enabled Workflow

Now that you have the fundamentals, it's time to build a simple glasses-enabled workflow. This section is intentionally generic so you can adapt it to any use case – the idea is to pick one capability and get an end-to-end feature working.

Pick a "First Feature" That's Easy to Validate

Good first features are ones that you can trigger and see working immediately, without a complex setup. They should also be low-risk (not something that can break other parts of your app) yet meaningful to the user. Aim for something that showcases the glasses interaction clearly:

  • Observable: You should get instant feedback that it worked (visual, audio, or log output).
  • Low-risk: It shouldn't require an entire refactor or cloud backend – keep it self-contained.
  • Useful: Ideally it ties into a real user need, even if small, so stakeholders see value.

Examples:

  • "Tap on glasses → open a specific screen in the phone app." – Perhaps the user taps the temple of the glasses and your app navigates to a "notifications" screen. Immediate and clear.
  • "Voice intent → start a predefined workflow." – E.g. user says "Hey Assistant, start my run" and your fitness app begins tracking a workout hands-free.
  • "Capture trigger → send a placeholder payload to the phone app." – Use the glasses capture button to, say, send a dummy data packet or simply log that it was received, proving the pipeline works.

Each of these can be implemented quickly and lets you verify end-to-end connectivity: the glasses event is detected by your app and your app responds in some way.

Implementation Template (Pseudo-Code)

1) Initialize SDK
2) Request/verify permissions
3) Connect to device
4) Subscribe to events
5) On event: call an app action (local function)
6) Show user feedback (status + errors)
7) Log key lifecycle states (for debugging/support)

Let's break that down:

  1. Initialize SDK: Typically done at app launch. This might involve providing your App ID or any config to the SDK and preparing it to handle incoming events.
  2. Permissions: Before doing anything that involves camera or mic or Bluetooth, ensure those permissions are granted (request if not). If any permission is essential (e.g. Bluetooth), you might hold off connecting until it's granted.
  3. Connect to device: Using the SDK's connect or registration call, initiate the link to glasses (this might pop the Meta AI app flow). Wait for confirmation that the device is connected/registered.
  4. Subscribe to events: Attach listeners or callbacks for relevant events. For a first feature, maybe you only care about one event (e.g. a button press or a voice command). Subscribe to just that to start.
  5. On event, perform action: Write the handler such that when the event fires, you call your function that does something in the app. Keep it simple – e.g., toggle a piece of UI, start a phone-side process, or record that the event happened.
  6. User feedback: Always give some feedback that the glasses interaction was received. This could be a toast, an on-screen indicator, or even a subtle sound/vibration on the phone. If an error occurs (e.g. event received but an action failed), surface that ("Failed to load data, please check connection").
  7. Logging: Especially in early development, log important states: connection established/lost, event received, action started/completed. These logs will be invaluable for troubleshooting issues in the field and for support, without having to connect a debugger every time.

By following this template, you can build out any feature one piece at a time. Once the basic pattern is working for one event→action, you can expand to more events and more complex actions.

Minimal UX Requirements (Don't Skip These)

To ensure a good user experience (and to aid in debugging), even your prototype should include a few critical UX elements:

  • Clear status: Always show whether the glasses are Connected or Disconnected (or Reconnecting). A simple status bar or icon is enough. Users (and testers) need to know if the app is actively linked to the wearable or not.
  • Clear error messages: Don't let errors fail silently. If a permission is missing ("Bluetooth permission not granted") or an action can't complete, display a brief, friendly message explaining what's wrong ("Please enable Bluetooth to use the glasses" or "Connection lost. Trying to reconnect…"). This will save you from guessing what went wrong when a tester says "it didn't work".
  • Fallback path: Ensure that whatever your glasses feature does, the user can still complete their task without the glasses if needed. For example, if the glasses are disconnected, maybe the "camera capture" button in your app UI is still available as a manual fallback. In other words, don't make your entire app functionality hinge on the glasses being present – offer a way to do the core action in the traditional way too. This is important for real-world rollout since not all users will have glasses at all times.

Implementing these UX points will make your first feature feel much more polished and user-friendly, even while you're experimenting.


Testing & Troubleshooting

Building for a new form factor means lots of testing in different scenarios. Use this checklist of scenarios and common issues to harden your integration:

Test Matrix

← Scroll for more →
ScenarioExpected BehaviourNotes
First-time setupWalks user through pairing & permission grant. App registers with glasses successfully.e.g. On fresh install, prompt for Bluetooth, camera, etc., then initiate pairing via Meta AI app. Ensure graceful handling if user skips any step.
App backgroundedConnection stays stable (or automatically reconnects when app returns to foreground).iOS may pause streams when backgrounded. Test that the app resumes the session or at least notifies user to reopen app if needed. Android might require a foreground service to persist connection.
Permission deniedApp detects denial and provides a workaround.For instance, if Camera access was denied, show an in-app message with a "Enable camera in Settings" prompt. The feature using that permission should be disabled or hidden until resolved.
Disconnect mid-flowApp handles it gracefully: stops any ongoing action and notifies user.Test by turning off the glasses or Bluetooth. The app should not crash; it should update status to "Disconnected" and perhaps offer a "Reconnect" button. Any in-progress processes (e.g. recording) should safely abort or pause.

Include any other scenarios relevant to your app: low battery on glasses, network loss (since the SDK might need internet), etc. The key is to ensure the app is robust against interruptions and edge cases.

Common Gotchas

Even with everything set up correctly, developers often run into similar issues. Here are 15 common errors and issues with the Wearables SDK – and how to fix them fast:

  1. SDK Error: metaAINotInstalled – This error occurs if the Meta AI companion app is not found on the device, yet the SDK is trying to use it. It can happen on iOS if you're testing on an iPad or secondary device without the app (or if the app's bundle ID is not what the SDK expects). Fix: Install the Meta AI app from the App Store/Play Store and log in. Use a supported device (the iPhone version of the app, as iPads might not be officially supported for the companion app).
  2. Connection fails with APP_NOT_ALLOWED_TO_USE_WIFI_DIRECT (Error 1002) – You see logs about Wi-Fi Direct lease failing and "3P app not allowed to use Wi-Fi Direct." This happens with certain glasses (e.g. Ray-Ban Meta Display model) that aren't yet supported in the preview, thus blocking third-party access. Fix: If you're using a device not officially supported yet, you may be out of luck until a firmware update. Stick to supported models (Ray-Ban Meta Gen1/Gen2, Oakley HSTN) for now, and ensure they have the latest OS update which may unlock third-party support in future.
  3. Streaming yields no data (empty frames) – Your app connects but you get no video frames or events (Devices.stream() returns empty on iOS). This could be a bug in the preview SDK or a misstep in setup. Fix: First, verify that the glasses are indeed capturing (try using the Meta AI app to see if the camera works). If yes, ensure your code subscribed to the correct stream/event and that you called the method to start streaming (e.g. startStream() for video) after connecting. If everything in code looks correct, check the SDK release notes for known bugs around streaming – this might be an issue to report or watch for a fix from Meta's team. As a temporary workaround, restarting the app and glasses can sometimes resolve transient stream issues.
  4. Multiple devices connected – If you have more than one pair of Meta glasses paired to your phone (say you tested a friend's glasses earlier), the SDK may get confused or default to one, and streaming can fail. Fix: Disconnect or power off any extra wearable devices. The toolkit currently supports one active device at a time, so ensure only your target glasses are on and paired. In future updates, multi-device management may improve, but for now simplicity wins.
  5. No internet, no connection – The first-time connection attempt might fail if your phone doesn't have an active internet connection. The SDK's registration flow likely calls Meta's services (for authentication or backend setup) when pairing your app. Fix: Make sure you have Wi-Fi or cellular data when initializing the connection. This is especially relevant in test labs or demo rooms – it's easy to have a device on airplane mode or a closed network and overlook that the glasses SDK needs internet for initial handshakes.
  6. Device not found in app – Your app's UI keeps "searching" and cannot find the glasses. Often this means the pairing was never initiated. Remember, the third-party app doesn't directly scan for the glasses like a Bluetooth device; you must go through the Meta AI app's linking process. Fix: Open the Meta AI app on the phone and ensure the glasses are paired there (if not, pair them through that app first). Once paired, the Wearables SDK should detect the glasses. If you're already paired but still not connecting, try restarting the Meta AI app (it might be hung or not listening – a fresh launch can reset the state).
  7. Forgot to add App ID (Android) – Everything compiles, but the connection isn't working on Android and you see no prompts. A common oversight is missing the com.meta.wearable.mwdat.APPLICATION_ID meta-data in the Android manifest. Without it, the SDK doesn't know which project/app it's authenticating. Fix: Add the meta-data tag with your correct App ID from the developer centre. Double-check for typos. Then reinstall the app and try again – it should prompt via the Meta AI app this time.
  8. iOS app not appearing in Meta AI devices list – Similar to the above, on iOS the Meta AI app keeps showing no linked apps. This could happen if your bundle identifier or team isn't correctly associated with the preview access. Fix: Ensure that the bundle ID of your app is the one you registered in the Wearables Developer Centre (or if the preview uses a provisioning profile from Meta, ensure you built with that profile). If you're unsure, try changing your app's bundle ID to a new one and update the project in the developer centre accordingly. Also verify that on the phone, in the Meta AI app settings, any developer mode toggle for your app is enabled.
  9. Provisioning or code signing errors (iOS) – You can't even run the sample on device because of signing issues or an error like "A valid provisioning profile for this executable was not found." Fix: This is standard Apple development rigmarole: make sure you've logged into Xcode with your Apple ID, and have a valid Development certificate. For the sample app, change the bundle ID to something unique (e.g. append your initials) so that it can use your team's provisioning profile. Then clean build and run. Apple also requires enabling Developer Mode on the device (mentioned before) or the app won't launch at all.
  10. Swift Package not added (iOS) – If you run the iOS app and it crashes immediately, check the logs for missing symbol errors. The sample or your project might not have actually included the Wearables SDK package. Fix: Follow the integration steps to add the Swift Package via Xcode or SwiftPM properly. In Xcode, you should see the MetaWearables package under Package Dependencies. If not, add it and rebuild. This issue often happens if you open the sample project but forget to resolve packages, or if you started a fresh project and didn't add the SDK yet.
  11. Gradle build failure (Android) – If the app won't build due to failing to find com.meta.wearable:mwdat-core, it means Gradle couldn't fetch the package. Fix: Ensure your GitHub credentials are set up. Generate a Personal Access Token (with at least read:packages scope) and either set it as the environment variable GITHUB_TOKEN or put it in your ~/.gradle/gradle.properties as github_token=YOUR_TOKEN_HERE. Also verify the Maven URL is correct in settings.gradle (it should point to the facebook repo as shown above). Once the token is in place, do a Gradle sync again.
  12. Permissions declared but not requested – A classic mistake: you added all the uses-permissions in AndroidManifest or keys in Info.plist, but at runtime you never ask the user for them. The feature just doesn't work and you see no prompts. Fix: Implement the runtime permission requests. On Android, use ActivityCompat.requestPermissions for things like CAMERA, RECORD_AUDIO, BLUETOOTH_CONNECT. On iOS, if using camera/mic, the system prompts should appear automatically on first use of AVCapture or mic APIs, but Bluetooth and local network have their own prompts (Bluetooth prompts on first CBCentralManager usage, etc.). Ensure you trigger these by initializing those subsystems. Also, double-check Info.plist entries – if you forgot NSLocalNetworkUsageDescription and the SDK uses local network (Wi-Fi Direct), iOS might block the connection silently. Add a rationale for local network access if needed (e.g. "Allow connections to accessories on local network").
  13. Glasses disconnect when app is backgrounded – You notice that if you press Home or switch apps, the connection drops quickly. This can happen due to aggressive power management or because the SDK isn't allowed to continue in background by default. Fix: For Android, you might need to run a foreground service to maintain the connection, or disable battery optimisations for the app (the user can do this in system settings for your app to prevent Doze mode interference). For iOS, long-running capture in background requires using AVFramework background modes (if at all possible) or at least handling the pause – in preview, it might simply be a limitation that heavy operations stop when backgrounded. In any case, design your app to handle reconnection when coming back to foreground. Inform the user (e.g. "Paused when app inactive") rather than just failing silently.
  14. Output is low-quality or throttled – For example, the video frames from the glasses are lower resolution than expected, or you get slower frame rates. Keep in mind this is a preview toolkit – some settings might be defaulted to conserve bandwidth. Fix: Check if the SDK allows configuring frame quality (some SDKs have parameters for resolution or JPEG vs RAW frames). In the Flutter plugin snippet, for instance, they used VideoQuality.medium and frameRate 24 by default. Experiment with those settings if available. Also ensure your phone has a solid Wi-Fi Direct connection (if there's interference or if the phone connected over Bluetooth fallback, you might get degraded quality). For now, accept that this is not final quality and focus on functionality; Meta will likely improve quality and performance over time.
  15. Privacy and data concerns – Not an "error" that stops your app, but a potential pitfall: accidentally streaming or storing more data than intended. For instance, forgetting to turn off the camera stream when not needed, or saving images without user consent. Fix: Build with a privacy-first mindset. Only capture or keep data when necessary. Use the toolkit's capabilities to start/stop streams on demand (don't run the camera endlessly in background). If you log data, avoid personal info. This isn't a bug you'd get a stack trace for – it's one you catch by reviewing your design and compliance requirements (see the Privacy section below for more on this). It's easier to bake this in now than to refactor for privacy later.

Keep this list handy during development – if something isn't working, chances are it's one of these common issues. A quick fix now can save hours of frustration.


Privacy, Security, and AU Notes

Developing on the cutting edge of AR and AI glasses is exciting, but it also comes with responsibilities. You're dealing with sensors that can capture personal imagery and audio, so you should prioritize privacy and security from day one. Here are some practical default settings and considerations:

← Scroll for more →
AreaRecommended DefaultWhy it matters
Data minimisationCollect only what you need for the feature. Discard extraneous data.Less data means lower risk and simpler compliance. It aligns with privacy laws' expectation to limit collection of personal info.
StorageAvoid saving raw sensor feeds (video/audio) by default. If you must, keep it short-term or on-device.Reduces the impact of a breach. If you don't store it, it can't be leaked. Users will appreciate that you're not hoarding their data.
LoggingRedact or anonymize sensitive info in logs. E.g., don't log exact GPS or personal identifiers from glasses.Logs are often overlooked but can become a leak. Clean logging means safer diagnostics, especially if sharing logs with support.

AU context: If you're operating in Australia (or handling data of Australian users), be mindful of local regulations. The Privacy Act 1988 (Cth) governs how personal information must be handled – it requires transparency and safeguards when you collect personal data. For example, if your glasses app captures photos or audio that could identify someone, that's personal data. You need a clear privacy policy and possibly user consent for how that's used. Also, for apps in regulated sectors or government projects, consider aligning with security frameworks like the Australian Cyber Security Centre's Essential Eight strategies. The Essential 8 is a set of baseline cybersecurity strategies designed to protect against common threats. While not all will apply to a mobile app, relevant ones (like application control, patching, access management) should be in your deployment plan. Mapping your app's controls to something like the Essential Eight can strengthen your case if you're seeking enterprise or government adoption, by showing you follow industry best practices for security.

In short, bake in privacy and security from the start – it's much easier than retrofitting it later (and it's the right thing to do for your users).


Production Readiness Checklist

So your prototype is working and you're ironing out the last bugs – great! Before shipping a glasses-integrated feature to real users, run through this checklist to ensure you're production-ready:

  • Stable connection handling: The app can handle disconnects and reconnections seamlessly. No crashes or dead-ends if the glasses go offline unexpectedly (e.g., user takes them off or loses Bluetooth).
  • Permission recovery: Every permission denial has a user-friendly path to resolution. For instance, a dialog that says "We need camera access for this feature. Please enable it in Settings." and a shortcut to settings if possible.
  • Feature flags for preview features: Since the toolkit is in preview, consider gating glasses-specific functionality behind a feature flag or toggle. This way you can remotely disable it if something goes wrong, or limit it to beta users.
  • Crash and event logging: Integrate an analytics or logging solution to capture crashes, errors, and usage of the glasses features. If a user reports "it's not working," these logs will help pinpoint if an SDK error occurred (make sure not to log personal data, as noted).
  • Device and version compatibility notes: Document which glasses models and phone OS versions your app supports. For example, if you only tested on Ray-Ban Meta Gen-2 and iOS 17+, be transparent about that. Handle unsupported cases in code (gracefully tell the user if they try with an unsupported device).
  • Onboarding instructions: Provide an in-app guide or link to a guide for first-time users of the glasses feature. Assume not all your users are developers – explain how to pair their glasses, what commands or gestures they can use, and how to troubleshoot basic issues. This is especially important if you're distributing to external testers or customers who may not be familiar with the Meta preview programme.

Tick off all these items, and you'll be much more confident that your glasses integration won't break in the wild. A well-prepared app makes for happy users (and happy developers who can sleep at night).


Next Steps

Your first glasses-enabled feature is in the bag – what next?

  • Expand to your next feature: Pick another capability (maybe something using the glasses' microphone, or integrating an AI assistant response) and apply the same events→actions pattern to build it out. Each new sensor or interaction (gesture, voice, etc.) will teach you more about the toolkit.
  • Replace placeholders with real workflows: If your initial feature used a hardcoded action (like opening a set screen), try hooking it up to a more dynamic flow. For example, if a voice command triggers a workflow, connect it to a real API call or in-app function now that the basics work.
  • Beta test with real users: Once you have a couple of features, release the app to a small group (internal team or friendly users) via TestFlight or Play Store internal testing. Gather feedback on what's useful, what's confusing, and observe how the glasses are actually used. You might discover new use cases or small tweaks to improve UX (maybe users want a louder notification on the phone when an event occurs, etc.).

Need help hardening the integration or planning more advanced features? Cybergarden can assist you in going from prototype to production – from optimising native wrappers and event architectures to rigorous testing and privacy-by-design reviews. Don't hesitate to reach out if you want expert support to accelerate your roadmap.


FAQs

Do I need preview access to use the toolkit?

Yes. The Meta Wearables Device Access Toolkit is currently in developer preview, which means you must sign up and be approved to get access. Use the official sign-up link (on Meta's developer site) to request access. Once approved, follow Meta's instructions to download the SDK and obtain the necessary credentials (such as an App ID for your project). Without preview access, the SDK libraries and the required companion app integration won't function – so make sure you go through the proper channels to get in the programme.

Can I build with React Native or Flutter?

Absolutely – but you'll need to write some native integration code (a bridge module or plugin). The core SDKs are for iOS (Swift/Objective-C) and Android (Kotlin/Java). In a React Native app, for example, you can create a Native Module that initializes the Wearables SDK and emits events to JavaScript. For Flutter, a community plugin already exists that wraps the Wearables SDK, which shows it's feasible to integrate. The key is to keep your business logic in the cross-platform layer, and limit the platform-specific code to just handling the SDK interaction. By doing so, you can reuse most of your code on both iOS and Android while the native parts manage the connection, permission requests, and event forwarding. In summary: Yes, you can use cross-platform frameworks, just be prepared to get your hands dirty on the native side initially.

What's the fastest path to a working demo?

The quickest way is to leverage the provided sample app. Get the sample running on your device and confirm you can connect to the glasses and receive a basic event (like a camera frame or button press). This proves your setup is correct. Then, identify one simple event→action pair as described earlier – for instance, when you press the glasses' capture button, toggle some UI element in the app. Implement that by adding a listener in the sample app's code and a simple handler (even just logging or displaying a toast). Now you have a minimal end-to-end demo: you interact with the glasses, and the app responds in real-time. Because you started with the working sample, you didn't have to build everything from scratch – you just added a small feature to it. This approach (verify base functionality, then add one small feature) is typically much faster than trying to code a full integration in a vacuum. Plus, seeing that first glasses-triggered action happen in your app is a huge morale boost and proof-of-concept for stakeholders. From there, you can incrementally build out more complex interactions, one step at a time.


Changelog

  • 2026-01-30 – Updated with 15 common errors, link audit, and Australian English corrections. Prepared for production-ready review.
  • 2026-01-20 – Initial draft covering basic SDK troubleshooting for iOS and Android.