Published on

Meta Wearables Device Access Toolkit: Quickstart Guide for iOS (2026)

Authors
  • avatar
    Name
    Almaz Khalilov
    Twitter

Meta Wearables Device Access Toolkit: Quickstart Guide for iOS (2026)

Want to extend your iOS app into a glasses form factor—without guessing the connection, permissions, and event lifecycle? This quickstart will get you from SDK access to a working demo in minutes. This guide gives you the fastest path to a running Meta AI Glasses integration on iOS, then a production-ready checklist for polishing your app.

Meta's AI glasses (like the Ray-Ban Meta) are ushering in a new era of hands-free, wearable experiences. With the Meta Wearables Device Access Toolkit, developers can connect iOS apps to the glasses' camera, microphones, and speakers – enabling POV video streaming, photo capture, voice input and more. The toolkit is currently in developer preview, providing a reliable way to pair with the glasses hardware and start prototyping new features. In this guide, we'll walk through building your first glasses-integrated iOS app in ~20 minutes using SwiftUI and Combine for a modern, reactive architecture.

Ray-Ban Meta smart glasses allow wearable, hands-free experiences. The new toolkit lets iOS apps access the glasses' sensors (camera, audio) to create innovative features.

Watch the Quickstart VSL

The video above walks through installing the Meta Wearables toolkit and running the sample app on an iPhone. By the end, you'll have a basic glasses-connected demo app running on iOS.


What This Guide Covers

  • How to get access to the toolkit (Preview)
  • How to install the SDK and run the sample apps (iOS + Android)
  • The core building blocks (connect, permissions, events, actions)
  • A reusable "first feature" pattern you can adapt to any use case
  • Testing, privacy, and rollout basics

Before You Start

Access & Accounts

  • Meta Developer account: Make sure you have a Meta developer account and are logged in. (Sign up at the Meta Developer site if you haven't already.)
  • Wearables Device Access Toolkit preview access: Apply for the Wearables preview programme. You may need to request access via the Meta Wearables Developer Centre – only supported countries have full toolkit access.
  • Documentation hub: Review the official docs and FAQ on the Wearables Developer Centre. (This will have the SDK API details and guidelines.)
  • Support / forums: Join the developer community for help. (Meta has an official forum/Discord and GitHub discussions for the toolkit – a great place to ask questions or search issues.)

Devices & Environment

← Scroll for more →
RequirementiOSAndroid
PhoneiPhone (iOS 15.2 or later)Android device (Android 10+)
Dev toolsXcode 15 (or latest)Android Studio (latest)
ToolkitMeta Wearables DAT Preview SDKMeta Wearables DAT Preview SDK
Target deviceRay-Ban Meta / Oakley Meta HSTN glassesRay-Ban Meta / Oakley Meta HSTN glasses

If you're using React Native or Flutter, plan for a thin native wrapper module around the SDK (iOS: use Swift/Objective-C, Android: use Kotlin/Java) to bridge the toolkit into your framework.

Note: Ensure your glasses are in Developer Mode for testing. On Ray-Ban Meta glasses, you enable this via the Meta AI app . In the app's settings, tap the version number 5 times until you see "Developer mode enabled". Also make sure the glasses are paired to your phone through the Meta AI companion app, which manages connection approval between your app and the glasses. Also make sure to see Developer mode enabled.


Quickstart: Install + Run the Samples

Step 1) Download SDK + Samples

  • Developer Centre: Visit the Meta Wearables Developer Centre to download the SDK (or get the GitHub repo link). You'll find both iOS (Swift) and Android (Kotlin) resources here.
  • Docs: Read the Getting Started guide in the documentation covers setup, API usage, and a tutorial. The FAQ and Developer Terms are worth a look before you begin.
  • Sample apps: Grab the sample app projects for iOS and Android. Meta provides a "Camera Access" sample for each platform (available via GitHub or the developer centre) to help you get up and running quickly.

Step 2) iOS: Build & Run

First, add the Meta Wearables SDK to your iOS project. The easiest way is via Swift Package Manager:

bash

Copy code

# In Xcode: # 1. File > Add Packages... # 2. Enter the repo URL: <https://github.com/facebook/meta-wearables-dat-ios> # 3. Choose the latest version (e.g. 0.3.0) and add the package to your app target.

If you downloaded the sample Xcode project, open it in Xcode. Make sure to select your development team for code signing (under Signing & Capabilities) so you can run on a device. Build and run the app on a physical iPhone connected via USB.

Checklist (iOS)

  • Project builds locally without errors (the SDK should fetch via SPM).
  • App launches on the iPhone.
  • Required permissions are declared in Info.plist (e.g. Bluetooth usage description, Microphone, Camera if needed).
  • On first launch, you can reach the app's "Connect" screen, where it will scan/pair with the glasses.

Step 3) Android: Build & Run

For Android, you'll import the SDK and run the provided sample (or your own app). The toolkit is distributed via GitHub Packages (Maven). In your Gradle settings, add Meta's GitHub Maven repository and include the SDK dependencies:

gradle

Copy code

// settings.gradle repositories { maven { url "<https://maven.pkg.github.com/facebook/meta-wearables-dat-android>" credentials { // Use a GitHub token with read:packages username = "" password = System.getenv("GITHUB_TOKEN") } } // ... other repos (Google, etc) } // app/build.gradle dependencies { implementation "com.meta.wearable:mwdat-core:0.3.0" implementation "com.meta.wearable:mwdat-camera:0.3.0" implementation "com.meta.wearable:mwdat-mockdevice:0.3.0" }

Sync your Gradle project to download the libraries. Open the sample project (if provided) in Android Studio, connect an Android phone (with USB debugging on), and run the app on the device.

Checklist (Android)

  • Project builds and Gradle dependencies resolve (you may need to supply a GitHub Packages token as shown above).
  • App launches on the Android device.
  • Required permissions are in the AndroidManifest.xml (Bluetooth, Microphone, etc.) and are requested at runtime on first use.
  • You can reach the "Connect" screen in the app (which should handle finding the glasses via the Meta AI app handshake).

Step 4) Connectivity Smoke Test

Goal: Verify you can connect to the glasses and receive at least one event from them (e.g. a camera frame or a status update).

  1. Pair & Connect: In the app's Connect screen, initiate connection to the glasses. Approve any prompts (the Meta AI companion app may ask to confirm the connection).
  2. Expected result: The app should indicate a Connected status, and you should see evidence of the glasses streaming data. For example, the sample app might start a camera preview or log a "frame received" event in Xcode/Logcat.
  3. If it fails: Check the basics:
    • Glasses pairing: Ensure the glasses are paired via the Meta AI app and in range. The Meta AI app must be running (in background) to broker the connection.
    • Permissions: Make sure you granted Bluetooth (for connection) and other permissions. On iOS, if you denied Bluetooth or Local Network access, the connection won't establish. On Android, check that location/Bluetooth are enabled if required for BLE scans.
    • OS Settings: Verify Developer Mode on glasses is enabled (see above) – without it, third-party apps may be blocked. Also ensure your phone's Bluetooth is on, and both phone and glasses have internet if required for Meta services.
    • SDK version: If you installed an older preview SDK, upgrade to the latest (e.g. v0.3.0) – early versions are updated frequently. Breaking changes or bug fixes might resolve your connection issues.

Once you have a successful connection and data flowing, congratulations – you've stood up the basics of a glasses-enabled mobile app!


Core Concepts You'll Use in Every Feature

1) Connection Lifecycle

Managing the connection is key. Your app must handle connecting and disconnecting from the glasses gracefully:

  • Connect/Disconnect: Use the SDK's connect call to initiate pairing. Handle the disconnect event (e.g. user takes glasses off or closes app) so your UI updates accordingly (e.g. show "Disconnected" status).
  • Background/Foreground: Be mindful of what happens if the app goes to background. The glasses might stop streaming when your app is backgrounded (due to OS limitations). Plan for it – for instance, you might automatically pause any active camera feed when the app goes inactive, and attempt to reconnect or resume when the app comes back to foreground.
  • Reconnect strategy: Implement a retry or reconnect loop if the connection drops. For example, if the Bluetooth or wireless link is lost temporarily, try to reconnect a few times with exponential backoff. Timeouts should be in place – don't hang your UI waiting forever. Give feedback like "Reconnecting... (tap to retry)" after a few seconds if needed.

2) Permissions & Capabilities

Each glasses-enabled feature will require certain device capabilities and user permissions:

  • Glasses hardware vs Phone hardware: The toolkit currently gives access to the glasses camera (for photo/video) and routes audio through standard Bluetooth. That means your app may need camera/mic permissions even though the camera/mic are on the glasses – e.g. iOS still requires a mic permission to record from any microphone (including a paired glasses mic). Declare needed permissions in your app manifest (Info.plist or Android Manifest) and request them at runtime with clear prompts.
  • Runtime prompts: Explain why you need a permission ("We need Bluetooth access to connect to your AI glasses"). The user might decline, so handle that gracefully (show a message that the feature won't work without permission, and offer a way to enable it in Settings).
  • Graceful fallbacks: If a permission is denied or the glasses are not connected, your app should still function. For instance, if glasses camera isn't available, you might fall back to the phone's camera or disable the feature with an explanation. Don't leave the user stuck on a blank screen.

3) Events → Actions Pattern

The core integration pattern is Events → Actions:

  • Events are things that happen on the glasses or toolkit side. For example: a hardware button press on the glasses, a new camera frame comes in, a status change (battery low, connection lost), or a voice command is detected.
  • Actions are how your app responds to those events. This could be starting a workflow in the app, navigating to a certain screen, capturing a photo, sending data to a server, etc.

Your app will subscribe to events from the glasses (using Combine publishers, callbacks, or delegates provided by the SDK). When an event occurs, your code triggers an appropriate action in the app. For instance, using Combine you might have something like:

swift

Copy code

glassesSDK.eventPublisher .filter { $0 == .tap } .sink { _ in startMyFeatureWorkflow() }

This decoupling makes it easy to add features. Listen for an event, then define the app's reaction. Over time you might handle many events (swipes, voice intents, etc.), but each can map to an isolated action or UI update. Tip: Use the main thread to update UI (SwiftUI @MainActor or DispatchQueue.main) since events may come in on background threads.


Build Your First Glasses-Enabled Workflow

This section is intentionally generic so you can swap in any capability.

Pick a "First Feature" That's Easy to Validate

Good first features are:

  • Observable: You can see it working instantly (no complex setup).
  • Low-risk: Doesn't rely on lengthy processes or third-party services that might fail.
  • Useful: Tied to a real user need (even if small) – this makes it easier to demo and get buy-in.

Examples of simple first features:

  • "Tap on glasses → open a specific screen in the phone app." (Ex: user taps the frame button, your app opens a notification or a particular view.)
  • "Voice intent → start a predefined workflow." (Ex: user says a wake word, your app triggers a checklist or starts recording a note.)
  • "Capture trigger → send a placeholder payload to the phone app." (Ex: user presses the capture button on glasses, your app receives a photo or video stream and simply logs or displays a thumbnail.)

Each of these can be done with minimal UI and logic, but demonstrates end-to-end connectivity.

Implementation Template (Pseudo-Code)

scss

Copy code

// 1) Initialize SDK Glasses.initialize(apiKey: "…") // 2) Request/verify permissions Permissions.check(.bluetooth, .microphone) { granted in ... } // 3) Connect to device Glasses.connect(deviceId: myGlassesID) { status in ... } // 4) Subscribe to events Glasses.onEvent { event in // 5) On event: call an app action handleGlassesEvent(event) } // 6) Show user feedback (status updates & errors) ui.updateStatus(Glasses.connectionState) // 7) Log key lifecycle states (for debugging/support) logger.info("Glasses connected, streaming started")

This is a rough pattern. In practice, you might use Combine or delegates instead of callbacks, but the steps remain:

  1. Init the SDK early (e.g. app launch).
  2. Permissions – ensure the user has granted what you need.
  3. Connect – attempt to connect/pair with the glasses (perhaps in response to a UI button "Connect Glasses").
  4. Subscribe – start listening for glasses events (could be before or after connect, depending on SDK).
  5. On Event → Action – when an event comes in, execute some function in your app.
  6. User feedback – keep the user informed (a label showing "Connected ✅" or an alert "Glasses battery low!" etc.).
  7. Logging – internally log important steps, so if something goes wrong in the field, you have breadcrumbs (consider integrating a crash/error reporting SDK).

Minimal UX Requirements (Don't Skip These)

Even for a demo, make sure to cover these UX basics:

  • Clear status indicators: Show whether the glasses are Connected or Disconnected (and maybe Reconnecting if you implement that). Users shouldn't have to guess if it's working.
  • Obvious errors: If something's wrong (e.g. "Bluetooth Off", "Glasses not found", or "Permission denied"), display a clear error message. For example, "Please turn on Bluetooth to connect to your glasses" with an actionable icon. This beats silently failing or showing a spinner forever.
  • Fallback path: Assume at some point, the user doesn't have the glasses on them. Your app should still allow them to do the core task. For instance, if your feature was "glasses take a photo to do X", allow the phone's camera to do X as a fallback. This way your overall user experience is robust and not entirely dependent on the wearable.

Testing & Troubleshooting

Test Matrix

Use a simple test matrix to cover different scenarios. Here are a few to include:

← Scroll for more →
ScenarioExpected BehaviourNotes
First-time setupApp prompts for Bluetooth, etc., and guides user through pairing.e.g. after install, the user is walked through enabling permissions and connecting the glasses for the first time.
App backgroundedConnection stays stable (or gracefully pauses) when the app is backgrounded, and resumes when foregrounded.iOS may suspend camera feed in background – ensure no crashes, and perhaps show "Paused" notification. Android might require a foreground service if persistent streaming is needed.
Permission deniedIf user denies a permission, the app shows a helpful message and a way to retry.For instance, "Bluetooth is required to connect. Please enable it in Settings." Don't just break without explanation.
Disconnect mid-flowIf glasses disconnect in the middle of a workflow, the app handles it.e.g. stop any ongoing tasks, save state, and let the user know "Glasses disconnected – workflow paused. Please reconnect to continue or finish on phone."

Run through these scenarios on both platforms. It's often useful to simulate bad conditions (turn Bluetooth off, kill the companion Meta AI app, etc.) to see how your app behaves.

Common Gotchas

  • Forgetting runtime permission requests: You added keys to Info.plist/Manifest, but did you actually request the permission in code? On iOS, the first attempt to use an API may prompt the system dialog (e.g. accessing microphone will prompt for mic permission). Ensure this happens at a logical time (not at app launch ideally, but when needed) and handle the case where user declines.
  • Build signing/provisioning (iOS): If you hit signing errors when building for device, double-check that your bundle ID is unique and you've selected your Apple Developer Team in Xcode. If you cloned the sample, update the bundle ID to something under your domain. Provisioning profiles must allow camera/mic usage as well.
  • Background limits (Android): Some Android OEMs aggressively kill background services. If your app doesn't seem to receive events when screen is off, you might be hitting battery optimisation. Advise users (or handle via code) to disable battery optimisation for your app if continuous background operation is needed.
  • "One-time demo" effect: A common pitfall is handling only the happy path. Maybe you got it working once and stopped there. If you don't implement reconnection logic or proper state reset on disconnect, your demo might only work the first time. Make sure you can disconnect and reconnect repeatedly without issues. Test things like power-cycling the glasses and ensuring the app can reconnect without restarting.

Privacy, Security, and AU Notes

Practical Defaults

← Scroll for more →
AreaRecommended DefaultWhy it matters
Data minimisationOnly collect what you need. For example, don't constantly record audio or video unless it's required for the feature.Reduces privacy risk and compliance burden – less sensitive data stored means less to protect or potentially leak.
StorageAvoid saving raw sensor media by default. Process data in memory or stream it to a backend if needed, rather than keeping it on the device long-term.Lowers the risk if a device is lost or compromised, and users appreciate not having piles of potentially sensitive media saved without their knowledge.
LoggingRedact sensitive values in logs. Keep logs minimal in production.Logs can inadvertently capture personal data (e.g. transcripts of audio or image analysis). Redacting or hashing such info ensures you don't expose it when logs are collected for support.

AU context: If your app will handle personal information (photos, audio, identities) and you're operating in Australia, ensure you comply with the Privacy Act 1988 (Australia's data protection law). That means having a clear privacy policy and actually following it in practice. Be transparent about what you collect via the glasses (e.g. are you sending video to cloud servers? storing any faces detected?). For more sensitive or enterprise use cases, it's wise to map your security controls to frameworks like the Essential Eight strategies from the Australian Cyber Security Centre. This ensures you've covered basics like access control, patching, and backups in the context of your glasses integration.

Remember that wearables can feel personal to users (they are literally wearing them!), so prioritising privacy and security will not only keep you compliant but also build user trust.


Production Readiness Checklist

Before you roll out your glasses-enabled feature to real users (especially beyond a closed beta), make sure you've checked off the following:

  • Stable connection handling: The app reliably connects on launch or on demand, and recovers from disconnects without crashes or stuck states.
  • Permission recovery paths: Every permission your feature needs has a user-friendly flow for when it's missing. (E.g. "Allow Bluetooth" prompt, then if denied, show instructions to enable in settings.)
  • Feature flags for preview: Since the Device Access Toolkit is in preview, consider gating glasses-specific features behind a remote kill-switch or feature flag. If Meta changes the API or if something goes wrong, you can disable the feature without breaking the whole app.
  • Crash & analytics logging: Integrate crash reporting (so you catch any edge-case crashes from the new integration). Also consider event logs for key actions (connect, capture, etc.) that can be exported for debugging. This will help when users or testers report issues.
  • Device/version compatibility notes: Document which phone OS versions and glasses models you've tested with. For instance, note if it's only validated on Ray-Ban Meta (2025) glasses and iOS 17+. This helps your support team and sets the right expectation for users.
  • Onboarding UX: The first-time setup should be as seamless as possible – even for non-developers. Ideally, someone who has the glasses and your app can get set up without calling you for support. This means clear instructions in-app, perhaps an onboarding tutorial or setup wizard, and helpful tips (like "Make sure your glasses are charged and paired via the Meta app.").

Tick off all of the above, and you'll be in a good position to move from prototype to a real product release.


Next Steps

  • Choose your next feature: Now that you have a basic connection and one feature running, identify the next most valuable glasses interaction for your app. Maybe it's leveraging the camera for an AI recognition task, or using the glasses' button as a remote control for a phone feature.
  • Iterate the Events → Actions pattern: For each new capability, follow the same approach – subscribe to the relevant event from the glasses (or the companion app), then trigger your app logic. Over time, you'll build up a library of event handlers that make your app feel truly integrated with the wearable.
  • Replace placeholders with real workflows: Our first demo might have been a "Hello World" (like logging a message on tap). In a production app, you'll tie that into real functionality – e.g. tap = drop a map pin, or voice command = create a calendar entry. Start wiring up those connections to your backend APIs, database, or phone features.
  • Test with real users (beta): Once you have a couple of solid features, consider an internal beta test. Get feedback on the user experience: is it clear how to connect the glasses? Do users understand what they can do with the glasses in your app? Use that feedback to refine usability and fix any lingering bugs.

Need help hardening the integration or adding more features? Cybergarden can help you go from prototype → production. Our team has experience with native modules, event-driven architecture, testing at scale, and privacy-by-design. If you want to accelerate your Meta wearables project, feel free to reach out – we're here to help bring your vision (literally) to life on AI glasses!


FAQs

Q: Do I need preview access to use the toolkit?

A: Yes. The Meta Wearables Device Access Toolkit is currently in developer preview, which means you must be accepted into the preview programme to get the SDK and enable your glasses for third-party apps. Use the official sign-up process on the Meta developer site to request access (and ensure you're in a supported country). Once you have access, you can download the SDK and run the samples as shown above.

Q: Can I build with React Native or Flutter instead of native iOS/Android?

A: Absolutely – but with some extra effort. The toolkit is provided as a native iOS (Swift) and Android (Kotlin) library. To use it in React Native or Flutter, you'll need to create a thin native module that bridges between the SDK and your framework. The idea is to keep your business logic in the cross-platform layer, but handle the glasses connectivity in Swift/Kotlin. Both React Native and Flutter allow integrating custom native plugins, and the community or Meta may provide wrappers over time. For now, plan to write a small amount of platform-specific code to expose glasses events and actions to your shared code.

Q: What's the fastest path to a working demo?

A: Use the official sample app. The quickest way to see results is to install and run the provided sample (as we did in this guide) – this ensures your environment is set up correctly. Confirm that you can connect to the glasses and receive a basic event (like camera feed or a button press). Then, in that project (or your own test app), implement one simple event→action flow: for example, print a log or display an alert on the phone when you press the glasses button. With connectivity and one event working, you have an end-to-end demo. From there, you can incrementally build out real functionality (e.g. capturing a photo and doing something with it). Keeping the first demo small and focused also makes it easier to troubleshoot issues. Remember, even a "Hello World" that blinks a message on the phone when you tap your glasses can wow stakeholders – it proves the tech works. So start small, get that win, and then iterate.