Published on

How to Build Production-Ready Meta Glasses Apps with Security, Privacy & Australian Compliance

Authors
  • avatar
    Name
    Almaz Khalilov
    Twitter

How to Build Production-Ready Meta Glasses Apps with Security, Privacy & Australian Compliance

TL;DR

  • You'll build: a secure companion app that connects to Meta's Ray-Ban smart glasses, capturing photos/audio hands-free while respecting user privacy and legal requirements.
  • You'll do: Get early access → Set up SDKs → Run official sample apps → Integrate the glasses SDK into your app with privacy safeguards → Test compliance on real device and mock scenarios.
  • You'll need: a Meta developer account (Wearables preview), Ray-Ban Meta smart glasses (or simulator), test phone (iPhone/Android), development tools (Xcode/Android Studio).

1) What is the Meta Wearables Device Access Toolkit?

What it enables

  • Hands-free camera & audio: Access the glasses' dual cameras and microphones from your mobile app, enabling first-person photo/video capture and voice input without using the phone. For example, an app can live-stream the wearer's viewpoint or record audio notes via the glasses.
  • Seamless integration with apps: Extend your existing mobile app's features into the physical world. The toolkit lets your app leverage the wearer's perspective and open-ear speakers to create immersive experiences (e.g. real-time POV streaming, AI scene analysis) that blend with everyday life.
  • On-device AI assistant (future): Meta's glasses include an AI assistant ("Hey Meta"). While third-party apps can't yet hook into Meta's voice assistant in this preview, you can still build AI features by sending camera imagery or audio to your own AI models. Voice command integration is expected in future SDK updates.

When to use it

  • Primary use cases: Use the toolkit when building apps that benefit from hands-free operation or first-person data capture. This includes social livestreaming, fitness and sports coaching, assistive apps for accessibility, navigation guides, and any scenario where users want to engage with content without holding a phone. If your app could offer a unique experience by "seeing" and "hearing" from the user's perspective, Meta's glasses provide that capability.
  • Secondary use cases: Experiment with context-aware AI - for instance, an app that gives real-time tips based on what the wearer is looking at (early partners like Disney are exploring guided tour apps using the glasses). Also consider enterprise uses like field service support or training, where remote experts see what the wearer sees. Always ensure the hands-free feature truly adds value for the user.
  • Privacy-critical contexts: Avoid or carefully design uses in sensitive environments. The glasses will capture personal information (faces, voices, surroundings) which is regulated as "personal data" under laws like Australia's Privacy Act. Only deploy these features when you canjustify the data collection and obtain proper consent. For example, using glasses for employee monitoring demands strict compliance with workplace surveillance laws (e.g. written notice under NSW's Workplace Surveillance Act). In public-facing apps, encourage responsible use (Meta's user guide asks wearers to turn off glasses in sensitive areas and respect others' privacy).

Current limitations

  • Developer preview only: As of early 2026, the SDK is inclosed preview- you can build and test within your team, but publishing apps to the public is restricted to select partners until general availability in 2026. This means "production-ready" is about preparing your app's code, security, and compliance now, even if public release must wait.
  • Device and feature support: The first toolkit release supports Ray-Ban Meta smart glasses (camera, mic, speaker access) and upcoming Meta "Display" glasses for camera/audio only.No display ARaccess is available yet, and advanced inputs like custom gesture recognition are not provided in this version. You can listen for basic events (touch taps, etc.) but not define new gestures or use the Neural Band interface.
  • No built-in Meta AI integration (yet): The glasses' onboard AI (like real-time chat via "Hey Meta") isnot open to third-party apps in the initial SDK. Developers can implement their own AI processing by streaming data to external services, but this can impact battery and requires strong user privacy protections. Official Meta AI hooks are a "key area" for future updates.
  • Platform constraints: Apps must use Meta's companion app for pairing and permissions - the user must have the Meta AI smartphone app to connect the glasses, and your app interfaces through the toolkit. Also,regional availabilityis limited: if Meta's glasses aren't sold in a country, developer access isn't offered there. Ensure your target users are in supported markets (e.g. US, EU, AU). Lastly, some phone OS limitations apply (e.g. iOS may not allow continuous background camera streaming without user-visible indicators).

2) Prerequisites

Access requirements

  • Meta developer account: Sign up or log in to the Meta Wearables Developer Center using a Meta (Facebook) developer account. Set up your organisation and team in the portal.
  • Join the Wearables preview: Fill out Meta's access request form to join theWearables Device Access Toolkitdeveloper preview. You may need to agree to preview terms (NDA and platform policies regarding data use). Once approved, you'll get access to documentation, SDK downloads, and sample apps.
  • Project setup: Create a project/app entry in the Wearables Developer Center. This will generate identifiers or keys for your app. For example, you might obtain an App ID or API token and be prompted to download a config file (for iOS, a plist; for Android, a JSON or token) to include in your app.Done when: you have the necessary credentials (App ID, keys) and can see your project listed on the portal, ready for integration.

Platform setup

iOS

  • Development environment: Xcode 15+ with iOS 16 or later SDK. Ensure a physical iPhone (iOS 16+ device) for testing - required to fully utilize camera/mic and see iOS privacy indicators (the simulator won't show the camera-in-use dot).
  • Permissions setup: In your app's Info.plist, add usage description strings for camera, microphone, and Bluetooth. For example,NSCameraUsageDescription("This app captures photos from your connected glasses"),NSMicrophoneUsageDescription("Capture audio via smart glasses for voice notes"), andNSBluetoothAlwaysUsageDescription("Connect to wearable glasses via Bluetooth"). iOS requires these strings so the user knows why your app needs these capabilities.
  • Package manager: Install Swift Package Manager (built into Xcode) or CocoaPods if needed. The Meta Wearables SDK for iOS is distributed via Swift Package (GitHub URL) or as a CocoaPod. Have your package manager ready to fetch the SDK framework.
  • Device pairing: The iPhone should have the Meta AI companion app installed (from App Store) to pair with the glasses. Ensure Bluetooth is on and the phone is logged into the Meta app with the glasses connected at least once.

Android

  • Development environment: Android Studio Flamingo/Arctic Fox or newer with Android 13 (API 33)+ SDK. Use a physical Android phone (Android 13 or above) for best results; while an emulator can mimic some behavior, Bluetooth and sensor streaming work more reliably on real hardware.
  • Permissions in Manifest: Update yourAndroidManifest.xmlto include required permissions:
    • <uses-permission android:name="android.permission.CAMERA" />(even though the camera is external, your app declares it captures imagery)
    • <uses-permission android:name="android.permission.RECORD_AUDIO" />(for audio via glasses mic)
    • <uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />(for Bluetooth connectivity to glasses on Android 12+)
    • (If targeting Android 12 and performing BLE scans, include ACCESSFINELOCATION due to BLE requirements.)
  • Runtime permission code: Plan to request camera and audio permissions at runtime on Android 6.0+ and Bluetooth connect permission on Android 12+. Also, if your app will run the camera in background (e.g. recording while screen off), you must show a persistent notification by Android policy to inform users of background recording.
  • Build tools: Gradle 8+ and Kotlin 1.8+ (or Java 11+) are recommended. If Meta provides an AAR or Maven dependency for the SDK, you'll need to add Meta's Maven repository (or GitHub Packages) in your Gradle settings and include the SDK dependency. Have yourgradle.propertiesready for any auth tokens if required to fetch the package.

Hardware or mock

  • Meta wearable device: Ideally, have at least one pair of Ray-Ban Meta smart glasses (Gen 2, 2023) or equivalent Meta AI glasses for testing. This ensures you can test real-world performance, camera quality, and the user experience with LED indicators, etc. The device should be updated to the latest firmware via the Meta app.
  • Mock device kit: If you don't have hardware, use Meta'sMock Device Kitprovided in the toolkit. This simulator lets you emulate a pair of glasses in software - you can simulate connection, camera input (perhaps by feeding sample images), and audio streams. While not a substitute for real-world testing, the mock mode is useful for initial development and CI automation.
  • Testing environment: Enable Bluetooth on your test phones and grant any system permissions for the Meta companion app as well. It's wise to test in a controlled environment especially with real glasses - for example, do early trials in an office or lab space to avoid accidentally recording strangers. Ensure any bystanders are aware of test recordings (use the built-in LED on the glasses as a transparency measure).

3) Get Access to Meta's Wearables SDK Preview

  1. Log in to the Portal: Go to the Meta Wearables Developer Center and sign in with your Meta developer account. Navigate to theWearablessection (you should see options for managing organisations, projects, and devices).
  2. Request preview access: Click on the WearablesDevice Access Toolkitpreview link. Submit the early access request form (this may ask for your use case description and agreement to terms). Meta will grant access if you meet the criteria (being in a supported country, etc.). Once approved, the Wearables SDK documentation and downloads become visible in the portal.
  3. Accept terms: Review any developer agreement or NDA for the glasses SDK. Accept the terms to proceed. This likely includes agreeing to handle data in compliance with privacy policies (Meta's and applicable law) - important given the sensitivity of camera/audio data.
  4. Create a project: In the Developer Center, create a newWearables projectfor your app. Give it a name (and bundle ID/application ID if prompted). This registers your app and may generate credentials. For example, you might obtain anOAuth client IDorApp IDspecifically for glasses integration.
  5. Obtain credentials: After project setup, download any required config files or keys:
    • iOS: You might get a plist or entitlement file (e.g. a file containing your Project ID or API key) to add to your Xcode project. Add this to your app bundle as instructed.
    • Android: Note any API keys or secrets - for instance, you might need to put a key in your app'sstrings.xmlor as a meta-data in the manifest, or configure a Gradle property for maven access.
    • The Meta portal may also instruct you to add a callback URL or redirect scheme in your mobile app (so that the Meta companion app can hand off control after the user grants permission).
  6. Verify setup: Confirm that your project now appears on the portal dashboard and that you have what you need (app ID, keys, SDK download links).Done when: you possess the necessary credentials (such as an app token or entitlement) and can access the SDK downloads/documentation. At this point, you're ready to run the sample app and start coding.

4) Quickstart A — Run the Sample App (iOS)

Goal

Run Meta's official iOS sample app to verify that your glasses can connect and that camera features work, all while observing how the app handles permissions and data. This ensures your development environment is correctly set up and gives you a working reference implementation.

Step 1 — Get the sample

  • Option 1: Clone the repo. Clone the Meta wearables iOS SDK repository from GitHub:git clone https://github.com/facebook/meta-wearables-dat-ios.git. This repo includes an Example Xcode project.
  • Option 2: Download release. On the GitHub page's Releases section, download the latest sample app ZIP. Unzip it and locate the Xcode project (e.g.,MetaGlassesSample.xcodeproj).
  • Open the sample project in Xcode. (If prompted, trust the project and resolve Swift packages to fetch the SDK dependency.)

Step 2 — Install dependencies

  • The sample may use Swift Package Manager to include theMeta Wearables SDK. When you open the project, Xcode should automatically fetch the package. If not, checkPackage.swiftorProject Settings > Package Dependenciesand add the package:
    • Package URL:https://github.com/facebook/meta-wearables-dat-ios(use the commit or version tag provided by Meta, e.g.,0.1.0).
    • Alternatively, if Meta provided a CocoaPods spec, runpod installin the sample's directory to install the SDK.
  • Ensure the Meta Wearables framework is properly linked in the project (check that the build can see the MetaWearables module).

Step 3 — Configure app

  • Insert config: If you received an app config file (e.g.MetaWearablesConfig.plist) from the portal, add it to the Xcode project (usually by dragging it into the project's Resources). The sample might come pre-configured, but double-check any API keys or project IDs in the code or Info.plist.
  • Bundle ID: Set the bundle identifier of the sample app to match the one you registered on the portal (in Xcode, under Target > General > Identity). This ensures the Meta backend recognizes your app. You may also need to update the provisioning profile for your device with this bundle ID.
  • Capabilities: Enable any required capabilities in Signing & Capabilities:
    • For example, turn onBackground Modes>Audio, AirPlay, and Picture in Pictureif you plan to keep connections alive when app goes background (this allows audio streaming to continue).
    • If the SDK documentation requires any specific entitlements (such as Keychain sharing or Associated Domains for callbacks), add them here.
  • Privacy keys: Verify the Info.plist contains the usage descriptions you added in prerequisites. The sample likely includes placeholders - update the strings to clearly explain usage to testers (this is not just good practice but needed for App Store compliance).

Step 4 — Run

  1. **Select target:**In Xcode, select the sample app target (e.g.MetaGlassesSample) and choose a run destination (pick your iPhone device).
  2. Build & Run:PressRunto build the app and deploy it to your iPhone. The first build will also build the SDK package; it might take a minute.
  3. **Launch:**The app should launch on your phone. You may see initial prompts (iOS will ask for Bluetooth permission on first launch if the app uses Bluetooth APIs, and it will ask for microphone/camera permission when those features are first invoked).
  4. **Grant permissions:**Approve the Bluetooth permission so the app can discover the glasses. The camera/microphone permission prompts might appear later when you try capturing.

Step 5 — Connect to wearable/mock

  • **Pair the glasses:**Make sure your Ray-Ban Meta glasses are already paired to the Meta companion app on this iPhone. The sample app will typically detect the glasses through the SDK once the Meta app is running in the background. Some sample apps might have a "Connect" button - tap it to initiate connection.
  • **Developer Mode:**If required, put the glasses in developer mode (the Meta portal documentation describes how, e.g., using the Meta app to enable Developer Mode for testing). This allows third-party apps to stream data.
  • **Use Mock (if no device):**If you're using the Mock Device Kit, ensure the sample app is configured to use it (there may be a toggle or the app might default to mock if no real device found). You might need to run a simulator tool on your computer that the SDK provides.
  • **Grant glasses permissions:**When connecting, the Meta companion app might prompt the user to allow your app to access the glasses. For example, a dialog like "Allow MyApp to access Ray-Ban Meta glasses?" will appear. This is where the user grants your app permission to use the glasses' camera/mic. Approve this request to establish the link.

Verify

  • Connected status: The sample app should indicate when the glasses are connected (e.g. an on-screen status or icon). Verify that the app recognizes the device - often it will show the device name or a "Connected ✅" message.
  • Capture test: Trigger a camera action in the sample app. For instance, tap a "Take Photo" button in the UI. The glasses' camera should activate - check that theLED on the glasses lights upto signal recording (important for privacy). After a moment, an image thumbnail or preview should appear in the app, confirming the round-trip of capture -> transmit -> receive.
  • Audio test: If the sample has an audio feature (like recording a short voice clip or using the glasses' microphone for voice commands), try that as well. You should see an indication that audio is streaming or a transcript if the sample processes it.
  • Data handling: Ensure that any captured photo or audio is either stored within the app sandbox or just displayed (the sample shouldn't be uploading data externally unless configured to do so). This is more of a sanity check for privacy - no unexpected network calls.

Common issues

  • Build errors: If you get a code signing error, make sure you updated the bundle ID and have a valid provisioning profile for your device. For missing libraries, verify the Swift Package or CocoaPods integration succeeded (runpod installif needed, or reset package cache). Clean build if Xcode has trouble finding the Meta SDK module.
  • No device found: If the app says "No glasses found" or times out, ensure the Meta companion app is open and your glasses are on and connected via Bluetooth. Sometimes killing and reopening the Meta app, or toggling Bluetooth, helps. Also confirm your iPhone is on a supported iOS version and the glasses firmware is updated.
  • Permission denied: If the camera or mic features aren't working, check Settings -> Privacy on the iPhone. The app should have Camera and Microphone permission toggled on. If not, you may have denied it accidentally - enable them and try again. Similarly, ensure Bluetooth permission is on (Settings -> Privacy -> Bluetooth). The app may need a restart after changing these.
  • App to Meta app handoff issues: If tapping "Connect" switches to the Meta app or an OS prompt but nothing returns, the URL callback might be misconfigured. Verify that the Info.plist has any required URL scheme that Meta's app uses to return to yours. Without it, the handoff might fail silently.

5) Quickstart B — Run the Sample App (Android)

Goal

Run the official Android sample app to ensure your glasses work with Android as well. This helps verify the Android integration (BLE connection, permissions, etc.) in a controlled example, so you can confidently add it to your own app.

Step 1 — Get the sample

  • Clone the repo: git clone https://github.com/facebook/meta-wearables-dat-android.git(assuming a similar naming; check Meta's documentation for the exact repo). Open the project in Android Studio.
  • Import project: If you downloaded a ZIP, useFile > Openin Android Studio and select the sample app's build.gradle. Let Gradle sync the project.
  • The sample likely contains a module (app) with some example code to connect to glasses and perform a capture.

Step 2 — Configure dependencies

  • Add Meta Maven repo: Meta may supply the Android SDK as a Maven package (perhaps via GitHub Packages). In the project's rootbuild.gradleor settings, add the repository. For example:gradle repositories { maven { url "https://maven.pkg.github.com/facebook/meta-wearables-dat-android" } }If the repository is private, generate a GitHub personal access token and add it to your~/.gradle/gradle.propertiesasgpr.userandgpr.key, then usemaven { url "..."; credentials { username = gpr.user; password = gpr.key } }.
  • Add SDK dependency: In the app module'sbuild.gradle, add the Wearables SDK dependency:gradle implementation("com.meta.wearables:device-access-toolkit:0.1.0")(The actual group ID/artifact may differ; use the coordinates provided by Meta's docs.)
  • Sync Gradle: Sync the project. Gradle should fetch the Meta SDK. If it fails with authentication errors, double-check your token setup or whether you have access. If the SDK is distributed as an AAR in the repo, you might instead have to drop the AAR into the project libs.

Step 3 — Configure app

  • **Application ID:**Set theapplicationIdinapp/build.gradleto the one you registered on the portal (e.g."com.yourcompany.glassesdemo"). This should match the project you created in the Meta portal.
  • **Update manifest:**Insert required permissions inAndroidManifest.xmlif not already present:
    • <uses-permission android:name="android.permission.CAMERA" />
    • <uses-permission android:name="android.permission.RECORD_AUDIO" />
    • <uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />(and if needed<uses-permission android:name="android.permission.BLUETOOTH" />for older Android versions)
    • If the sample app is well-configured, these may be pre-included. Just confirm they're there to avoid missing permission issues at runtime.
  • **Meta config (if any):**If Meta provided an API key or config JSON for Android, place it inapp/src/mainas instructed (for instance,res/values/meta_wearables.xmlor similar). Some SDKs require a meta-data tag in the manifest, e.g.:
<application>
    <meta-data
        android:name="com.meta.wearables.ApiKey"
        android:value="@string/meta_api_key" />
</application>

Check the sample's manifest or documentation for such requirements.

  • **Gradle settings:**Ensure the minSdkVersion and targetSdkVersion meet Meta's requirements (likely minSdk 24 or higher given BLE and camera usage). The sample should already have correct settings, but align them with your needs if you copy these to your app.

Step 4 — Run

  1. **Select configuration:**In Android Studio, select the sample app run configuration. Choose a deployment target - connect your Android phone via USB (with USB debugging on) or use Wi-Fi debugging.
  2. Launch app:ClickRun. The app will build and install on your device. Watch for any runtime permission prompts on first launch.
  3. **Permissions:**Approve any prompts:
    • Bluetooth: On Android 12+, you'll get "AllowAppNameto find and connect to devices?". Grant "Allow" so it can communicate with the glasses.
    • Camera/Microphone: If the sample tries to access these (it might when you initiate a capture or audio stream), you'll see "AllowAppNameto take pictures and record video?" and a similar one for audio. Approve these for full functionality.
  4. **Initial setup:**Some samples might show a onboarding screen. If it asks for a "Glasses ID" or to log into Meta, follow the instructions (most likely, the SDK handles pairing via the Meta app, so you shouldn't need to log in within the sample app itself).

Step 5 — Connect to wearable/mock

  • **Ensure pairing:**On your Android phone, make sure the Ray-Ban Meta glasses are paired via the Meta app (install it from Play Store and set up the glasses if not done yet). The glasses should appear as a connected Bluetooth device (for audio) and be recognized by the Meta app.
  • **Connect via sample:**In the sample app, tap the "Connect Glasses" button (or similar UI element). The app will likely initiate a handshake with the Meta companion service. You might briefly see a Meta consent screen popping up to authorize your app.
  • Grant access:The Meta system will prompt "Allow this app to access your glasses?" - tapAllow. This grants the app the necessary session token to use the glasses. Once allowed, the sample app should show a connected status.
  • **Use Mock mode:**If no physical device is available, ensure the sample is configured to simulate one. This might involve toggling a developer setting or running the mock service on your development machine that the app connects to. Follow Meta's guide for using the Mock Device on Android (it could involve the phone connecting to a dev server acting as a fake glasses).

Verify

  • App shows Connected: The sample app should clearly indicate when the glasses are connected (e.g. "Glasses Connected" message or icon). On Android, you might also see a tiny glasses icon in the status bar if the companion app is active.
  • Camera capture works: Press the sample app's camera trigger (if available). The glasses should snap a photo (check that the glasses' camera LED lights up). After a moment, the image should appear in the app UI. This confirms the camera feed traveled from glasses to phone. If the sample app saves to gallery, check your phone's photos for a new image.
  • Audio works: If applicable, test an audio feature. Speak a short phrase near the glasses after starting a recording via the app. You may hear the glasses' shutter sound for photo or a start tone for recording. The app might play the recorded audio through the glasses' speakers or transcribe it. Ensure your voice was indeed picked up (some sample apps log the text or audio waveform).
  • Error handling: Try an edge case: turn off the glasses while connected to see if the app detects the disconnection gracefully (it should update status to "Disconnected" or prompt you). This tests that the sample handles sudden device loss, which you'll want in your app too.

Common issues

  • **Gradle authentication error:**If Gradle failed to fetch the SDK (HTTP 401 or similar), double-check your repository credentials. Ensure your GitHub token (or Meta-provided credentials) is correct and has access. You might need to use a Gradlemaven { url "..."; credentials {...} }block as shown in Meta's documentation to access the private package.
  • **Manifest merger conflict:**If you integrated the sample into another app or added permissions, you might see merge errors (e.g.<uses-permission android:name="...">already added). Resolve duplicates by ensuring each permission is declared only once. Also, if the Meta SDK's manifest has application attributes (like provider authorities), you might need to settools:replaceor adjust to avoid conflicts.
  • **Device connection timeout:**If the sample app cannot find the glasses, confirm that your phone's Bluetooth is on and the glasses are powered. On Android, also enable Location if you're on an OS version that ties BLE scanning to location permission (Android 11). If using an emulator, know that emulators typically can't do Bluetooth - use a real device. For physical devices, sometimes toggling Bluetooth or rebooting the glasses can help. As a last resort, unpair and re-pair the glasses using the Meta app, then retry the sample.

6) Integration Guide — Add Meta Glasses Support to an Existing Mobile App

Goal

Integrate the Meta Wearables SDK into your own app and implement one end-to-end feature (e.g. take a photo with the glasses and display it in-app). We'll structure the integration to maintain clean separation of concerns, ensuring the feature is production-quality in terms of stability,security, andprivacy compliance.

Architecture

Your app will interact with the glasses through an SDK client, while preserving user privacy:

  • Mobile App UIWearables SDK ClientMeta Glasses (device)Callbacks/EventsApp Logic/Storage

In practice, this means your app's UI triggers actions on aWearablesClient(your wrapper around Meta's SDK). The client sends commands to the glasses (via the Meta companion app behind the scenes). As data comes back (e.g. an image or status update), the SDK uses callbacks or events which your wrapper translates into app-state updates (e.g. saving the photo, updating the UI). This decoupling keeps your code organized and allows adding logging or permission checks at the client layer.

Step 1 — Install the SDK

iOS

  • In Xcode, add the Meta Wearables SDK to your project. If using Swift Package Manager, go to**File > Add Packages…**and enter the GitHub URL (e.g.facebook/meta-wearables-dat-ios). Select the latest version tag. Xcode will add the package as a dependency.
  • Alternatively, if Meta provided a CocoaPod, addpod 'MetaWearables', '~> 0.1'to your Podfile and runpod install. Ensure the pod is integrated (check that you canimport MetaWearablesin code).
  • After installation, locate any example code or documentation within the package (it might include a README or header docs) to verify the SDK classes are accessible.

Android

  • Open your app'sbuild.gradleand in the dependencies section add:gradle implementation("com.meta.wearables:device-access-toolkit:0.1.0")(Use the correct version from Meta - in preview it might be 0.1.x or a specific snapshot.)
  • Add Meta's Maven repo if not already in your project-levelbuild.gradle:gradle maven { url "https://maven.pkg.github.com/facebook/meta-wearables-dat-android" }plus credentials if required (as done in the sample). This lets Gradle download the SDK AAR.
  • Sync the project. Confirm you can reference the SDK in code (e.g., try writingMetaWearables.connect()or the equivalent class - use auto-import to see if the library is recognized).

Step 2 — Add required permissions

iOS (Info.plist)

Add or verify the following keys in your Info.plist with human-readable justification (these should already be present from earlier steps, but double-check):

  • NSCameraUsageDescription: e.g. "Allow this app to use the connected glasses' camera to take photos and videos."
  • NSMicrophoneUsageDescription: e.g. "Allow this app to capture audio through your smart glasses for voice notes or video audio."
  • NSBluetoothAlwaysUsageDescription: e.g. "Allow Bluetooth connection to smart glasses accessories." (Bluetooth is used to communicate with the glasses).
  • Privacy tip: Keep the descriptions concise and honest. During App Store review, Apple will check that your usage aligns with the description. Since the camera/mic are on a wearable, mention the glasses explicitly to avoid confusion.

Android (AndroidManifest.xml)

Make sure these lines appear in your manifest's <manifest> section (if you haven't already added them):

xml

For Android 14+, also include <uses-permission android:name="android.permission.BLUETOOTH_SCAN" /> if your app actively scans for the glasses (though if relying on the Meta app pairing, you might not need to scan). And if your integration might access media or storage (e.g. to save photos), don't forget file system permissions on older Android or use Media Store APIs on newer versions.

Note: The Meta glasses themselves have LED indicators and hardware controls to protect privacy, but as a developer you should also respect privacy in software. Only activate the camera/mic when the user triggers it, and stop immediately after. We'll cover more privacy checks in the next steps.

Step 3 — Create a client wrapper

Rather than peppering your code with direct SDK calls, create a dedicated module/class to manage the wearable. This helps abstract the complexities (and makes it easier to update when the SDK changes). Suggested components:

  • WearablesClient (class): This will handle connecting to and disconnecting from the glasses, and track the connection state. It might expose methods like connect(), disconnect(), and properties like isConnected. Internally, it will use the Meta SDK's connect method and listen for status events (e.g. via delegate or listener).
  • FeatureController or Service: For each major feature (e.g. camera, audio), consider a controller. For instance, a PhotoCaptureService with a method capturePhoto() that uses the SDK to trigger the glasses camera, and a callback to handle the image when received. Similarly, an AudioNoteService could handle recording via the glasses mic.
  • PermissionsManager: Although the phone OS will handle user prompts, it's wise to centralize permission checks in your code. This manager can check AVCaptureDevice.authorizationStatus(.video) on iOS or ContextCompat.checkSelfPermission on Android for camera/mic, and prompt if not granted. It can also wrap the logic to ensure the Meta companion app's permission (glasses access) is obtained.
  • Secure Data Handling: Plan how to handle the data coming from the glasses. For example, if a photo is captured, will you store it on device, upload to a server, or just show and discard? Given privacy, prefer processing data on-device when possible. If you must upload (e.g. for an AI analysis), encrypt the data in transit (HTTPS) and avoid storing identifiable content longer than necessary. Implement any deletion or anonymization policies required by Australian law for personal data.

Definition of done:

  • Your app can initialize the SDK (e.g. set up any required listeners or sessions) on startup or on-demand without crashes.
  • The user can trigger a connection (like tapping "Connect Glasses") and the WearablesClient establishes the link (the Meta app might briefly open then return control). The UI updates to "Connected" state. The app handles disconnects (e.g. if glasses go offline, you update UI to "Disconnected" and possibly attempt reconnection or prompt the user).
  • All sensitive operations are gated behind user actions and permissions. For instance, the app doesn't automatically start recording; it waits for user input. If any error occurs (failure to connect, permission denied, device error), your app surfaces this to the user in a friendly way (toast or dialog) and logs it for your debugging.

Step 4 — Add a minimal UI screen

Design a simple interface to let the user interact with the glasses:

  • Connect button: A button, e.g. "Connect to Meta Glasses." When pressed, initiate the connection via your WearablesClient. While connecting, you might show a loading spinner. Once connected, this button could turn into "Disconnect" or be disabled.
  • Status indicator: A small label or icon that shows the current status (Disconnected / Connected / Connecting…). This helps the user know if they need to pair or if everything is ready. You can also show the device name or battery if the SDK provides it.
  • Feature trigger: For example, a "Capture Photo" button. This calls your PhotoCaptureService.capturePhoto(). While the capture is in progress, you can give feedback (e.g. change the button to "Capturing…").
  • Result display: An UIImageView (iOS) or ImageView (Android) to display the latest photo taken, or a gallery of thumbnails. For audio, this could be a playback button or transcript text area. Ensure you have the rights (permissions) to show or store this content; since it may include personal images, treat it carefully (if your app saves the photo, mention it in your privacy policy and perhaps allow the user to delete it).

By the end of this integration, you should be able to build and run your app, connect to the glasses, and trigger at least one end-to-end feature (like taking a photo and seeing it in your app). Moreover, your app should adhere to security best practices (using encryption, not exposing data) and Australian privacy compliance (only collecting what's necessary and informing the user). In the next section, we'll detail a specific feature implementation with those considerations.


7) Feature Recipe — Trigger Photo Capture from Wearable into Your App

Goal

Implement a full cycle: user taps a "Capture" button in your app → the glasses take a photo → the photo is transmitted to your app → your app displays the image and optionally saves it. We'll include privacy checks (like ensuring consent) and error handling.

UX flow

  1. Precondition: Glasses are connected (if not, the UI should prompt the user to connect first).
  2. User taps "Capture Photo." This triggers the capture sequence.
  3. Show progress: Immediately provide feedback — e.g. overlay "Capturing…" or a spinner — as the glasses LED will turn on during capture. This keeps the user informed, especially since capture might take a second or two.
  4. Photo taken: The glasses capture the image and send it to your app via the SDK. You receive it in a callback.
  5. Display result: Your app receives the photo (likely as an image buffer or file path). Remove the "Capturing…" state and display a thumbnail or full-screen preview of the image. Also, save the image to app storage or gallery if that's a feature.
  6. Acknowledgement: Optionally, show a brief "Saved ✔️" message or similar confirmation to the user, so they know the action succeeded.

Implementation checklist

  • Connected state verified: In your capture button's handler, first check if (!wearablesClient.isConnected) { showAlert("Please connect your glasses first."); return; }. This prevents trying to capture without a device.
  • Permissions verified: Ensure camera/mic permissions are granted. If not, trigger your PermissionsManager to request them. Also, consider a check if the Meta companion app's permission for your app is active (if the SDK provides a call to check authorization status).
  • Issue capture request: Call the SDK's capture method. For example, wearablesClient.takePhoto() which internally might call something like MetaWearables.captureImage() (the exact API depends on Meta's SDK). Wrap this in a try-catch or success/failure callback.
  • Timeout & retry: Start a timer (e.g. 5-10 seconds) in case you don't get a response. If the timer elapses with no photo, cancel the request (if possible) and notify the user that it failed. Maybe offer a "Retry" option. (Failures could happen if Bluetooth drops or if the user's glasses storage is full, etc.)
  • Receive result: Implement the callback/listener for photo data. When invoked, you get an image (as NSData/UIImage on iOS or as a ByteArray/Bitmap on Android, for instance). Immediately store it to a secure location:
    • For example, save to the app's cache directory or photo library (with user's consent).
    • If saving to gallery, ensure you have permission on Android (scoped storage or MediaStore write permission).
  • Update UI: On the main thread, set the ImageView to display the new image thumbnail. Clear the "Capturing…" overlay.
  • Post-capture cleanup: Stop any loading indicators. If you had disabled the capture button during progress to prevent double clicks, re-enable it now for the next capture.
  • Logging: Log an event "photo_capture_success" along with metadata (timestamp, maybe file size). If the result is instead a failure (exception or error callback), log "photo_capture_fail" with error info and show the user an error message ("Capture failed, please try again.").

Below is a pseudocode illustration for an iOS-like environment:

func onCaptureButtonTapped() {
    guard wearablesClient.isConnected else {
        showMessage("Connect your glasses first.")
        return
    }
    guard permissionsManager.allPermissionsGranted() else {
        permissionsManager.requestPermissions()
        return
    }
    showLoading("Capturing...")
    wearablesClient.capturePhoto { result in
        hideLoading()
        switch result {
        case .success(let photo):
            savePhotoToAppDirectory(photo)
            imageView.image = photo
            showMessage("Photo saved ✅")
        case .failure(let error):
            showMessage("Capture failed: \(error.localizedDescription)")
            logError("capturePhoto", error)
        }
    }
}

(The exact API will differ; adjust for Android with coroutines or callbacks as needed.)

Troubleshooting

  • Capture returns empty: If the callback returns success but with an empty image or null data, check the logs. It could be that the glasses were not actually in capture mode. This might happen if the glasses weren't ready or if the user's permission wasn't properly granted. Solution: ensure the glasses are awake (some have auto-off after idle - ask the user to touch the frame or try again) and double-check the Meta permission flow. Also verify your app is using the correct API call for photo (e.g. not a thumbnail API by mistake).
  • Capture hangs or times out: If your request never returns, the issue could be a lost Bluetooth connection or a bug in the preview SDK. Implement your own timeout logic - e.g. after 10 seconds, assume failure and allow the user to cancel. When timing out, try a fresh reconnect to the glasses before retrying, as the connection may have dropped silently.
  • "Instant display" expectations: Users might expect the photo to appear immediately. However, there's inherent latency sending data from the glasses to phone. Manage this by using a placeholder UI - e.g., a blurred preview or an animation - to indicate processing. You could even play a camera shutter sound (if appropriate) when the glasses LED flashes, to give immediate feedback. These small UX touches reassure the user that something is happening while the image is on its way.

By following this recipe, you implement a core feature in a way that is user-friendly, handles errors, and respects privacy (notice we always informed the user of capture and only did so on user action, aligning withprivacy by designprinciples).


8) Testing Matrix

Test your app under various scenarios to ensure robustness and compliance:

← Scroll for more →
ScenarioExpected OutcomeNotes
Mock device (simulator)Feature works with simulated dataUse the Mock Kit to run CI tests - e.g., simulate a photo capture and ensure your app handles the callback.
Real device (close range)Low latency, reliable connectionIn ideal conditions (phone and glasses near, strong Bluetooth), captures should succeed quickly (under 2s). This is your baseline for user experience.
Real device (next room)Possibly reduced connection qualityTest range limits. The capture might take longer or fail if the wearer walks away. Your app should handle a disconnect gracefully (e.g., show "Disconnected" instead of hanging).
Background / Lock screenDefined behavior (no unexpected recording)On iOS, if the app goes background, camera should stop (by OS design). Verify that if the user locks the phone during capture, either the capture completes or is aborted safely. On Android, if you allow background operation, ensure a notification is shown (as per Android policy) and the app still receives data.
Permission denied (user)Clear error message, feature is blockedIf the user denies camera or mic permission on the phone, the app should not attempt capture. It should explain that permission is needed. Similarly, if the user didn't allow the glasses access for your app, detect this and prompt them to enable it (perhaps by re-triggering the Meta consent flow).
Data privacy checkNo unintended data leaksAfter a capture, ensure the photo is stored only where intended. For instance, it should not go to a public gallery without user action. Also verify that if your app uploads images to a server, they are encrypted in transit (HTTPS) and you have user consent per your privacy policy.
Disconnect mid-actionGraceful cancellation, no crashesIf the user takes off or powers off the glasses during a capture, your app should handle the "device disconnected" event. The capture result will fail - make sure you stop any loaders and inform the user ("Glasses disconnected. Please reconnect and try again."). There should be no crashes or infinite waits.
Multiple captures in a rowConsistent results, no memory leaksHave a test where a user captures, say, 5 photos in succession. The app should handle it (perhaps queue if the SDK requires sequential operation). Check that each photo displays correctly and memory is freed (no crashes due to large images).

By covering these scenarios, you not only ensure functionality but also buildtrust- your app behaves predictably and securely under all conditions, which is crucial for a product dealing with cameras and personal data.


9) Observability and Logging

Implement logging to monitor how your app interacts with the glasses. This helps in debugging and also in demonstrating compliance (you have an audit trail of sensitive operations):

  • Connection events: Log when you start connecting (connect_start), when the connection succeeds (connect_successwith timestamp and device ID), and failures (connect_failwith error codes). This can help identify if connections drop frequently.
  • Permission state: Record if the user has granted or denied key permissions. E.g.,permission_status_camera = granted/denied. If denied, you might log each time the user hits "Capture" and you had to show a permission rationale - useful for UX improvements.
  • Feature usage: For each major action (photo, audio, etc.), log start and end:
    • photo_capture_start (with an ID for the request),
    • photo_capture_success (with file size, maybe image resolution),
    • orphoto_capture_fail (with error info like timeout or user cancellation). Do similarly for audio or any other feature (audio_record_start/stop, etc.).
  • Performance metrics: Log how long operations take. For instance, measure the time from capture request to photo received - logphoto_capture_duration_ms. If certain operations consistently take too long, you might optimize or at least inform the user (e.g., "This may take up to 5 seconds").
  • Reconnection attempts: If your WearablesClient auto-reconnects when connection is lost, log an eventauto_reconnect_attemptand whether it succeeded. This can surface issues with stability or the need for better reconnection logic.
  • Security audits: Consider logging any unusual events, like if the data from the device doesn't match expected formats (could indicate a bug or, in a far-fetched scenario, a man-in-the-middle intercept - the connection is encrypted by Meta's protocol, but adding checksums/hashes for received files can add assurance).
  • Storage and deletion: If your app automatically deletes old photos or transfers them off the device (for cloud backup), log those events too (photo_deleted_after_uploadetc.), especially if required by a data retention policy.

Make sure your logging respects user privacy: do not log actual image content or personal data. Stick to metadata (timestamps, statuses). If using analytics platforms, be mindful not to inadvertently send sensitive info. Logging should aid development and compliance verification without exposing users.


10) FAQ

  • Q: Do I need hardware to start developing? A: Not initially. Meta provides aMock Device Kitthat simulates the glasses so you can begin development without physical hardware. You can pair a virtual device, simulate camera output, and test your app's logic. However, before production you'll want to test on a real device like Ray-Ban Meta glasses to ensure the user experience (camera quality, connectivity) meets your expectations.
  • Q: Which smart glasses are supported by this SDK? A: The toolkit currently supports Meta's AI glasses lineup - primarily theRay-Ban Meta Smart Glasses(both the first-gen Ray-Ban Stories and the new Ray-Ban Meta glasses with display) and the equivalentOakleymodels in Meta's program. The core features (camera, mic, speakers) are accessible on these devices. As Meta releases new wearables (like those with displays or neural bands), the SDK will likely expand to cover them. Always check the latest docs for device compatibility.
  • Q: Can I ship a Meta glasses integrated app to production now? A: Not to general users yet. During the current developer preview, only select partners have rights to publish their integrations to the public app stores. For example, Meta has allowed a few early apps (like some from Microsoft, Streamlabs, etc.) to go live for testing. For most developers, you can use the SDK in internal tests or closed pilots, but broad release will have to wait until Meta opens up general availability in 2026. Keep an eye on Meta's announcements - once the preview phase ends, you should be able to release your app publicly, provided it complies with Meta's policies.
  • Q: What about privacy and legal compliance in Australia? A: Australian law has strict requirements when dealing with camera and audio data. Ensure your appfollows the Privacy Act 1988 (Cth)if applicable - this means obtaining consent for collecting personal information, clearly disclosing what you collect and why (update your Privacy Policy), and securing the data. For example, if your glasses app records customers in a retail store, you may need signage to inform the public. Also, if deploying in workplaces, comply with state surveillance laws (e.g., give written notice to employees in NSW). Technically, Meta's hardware helps with privacy (LED recording light, etc.), but as a developer you are responsible forprivacy by designin your software. Only capture what is necessary (data minimisation), and use encryption for data in transit and at rest. It's wise to conduct a Privacy Impact Assessment and consult legal advisors if your app deals with sensitive scenarios.
  • Q: Can my app push content or notifications to the glasses? A: Limited at this stage. The current SDK is mostly about accessing the glasses' sensors (camera, mic) from the phone. You cannot yet send images or video to display on the glasses' HUD because the display API isn't open in this preview. You also cannot override core behaviors (like disabling the LED or changing the voice assistant). The glasses will play audio from the phone by default for calls or media, so you can output sound (e.g., text-to-speech directions) through the normal Bluetooth audio channel. But richer AR content push will have to wait for future updates when Meta enables developers to draw to the glasses' display. In short, current integrations are one-way (glasses -> app) for sensor data, with the exception of audio which is two-way via standard protocols.
  • Q: How secure is the connection between the glasses and my app? A: Meta's wearable platform uses encryption for data transmitted between the glasses and the phone. The "Meta AI" companion app manages the low-level connection, ensuring that things like live video streams or audio are encrypted in transit (e.g., BLE encryption or higher-level encryption over Wi-Fi, if applicable). As a developer, you should still use secure practices: trust but verify. That means keep your SDK updated (Meta will patch vulnerabilities via firmware or SDK updates), do not send sensitive data unencrypted over your own network calls, and don't store data in plaintext on the device. By using the official SDK and companion app, you benefit from Meta's built-in security measures such as authentication and encryption handshakes.

11) SEO Title Options

  • How to Get Access to Meta's Smart Glasses SDK and Run the Sample App (iOS & Android)
  • Integrate Meta Glasses into an Existing Mobile App: Step-by-Step Guide
  • How to Trigger Photo Capture from Ray-Ban Meta Smart Glasses in Your App
  • Meta Glasses SDK Troubleshooting: Pairing, Permissions, and Build Errors

(Recommended SEO Title: 'How to Get Access to Meta's Smart Glasses SDK and Build a Secure, Privacy-Compliant App' — this combines the 'how to' with security/privacy keywords for the Cybergarden audience.)


12) Changelog

  • 2026-01-17 - Verified with Meta Wearables Device Access Toolkit preview (December 2025 release). Tested on iOS 17 (Xcode 15) and Android 14, using Ray-Ban Meta Smart Glasses (Gen 2) and Mock Device Kit. Updated privacy law references per Australian regulations as of 2025.