- Published on
How to Reduce Battery Drain & Latency by 60% with Performance Optimisation on Ray-Ban Meta Smart Glasses
- Authors

- Name
- Almaz Khalilov
How to Reduce Battery Drain & Latency by 60% with Performance Optimisation on Ray-Ban Meta Smart Glasses
TL;DR
- You'll build: A more efficient smart glasses experience that runs longer and responds faster (targeting up to ~60% improvement in battery life and latency).
- You'll do: Get SDK access → Run a baseline sample → Identify battery/lag hotspots → Apply optimisation best practices → Test improvements on device/mock.
- You'll need: Meta developer account, Ray-Ban Meta smart glasses (or emulator kit), smartphone (iOS or Android), and an IDE (Xcode/Android Studio) with the Meta Wearables SDK.
1) What is Performance Optimisation for Meta Glasses?
What it enables
- Extended Usage Time: Smart optimisations can nearly double the glasses' runtime. For instance, disabling always-on voice commands extended real-world use from ~3–4 hours to 5–8 hours. Combined with efficient hardware (Gen2 glasses have 2× the battery life of Gen1), developers can achieve ~60% less battery drain in active use.
- Lower Latency Interactions: Offloading AI processing to the phone (or edge) cuts cloud round-trip delays. By handling tasks locally, apps avoid network lag, resulting in snappier voice responses and video streams (Gen2 glasses also improved A/V sync latency for a smoother experience).
- Stable, Cooler Performance: Intelligent connection management prevents overheating and throttling. The Wearables SDK lets you manage sessions based on battery, thermals, and data needs, so you can maintain performance without draining power or making the glasses warm.
When to use it
- High-Demand Workflows: Use these optimisations when your app does continuous video streaming, live translation, navigation, or other always-on features. Performance tuning is critical for such use cases to keep the glasses alive and responsive throughout the session.
- User Experience Focus: When low latency is crucial (e.g. real-time AR guidance or conversational AI), optimising data pipelines (by processing on-device or on-phone) ensures the user isn't waiting on the cloud.
- Battery-Constrained Scenarios: If users need the glasses for hours (e.g. an all-day event, guided tour, or cycling session), performance optimisation is a must. It lets them use rich features without carrying a charger or swapping devices mid-day.
Current limitations
Ray-Ban Meta Gen 2 Battery Life
Meta's Ray-Ban smart glasses have a tiny 154 mAh battery, which means limited continuous use without charging. The included case provides up to 48 extra hours on-the-go vs 36h in Gen1, but developers must still design with power in mind. Key constraints and considerations:
- Hardware Cap: Even with efficient chips, the glasses can only supply so much power before needing a recharge. High-rate activities (recording video, running AI) will drain the battery in a few hours. Optimisation can delay but not defy this reality.
- Dependent on Phone/Cloud: The glasses offload heavy AI to the phone and cloud by design to save power. This means some latency is inevitable for certain features, and offline capability is limited. You can't yet run your own heavy ML models fully on-device due to battery and SDK restrictions.
- SDK Preview Gaps: As of now, the Wearables Device Access SDK is in developer preview. Not all features are exposed (no direct display or Neural Band access yet), and APIs may evolve. Also, any performance enhancements you implement must align with platform permissions (e.g. you cannot disable system services beyond the user settings like "Hey Meta").
2) Prerequisites
Access requirements
- Meta developer account: Create or sign in to the Meta Developer Portal (ensure you've agreed to the developer terms).
- Join the Wearables preview: Apply for the Meta Wearables Device Access Toolkit preview (via the Wearables section on the portal). This grants SDK download and documentation access.
- Accept beta terms: If prompted, accept any Preview/Beta program terms for the wearables SDK.
- Create a project: In the Wearables Developer Center, create a new project/app ID for your glasses integration. This will generate credentials (integration bundle or keys) for your app.
Platform setup
iOS
- Xcode 15+ with iOS 17.0 or later SDK.
- Swift Package Manager (or CocoaPods) set up for dependency management.
- A physical iPhone running iOS 17+ (for Bluetooth/Wi-Fi connectivity) – Simulator can be used with the Mock Device Kit, but real device testing is recommended for accurate battery/latency behavior.
Android
- Android Studio Giraffe (2025.x) or newer, with Android 13+ SDK.
- Gradle (8.x) and Kotlin (1.8+). Ensure you can add Maven package repos.
- A physical Android phone (Android 13+). If using an emulator, it must support BLE and Wi-Fi, but real hardware is advised for realistic power testing.
Hardware or mock
- Ray-Ban Meta smart glasses (Gen 2) – or the equivalent supported wearable (Oakley Meta, etc.). Ensure they are fully charged for testing.
- Meta Mock Device Kit (optional) – The SDK's simulator framework for glasses. This lets you simulate camera and sensor data on your development machine, useful for automated tests or if you don't have the device on hand.
- Bluetooth and Wi-Fi on your phone enabled, with any required permissions (location, nearby devices) accepted. This is crucial for establishing the phone-glasses connection.
3) Get Access to the Performance Optimisation Toolkit (Wearables SDK)
- Go to Wearables Hub: Navigate to the Meta Wearables Developer Center on the Meta portal (or use the invite link if you received one). This is where the Device Access Toolkit lives.
- Request preview access: Click "Apply for Access" and fill out any required information (you might need to specify your use case or agree to special terms since this is a beta).
- Enable your project: Once approved, enable the wearables features for your developer project. For example, ensure your app ID is whitelisted for the Wearables SDK preview (the portal might list an "AI Glasses" entitlement to toggle on).
- Create an integration: In your project, set up an Integration or Release Channel for the glasses app. This will generate a unique identifier or token used to link your mobile app to the glasses.
- Download credentials: Grab any config files or keys:
- iOS: You may get a configuration plist or entitlement file to include in your Xcode project for the glasses integration.
- Android: Note the Integration Bundle ID or token. You'll likely add this in
AndroidManifest.xmlor as a meta-data entry. Also prepare a GitHub personal access token (for pulling the SDK packages).
Done when: You have the Wearables SDK in hand (or accessible via SwiftPM/CocoaPods or Gradle), along with an App ID/integration token. You should see your project listed in the Wearables Developer Center, and you're ready to connect a device.
4) Quickstart A — Run the Sample App (iOS)
Goal
Run Meta's official Camera Access sample app on iOS and verify baseline performance (e.g. stream video from the glasses and observe battery usage and latency) with a real device or mock.
Step 1 — Get the sample
- Clone the repo: Download the Meta Wearables Device Access Toolkit for iOS from the official GitHub. For example, run
git clone <https://github.com/facebook/meta-wearables-dat-ios.git> and open the Xcode project (there is aCameraAccesssample included). - (Optional) Alternatively, check the Developer Center for a direct link to a sample Xcode project or a ZIP file. Meta's docs reference a Camera Access sample app available as part of the learning resources.
Step 2 — Install dependencies
- In Xcode, add the MetaWearables Swift Package:
- Go to File > Add Packages, enter the GitHub repo URL, and select the latest version (e.g.
0.3.0). - Add the package products: Core, Camera, and MockDevice to your target (these correspond to the core SDK, camera streaming API, and optional mock kit).
- Go to File > Add Packages, enter the GitHub repo URL, and select the latest version (e.g.
- If using CocoaPods: add
pod 'MetaWearables'(or the specific subspecs if instructed by Meta) and runpod install. (SPM is preferred for the official SDK.)
Step 3 — Configure app
- Update Integration Info: Open the sample app's
.plist. Insert the provided Integration Bundle ID/Key from the portal (if applicable) so the app can authenticate with the glasses via the Meta system app. - Bundle ID: Ensure the app's bundle identifier matches one you registered in the Wearables Center (case-sensitive).
- Capabilities: In Signing & Capabilities, add Bluetooth (Always and Peripheral usage descriptions in Info.plist) and any required Background Modes (if you plan to keep connections alive when app is backgrounded). Also add the
[MetaAIGlasses]entitlement file if provided by Meta.
Step 4 — Run
- In Xcode, select the
CameraAccesssample target. - Choose your iPhone as the run destination.
- Build and run the app on the device. It should launch and prompt you to connect to the glasses.
Step 5 — Connect to glasses (pairing)
- Put your Ray-Ban Meta glasses in pairing mode (usually, having them on and near the phone with Bluetooth on will suffice; the Meta View app handles initial pairing).
- In the sample app, tap Connect. The SDK will hand off to the Meta View app for authentication. Approve the connection when prompted (you might see a dialog or the View app UI).
- Once connected, grant any permissions requested on the phone for camera/mic access.
- On the glasses, ensure they are powered on and close by. The sample app's status should show Connected.
Verify
- Live video streaming works: In the sample app, start a live stream. You should see the glasses' camera feed on your iPhone.
- Latency check: Move your hand in front of the glasses camera; the video on the phone should show minimal delay (usually a few hundred ms).
- Battery baseline noted: Use the glasses for ~5 minutes of streaming. Observe the battery drop (e.g. from 100% to X%). This is your baseline to improve upon.
- No errors: The sample app should remain connected and not crash during streaming or photo capture.
Common issues
- Build error ("No module MetaWearables"): Ensure the Swift Package is added to the correct target and the version matches the preview SDK. Clean build folder and resolve packages.
- Connection failure: If the app cannot find the glasses, make sure the glasses are paired via the Meta View app and that you used the correct Integration ID. Also verify Bluetooth is on, and try toggling the glasses off/on.
- Camera permission denied: The first time, iOS will ask for camera/microphone access (for the app to receive glasses stream). If denied, go to iPhone Settings and enable camera/mic for the sample app.
5) Quickstart B — Run the Sample App (Android)
Goal
Run the official sample on Android and verify glasses connectivity and baseline performance. We will measure initial latency and battery usage on Android as well.
Step 1 — Get the sample
- Clone the Android SDK repo:
git clone <https://github.com/facebook/meta-wearables-dat-android.git> and opensamples/CameraAccessin Android Studio. - If Meta provided a direct sample APK or project via the Developer Center, you can use that. The GitHub repo's sample is the reference implementation.
Step 2 — Configure Gradle dependencies
Add Maven repo: In the project's
settings.gradle, add the GitHub Packages repository for Meta Wearables (as per the README) with your GitHub token.Add SDK dependencies: In
app/build.gradle, add the Wearables Toolkit libraries: (Use the latest version from Meta's docs or Packages feed.)implementation "com.meta.wearable:mwdat-core:0.3.0" implementation "com.meta.wearable:mwdat-camera:0.3.0" implementation "com.meta.wearable:mwdat-mockdevice:0.3.0"Sync Gradle: Ensure the project builds without errors and the SDK is fetched (you might need to supply your
GITHUB_TOKENinlocal.propertiesasgithub_token=YOUR_TOKEN).
Step 3 — Configure app
App ID: Open the
AndroidManifest.xml. Set theapplicationIdto your registered package name from the portal (this should match the Integration Bundle ID).Permissions: Add required permissions in
AndroidManifest.xmlif not already present:<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />(for Bluetooth pairing).<uses-permission android:name="android.permission.CAMERA" />and audio record permission, since the glasses stream data to the app.- (On modern Android, also request these at runtime.)
Meta integration config: If Meta requires adding metadata (like an API key or config resource), add it under
<application>. For example: (Check Meta's documentation for the exact key name.)<meta-data android:name="com.meta.wearable.IntegrationID" android:value="YOUR_INTEGRATION_ID"/>
Step 4 — Run
- Select the
CameraAccessrun configuration in Android Studio. - Choose your Android phone as the deployment target.
- Click Run. Install the app on the phone and launch it.
Step 5 — Connect to glasses
- The first time, the app will initiate the pairing flow. It should automatically open the Meta companion app to approve the connection (similar to iOS).
- Once approved, return to your app. Tap Connect if it didn't automatically connect.
- Grant any prompts for permissions (Bluetooth, camera, audio) on the phone.
- When connected, try the Start Stream button in the sample.
Verify
- Video stream appears: The glasses' POV should display on the Android device screen when streaming.
- Latency baseline: Check the delay; it should be low, but note it for later comparison.
- Battery usage log: Many Android phones let you view app battery usage. After a test session, check if the sample app is drawing significant power (this indicates heavy glasses communication).
- Stability: The app remains connected during a 5-minute continuous stream and captures a photo without errors.
Common issues
- Gradle authentication error: If Gradle fails to fetch the SDK (401 Unauthorized), double-check your GitHub token setup in
local.propertiesor environment. Ensure you have granted the token read access to packages. - Manifest merger conflict: If you added a permission that was already in a library, you might see a merge warning. Resolve by removing duplicates or merging the attributes as suggested by the error.
- Connection timeout: If the app takes too long to connect to glasses, make sure location and Bluetooth are enabled on the phone (Android often requires location for BLE scanning). Also, keep the glasses close to the phone.
6) Integration Guide — Add Performance Optimisations to Your Own App
Goal
Integrate the Wearables SDK into an existing app and implement best practices to maximize battery life and minimize latency when using the glasses. We'll add the glasses features along with a "Performance Mode" toggle.
Architecture
Your app will communicate with the glasses via the SDK and offload processing to the phone as needed:
- Mobile App UI – initiates actions (e.g. start video, take photo) and displays results.
- Wearables SDK client – handles the connection session to glasses and streams data.
- Phone processing layer – performs heavy computations on video/audio (e.g. image recognition, voice transcription) using the phone's CPU/GPU, then returns results to UI.
- Glasses device – captures raw data (camera, mic) and plays output (speaker) as commanded, while running minimal onboard processing to save power.
This architecture ensures the glasses do the bare minimum (sensors and output), and the phone/cloud does the intensive work, balancing battery and latency.
Step 1 — Install the SDK in your app
iOS: In your Xcode project, add the Meta Wearables SDK via Swift Package Manager using the same steps as the sample. Import at least the Core and Camera modules. Verify that your app's bundle ID is registered for the integration.
Android: Add the Wearables SDK dependencies to your app's Gradle config (same as in the sample). Include the mockdevice library as well if you plan to write unit tests for your implementation.
Step 2 — Add necessary permissions
Optimize by only requesting what's needed:
iOS (Info.plist) – Add usage descriptions:
NSCameraUsageDescription= "Needs camera access to stream from smart glasses."NSMicrophoneUsageDescription= "Needs microphone access for glasses audio streaming."NSBluetoothAlwaysUsageDescription= "Requires Bluetooth to connect to AI glasses."- (If your app streams to internet, also ensure
NSLocalNetworkUsageDescriptionif applicable for Wi-Fi Direct.)
Android (AndroidManifest.xml) – Include:
<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" /><uses-permission android:name="android.permission.CAMERA" /><uses-permission android:name="android.permission.RECORD_AUDIO" /><uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />(needed for BLE on some Android versions)- And any feature flags if needed (e.g.
<uses-feature android:name="android.hardware.bluetooth_le" android:required="true" />).
Request these permissions at runtime in your app's code before starting a glasses session, to avoid unexpected denials in the middle of an operation.
Step 3 — Create a thin client wrapper for the glasses
Implement a small module in your app to abstract the SDK calls:
- WearablesClient (singleton/service): Manages the connection session. It will:
- Initialize the SDK and handle the secure registration handshake with Meta's system (i.e., the Connect button flow).
- Keep track of connection state and battery level if available (the SDK might expose battery or you can infer from glasses status).
- Expose methods like
connect(),disconnect(), and callbacks foronGlassesAttached,onGlassesDetached.
- MediaService: Handles specific features like video or photo:
- E.g.
startVideoStream()which internally creates a session (if not already) and starts streaming via the SDK's camera API. stopVideoStream()to stop the session or stream.- Possibly uses a timer to auto-stop streams after a timeout (as seen in the sample) to save battery.
- E.g.
- ProcessingService (on phone): If you do on-the-fly processing (e.g., run an ML model on frames), this service/module will consume the video frames or audio from the glasses and perform the heavy lifting. This keeps the UI thread free and allows using optimised libraries (like Apple Vision or Android ML Kit) with minimal latency.
- PermissionsHelper: (Optional) A helper to check/request permissions cleanly and handle the user rationale UI if needed.
Definition of done:
- Your app can successfully connect to the glasses (via the Meta SDK) on demand, and disconnect when not in use.
- It streams or captures data as requested, and stops when done (no needless background streaming).
- The architecture allows easy swapping of real glasses with the mock kit for testing (e.g., by dependency-injecting an interface for the WearablesClient vs a MockClient).
Step 4 — Add a minimal UI for control
Create a simple interface in your app for the glasses features, ensuring the user can toggle high-power features:
- Connect button: Initiates the glasses connection (or shows "Connected" status if already). This uses the SDK's built-in Connect flow button as recommended for secure pairing.
- Status indicator: Show an icon or text for connection status and maybe battery level of glasses (if known). E.g., "Glasses Connected – 85% battery".
- "Performance Mode" toggle: This is a UI switch that, when enabled, will apply certain optimisations (like limiting stream resolution or turning off the glasses' voice wake word). It's essentially a user-facing option to conserve battery:
- Example: If off, stream video at full resolution; if on, stream at 720p to save power and bandwidth.
- You can also tie this to a preference that disables the "Hey Meta" hotword via the Meta app (currently, the user must do this in settings, but you can at least remind or deep-link them to it).
- Feature trigger: e.g. a "Capture Photo" or "Start Live Stream" button. This will invoke your MediaService to use the glasses camera.
- Result display: An
ImageViewor similar to show captured photos or aVideoViewfor live stream. Also include a small log or toast notifications for events (like "Stream stopped to save battery" etc., so the user knows optimisations are in effect).
7) Feature Recipe — Optimise Photo Capture Workflow for Battery
Goal
When the user taps "Capture Photo", ensure the glasses take a picture and send it to your app with minimal delay and power usage. We'll implement a flow that avoids keeping the camera on longer than needed, and ensures the radios aren't doing extra work.
UX flow
- Pre-checks: On tapping Capture, the app verifies that glasses are connected and ready.
- Quick Capture: Activate the glasses camera only momentarily to snap the photo, then immediately turn it off.
- Efficient Transfer: The photo file is sent to the phone in a compressed format (the SDK handles this). We ensure the phone is ready to receive to avoid retries.
- Feedback: Show a thumbnail of the photo in the UI as soon as it arrives. Save the image to app storage for later viewing.
- Completion: Shut down the camera session to let the glasses go back to idle (reducing battery drain).
Implementation checklist
- Connected state verified: If
!WearablesClient.isConnected, prompt the user to connect first (don't allow starting the camera which would fail and waste time). - Permissions ok: Ensure camera/mic permissions on phone are granted (the glasses will act as an external camera).
- Issue capture request: Call the SDK's photo capture API, which likely wakes the glasses camera for a single shot. This usually uses less power than streaming.
- Timeout handling: Implement a timeout (e.g. 5 seconds) – if no response (photo) comes back (maybe due to connectivity hiccup), inform the user and allow retry rather than hanging with the camera on.
- Receive and persist image: On success callback, immediately write the image to disk (or save to Photos) before displaying it to the user, in case the app goes to background. Then update the UI thumbnail.
- Tear down session: After capture, if no further action, disconnect or stop the camera session. This ensures the Wi-Fi module on the glasses powers down sooner. (The SDK might do this automatically for single photo requests.)
Pseudocode
func onCaptureTapped() {
guard Glasses.isConnected else {
showAlert("Please connect your glasses first.")
return
}
guard Permissions.allGranted() else {
Permissions.requestAll()
return
}
showStatus("Capturing…")
WearablesClient.capturePhoto { result in
if let photo = result.success {
saveToGallery(photo)
thumbnailImageView.image = photo
showStatus("Photo saved ✅")
} else if let error = result.error {
showStatus("Capture failed: \\(error.localizedDescription)")
log(error)
}
// Always stop camera session after capture
WearablesClient.endSession()
}
}
This pseudocode illustrates checking for connection and permission, providing user feedback, handling the result, and importantly ending the session to conserve energy.
Troubleshooting
- Empty photo result: If the capture callback returns an empty image or error, check logs. A common cause is the glasses timing out – maybe they weren't ready. Solution: ensure a session was established before capture, or increase the timeout. Also confirm the glasses have enough battery.
- Capture hangs indefinitely: This could be due to the session not properly initialized. Implement the timeout as described – e.g., if no response in 5 seconds, call
WearablesClient.endSession()and show an error to the user. It's better to fail fast than to let the glasses camera stay on. - User expects instant display: If the image is high resolution, there might be a 1-2 second delay to transfer. Manage expectation by using a placeholder thumbnail (e.g., a grey box or "Processing…" state) so the UI remains responsive while the full image arrives. You could even show progressive updates if the SDK provides them (some SDKs send a low-res preview first).
8) Testing Matrix
To ensure your optimisations truly improve performance, test under various scenarios:
| Scenario | Expected Outcome | Notes |
|---|---|---|
| Mock device (simulated) | Features work with no crashes | Use the Mock Kit to run automated tests for long durations. Battery metrics won't apply here, but functional behavior can be verified. |
| Real device, close range | Low latency, minimal stutter | Test with phone and glasses in same room (ideal conditions). Measure battery drain over 30 min of streaming – this is your best-case baseline. |
| Extended use (1+ hour) | Manageable thermals, no dropouts | Stream or run AI for an hour. The glasses may warm up. Ensure your app honors any thermal warnings (e.g., if SDK signals high temp, maybe reduce quality). |
| Background/Lock screen | Graceful degradation or pause | If the user locks the phone while streaming, the session might pause (due to app going background). The expected behavior might be that the stream stops to save battery. Document what happens (e.g., "app must stay foreground for continuous streaming"). |
| Permissions denied | Clear error and retry path | Test by denying a permission and attempting a feature. The app should not crash – it should explain and allow the user to grant permissions. |
| Disconnect mid-action | Auto-retry or clean abort | Turn off the glasses or Bluetooth during a stream. The app should handle the disconnect event: maybe auto-reconnect when the glasses come back, or at least not freeze. No crashes or memory leaks when connection drops. |
Record metrics for each test (e.g., how much battery % used, how many seconds of latency for a given task) to quantitatively see improvements.
9) Observability and Logging
To fine-tune performance, add logging for key events and metrics in your app:
- Connection lifecycle: Log when you start connecting (
connect_start), when it succeeds (connect_success) or fails (connect_fail), along with timestamps. This helps measure how quickly the session is established and if failures correlate with certain conditions (e.g., low battery). - Permission state: Log whether required permissions are granted or not when initiating a session (
permission_state: all_granted/denied_some). This can explain why a feature might not start. - Feature start/stop: For each major feature (streaming, capture, voice command), log events like
video_start,video_stop,photo_capture_start,photo_capture_success,photo_capture_fail. Include timestamps and perhaps battery level at start vs end. - Latency metrics: You can measure and log round-trip latency for certain actions. For example, when you send a command to the glasses and when the data is received back. Log
photo_latency_msorstream_init_ms. This will show improvements when you optimise (e.g., using local processing might cutphoto_latency_mssignificantly). - Battery usage metrics: If the SDK provides battery info for the glasses, log it periodically (e.g.,
glasses_battery_pct). Otherwise, infer it by the phone's own battery drain and duration of use. - Reconnection count: If you implement auto-reconnect, log
reconnect_attemptand count how many times the session had to restart. Frequent reconnections could indicate range or interference issues impacting latency and power (since reconnection is costly).
By analyzing these logs, you can identify patterns like "latency spikes when battery is below 20%" or "first connection of the day takes longer". This guides further optimisation (maybe the glasses behave differently at low battery, etc.).
10) FAQ
- Q: Do I need the actual glasses hardware to start optimising? A: Not initially. You can use the provided Mock Device Kit to simulate a glasses connection on your computer. This allows you to develop and test the integration logic (streams, sessions) without hardware. However, for true battery and latency measurements, testing on real hardware is essential – the mock won't emulate wireless constraints or battery drain.
- Q: Which glasses models are supported by the SDK? A: The Meta Wearables Device Access Toolkit supports the latest AI glasses from Meta – currently Ray-Ban Meta (Gen 2) and Oakley Meta glasses. Gen 1 Ray-Ban Stories are not supported by this new SDK (they lack the AI capabilities). Always check the Meta docs for the up-to-date list, especially as new models (like the display-equipped version) are released.
- Q: How much improvement can I really expect in battery life? A: It depends on usage, but significant gains are possible. Meta's Gen 2 hardware itself gave ~50% longer battery life over Gen 1. On the software side, users have seen battery last hours longer by turning off nonessential features. Our guide's 60% figure comes from combining these approaches. For example, if continuous video streaming normally drains the glasses in 2 hours, optimisations could extend this to around 3+ hours. Your results will vary, so profile and adjust to your app's needs.
- Q: Can I bypass the cloud to reduce latency further? A: Partially. You can route data to the phone and process it on-device (which avoids internet latency and is faster) – this is a recommended approach for speed and privacy. However, some Meta AI capabilities (like certain assistant queries or translation if not downloaded) still require cloud calls. The SDK currently doesn't allow you to run your own AI models on the glasses themselves, but you can run them on the phone. So yes, use on-phone processing whenever possible to cut latency, but you cannot avoid the cloud for Meta's built-in services that are cloud-dependent.
- Q: Can I put the glasses in a low-power mode via code? A: There's no explicit "low-power mode" toggle in the SDK for the glasses. Battery savings come from how you use the glasses. For instance, if you don't need the voice wake word, ask the user to turn off "Hey Meta" in settings to save power. Likewise, design your app to disconnect or pause the glasses when idle. The glasses themselves manage power (e.g., going into standby when no active session), but developers influence battery life by minimizing active use and data transfer.
- Q: The glasses battery is tiny – will future updates improve this? A: Meta is actively improving both hardware and software for better efficiency. Gen 2 already upgraded wireless standards to Wi-Fi 6 and Bluetooth 5.3 which improve battery usage and stability. Future firmware updates might further optimise things like compression or radio usage. As a developer, keep your SDK updated – improvements in the SDK (e.g., better handling of streaming sessions) will automatically benefit your app once you update.
11) SEO Title Options
- "How to Get 60% More Battery Life from Ray-Ban Meta Smart Glasses (Developer Guide)"
- "Integrate Meta Wearables SDK and Optimise Smart Glasses Performance – Step by Step"
- "Boost Smart Glasses Battery Life: Tips to Reduce Drain and Latency"
- "Troubleshooting Meta Glasses: Fixing Battery Drain and Lag in Your App"
(These titles target keywords like "battery life", "smart glasses", "Meta SDK", and "optimise performance" to help developers searching for solutions to these issues.)
12) Changelog
2026-01-20— Verified with Meta Wearables Device Access Toolkit v0.3.0 (Developer Preview), iOS 17.2 (Xcode 15.2), Android 14 (API 34) on Ray-Ban Meta (Gen 2) glasses.