Published on

How to Add "Save to Photos", Share Sheet & AirDrop for Smart Glasses Media on iOS

Authors
  • avatar
    Name
    Almaz Khalilov
    Twitter

How to Add "Save to Photos / Share Sheet / AirDrop" for Captured Glasses Media (iOS)

TL;DR

  • You’ll build: an iOS companion app feature that lets users save photos/videos captured on smart glasses to the iPhone’s Camera Roll or share them via the iOS Share Sheet (including AirDrop).
  • You’ll do: Get necessary access (developer accounts or SDKs) → Install any glasses SDK (if required) → Run a sample app to test sharing → Integrate the share feature into your app → Test on an iPhone with the wearable.
  • You’ll need: Apple Developer provisioning (to run on device), a pair of smart camera glasses (or a way to simulate captured media), and Xcode 15+ on macOS.

1) What is the Smart Glasses Media Sharing Solution?

What it enables

When to use it

  • Companion apps for camera glasses: Use this solution if you’re building an app for smart glasses (e.g. Snap Spectacles, Meta Ray-Ban Stories) and want users to export and share the photos they take on the wearable.
  • Quick media sharing features: Ideal when your app handles media captured externally (like via a wearable or drone) and you want to provide a native option to save to the Camera Roll or invoke sharing (versus building a custom sharing flow).
  • User-generated content apps: If your app lets users capture unique POV content via glasses, this feature makes it easy for them to keep those memories (saved locally) or share with friends instantly, enhancing engagement.

Current limitations

  • Platform constraints: On iOS, saving to Photos requires user permission (iOS will prompt for access the first time). You must include the NSPhotoLibraryAddUsageDescription in your app's Info.plist for the "Save Image/Video" option to appear. Without it, the share sheet will not show the save-to-camera-roll button by default.
  • Device connection: This guide assumes the glasses are already connected and your app has received the photo/video. The actual capture and transfer from glasses to app may depend on vendor SDKs or the glasses' companion app (some consumer glasses like Ray-Ban Stories don't provide open APIs, so media must be imported via official apps).
  • No direct reverse transfer: Most smart glasses (as of 2025) do not allow pushing content to the glasses for viewing. The sharing solution here is one-way (glasses → phone). You won’t be able to send or AirDrop media to the glasses (these devices lack such APIs in consumer models).

2) Prerequisites

Access requirements

  • Apple Developer – Make sure you have an Apple Developer account (free for testing on device, paid for App Store). You’ll run the app on an iPhone to use Photos and AirDrop features.
  • Smart Glasses Developer Access – If your solution involves a specific SDK (e.g. Snap Spectacles Mobile SDK or another glasses API), sign up for that developer program. For example, Snap's Spectacles program requires joining their developer portal and possibly obtaining an API token or pairing permission via the official Spectacles app.
  • Project/App ID – Create an app ID in any vendor portal if needed. (If using Snap’s SDK, you may need to create a Snap Kit app and enable the Spectacles or Camera Kit features to get credentials.)

Platform setup

iOS

  • Xcode 15 or later with iOS 17+ SDK. (Ensure your deployment target meets the minimum iOS version supported by the glasses SDK, if any.)
  • Swift Package Manager or CocoaPods installed (if integrating a vendor SDK library). The iOS share sheet functionality itself uses UIKit (no external library required), but a glasses SDK (like Snap’s) might be installed via SPM or CocoaPods.
  • Physical iPhone running iOS 16+ (recommended). The share sheet’s Save to Photos and AirDrop cannot be fully tested on Simulator (AirDrop requires real hardware with WiFi/Bluetooth).

Android (if you plan to implement similar sharing on Android)

  • Android Studio Giraffe or later with Android 13+ SDK.
  • Gradle 8+ and Kotlin 1.8+ for any Android SDK integration.
  • Physical Android device (Android 12+ recommended) since Nearby Share (the Android equivalent of AirDrop) and saving to gallery should be tested on real hardware.

Hardware or mock

  • Supported smart glasses (e.g. Snap Spectacles 3 or Spectacles AR, Ray-Ban Stories, etc.) or if you don’t have hardware, use a sample image/video file that simulates a capture. (For development, you can work with a placeholder image from disk until you have actual device input.)
  • Bluetooth enabled on your phone (if your glasses transfer media via BLE/Wi-Fi). Understand any pairing permissions the glasses might need (e.g. Bluetooth, camera, microphone on the phone if the glasses stream live content).

3) Get Access to the Smart Glasses SDK (if applicable)

  1. Developer portal: Go to the smart glasses' developer portal or SDK page. For example, Snap Spectacles developers would visit the Snap for Developers site. Create an account or sign in.
  2. Request access: If the SDK is in beta or requires approval (common for new AR glasses), follow the steps to request access or join the beta program. (E.g., join a TestFlight or download a provisioning profile if needed for preview hardware.)
  3. Accept terms: Agree to any developer agreements or NDA if required (some wearables have confidential SDK docs if not publicly released).
  4. Create a project: In the portal, create a new app/project entry if needed. For instance, with Snap you’d create a Snap Kit project and perhaps enable Spectacles or Camera Kit features to get credentials.
  5. Download credentials: If the platform provides API keys or configuration files, get those:
    • iOS: You might receive an API token, certificate, or a config plist/json to include in your Xcode project.
    • Android: You might get a JSON config or need to add a token to local.properties or as a resource.
  6. Pair your device: If the glasses require pairing through an official app first (e.g. Snap's Spectacles must be bonded via the Spectacles app), ensure you go through that process so your glasses are connected and can communicate with your app.

Done when: you have the necessary app credentials (if any) and your phone is set up with the smart glasses (paired and ready). You should be able to see the glasses in the official app or SDK, and have any required keys ready to use in the sample or your own app.


4) Quickstart A — Run the Sample App (iOS)

Goal

Run an official or example iOS sample app to verify that capturing from the glasses and saving/sharing works. This will ensure your environment is correct before integrating into your own app.

Step 1 — Get the sample

  • Option 1: Clone the repo. If the glasses vendor provides a sample iOS app (for example, Snap's Spectacles Mobile Kit sample), clone it from their GitHub. For instance, Snap's Spectacles Mobile Kit has an iOS sample project on GitHub. Download or clone the repository.
  • Option 2: Download an example. If no official sample exists, use Apple's sample code for saving photos. Apple's Developer Documentation on AVFoundation has sample code for capturing images and saving to Photos. You can use that as a starting point, adding a share sheet to it.

Open the Xcode project or workspace for the sample.

Step 2 — Install dependencies

If the sample uses package managers or CocoaPods:

  • Swift Package Manager: Check if the sample includes a Package.swift or Swift Packages. If not already resolved, in Xcode go to File → Packages → Resolve Packages or add the required package URL. For Snap's Camera Kit or Spectacles SDK, you'd add the GitHub URL of their SDK and select the latest version.
  • CocoaPods: If the sample uses a Podfile, run pod install in the project directory to install any pods (e.g., a glasses SDK framework or Snap's SCCameraKit pod from their CocoaPods setup).
  • Manual frameworks: Occasionally, you might need to drop in a .framework or .xcframework. Follow the sample’s README for any such steps.

After dependencies are set up, build the project once to ensure everything compiles.

Step 3 — Configure the app

Before running, some sample apps require configuration:

  • Info.plist updates: Make sure the sample's Info.plist contains NSPhotoLibraryAddUsageDescription with a string explaining why you need to save to Photos (e.g. "Allow saving captured images to your Photo Library"). If it's missing, add it now. This is critical – without it, iOS will not show the "Save Image" option in the share sheet and will prevent saving to Camera Roll.
  • Bundle ID & App ID: If the sample uses an App ID tied to a provisioning profile, you may need to change the bundle identifier to one of yours and update signing. (Especially if the sample comes with a placeholder bundle ID that you don’t have rights to.)
  • Glasses SDK config: If you obtained an API key or config file in step 3, add it to the sample:
    • For example, if there’s a GoogleService-Info.plist style file or custom SpectaclesConfig.plist, include it in the Xcode project and ensure it’s added to the app target.
    • If you have to paste an API token in code (some samples have a placeholder string), do that now.
  • Capabilities: Turn on any background modes or Bluetooth permissions if required. For instance, if the glasses communicate via BLE, enable the Bluetooth LE background mode and add NSBluetoothAlwaysUsageDescription in Info.plist with a justification.

Step 4 — Run the sample

  1. Build & run: Select your iPhone as the run destination. Build and launch the app on the device (plug it in, or use wireless debugging).
  2. Initial setup: The sample may have a UI to connect to the glasses or to trigger capture. Follow the on-screen instructions. For example, a Snap Spectacles sample might show a "Bind" or "Connect" button — tapping it could hand off to the Spectacles companion app for pairing consent. Complete any pairing/permission steps to connect the glasses.
  3. Capture test: Use the sample to take a photo or video via the glasses. This could be a button in the app that triggers the glasses’ camera. Alternatively, press the capture button on the glasses themselves if the sample listens for incoming captures.

Step 5 — Save/Share verification

Now verify the save and share functionality with the captured media:

  • The sample should display the captured image or video on screen once received. Look for a Share or Save button in the UI. Tap it.
  • An iOS UIActivityViewController (share sheet) should appear, offering options like AirDrop, Messages, etc. If your Info.plist is set up correctly, you should also see Save Image or Save Video in the list (you might need to scroll or tap "More" to see it, as it often appears in the second row of actions).
  • Choose Save Image (or “Save Video”). The first time, iOS will automatically prompt the user: “Allow this app to add to your photos?” (This prompt is driven by the usage description you added).
  • After allowing, the photo gets saved to the Camera Roll. Open the Photos app on the iPhone and confirm the image appears.
  • Try AirDrop: Open the share sheet again, and tap an AirDrop target (another device or Mac that's nearby). Ensure AirDrop is enabled (on the other device set it to receive from Contacts or Everyone). The photo should transfer and on the receiving device you'll see it open in Photos by default.

Verify:

  • The share sheet shows a thumbnail preview of the image/video (if not, ensure you passed a file URL, not just a UIImage, to the activity items — using a file URL yields a nice thumbnail and file info).
  • Save Image option appears and successfully saves to Photos after permission.
  • AirDrop transfers the image to another device which receives it (check that device’s Photos app for the new image).
  • Other share options (Mail, Messages, etc.) can be tested as needed – e.g., share via iMessage and ensure the image sends.

Common issues

  • Share sheet not showing "Save Image": This almost always means the NSPhotoLibraryAddUsageDescription key is missing or not spelled correctly in Info.plist. Double-check it. Also ensure you're sharing an actual image object or file; if you pass a remote URL without downloading, the Save option might not appear.
  • Build error (code signing): If the sample’s bundle ID isn’t changed to yours, you might get provisioning errors. Fix by setting a unique bundle ID and using your team profile in Xcode’s Signing settings.
  • No glasses found/connect fail: If the sample can’t detect the glasses, make sure they’re paired via Bluetooth to the phone (for some SDKs, the glasses might need to be connected in the vendor’s app first). Also verify the app has Bluetooth permission (see iOS Settings -> Privacy if the dialog was perhaps dismissed).
  • Permission denied for Photos: If you tapped “Don’t Allow” on the add-to-photos prompt, subsequent save attempts do nothing. iOS won’t show the prompt again. The sample should ideally handle this – but if not, you must go to iPhone Settings → Privacy → Photos and set the app’s permission to “Add Photos” or “All Photos”. Alternatively, delete and reinstall the app to get the prompt again. In your own integration, consider detecting denied status and guiding the user to Settings.

5) Quickstart B — Run the Sample App (Android)

If you only target iOS, you can skip this section. This mirrors the above steps on Android, using the equivalent Android sharing mechanism.

Goal

Run an Android sample (if available) to test capturing from the glasses and sharing the media on Android. On Android, “Save to Photos” is usually handled by writing to the Media Store or by sharing to gallery apps, and AirDrop is replaced by Nearby Share.

Step 1 — Get the sample

  • Clone or download the Android sample project from the smart glasses vendor (if provided). For example, Snap’s Spectacles Mobile Kit might have an Android sample on GitHub similar to the iOS one.
  • Open the project in Android Studio.

Step 2 — Configure dependencies

  • Add repositories: If the SDK is distributed via Maven or JitPack, ensure the build.gradle has the proper mavenCentral() or vendor repository URL.
  • API tokens: If you have an API key (from step 3), put it in the appropriate place. Often, you might add it to local.properties or as a string resource. (E.g., spectacles.sdk.token=YOUR_TOKEN in local.properties and reference it in the Gradle config.)
  • Gradle sync: After adding any required SDK dependencies (for instance, a line like implementation "com.vendor:glasses-sdk:1.0.0"), sync the project to download them.

Step 3 — Configure app

  • Application ID: Change the applicationId in app/build.gradle to something unique (e.g., "com.yourdomain.glassesshare"). This is important for installing on device and using your own signing keys.
  • Permissions: Open AndroidManifest.xml. Add any needed uses-permission lines:
    • If you plan to save to the device gallery directly, add WRITE_EXTERNAL_STORAGE (for Android ≤ 9) and READ_MEDIA_IMAGES (for Android 13+) – though if you use the MediaStore API to add images, explicit permission may not be required on modern Android.
    • Include Bluetooth permissions if the glasses require it (e.g. BLUETOOTH_CONNECT for Android 12+).
  • File provider (optional): If the sample intends to share via file URIs, ensure a <provider> is set up for sharing files (FileProvider) and the code uses content URIs for files. Many modern sharing implementations handle this, but just be aware.

Step 4 — Run

  1. Select Run config: In Android Studio, choose the sample app’s run configuration (if multiple modules exist, pick the main app module).
  2. Choose device: Connect an Android phone (with Developer Options enabled and USB debugging) or use an emulator (Note: BLE and Nearby Share might not work on emulators). A physical device is recommended for testing sharing.
  3. Run the app: Install it on the Android device.
  4. Pair/connect: The sample might prompt to connect to the glasses. Ensure Bluetooth is on. Follow similar steps as on iOS if needed (some might involve scanning for the device or opening the vendor’s companion app for approval).

Step 5 — Connect and test capture

  • Once the app is running and the glasses are connected, trigger a capture (via the app UI or the glasses button). The media should appear in the app’s interface.
  • Look for a Share or Save action in the app. On Android, a common approach is a Share button that launches an Intent.ACTION_SEND with the image/video URI.

Verify

  • The Android sharesheet appears when tapping share. This is the system UI similar to iOS’s, listing apps and actions.
  • You can share the image to “Photos” or gallery – for example, if Google Photos is installed, it might be a target. Or you might have an option “Save to device” depending on your Android version (often, sending to a file manager or a dedicated “Save to Drive” etc.).
  • Nearby Share: If you have another Android device or a Chromebook, test the Nearby Share. On the sharesheet, look for “Nearby Share” or an icon of two overlapping circles. Tap it to send the file wirelessly. Accept on the receiving device and ensure the file transfers and appears in their gallery.

Common issues

  • Gradle Maven auth error: Some SDKs (if private) might require a Maven username/token. Double-check the documentation if Gradle can’t resolve the dependency.
  • File URI exposure error: If you see an exception about “exposed beyond app through Intent”, it means you need to use a content URI with a FileProvider. Ensure the sample (or your integration) uses FileProvider.getUriForFile and adds intent.addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION).
  • Device connection timeout: If the glasses don’t connect, try restarting Bluetooth and the app. For BLE devices, also ensure location services are on (Android sometimes requires location permission for BLE scanning).

6) Integration Guide — Add Sharing Features to Your Existing iOS App

Goal

Now that the sample is working, integrate the “Save to Photos / Share Sheet / AirDrop” feature into your own iOS app. We’ll use the iOS SDK (UIKit and possibly the glasses SDK) to capture an image from the glasses, then invoke the share sheet to let the user save or share it. By the end, your app will allow a complete end-to-end flow: capture on glasses → photo appears in app → user saves or shares it.

Architecture

At a high level, the flow involves these components:

  • Your App’s UI – e.g., a view controller that shows the captured image with a “Share” button.
  • Glasses SDK client – if applicable, a module that handles connecting to the glasses and receiving images (for instance, a GlassesManager that emits an event when a new photo arrives).
  • Sharing logic – this is essentially built into iOS (UIActivityViewController), but you might wrap it in a helper to prepare data (like converting images to files).
  • Callback handling – your app can get a callback when sharing is completed (to possibly show a confirmation or handle errors).

Example data flow: Glasses capture → SDK calls your delegate with UIImage → you store it (maybe display on screen) → user taps share → your code presents UIActivityViewController with that image file → iOS handles share (saving to Photos if chosen, etc.) → on completion, you log the result or update UI.

Step 1 — Install any required SDK (if not already in app)

If your app doesn’t already include the glasses’ SDK:

iOS (Swift):

  • If using Swift Package Manager, add the package in Xcode (e.g., for Snap Spectacles Mobile Kit, add the GitHub repo for the SDK). If using CocoaPods, add the pod (e.g., pod 'SpectaclesMobileKit' if it existed).
  • Import the necessary modules in your code (for example, import SpectaclesSDK).

(If the glasses don’t have a public SDK, you might be relying on the official app to get media into your app via the Photos library or a share import. In that case, you actually might not need any third-party SDK at all and can proceed to implementing the share sheet for images that the user has imported.)

Android:

  • Add the glasses SDK dependency in Gradle (implementation "com.vendor:glasses-sdk:version"). Sync project.
  • If no specific SDK (e.g., glasses sync via their app), you might skip directly to handling images via Intents.

Step 2 — Add required permissions

Now ensure your app has the correct permissions to use Photos and any connectivity:

iOS (Info.plist)

  • NSPhotoLibraryAddUsageDescriptionRequired. Add a human-readable reason, e.g. "Allow adding photos to your library so you can save pictures taken on your smart glasses." This enables the "Save Image" action in share sheets and triggers the permission prompt.
  • NSBluetoothAlwaysUsageDescription – If your app connects to the glasses via Bluetooth (common for wearables), include this with a message like “Allow Bluetooth access to connect to your smart glasses.”
  • Any other usage descriptions if needed: e.g., NSCameraUsageDescription (if your app also uses the phone camera), NSMicrophoneUsageDescription (if videos with audio), etc.

Android (AndroidManifest.xml)

  • <uses-permission android:name="android.permission.BLUETOOTH_CONNECT" /> (if needed for BLE connection on Android 12+).
  • <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" android:maxSdkVersion="28"/> for saving images on older Android (API 28 and below).
  • <uses-permission android:name="android.permission.READ_MEDIA_IMAGES" /> for Android 13+ to add media to Photos. (If you use MediaStore with the right flags, you might not need an explicit permission, but it’s good to declare.)
  • No specific permission is needed to launch the share sheet on Android, but the actual saving if done by your app might require the above.

Step 3 — Create a thin client or helper for sharing

To keep code organized, create a small service or utility that handles preparing and invoking the share sheet:

  • WearablesClient (optional): If you haven’t already, abstract the glasses connection in its own class. Ensure it can notify when a photo is captured (e.g., via a delegate or Notification).
  • MediaShareService: Create a class or extension responsible for sharing media. For example, a Swift class MediaShareService with a method shareImage(from viewController: UIViewController, image: UIImage).
    • This method can handle writing the UIImage to a temporary file (using UIImagePNGRepresentation or UIImageJPEGRepresentation to get Data then .write(to:) to a file in NSTemporaryDirectory()). You do this to get a file URL because passing a file URL yields better results in the share sheet (thumbnail + file metadata).
    • Then it creates a UIActivityViewController with the file URL and presents it.
    • Optionally set activityVC.completionWithItemsHandler to know which activity was chosen and if completed.
  • PermissionsService (optional): On iOS, you might not need a separate service if using the share sheet (it auto-prompts). But you could have a utility to check the current authorization status of .addOnly photo permission via PHPhotoLibrary.authorizationStatus(for: .addOnly) (returns .authorized/.denied etc.) – to possibly warn the user if they denied it and need to enable in Settings.

Definition of done:

  • When the glasses capture an image, it appears in your app’s UI (e.g., in an ImageView or a thumbnail list).
  • When the user taps the share/save button, the UIActivityViewController launches with that image, and they can successfully save to Photos or AirDrop it. The app does not crash or show any permission errors at this point.
  • Your code properly handles the connection lifecycle of the glasses (if the glasses disconnect or app goes to background, it handles it gracefully), but that’s beyond the sharing scope – just ensure the image you want to share is accessible when needed.
  • Error cases are considered: e.g., if the user hasn’t allowed photo access and selects “Save Image”, iOS will fail silently. Ideally, you intercept beforehand and perhaps show an alert like “Please enable Photos permission to save images” with a shortcut to Settings if .denied.

Step 4 — Add a minimal UI screen

Design a simple UI within your app for this feature:

  • Capture display: An UIImageView or similar to show the last captured photo from the glasses. This gives users feedback of what they might save/share.
  • “Connect” button (if needed): If your app needs to connect to the glasses manually, include a button to handle that (tapping it would call into your WearablesClient to initiate pairing).
  • “Share” button: A prominent button (perhaps with an action icon) that triggers the MediaShareService.shareImage(...) function. If you want separate actions, you could also have:
    • “Save to Photos” button that directly saves without showing the share sheet (this would use UIImageWriteToSavedPhotosAlbum() and require Photo permission handling via PHPhotoLibrary).
    • But since the share sheet includes Save, many apps just use one Share button to cover all.
  • Status labels: It helps to show a connection status (e.g., “Glasses Connected ✅” or “Disconnected 🔴”) and maybe a note like “Captured image saved” after a successful save.

With the UI in place and wired to your sharing logic, you can build and run the app on your iPhone to test the integrated feature end-to-end.


7) Feature Recipe — Let Users Save or Share a Captured Photo

Goal

Allow the user to tap a button in your app to save a photo (taken from their smart glasses) to the iPhone’s library or share it via the standard iOS sheet. We’ll focus on the photo use-case for clarity (video is similar, just ensure file format compatibility).

UX flow

  1. Capture: User has their glasses paired and takes a photo (via glasses hardware or an in-app command).
  2. Display: The new photo appears in the app’s UI (thumbnail or full-screen preview).
  3. User action: The user taps a Share button in the app, indicating they want to export this photo.
  4. System sheet: The iOS share sheet slides up, showing sharing options.
  5. User selects option: e.g., taps “Save Image” or chooses an AirDrop target or another app.
  6. Result: If Save was chosen, the photo is saved to Photos and the user sees a system confirmation (e.g., a small “Saved Image” HUD). If AirDrop, the photo is sent and the receiving device handles it.
  7. Completion: The sheet dismisses. Your app could show a brief confirmation message like “Photo saved to Photos 👍” if you want (optional, since system often handles it).

Implementation checklist

  • Ensure photo availability: Your code that receives the glasses photo should store it (in memory or disk) so it’s available by the time the user taps Share. (If the photo is large, you might already have it on disk – in which case, you can use that file directly.)
  • Permissions ok: Confirm Info.plist contains the usage descriptions (the presence of NSPhotoLibraryAddUsageDescription means iOS will prompt as needed – you typically don't manually prompt via code when using UIActivityViewController).
  • Share sheet integration: Implement the share action. For example, in your view controller: swift Copy code @IBAction func shareButtonTapped(_ sender: Any) { guard let image = lastCapturedImage else { return } let tempURL = saveImageToTempDirectory(image) let activityVC = UIActivityViewController(activityItems: [tempURL], applicationActivities: nil) present(activityVC, animated: true) } Where saveImageToTempDirectory writes the UIImage as JPEG/PNG to a file and returns the file URL.
  • Progress/feedback: While not strictly necessary (the share sheet appears quickly), you might disable the Share button while writing the file or show a tiny loading spinner if the image is large.
  • Handle completion: Optionally, set activityVC.completionWithItemsHandler. In it, you can check activityType. If activityType == UIActivity.ActivityType.saveToCameraRoll (or its raw string value "com.apple.UIKit.activity.SaveToCameraRoll"), and completed == true, you know the user saved the photo. You could then do something like update UI or send a log event (see Observability section).
  • Error handling: If writing the image to a file fails (low disk space, etc.), handle that by showing an alert. If the share sheet itself errors out (rare), activityError in the completion handler would be non-nil, which you could log.

Pseudocode

swift

Copy code

func onCaptureFromGlasses(image: UIImage) { self.lastCapturedImage = image imageView.image = image // Show the image in UI // Perhaps auto-open the share sheet to prompt user immediately (or they manually tap share) } @IBAction func shareButtonTapped(_ sender: UIButton) { guard let image = lastCapturedImage else { showAlert("No image to share. Capture a photo first.") return } // Prepare image file let tempDir = FileManager.default.temporaryDirectory let fileURL = tempDir.appendingPathComponent("glasses_capture.jpg") do { try image.jpegData(compressionQuality: 0.9)?.write(to: fileURL) } catch { showAlert("Failed to prepare image for sharing: \(error)") return } // Create share sheet let activityVC = UIActivityViewController(activityItems: [fileURL], applicationActivities: nil) activityVC.completionWithItemsHandler = { activity, success, items, error in if let error = error { print("Share failed: \(error)") } else if success { if activity == .saveToCameraRoll { print("✅ Photo saved to library") } else if activity?.rawValue == "com.apple.UIKit.activity.AirDrop" { print("✅ Photo sent via AirDrop") } } } present(activityVC, animated: true) }

(This Swift pseudocode demonstrates saving the UIImage to a JPEG file, then using the share sheet. In a real app, add proper permission checks and user notifications as needed.)

Troubleshooting

  • Save option not appearing: Ensure you're using a file URL or UIImage in the activity items and have the usage description in Info.plist. Without the Info.plist key, iOS hides the save action.
  • No thumbnail in share sheet: If the share sheet just shows your app name or a generic icon instead of an image preview, it means you passed an in-memory object. Save to file and share the file URL to get a preview.
  • AirDrop target not showing: Make sure Wi-Fi and Bluetooth are ON on both devices. Also, AirDrop on the receiving device should be set to accept from the sending device (Contacts or Everyone). If not, the share sheet might not list any AirDrop devices. That’s an OS-level setting, not your app’s fault.
  • User expects instant save: Some users might prefer one-tap save without further interaction. If so, you can implement a separate “Save” button that calls UIImageWriteToSavedPhotosAlbum(image, self, nil, nil). This will directly save the image and immediately prompt for permission if not given before. But note, doing it via the share sheet as we did has the advantage of bundling multiple options without extra UI.

8) Testing Matrix

Test your implementation across these scenarios to ensure robustness:

← Scroll for more →
ScenarioExpected ResultNotes
Using a mock image (no glasses)Share sheet appears and allows save/share of a placeholder image.Good for CI or if you lack hardware – simulate as if an image was captured.
Real glasses capture (close range)Image transfers quickly, share sheet works. AirDrop to nearby device under 5 seconds latency.Baseline real-world test with device near phone.
App in background when capture arrives(If supported) The image is queued or user is notified when they return to app to share it.iOS might not allow background UI update; ensure your app handles it gracefully (e.g., show new capture when reopened).
Device locked when sharingIf user tries to AirDrop while phone locks, the transfer should resume/unpause after unlock.iOS should handle this, but test that no crashes occur if device locks mid-share.
Photos permission deniedIf user had denied add-photo permission, tapping “Save Image” yields no action or an error callback.Your app should detect this and perhaps prompt the user with instructions to enable permission in Settings.
Glasses disconnect mid-captureIf the glasses go offline, your app should handle it (show “disconnected” and disable share if no new image).Not directly related to sharing, but ensures overall flow doesn’t hang.
Multiple images in a rowCapture two photos sequentially and share the second one.The latest image should be shared. Check that temp file logic doesn’t accidentally share the old image due to filename reuse (use unique names or overwrite carefully).

Make sure to also test on different iPhone models or iOS versions if available (e.g., iOS 16 vs iOS 17) to catch any differences in share sheet behavior or permission dialogs.


9) Observability and Logging

To maintain and support this feature, add logging for key events and outcomes:

  • Connection events: Log when the glasses connect or disconnect. E.g., GlassesConnectionManager.log("connect_start")"connect_success" or "connect_fail" with error details. This helps separate issues of sharing vs device connectivity.
  • Permission state: Log the result of the Photos permission prompt. For instance, when the UIActivityViewController completion fires for SaveToCameraRoll the first time, log if it succeeded or if the user canceled (which could imply they denied permission).
  • Share events: Log each share initiation and result:
    • When user taps share: Analytics.track("share_initiated", {"mediaType": "photo"}).
    • If user chose a particular target and completed: log the activityType. For example, "share_completed", {"destination": "AirDrop"} or "share_completed", {"destination": "save_to_photos"}.
    • If activityType is nil and completed is false, that means the user dismissed the sheet without choosing (you could log a cancel event if desired).
  • Latency metrics: If possible, capture timing – e.g., how long from tapping share to completion. Particularly for AirDrop, you might measure from the moment share sheet is presented to the completion block being called with success. This can inform if large files take too long, etc.
  • Error tracking: If the completion handler provides an error (or if your save function catches an error), surface that via your logging or crash reporting. For instance, if activityError is non-nil (perhaps an unavailable activity, etc.), log it with a message.

By instrumenting these, in your analytics or logs you can answer questions like: How often are users saving vs AirDropping?Are any users failing to save due to permissions?Average AirDrop transfer time?, etc. This data is valuable for future improvements (or for convincing stakeholders of feature usage).


10) FAQ

  • Q: Do I need the actual glasses hardware to implement this feature? A: Not strictly to implement the sharing feature. You can develop the save/share functionality with any image (even one from the iPhone camera) because it uses standard iOS APIs. However, to test the end-to-end flow of capturing from the glasses, you’ll eventually need the device (or a provided simulator) to ensure the image transfer part works. Many developers start by using a placeholder image to build the UI and sharing logic, then integrate the real device when available.
  • Q: Which smart glasses are supported by this kind of integration? A: In general, any smart glasses that allow you to get the photos/videos into an iOS app can be supported. For example, Snap Spectacles (particularly the Snap AR Spectacles) have an SDK that lets your app receive Lens content frames. Meta's Ray-Ban Stories do not have a public third-party SDK, but your app could access the photos after the user imports them via the Meta View app (e.g., from the Photos library). Enterprise devices like Vuzix or Epson Moverio might run Android onboard (different integration). The guide focuses on consumer devices where the phone app does the heavy lifting of sharing after capture.
  • Q: Can I ship this feature to App Store? A: Yes, if you follow the guidelines. iOS's share sheet and Photo saving are public APIs. Just ensure you provide the privacy usage description for adding to Photo Library, or App Review will reject the app. Also, test that if the glasses SDK is beta, you're allowed to distribute an app using it (some beta SDKs might forbid publishing until final release). Assuming no such restrictions, there's no problem shipping this. It's essentially using standard iOS functionality to save images.
  • Q: Is there an equivalent to AirDrop on Android that I should implement? A: Android’s closest equivalent is Nearby Share, which is built-in to modern Android devices. You don’t implement a separate API for it; by invoking the standard share Intent with your image, “Nearby Share” will appear as an option in the chooser automatically if the feature is enabled on the device. Just like AirDrop, it allows offline sharing to nearby devices. So, on Android you get that feature out-of-the-box via the share sheet. The user just needs to tap the Nearby Share icon.
  • Q: Can I automatically save every photo to the user’s gallery without prompting (background save)? A: Technically you can call UIImageWriteToSavedPhotosAlbum on each incoming image to save it immediately. However, the first time, iOS will still prompt the user for permission to add to Photos. After that, it will silently save. The downside is the user might not want every single capture saved (maybe they took some test shots). It might be better to give them control (save on demand) unless the product requirements say otherwise. Also, consider that if a large batch of photos save at once, it could slow the device or clutter the camera roll. Provide clear settings or choices if implementing auto-save.
  • Q: The share sheet has a lot of icons (Print, Assign to Contact, etc.) – can I limit them? A: Yes. You can set the excludedActivityTypes property on UIActivityViewController to filter out certain options. For example, you might exclude .assignToContact, .print, etc., if they don't make sense for your context. Be cautious not to remove too much; generally it's fine to leave the defaults. On Android, you have less control over the chooser list (it's managed by the system, though you can create a custom chooser if needed).
  • Q: Can I share multiple images at once (e.g., a burst of photos)? A: UIActivityViewController supports multiple items. You could provide an array of file URLs/images. The share sheet would then let the user, for instance, save a batch of images to the library in one go (it would ask “Save X Images”). This could be useful if your glasses can capture many photos quickly and the user wants to export all. Just be mindful of memory and performance if the images are large – you might want to downscale or handle them efficiently.

11) SEO Title Options

  • “How to Save Smart Glasses Photos to iPhone Camera Roll and Share via AirDrop”
  • “Integrating iOS Share Sheet: Save to Photos & AirDrop for Wearable-Captured Media”
  • “Step-by-Step: Add ‘Save to Photos’ Option for Images Captured with AR Glasses on iOS”
  • “Smart Glasses to iPhone: Sharing Captured Media with Photos App and AirDrop”

(The above are potential titles optimized for searchability. They include keywords like “smart glasses”, “iPhone”, “save to camera roll”, “AirDrop”, etc. Choose one that best fits the focus – for example, the first one directly addresses a likely query about saving glasses photos to iPhone.)


12) Changelog

  • 2025-12-25 — Verified with iOS 17.2 on iPhone 15 Pro, using Snap Spectacles 2023 (Snap OS 5.64) for testing. Ensured UIActivityViewController flow and NSPhotoLibraryAddUsageDescription compliance on latest iOS. Updated Android instructions for API 33+.