Apple is reportedly developing augmented reality glasses that will feature a more intuitive and fluid software experience than current competitors, particularly Meta’s Ray-Ban Meta Smart Glasses. One of the biggest limitations in Meta’s offering is the inability to run multiple apps simultaneously or switch between them without friction. Apple’s solution appears to center on a modular app framework, allowing users to interact with different services—like messaging, navigation, and media—without closing or restarting apps.
This approach builds on Apple’s existing ecosystem strengths, such as handoff, continuity, and background activity management, which already allow seamless transitions between devices and apps. By extending these capabilities to AR glasses, Apple could offer a more desktop-like experience in a wearable form, where users can layer apps and switch contexts effortlessly.
The glasses are expected to run a lightweight version of visionOS, optimized for low-power hardware and real-time responsiveness. Unlike Meta’s glasses, which rely heavily on voice commands and have limited visual feedback, Apple’s device may include gesture controls, eye tracking, and context-aware UI elements that adapt to the user’s environment and activity.
Apple’s rumored integration of Siri with on-device AI could further enhance usability, allowing users to summon apps, respond to messages, or control smart home devices without lifting a finger. This would address one of the core frustrations with current AR wearables: the lack of a cohesive, multitasking-friendly interface.
While Apple has not confirmed a release date, analysts expect the AR glasses to debut sometime in 2026, possibly alongside the second-generation Vision Pro headset. If successful, the device could redefine expectations for wearable computing, shifting the focus from novelty to productivity.