Apple plans to revolutionize wearable technology with camera-equipped watches that use artificial intelligence to interpret the world around you.
Intelligent Vision: Apple Watch’s Camera-Powered Future
Apple is quietly developing a groundbreaking evolution of its popular smartwatch line. According to multiple reports from Bloomberg’s Mark Gurman, the tech giant is actively working on integrating cameras into future Apple Watch models, transforming them into sophisticated AI wearables capable of understanding and responding to the wearer’s surroundings.
Unlike the incremental updates expected for the 2025 Apple Watch models (which will likely focus on blood pressure monitoring and minor design refinements), this camera integration represents a more radical shift in the device’s capabilities. The technology would enable Apple Watches to capture visual information from the environment and process it through AI systems to provide contextual insights directly on your wrist.
Different Camera Placements for Series and Ultra Models
Apple’s engineering teams are exploring distinct camera configurations for different watch models. The standard Series watches would feature front-facing cameras integrated into the display—similar to the iPhone’s selfie camera but optimized for the watch’s compact form factor. Meanwhile, the more robust Ultra models would house camera lenses along the right-hand side near the Digital Crown and Side Button, providing a different capture angle.
This dual approach acknowledges the different use cases and physical constraints of each watch type, while maintaining Apple’s characteristic attention to thoughtful design integration.
Visual Intelligence: From iPhone to Your Wrist
The camera-equipped Apple Watches would extend Apple’s Visual Intelligence feature beyond smartphones. Visual Intelligence, which debuted with iPhone 16 and is expected to reach iPhone 15 Pro models with iOS 18.4, allows users to analyze objects and text using AI models from ChatGPT and Google Search.
When implemented on Apple Watch, this capability would enable scenarios where users could:
- Point their watch at a restaurant to instantly view ratings, hours, and menu highlights
- Identify landmarks while traveling without pulling out their phone
- Scan objects for shopping information or compatibility with other Apple devices
- Receive visual confirmation during exercise to ensure proper form
- Capture moments with a quick gesture when a phone isn’t readily accessible
These interactions would be seamlessly integrated into the Apple Watch experience, with information presented on-screen or possibly read aloud through the watch speaker or connected AirPods.
Beyond the Failed AI Pin: Learning from Others’ Missteps
Apple’s approach appears to be learning from the stumbles of earlier AI wearables. Gurman specifically references the “dismal Humane Ai Pin” as a cautionary tale. Where the Ai Pin struggled with practical utility and user experience, Apple has the advantage of building upon an already successful and familiar wearable platform with established user behaviors and expectations.
Rather than creating a entirely new device category—always a risky proposition—Apple is enhancing a product that millions already wear daily, potentially delivering AI capabilities in a more intuitive and useful format.
Apple’s Broader AI Wearable Strategy
The camera-equipped watches are reportedly part of a larger ecosystem of AI-enhanced wearables. Apple is simultaneously developing camera-equipped AirPods with similar contextual awareness capabilities. Both products are expected to launch around 2027, suggesting a coordinated rollout of Apple’s wearable AI vision.
Mike Rockwell, who previously led development of Apple’s Vision Pro headset, is playing a key role in bringing these AI features to Apple’s wearable devices while continuing his work on visionOS. This leadership continuity hints at a unified strategy across Apple’s various extended reality and wearable initiatives.
An interesting aspect of Apple’s timeline is the company’s AI development roadmap. Current Visual Intelligence features rely on third-party AI models, but Bloomberg reports that Apple aims to transition to its own in-house technology by the time these new wearables launch.
This shift would give Apple greater control over the user experience, performance optimization, and privacy protections—all core values for the company. It would also reduce dependency on potential competitors in the AI space.
Beyond Just Visual Data
While the camera hardware enables visual analysis, the true power of these future watches would come from combining visual data with other contextual information. Location data, user history, time of day, and even health metrics could all influence how the watch interprets and responds to what it sees.
For instance, scanning a food item could yield different information depending on whether the user has dietary restrictions logged in their health data, or scanning a transit map might prioritize different route options based on the user’s typical commute patterns.
The Long Road to Release
Despite the exciting possibilities, consumers shouldn’t expect camera-equipped Apple Watches on store shelves anytime soon. Gurman describes the upgrade as “generations away,” with a potential launch around 2027. This extended timeline suggests Apple is taking a methodical approach to solving the considerable technical and design challenges involved.
The company has been exploring these concepts for some time, as evidenced by patents describing Apple Watches with folding screens and integrated cameras. The recent reports indicate these explorations have progressed to more concrete development efforts.
Potential for FaceTime and Similar Apps
While the primary focus appears to be on AI-powered environmental analysis, the addition of cameras could enable other features as well. FaceTime support would be a natural extension, finally bringing video calling capabilities to the wrist-worn device. The small screen size presents obvious limitations, but for quick video check-ins, the convenience might outweigh the display constraints.
Other possibilities include document scanning, augmented reality experiences, or even security features like visual authentication.
If you are interested in this topic, we suggest you check our articles:
- The New Apple AI Notification Summaries Feature Causes Confusion: A New Convenience for Users or An Even Bigger Headache?
- Apple Intelligence: Updated iOS with AI-Powered Innovation
Sources: TechRadar, Bloomberg, The Times of India
Written by Alius Noreika