Apple's next AirPods Pro — internally called "AirPods Ultra" — have reached advanced testing with cameras designed to give Siri real-world visual context, challenging Meta and Google in ambient AI.
Apple's Camera-Equipped AirPods Reach Advanced Testing, Targeting Siri Visual AI
By Hector Herrera | May 16, 2026 | Vertical: Home
Apple's next-generation AirPods Pro — internally referred to as "AirPods Ultra" — have reached advanced hardware testing, with early mass production potentially beginning this summer, according to Bloomberg's Mark Gurman. The cameras will give Siri the ability to see what's around you — a capability Apple needs badly to compete with AI assistants that have left Siri behind on real-world context.
Why This Matters Now
Apple has been losing ground in the AI assistant race since OpenAI's GPT-4o and Google Gemini demonstrated real-time visual understanding in 2024. Siri's inability to process what users are looking at — a trivial task for competitors — has been a recurring source of frustration. Adding cameras to AirPods is Apple's bet that ambient, always-available wearable AI will become a more natural interface than lifting a phone.
The timing is deliberate. Meta's Ray-Ban smart glasses — camera-equipped, AI-connected — have been a quiet commercial success. Google is reportedly reviving smart glasses. Apple is choosing earbuds as its wearable camera platform, keeping the AI in your ear rather than on your face.
The Hardware
Each AirPod will contain one small camera — low-resolution, capturing visual context rather than full photographs or video. The sensors are designed to identify objects, read labels, and understand the environment around you without creating a persistent visual log. According to Gurman's reporting:
- One camera per earbud, positioned to capture the space in front of the wearer
- Low-resolution capture — context, not documentation
- No continuous recording; activation tied to Siri queries
- Priced above the current $249 AirPods Pro 3, likely in the $350–$400 range based on component costs
- Launch target: fall 2026, alongside iOS 27 and a redesigned Siri
What Siri Would Actually Do With This
The cameras don't help Siri unless Siri can use what they see. Apple's plan — as described through Gurman's reporting — is a redesigned Siri that can process visual context in real time. Ask "What kind of plant is this?" while walking past a garden and Siri answers. Hold a bottle in your hand and ask "Can I take this with my medication?" and Siri reads the label.
Get this in your inbox.
Daily AI intelligence. Free. No spam.
This is the use case Apple has been building toward since acquiring AI teams over the past three years. The camera hardware and the Siri software redesign are inseparable — one without the other would be useless.
What the Competition Looks Like
The field is moving fast:
- Meta Ray-Ban glasses — camera, voice AI, real-time visual queries. Commercial and growing.
- Google smart glasses — reportedly in development, leveraging Gemini's multimodal capabilities.
- Samsung — experimenting with AI-enhanced Galaxy Buds for audio-first context.
Apple's advantage is the installed base: hundreds of millions of AirPods users who already carry Apple's earbuds daily. If the cameras work and Siri delivers, the distribution is already there. The question is execution.
The Privacy Framing
Apple will lean heavily on its privacy positioning: no continuous recording, no cloud-stored visual data, on-device processing where possible. This framing will be essential. Cameras on your body are categorically different from cameras on your phone — the expectation of consent and control shifts.
Apple has navigated similar perception challenges before (Face ID, always-on Siri listening). Its record on not monetizing user data gives it more credibility here than most competitors. But the messaging will need to be exceptionally clear.
What to Watch
Whether Apple can deliver a redesigned Siri that actually leverages the cameras by launch day — not as a demo feature, but as a reliable daily-use tool. A camera in an AirPod that Siri can't meaningfully use would be a hardware investment with no software return. The fall 2026 timeline is ambitious given how far behind Siri has been.
Also watch: how regulators in the EU respond to always-available cameras in consumer audio devices. The GDPR (General Data Protection Regulation) implications of biometric-adjacent wearable data are still being worked out.
Hector Herrera covers AI in consumer technology, home systems, and the companies building them. He is the founder of Hex AI Systems.
Did this help you understand AI better?
Your feedback helps us write more useful content.
Get tomorrow's AI briefing
Join readers who start their day with NexChron. Free, daily, no spam.