Gurman: Apple AirPods with built-in camera currently in final testing stage
Siri is Getting Eyes: Apple’s Camera-Equipped AirPods Enter Final Testing
Imagine walking down a street in a foreign city, looking at a menu, and having your earbuds whisper the translation directly into your ear—without you ever pulling out your phone. That futuristic vision is getting a lot closer to reality. Recent reports indicate that Apple’s long-rumored AirPods with built-in cameras have officially moved into the final stages of testing.
The Road to Production
According to Bloomberg’s Mark Gurman, these next-gen earbuds are currently in the Design Validation Testing (DVT) phase. For those not deep in the supply chain lingo, this is essentially the home stretch. It’s the stage where prototypes are rigorously tested internally to ensure the hardware is polished before moving into mass production.
If everything goes smoothly, the next stop is Production Validation Testing (PVT), where the first batches of retail-ready units hit the assembly line.
Why Put Cameras in Your Ears?
It sounds a bit sci-fi (and maybe a little strange), but the goal isn’t to snap photos. Instead, these low-resolution cameras are designed to act as “eyes” for a revamped, AI-powered Siri. By seeing what you see, the earbuds can provide contextual information about your surroundings, assist with navigation, or identify objects in real-time. This is a major pillar of Apple’s broader push into wearable AI and augmented reality.
Rumors suggest these might carry the “AirPods Ultra” branding. While they’re expected to look similar to the current AirPods Pro 3, the stems might be slightly elongated to accommodate the new sensors and camera hardware.
When Can We Buy Them?
While the hardware is reportedly nearly finished, the software remains the wild card. Apple is notoriously perfectionist when it comes to user experience. If the integration with the new “Apple Intelligence” version of Siri isn’t 100% seamless, the launch could see further delays. Originally slated for earlier this year, the timeline has shifted as engineers fine-tune the complex AI interactions.
We’re likely looking at a launch window that depends entirely on software stability. One thing is certain: the way we interact with our environment is about to get a whole lot more interesting.
