Earbuds that recognize what is around you: Apple is moving closer to launching a new generation of AirPods with built-in cameras, transforming the earbuds from an audio accessory into a wearable device that understands its surroundings. According to a report by Apple insider Mark Gurman, the project has reached an advanced testing stage, with prototypes featuring an almost finalized design and capabilities already being tested internally. If the product ultimately reaches the market, it could become one of Apple’s first major steps toward a world in which artificial intelligence not only listens to the user, but also sees the environment around them.

According to the report, the new earbuds include tiny cameras in both the left and right earpieces. Unlike smartphone cameras, they are not intended for taking photos, videos, or personal recordings, but rather for collecting basic low-resolution visual information. This data is expected to be used by Siri, Apple’s voice assistant, to answer questions related to what the user is seeing.

The core idea is relatively simple to understand but extremely complex to implement: Users would be able to look at an object, product, food ingredient, or street landmark and ask Siri what they are seeing, what can be done with it, or how to act accordingly. For example, someone standing in front of ingredients in the kitchen could ask what can be prepared for dinner. A pedestrian in the city could receive more precise directions, as the system identifies buildings, signs, or landmarks and explains when and where to turn.

In practice, this represents an attempt to turn AirPods into a wearable AI device, one that remains on the body for extended periods and continuously gathers information from the world around the user. Apple sees earbuds as a relatively natural starting point for such a move: They are already an extremely widespread product, worn for hours at a time, and frequently purchased alongside the iPhone. Since their original launch in 2016, AirPods have become one of the products most closely associated with Apple, beyond the iPhone itself.

According to Bloomberg, the prototypes are currently in what is known as the design validation testing stage, one of the final phases before early manufacturing tests. This means the hardware is already relatively close to its final form, but the path to a commercial launch still depends not only on the physical product, but also on the quality of the AI capabilities powering it. If Apple is not satisfied with the accuracy, speed, or reliability of the visual recognition system, the launch could be delayed.

Apple Store. The race toward artificial intelligence.
Apple Store. The race toward artificial intelligence. (credit: REUTERS)

Such a delay has reportedly already happened once. Apple had previously planned to launch the earbuds in the first half of the year, but delays involving the new version of Siri disrupted the timeline. The company has been working for some time on a major overhaul of its voice assistant after years in which Siri was perceived as lagging behind the new generation of AI tools.

The new AirPods feature is tied to a broader race across the technology industry. Meta is advancing smart glasses equipped with cameras and AI capabilities, OpenAI is recruiting senior hardware talent and developing new AI-centered devices, and other companies are also trying to discover the next device that could replace or complement the smartphone. Apple, long considered a company that prefers entering markets only once products are mature enough, is now under pressure to prove it is not falling behind.

One of the key questions surrounding the new product concerns privacy. Cameras embedded in earbuds naturally raise concerns among people around the user, especially if it is unclear when visual information is being collected and where it is being transmitted. According to the report, Apple has integrated a small LED light into the earbuds that is supposed to illuminate whenever visual data is being uploaded to the cloud. However, due to the earbuds’ tiny size, it remains unclear how noticeable that light would actually be in real-world environments.

Apple will also need to convince the public that cameras placed in earbuds do not turn them into covert recording tools. The company is expected to emphasize that the cameras are not intended for ordinary photography, but rather for temporary visual recognition designed to provide answers or guidance. Still, that distinction may prove difficult to explain to the broader public, especially in an era in which people are increasingly sensitive to issues involving surveillance, cameras, cloud services, and the use of personal information.

From a design perspective, the earbuds are expected to resemble the upcoming AirPods Pro 3, but with longer stems designed to house the cameras and related components. According to the report, Apple does not plan to turn them into a gesture-control device, as seen in some of its more sensor-heavy products. The goal is more focused: To add a layer of vision to an existing and widely used product without turning it into a bulky device.

Meanwhile, the report positions the earbuds as part of a broader wave of future Apple products. According to Bloomberg, the company is also working on smart glasses, a pendant-like device equipped with cameras, a touchscreen MacBook, a foldable iPhone, and AI-powered smart home devices. The big question is whether Apple will manage to combine polished hardware, privacy protections, an improved voice assistant, and genuinely useful artificial intelligence capabilities into a cohesive experience.