Although this technology is unlikely to appear on AirPods Pro 3 (expected to launch this year), it is still being researched and developed.
Apple's goal is to help AirPods better understand the environment around users.
Along with the iPhone 16 series, Apple has introduced the Camera Control feature - a new button that supports taking photos and adjusting camera settings, and introduced a new feature called Vision Intelligence.
Visual Intelligence is a powerful tool that helps users explore the world around them and perform actions based on real-life context.
For example, users can scan the event leaflet to add to the calendar application or use artificial intelligence such as ChatGPT or Google to learn about a certain topic.
According to Mark Gurman, Apple is looking to bring this feature to AirPods, in order to enhance its position in the artificial intelligence race: " Apple is developing a new version of AirPods Pro, using external cameras and artificial intelligence to identify the surrounding environment, providing information to users".
According to Bloomberg, this technology will not appear before 2027, possibly at the same time as the launch of AirPods Pro 4.
Although there are still many doubts about its feasibility, Apple seems determined to expand the ability of artificial intelligence to familiar products, creating the premise for a smarter ecosystem.