Apple Maps has just received a series of new features on iOS 26, but the most important change is the search engine, the core part of the application. Apple said Maps now supports natural language search thanks to the power of Apple Intelligence.
Similar to Photos, Apple Music or TV applications that were significantly improved thanks to AI, Maps are now also "upgraded". This feature is not in a separate mode, but silently improves the entire search capabilities on Apple Intelligence-powered devices.
When opening Maps on iOS 26, users can see the Search the Way You talk notification - search like how you chat every day. The example given is very intuitive: Finding a coffee shop with free Wi-Fi.
To do this, Apple applies the same type of language model it is using to summarize notifications, classify emails or suggest message replies. Thanks to that, Maps can fully understand long questions and a lot of data.
The addition of natural language search to Maps shows that Apple is gradually turning Apple Intelligence into the core of the iOS experience.
Not showing off with eye-catching effects, but these silent changes create the most obvious value, helping users find the right thing they need, faster and simpler.