At this year’s WWDC keynote, Apple revealed that it is introducing Adaptive Audio to AirPods. The new feature blends transparency mode with Active Noise Cancellation to match the conditions of your natural surroundings.
Using machine learning, Adaptive Audio will create a more customised listening experience so that you don’t have to play around with audio settings on your iPhone while you’re on the move.
Now when your AirPods detect that you’re on your way to or from work and there’s a lot of background noises with people talking, a new feature called Conversational Awareness will automatically activate transparency mode and lower background sound.
Conversational awareness is aimed at making the ambient listening experience – when you’re walking down a street or at work and still want to be aware of your surroundings – a lot more adaptive (I can see where they got the name).
The feature can identify background noises such as faraway conversations or a marching band and reduce the volume on these, while activating transparency mode at the same time.
It’s not adjustable but is adaptable
While active noise cancellation is a feature only currently available to the AirPods Pro 2 and AirPods Max, there are only three modes available – ANC, transparency mode and off.
By introducing Adaptive Audio, Apple has added a fourth mode that could offer the middle ground noise cancelling that we’ve been looking for. We previously wrote about six features I hope Apple adds in iOS 17 at WWDC and adjustable transparency modes was a big one for us.
It comes down to the fact that the transparency mode is limited to either being turned on really high to nothing at all. Adaptive Audio looks to offer the solution by adjusting the level of transparency automatically to match the environment. It looks like Apple really listened.