Anyone who's ever heard of Apple's proprietary spatial audio with dynamic head-tracking technology will tell you, pretty strongly, that movie content is delivered from an iPhone, iPad, Mac (with silicone), or Apple TV 4K. to your AirPods Max, AirPods Pro, or AirPods. 3 is profoundly transformed.
Still to give it a spin? you will enjoy Try the opening scene from Gravity, the shootout from the latest James Bond epic, No Time To Die, or pretty much any scene from Venom, ideally from an iPad Pro 12.9 to a set of AirPods Max. But we got sidetracked...
At its recent WWDC 2022 event, Apple leaked an update for Spatial Audio that comes with iOS 16. The news? Now it's personal.
The announcement was pretty simple: Craig Federighi, Apple's senior vice president of software engineering, said that with the release of iOS 16, users will be able to use an iPhone's TrueDepth camera to profile "Custom Spatial Audio."
So no one asks you to go out and get your ear impressions from an audiologist; your iPhone's camera has this covered. In truth, this is not the first time we have seen this kind of approach from helmet manufacturers.
The Sony Headphones app, for example, has been guiding users through photo-style ear scans using their camera phones for a while now, through two generations of its WH-1000XM4 and WH-Top Selling 1000XM5. And of course, Sony has its own spatial audio format to get the most out of it, Sony 360 Reality Audio…
How does spatial audio work now and how will it improve?
The third-generation AirPods brought Spatial Audio with head tracking at a lower price point than the AirPods Pro. (Image credit: Apple)
Sony 360 Reality and Apple's head-tracking spatial audio use what's known as head-related transfer functions (HRTFs). It is a kind of formula to explain the physical differences in any listener (shape of the ears, shape of the head, distance between the ears, if there really is something between them… OK, the last one is a joke) that will affect the reception of the sound . from a given point plotted in space - Sony is very clear that its immersive solution works in a sphere around the listener.
By processing data from thousands of people, an HRTF can be created that is closer to the perception and response of the average person, ie. Immersive spatial head tracking sound that will impress just about anyone.
So why do you need to customize? Well, Sony already does, though by mentioning the iPhone's TrueDepth camera, Apple is hinting that its snapper can also do 3D mapping of your channels, though the company hasn't explicitly said so.
It's also unclear whether Apple's custom spatial audio will involve a more robust fit test than the one currently featured in AirPods Pro, which plays sounds and uses the microphone in the ear to test the effectiveness of the seal you have between the ear canal. and the AirPods. , all with the aim of achieving the best possible audio quality and noise cancellation. Will the new customization process also include a hearing test, as seen in products like the ultra-customizable NuraTrue headphones?
But, and that's a compliment, since Apple's spatial audio is already great, will all of this really make it better? That remains to be seen; The iOS 16 beta is now with a select group of testers, the public beta is coming in July and a full release is planned for late 2022 as long as you have an iPhone 8 or later, and if you're feeling brave you can have it now. on your iPhone, though we're not sure we'd recommend that course of action just yet.
Opinion: Apple's Spatial Audio will accomplish great things, but not with this particular update
AirPods Max already offer impressive spatial sound. Can you really get better? (Image credit: TechRadar)
However Apple is "customizing" its wonderful spatial sound with head tracking, I doubt this is the update we all really want to see and more importantly hear.
You see, spatial audio takes surround sound Dolby Atmos signals and adds directional audio filters on top, adjusting the frequencies and volume levels each of your ears can hear so sounds can be placed virtually anywhere. around his person. And when using Apple's high-end AirPods and an Apple device with head tracking implemented, the device is also positioned and recognized as the sound source - watch one of the recommended movies at the beginning of this room and just walk away. a few steps from your device. Now turn around slowly. Watch?
What would be really amazing when it comes to spatial audio is the ability to physically walk through a symphony orchestra, stopping next to the bassoon or second violins, perhaps. Currently, your device is always the source, so while it's immersive, you can't get that truly advanced level of customization; you can't concentrate on the timpani while stepping on it.
From an app perspective, the ability to switch head-tracking content from “device as source” to “in-place”, if you had the space to experience your virtual tour of the Sydney Opera House at your local community center, for example, it would be to really improve spatial audio. And it may come, but it hasn't happened yet.
As with all these advancements, it's when the technology is truly advanced and malleable, when the end user is able to push it to the limit, break it, and put it back together incorrectly, but in a way that he sees as an improvement: that space the audio will reach its full potential.
I'm not sure taking a picture of your ear to optimize spatial audio for AirPods will accomplish this, but I don't want to spoil Apple's parade, either. It's definitely a step in the right direction, and I'm very excited to see what this award-winning technology can accomplish in the future. After all, we are so convinced that we even selected 10 albums that we would like to see available on Spatial Audio.