
At CES 2019, Sony totally messed with my brain.
The company’s new 360 Reality Audio demo is the stuff of futuristic dreams at the moment, but if it pans out the way Sony hopes it will, it’s not only going to be a revolutionary way to listen to music, but for folks like me, it could also be a way to improve the very quality of life itself by means of a thrilling new way to make music come alive.
Before we get to the demo, however, it’s worth noting that I almost missed it entirely. Sony’s press conference at CES 2019 was, to put it mildly, bizarre. The company showed off a total of zero new electronics products at a show that is built almost entirely on new electronics products. Instead, Sony’s hosts vamped for 45 minutes straight about its music and cinema branches, at least half of which was reserved for an all-out gush session about Spider-Man: Into the Spider-Verse (which, to be fair, is an amazing movie, no pun intended).
As such, one can forgive a lowly journalist like myself for skipping the interminable line in front of Sony’s Reality Audio demo to catch another meeting for which I was already late (welcome to CES). As it turned out, however, it was the wrong demo to skip, accounting for my most impactful audio experience at the show.
Bending reality
After hearing multiple people rave about the experience, I made my way back to Sony’s booth on my last day of the show to see what all the fuss was about. Following a short wait (the line was thinning out by now), six of us were ushered into a dark room and guided to carefully arranged stools. Circular racks loaded to the brim with powered studio monitors rested both above our heads and at eye-level, and in front of each of was a pair of Sony’s MDR-Z7M2 high-resolution headphones. I now had some idea of what we were in for.
On the wall in front of us was a projection of Sony’s new audio software, a nut-to-tree system designed for mixing (or remixing) sound in 3D. The system uses similar object-based surround techniques deployed in cinema systems like Dolby Atmos and DTS:X, but is designed specifically for music.
The demo was my most impactful audio experience at the show.
After a frequency-spanning signal burst from each of the speakers to tune the sound for the room, our host blasted us from all sides with two brilliantly engineered tracks, including an instrumental piece and a pop tune. As the audio showered all around me, I received the first telltale sign that lets me know my ears and I are having a good time – I got chills.
While I’d experienced similar demos before, the next part of it was a new twist. The engineers next placed teensy little microphones that looked like bent paper clips into our ears and then told us to put on the headphones sitting in front of us. Again, they shot frequency-spanning bursts of sound into our ears – this time only from the headphones in order to measure our hearing profiles. Then, something magical happened.
While the host was still talking, he cranked up the music again, and we were immersed once more on all sides. Once again, my physiology responded, and I got chills. Only this time – as you may have already guessed – it wasn’t coming from the speakers, but from the headphones.
I knew what was coming. You knew what was coming. And yet, my brain told me what I knew was wrong: The sound was still coming out of the speakers. Predictable though it was, the whole setup was sort of brilliant, including the Sony rep’s speaking during the second demo, which helped my ears locate both him and the speakers as a spatial home base in relation to my spot in the room. In spite of my own logic, there was simply no audible way to decipher the difference in sonic depth and direction between the two sources. I was fooled.
Coming to headphones near you?
As alluded to above, Sony is far from the first to show off such a demo. I was given a similar wow-factor test featuring DTS Headphone: X, which is designed more for cinema sound, while Sennheiser’s years-in-the-making Ambeo system is built around similar techniques and sound properties.
Still, Sony’s demo may be the most striking I’ve ever heard when it comes to re-creating both the full immersion of the speaker set and the thrilling liveness of the sound in stereo headphones. You really do feel like the sound is up close and in person. It’s virtual reality for your ears, only they’re apparently a lot easier to fool these days than your eyes.
The real question is when can we all get a hold of this technology, and the answer may be a long time coming. For one thing, it’s obviously impractical for Sony to hand out its ridiculously expensive microphones to each user of its 360 Reality users for analysis. Instead, the company envisions an app that will photograph your ear and deliver the content, but it remains to be seen if that would be as effective. In our demo, the microphones failed to capture one of the listener’s ears correctly and had to be restarted, so it appears to be quite sensitive at present.
Second, on the content side, it may also be some time before Sony has many remixes using its new object-based sound technology, let alone native recordings. Sony being Sony, we also imagine the system will be limited – at least for some time – to Sony artists and equipment.
Still, Sony’s new 360 Reality system is a promising way to get closer to your music than ever before and we’re just getting a taste of what this kind of object-based sonic immersion will be. One can imagine multiple applications, from jamming out to live concert mixes or studio sessions to virtual reality pairings that use spatial trickery to put you in a hallucinogenic fantasy world.
As far as this writer is concerned, that makes right now a very exciting time to be a fan of all things audio. I can’t wait to see where this technology goes next.