Game-changing audio technology brings professional sound to mobile and IoT applications

February 1, 2017 OpenSystems Media

 

Waves Audio began 25 years ago as an R&D company serving the professional audio industry. They were the first to bring digital equalizer technology to recording studios and never looked back, having accumulated over 350 award-winning tools for producing and broadcasting media.

About 10 years ago, they decided to branch into the consumer market and started doing audio, voice, and speech performance enhancement. They now ship over 100 million devices annually in the consumer space with big-name original equipment manufacturers (OEMs) like Dell, LG, and Google Home, to name a few.

Waves Audio recently announced their Waves Nx binaural audio engine has been successfully targeted for ARM Cortex-A-based devices to bring advanced professional-grade sound and virtualized mixing and mastering to mobile and IoT applications (Figure 1). This announcement puts professional audio experience and capabilities into the hands of the consumer.

Perhaps the most intriguing aspect of this announcement is what Waves Nx is capable of even with simple headsets. Tomer Elbaz, Executive Vice President and General Manager of the Consumer Electronics Division for Waves Audio explained where the technology started. “Our binaural audio engine was designed to virtualize an arbitrary number of sound sources and place them around the user. We released this to the professional audio market as a virtual mixing room, providing a realistic simulation of a mixing room for engineers to make critical sound decisions that were impossible to do using headphones if you didn’t have the tone preserved and acoustics correct. By tracking head movements, you get the sound out of the head and put it in front of you. The feedback was incredibly favorable and users were impressed with how accurate the sound was.”

Idan Egozy, Product Manager at Waves Audio explained how the head position tracking works. “There is a tracker powered by the ARM processor that maps head movements. Waves Nx emulates the way audio behaves in a real acoustic space and tracks the orientation of the listener’s head to accurately simulate the way the acoustic waves reach each of the ears, which is just the way we perceive where sounds originate from in the real world. Gyroscope and telemetry information track head movements and the data is delivered to the processing algorithms over a Bluetooth Low Energy (BLE) transport.”

The revolutionary thing about this technology is its ability to virtualize true surround sound with nothing more than a pair of headsets. You can experience it for yourself with an Apple or PC, headset, Chrome browser, and the www.waves.com/nx/player web site. The site walks you through install, download, and a demo.

The ability to augment headsets and HTC Vive Virtual Reality Headsets with audio that behaves like real acoustic space is impressive, but far from the application ceiling for this technology. Idan mentioned “We threw together a demo that included a 360 degree virtual reality (VR) racquetball environment. The sounds as the ball moved around the court were so life-like, the audience immediately started planning VR games to take advantage of the technology.”

Tomer believes that as these advanced capabilities proliferate, artists will start taking advantage of them in new and exciting ways to produce content. “If you look back at the last century when the concept of two-speaker stereo mixing was introduced, people didn’t get it for quite a while until the content made the value obvious,” he said. “With this technology, taking a two speaker system and virtualizing as many speakers as you wish interacting in real time to create a user-centric personalized experience that includes changes by simply tracking head movements will bring new creative outlets that promise to fundamentally change the way artists create content and the way consumers listen to it.”

Tomer also mentioned they will be producing a chip that incorporates all the head tracking technology needed to embed it into headphones and other devices to bring these advanced capabilities to the consumer headphone market. The chip will be announced in the Mobile World Congress (MWC) in Barcelona next month.

Key VR audio technologies like this go beyond traditional audio applications. IoT applications are starting to be predicated on voice and audio for commands, responses, and execution. Consumer and home IoT applications gain significant value with enhanced audio. Unlocking virtualized sound and surround-sound capabilities will be a transformational game-changer for these audio-rich IoT applications as well.

More information on the technology can also be found at www.waves.com/3d-audio-on-headphones-how-does-it-work.

 

Curt Schwaderer, OpenSystems Media
Previous Article
Hardware emulation: Tool of choice for verification and validation

Design in any discipline - electronics, mechanical, aerospace, etc. - begins with a specification that capt...

Next Article
CES 2017: From competition to coopetition - Standards groups work toward unifying IoT with dotdot

I'm sure you've all seen the following comic before and chuckled when thinking about the proliferation of I...

×

Stay updated on Developer Tools & Operating Systems with the Dev Tools and OS edition of our Embedded Daily newsletter

Subscribed! Look for 1st copy soon.
Error - something went wrong!