The Utah Digital Entertainment Network (UDEN) has been trying to get Utah creators entrenched deeply in the virtual and augmented reality war, since anyone with Google and a few minutes of research can see it really is the future of entertainment. Their last meeting was all hands-on with visitors from 11 companies, schools and projects (most from Utah) that already utilize the technologies.
AR/VR is an important and viable target since $2.3B was invested in the market’s development in 2016 and $800M in 2017 in just Q2. Digi-Capital predicts that more than 670 companies will be commanding a $108B VR/AR market a mere four years from now. What started as Google Glass, Microsoft Hololens, Oculus Rift, and rumors about Magic Leap has led to Pokemon Go, The Void, and a full year of market maturity. It’s definitely only going to get bigger.
What is Ambisonics?
To summarize other primers, ambisonics is most easily accomplished by recording audio with a 4-channel mic consisting of four cardioid or sub-cardioid capsules. Computer algorithms then translate the channels into 360 degrees of sound. It bears a similarity to light field photography in that you can record the audio once, then have a computer interpret the results in many ways (2-channel stereo, or any one of the many types of surround—5.1, 7.1, ATMOS, etc . . .). It’s a significant improvement to simple, binaural recordings that only mimic the way we hear, but don’t respond to the way we move.
There are tools to take regular mono or stereo signals and turn them into usable, 4-channel ambisonic wave files (known as B-Format), like Wave’s B 360 plug-in; so if you can’t afford an ambisonic mic, you can still get in on the action. (follow this link for 10% off of Waves’ stuff; that helps a little). The Ambix plug-in suite is also an option but takes some work to figure out. If you’re into the science of it, here’s a paper about the actual ambisonic transformations.
Like electric cars, ambisonic mics have been around for a while, but are about to start popping up all over. Like any new technology, the first ones to the market are at a premium. For example, the Schoeps ORTF-3D Outdoor mic has eight capsules and is almost $19k at the time of this writing; the SoundField ST450 is more than $7k; and the AUDEZE Planar Magnetic Tetrahedral VR Mic (that’s a mouthful) is about $4k. Recently though, Sennheiser released their AMBEO 4-channel mic for sub-$2000 (link to see current price). I won’t be surprised if every major mic maker released one that’s sub-$1000 in the next year. After all, the Ricoh 4-channel, 3D mic Audio-Technica helped develop for the Theta 360 camera is less than $300; admittedly, it has the appearance of a camera accessory, not pro audio equipment (first time I’ve ever seen a 5-conductor 2.5mm plug though).
The TetraMic is one notable, boutique outlier, at around $1k. Their product page contains a lot of information on how ambisonic audio can be reconfigured after the fact for many uses.
The cheapest solution, by far, is to use the Zoom H2n portable recorder. It doesn’t use the same style cardioid array as the more expensive mics, but still exports 4-channel surround sound audio to encode into 360-degree files. They also use Google’s spatial audio file format that’s compatible with Google’s VR toolkit. This recorder is at a price point that makes it an option for students to check out and use for class projects.
Google created a page on “spatial audio” that goes over ambisonics, how the head and reflections/reverb work, and how developers can implement all of this in their projects. The page notes that ambisonic files are still recorded from a single location, so they’re best used if the subject is only panning around and not moving, or in the distance of the virtual landscape. Other sounds can be added anywhere on the landscape, including being attached to flying objects like birds. These are all critical considerations for audio engineers tasked with asset creation.
What Kinds of Things Can Be Done with Ambisonics?
Imagine you’re at home, attending a virtual concert (via something like NextVR). If programmed appropriately, you can rotate your head around and the sound will change so you feel like you’re doing it at the concert. The same can be done with boxing matches, sporting events and game shows. If it’s a video game, the sound of creatures or events around you moves as your character looks around.
For reference, here’s a cool studio session where you get to hear a bunch of unusual instruments and vocals in 360-degrees. And, unless I’m mistaken, it was recorded on a bunch of mono mics, then turned into 360-degree audio.
With the right software and sensors, you can interpolate audio in a way that allows changes as you move around. This concept is perfect for places like The Void in Pleasant Grove, Utah (and New York City). The Void is an arena that’s been 3D mapped (with environments built in Unity), so you can actually run around, look around, aim and shoot, with every audiovisual aspect responding appropriately. They have proprietary equipment that each player straps on that allow for an exceptionally-realistic experience. Basically, you’re walking around with a gaming computer and an army of sensors (powered by Intel, Nvidia, Magic Leap, etc . . .). Since each person’s world is rendered on his or her person, you’re cable-free and ready to roam. Here’s one article on what The Void’s founder think of VR in general and what his goals are.
Other Signs the Industry is Moving in Tandem towards VR/AR
Audio effects libraries have begun creating products in ambisonic formats. Besides being ready to import into 360-degree videos and other immersive media, ambisonic audio can create virtual microphones. In other words, if the mic position of a particular sound recording wasn’t placed well for the application, the engineer can more or less change the audio to sound like a different placement was used. It opens up new interpretations beyond what the recording engineer thought best at the time.
The Windows Creators Update rolled out just a couple months ago. The features Microsoft decided to put atop their updates list are all about augmented reality (they call it “mixed reality”). They also rolled out new gear to accompany the experience, including sophisticated “HMD” headsets.
The latest Adobe Creative Cloud update was largely about virtual and augmented reality as well. Among the changes are new VR video effects, transitions and editing tools for After Effects and Premiere Pro. Premiere also got increased ambisonics support so it’s easier to change the orientation of audio.
Magic Leap just finished a $500M investor round. The company’s project has been a pretty well-kept secret for a long time, but they’ve begun letting people know closer to what they’re working on. Basically, it looks like a wearable computer interface to replace monitor, mouse and keyboard and create an augmented reality imposed on the real world. One thing that sets them apart (according to patents filed) is a camera that read your eye’s focal length to adjust the focus of digital graphics.
Why Use VR/AR in Schools?
With cool tech comes engagement that’s “off the charts.” It’s not something to take lightly, since it’s a paradigm shift in education. Students can interact with solutions and data in completely new and immersive ways, like this example at the British Columbia Institute of Technology. It’s a new tool for teachers to lead students through the discovery of their subject matter, and it gets easier to teach and more effective the more immersive the experience (chart).
The world seems to be moving toward interactivity in every field. Music is no exception. Just as interaction, immersion and discovery help students learn and stay engaged, it also makes paid experiences more marketable. If you have any questions regarding VR or AR in an educational or engineering context, feel free to comment below.
Featured image is licensed CC0 by Andrew Robles. If you appreciate his work or want to use any of it for free, check it out at the link below.