Health

How We Used VR to Explore What Music Feels Like to a Deaf Person

James does not have a cochlear implant; he uses hearing aids. Like Rachel, he reads lips. Communicating on the phone with a deaf person is often not ideal, so we used video chat or email to correspond and give creative notes. I went to London to work with James and his colleagues for a few days, and interacting in person helped our process — but I could say that about any collaboration.

“When I work with voice-over, I’m hardly ever able to work out what’s being said from the voice track alone,” James explained. “I can hear when something is being said, but not so much what is being said. So I use a combination of the audio track, the transcript with timings, and timeline markers. The audio waveform, which I can see on the computer screen, helps me to sync things up. If I still can’t work it out, there’s usually a friendly producer nearby who can help me fill in the gaps.”

At one point, before we had a crucial sound cue built in, James threw in some sound cues himself, but the volume levels were really loud so he could hear them. When we watched and listened to that version together, the hearing collaborators dove for the volume. We all laughed about it and James apologized, but it taught us a lesson: James’s sound cues were the only ones we felt as well as heard. That mattered for our sound design. In the moment when Rachel’s cochlear implants are turned on, we aimed to have a sound cue that jolted us physically.

We also wanted an acoustically extraordinary setting. Since there is a detail in Rachel’s op-ed about her first experiences hearing live music after getting her implant, we asked her where those experiences happened and which ones were the most impactful. The Santa Fe Opera, near her home in New Mexico, was at the top of her list. It generously opened its doors, and the back wall of its open-air stage, so we could film Rachel there with a twilit desert backdrop.

Photo

Rachel Kolb stands in front of Lytro’s camera while Maureen Towey looks on.

Credit
Marcelle Hopkins/The New York Times

While the animations were being developed, we landed a partnership with the technical wizards at Lytro to create our live-action scenes. Their light field technology can bring an extraordinary amount of depth and detail to a virtual reality image. The biggest VR camera we use regularly at The Times is about the size of a basketball. Lytro’s camera is the size of a sumo wrestler. It gathers about 475 times more visual information than we do for a standard VR piece. Lytro’s system also features Six Degrees of Freedom (6DoF), which enables the viewer in aheadset to move around within the piece. If you are standing in front of a person, you can close the distance to them or step farther away. If you are watching an animation, you can crouch down or swivel to the side to see what the image looks like from a different angle. This freedom of movement increases the sense that you are right there with Rachel. It’s what we call “presence” — the magic of VR.

During this process, we were also working closely with Brendan Baker, a sound designer whom I knew from his work on the podcast “Love + Radio,” which is known for its innovative use of sound in storytelling. I thought of him as the mad scientist of the podcast world, which was exactly what we needed to create a sound design that could accurately represent the sounds that Rachel was hearing when she first turned on her cochlear implant. The production company Q Department came in on the back end to spatialize the sound design so the sound would adjust as you moved your head in the headset.

When Rachel got that playlist from her friends, she was trying to do something she hadn’t done before: hear music. In working on this piece, we were trying to do some things we hadn’t done before. We were trying to create a VR piece that was animated, that incorporated new 6DoF technologies and that told Rachel’s story with the depth and sensitivity it deserved. In the VR department, our jobs are the most exciting when we can use a new technology to shine a light on an important story.

There are several ways to experience “Sensations of Sound” and other NYT VR pieces. You can watch through the NYT VR apps in Oculus and Daydream headsets for an immersive experience. You can watch on your phone through the NYT VR app or by clicking on this link and waving your phone around to explore. Or you can watch on your computer and use your mouse to scroll around in the 360 video.

Continue reading the main story

Source

Most Popular

To Top