That button had activated the eye-tracking technology of of Tobii, the Swedish company where Karlén is a director of product management for VR. Two cameras inside the headset had begun watching my eyes, illuminating them with near-IR light, and making sure that my avatar’s eyes did exactly what mine did.
Tobii isn’t the only eye-tracking company around, but with 900 employees, it may be the largest. And while the Swedish company has been around since 2006, Qualcomm’s prototype headset—and the latest version of its Snapdragon mobile-VR platform, which it unveiled at the Game Developers Conference in San Francisco this week—marks the first time that eye-tracking is being included in a mass-produced consumer VR device.
Eye-tracking unlocks “foveated rendering,” a technique in which graphical fidelity is only prioritized for the tiny portion of the display your pupils are focused on. For Tobii’s version, that’s anywhere from one-tenth to one-sixteenth of the display; everything outside that area can be dialed down as much as 40 or 50 percent without you noticing, which means less load on the graphics processor. VR creators can leverage that luxury in order to coax current-gen performance out of a last-gen GPU, or achieve a higher frame rate than they might otherwise be able to.
That’s just the ones and zeros stuff. There are compelling interface benefits as well. Generally, input in VR is a three-step process: look at something, point at it to select it, then click to input the selection. When your eyes become the selection tool, those first two steps become one. It’s almost like a smartphone, where pointing collapses the selection and click into a single step. And because you’re using your eyes and not your head, that means less head motion, less fatigue, less chance for discomfort.
There’s also that whole cameras-watching-your-eyes thing. Watching not just what your eyes are doing, but where they look and for how long—in other words, tracking your attention. That’s the kind of information advertisers and marketers would do just about anything to get their hands on. One study has even shown that gaze-tracking can be (mis)used to influence people’s biases and decision-making.
“We take a very hard, open stance,” he says. “Pictures of your eyes never go to developers—only gaze direction. We do not allow applications to store or transfer eye-tracking data or aggregate over multiple users. It’s not storable, and it doesn’t leave the device.”
Tobii does allow for analytic collection, Werner allows; the company has a business unit focused on working with research facilities and universities. He points to eye-tracking’s potential as a diagnostic tool for autism spectrum disorders, to its applications for phobia research. But anyone using that analytical license, he says, must inform users and make eye-tracking data collection an opt-in process.
There is no reason why eye tracking couldn’t do the same things (and pose the same risks) in AR devices.