When it comes to positional tracking, some VR devices do a great job, while some lack the capability entirely. It's one of the key differentiators between existing Mobile VR solutions and the Oculus Rift / HTC Vive. But what is positional tracking, anyway? Why is it important? And what systems exist right now to accurately track your motion in VR?

In the real world, you can rotate your head, look up and down, and tilt from side to side (yaw, pitch, and roll). Mobile VR tracks these movements and takes advantage of its knowledge where your head is facing to create the illusion of being able to "look around" in a virtual space. With the combination of compass and accelerometer sensors in a smartphone, VR can successfully account for most sources of error (like accelerometer drift) and accurately determine attitude at any point in time. The GearVR and Google Cardboard both run on a phone that snaps or straps into a headset, recalculating the attitude of the device to figure out what you are looking at in a virtual world and update the display accordingly. Because they calculate rotation around three axes (x, y, and z), to determine the direction in which you're looking, we say that most mobile VR has 3 degrees of freedom.

In the real world, you can also change your position relative to your starting point. You can lean forward, stand up, or duck. All of these actions result in translational motion that cannot be captured by a fully contained system unable to capture a fixed external reference. The GearVR and Google Cardboard currently don't have a way to accurately track positional changes.

Tracking only rotational movement, 3 degrees of freedom, is a problem for immersive VR. Your ear-brain combination can tell when you lean forward, and if your eyes don't detect that motion, too, there's a serious disconnect between your senses. You feel the results in your stomach (insta-nausea for those of us prone to sim-sickness, like me). Short bursts of using mobile VR are great, but good positional tracking allows VR experiences to be comfortable for longer usage. It has to be so accurate that your subconscious can't pick up on a disconnect between your eyes and your vestibular system; accurate to less that 1mm.

All of this is to say that positional tracking is one of the most fundamental components of immersive virtual reality. The Oculus Rift, HTC Vive, and other desktop-based VR sets typically have some sort of external tracking system that provides the fixed reference frame required to track your head (and VR goggles, and perhaps any hand controllers) when they're in motion.

In my work with IrisVR, I've had the opportunity to play with a few of these tracking systems over the past couple of years. In my opinion, the Vive Lighthouse is the best system of those that will be available in the first wave of desktop VR. Recently, however, one of my Lighthouse tracking units stopped working. The physics geek inside me took over and I seized the opportunity to take it apart instead of throwing it out.

The Lighthouse system is a set of two boxes, each of which contain two motors with casings that emit beams in the infrared. For every revolution of the motors, the boxes flash IR light momentarily. (IR because it's a specific wavelength that your overhead lights usually don't emit, so it's easier to detect. Also, you can't see infrared wavelengths, so the flashes and sweeping motors won't bother you). The beams from the motors are spread in a broad pattern and sweep across your tracking area some time after the flash. Photodiodes (detectors that convert light into a change in electrical signal) on the face of the Vive HMD (all those little indentations on the front of the thing) register when they're hit by a flash or a beam and from the time difference between, the computer can calculate the device's position in space.

It's super accurate, and did I mention insanely cool? If you want to learn more about the Lighthouse, go check out Gizmodo's post.

Below, I am going to go on a bit of a tangent. If you aren't interested in checking out the Vive hardware, definitely skip to the last couple of paragraphs. I won't judge you - some people don't like really cool shit - that's fine.

When I took the Lighthouse box apart, I blatantly ignored Valve's warning. Don't try this at home (I have no doubt it would invalidate your warranty) - just read my blog post instead. Sorry, Valve; I take full responsibility for any personal injury or mental anguish, and promise I did all of this in the privacy of my own home without the possibility of carelessly injuring a third party.

Laser Warning

First, here's what the Lighthouse components look like when they are removed from their housing. Notice the two black motors and the circuit board covered in IR "lights" (those ones I mentioned before that flash once per motor revolution).

Lighthouse w/o Cover

The beam that emanates from the motor casing is actually generated by an axis-aligned diode laser that hits a beam-splitter in the center of the housing. You can see the laser right at the center of the motor when I remove the housing.

Axis-aligned Laser

From there, it is directed through a lens in the casing (the clear thing embedded in the black cylindrical motor housing), and out through the top of the casing. Presumably, it is detected as "on" by a photodiode that sits on a board across from the motor in the Lighthouse enclosure.

Sketchy Diagram

I mentioned before that the beam is "broad" and sweeps across the entire tracking region. Yet, it is emitted from a laser not dissimilar to one you've probably used to point to a powerpoint on a projection screen. How is the beam spread? That lens in the motor casing does a pretty good job and to illustrate, I sent a red (visible) beam along the path normally traversed by the IR laser. Here is the result, projected through the Powell Lens and onto the wall of my apartment:

Beam Spread

Working in harmony, this hardware provides an accurate system for tracking just about anything to which you can affix an array of photodiodes. YouTuber "rvdm88" made an awesome animation of the Lighthouse at super slow-motion to demonstrate the tech in action.

Positional tracking has certainly come a long way in the past few years. Bulky and expensive systems are about to be replaced (at least in part) by the inexpensive, portable, and accessible Lighthouse. Valve's ingenuity is bringing the promise of an additional three degrees of positional tracking freedom (translation in the X, Y, and Z directions) to consumer VR. The result is the finest, most comfortable HMD (that will be) on the market.

Regarding the future of tracking for VR, there are a couple of issues with Lighthouse that I expect will be solved by a combination of software and hardware innovation within the next year. It is already excellent, and will only get better. I'll reserve the details for a future post, however. Furthermore, I anticipate that positional tracking will ultimately come to mobile VR, but it might not be in a form that looks similar to Lighthouse. Mobile VR has the benefit of being, well, mobile and completely cord-free. Ultimately, this technology will converge with tracking tech to produce a VR experience that does not sacrifice mobility for tracking, comfort, or performance. This is another topic worth investigating in much more depth and I look forward to following the advances in "inside-out" tracking over the coming months and years. Look for a blog post soon on why comparing mobile and desktop VR today is apples-to-oranges and what will take to bring them closer together.