We're not done yet. Once someone is in a VR world, they start believing in it. They look up, right, left, down, and their brain starts buying into the idea that they're really in this strange place. Then the user leans forward to examine some object more closely. The gryos can tell their head is tilted down, but not that it's moving forward. So the scene doesn't reflect this. They leaned forward but didn't move any closer to the thing they're looking at. To the user it feels like the entire world has tipped forward and away from them. Once again: VR Sickness.
So we need the simulation to track the user's head movement. In the case of the Oculus, this is done with some infrared LEDs and an infrared camera. This is basically the same technology the Wii uses with the Wii remote. On the Wii, that sensor bar on top of the TV has an LED on either side. That black panel on the front of the Wii remote has a sensor behind it. It sees the two lights and uses them to figure out where it is relative to the television. The Oculus does the same thing to keep track of your head, except the parts are reversed: The lights are on the headset and the sensor is a camera on top of your monitor.
Once again, we have speed and latency problems to worry about. We've got to pull in these images from the sensor, scan them for the lights (just to be clear, these lights are infrared and thus invisible to the eye), do a bunch of math to figure out where the headset is relative to the camera based on the lights, use this information blended with previous updates to figure out how the head is moving, and send that data off to the application. Most webcams have a bit of latency and nobody cares, but now every millisecond counts, because the bigger the gap between the time your head moves and the time when your eyeballs see the movement, the more likely you are to find it confusing or disorienting.
There's still work to do. In a typical OLED display, it takes the pixels a tiny amount of time to shut off. Even if the display is updating at 75 fps, when a pixel on the screen goes from light to dark it actually fades out gradually instead of snapping off instantly. Nobody cared before because this effect was too subtle to notice. But when you're wearing a VR headset this results in bright objects leaving glowing "trails". So now we need better displays with more responsive pixels.
This is where we are today. As far as I can tell, we're done inventing new stuff and now it all comes down to refinement: More pixels. Faster displays. Smaller electronics. Oh, and we're powering all of this stuff off of USB ports, which vary in ability, are in limited supply, and which aren't guaranteed to be close to each other. Cordless headsets would help with this I suppose, which is probably why they're working on them.
We've been trying to invent this for over quarter century. It's been a long road with a lot of setbacks. But it's finally happening and a lot of the pieces have fallen into place in just the last two years.