July 31, 2012

Oculus Rift: Motion Tracker development

Virtual Reality headsets have been consistently disappointing for several decades now. Three reasons come to mind in order to explain this state of things:


But things are about to change thanks to a device to be released this September via kickstarter funding: the Oculus rift.

Oculus RIFT

Palmer Luckey the man behind the project has managed to put together a set of outstanding components that provide very low latency and a significantly increased field of view ; he also managed to keep the price under 250$. This tackle the two first problems related to hardware.


As for the software, Luckey secured id Software to support the device officialy: Doom3 BFG Edition will be compatible out-of-the-box. Valve is also confirmed to be working on Portal compatibility:


EDIT : The kickstarter project opened on August 1st and reached its 250,000$ goal within an hour. The hype around this thing is beyond everything I have ever seen in the field. Looks amazing:


Motion Sensor: FSRK-USB-2

Unfortunately developers will get their hands on an Oculus RIFT only starting November 2012. But if you are eager to start developing you can ordered the motion tracker chipset that will be used: Hillcrest's FSRK-USB-2. It is a very tiny printed board that you can connect via micro-USB:




Deviceless development

The software used to receive signal from the motion tracker is called libfreespace. It is built on top of libusb and comes with many examples that make it easy to integrate to any engine.

With a little bit of electrical tape and Windows 8 sunglasses it is possible to build a unglamourous but efficient prototype and experiment with headtracking:





The complete "deviceless" setup...

Integration to a 3D engine

Integrating libfreespace inputs to a 3D engine can be done in a few minutes with the following architecture:



The engine spawns a "pump" thread: An infinite loop that will endlessly requests the motion sensor orientation via freespace_readMessage. The result is stored in a synchronized array. Upon rendering the engine just get the latest orientation know via the shared variable without waiting for communication via libfreespace: This brings latency down a lot since the time waiting for system calls to return from the USB bus is spent in the "pump" Thread and not in the "Engine" Thread.

You can also request a specific firmware from Hillcrest Labs that crank up frequency from 125Hz to 250Hz which will reduce latency further. You also need to enable Triple Buffering. Finally, the source code can be found on GitHub.


Once the motion tracking events are integrated to the engine, the video output also has to be significantly altered:

  • The screen must two rendition from two point of view: One for the left eye and one for the right eye, side by side.
  • Each eye image must also be distorted by the inverse amount operated by the physical lenses in the Oculus RIFT: This is called "pre-warping"

In the end the result has to look like this:



In practice this is done as follow:

  • Render each eye one after an other to an offscreen surface (with OpenGL this would be a Frame Buffer Object).
  • Use a special shader to perform the warping while moving from FBO to the framebuffer.

 

@