So i read about that new VR-Headset called Ocolus Rift. It has 140° FOV and is better than other VR-Headsets in nearly any other aspect.
My first thought was, that it would be ideal for fully immersive FPV-Flying. Sadly its not possible to just plug in the RIft in some standard video source and have a nice experience. Thats mostly because the Screen has to be divided into the left/right picture to allow streoscopic vision and these two images have to be barel-distored to be visible undistorted trough the lenses.
So i decided to write a small cross-platform OpenGL Application that processes the videostream from a video capture device for the Rift. The result looks like this:
The Application just grab a frame from the capture device, maps it to a Texture and later on a Fragmentshader distorts and duplicates it.
With my webcam the whole thing works pretty realtime/interactive. The distortion is done on the GPU, because its done in a fragment shader.
I developed this app on Linux and ported it to windows, it can be compiled cross-platform. So we can later on run it on some small devices like the Raspberry Pi and take it outside with the FPV-Equipment.
Warning Dragons ahead:
If you want you can download the application here: http://lab.neosolve.de/RemoteEyes/RemoteEyes.zip
Please be aware that this is a really early version. For example it just uses the first capture device it finds, and asumes that it delivers 640×480 frames with BGR Byteorder. So if you see some strange pictures with your capturedevice / the picture is completly distored, thats because the application just assumes that format and does not check whats delivered by the capture device. So expect some bugs!
f – Go fullscreen
esc – quit