Oculus Rift FPV

So i read about that new VR-Headset called Ocolus Rift. It has 140° FOV and is better than other VR-Headsets in nearly any other aspect.
My first thought was, that it would be ideal for fully immersive FPV-Flying. Sadly its not possible to just plug in the RIft in some standard video source and have a nice experience. Thats mostly because the Screen has to be divided into the left/right picture to allow streoscopic vision and these two images have to be barel-distored to be visible undistorted trough the lenses.

So i decided to write a small cross-platform OpenGL Application that processes the videostream from a video capture device for the Rift. The result looks like this:

remote_eyesThe Application just grab a frame from the capture device, maps it to a Texture and later on a Fragmentshader distorts and duplicates it.
With my webcam the whole thing works pretty realtime/interactive. The distortion is done on the GPU, because its done in a fragment shader.

I developed this app on Linux and ported it to windows, it can be compiled cross-platform. So we can later on run it on some small devices like the Raspberry Pi and take it outside with the FPV-Equipment.

Warning Dragons ahead:
If you want you can download the application here: http://lab.neosolve.de/RemoteEyes/RemoteEyes.zip
Please be aware that this is a really early version. For example it just uses the first capture device it finds, and asumes that it delivers 640×480 frames with BGR Byteorder. So if you see some strange pictures with your capturedevice / the picture is completly distored, thats because the application just assumes that format and does not check whats delivered by the capture device. So expect some bugs!
Keys:
f – Go fullscreen
esc – quit

Posted in FPV, Linux, Ocolus Rift
4 comments on “Oculus Rift FPV
  1. Brett says:

    Hi, I tried out your remoteeyes program. It works well at grabbing my laptop but does not see my video capture USB dongle. I tried disabling the webcam but then nothing is found. Will you be releasing source for this? I have a dream of a 2 camera fpv setup with two capture devices outputting stereo to my rift, maybe with a floating HUD.

    Thanks,
    Brett

  2. S4nshinez says:

    Hi Spacefish,

    that’s great work, your shaders are impressing! As Grix already mentioned, it would be very helpfull to have the C/C++ source code for building on your work, say for porting it to linux or implementing the toe-in-axes (= the asymetric frustum mentioned in Sven’s post). That would be awesome.

    Thx in advance,
    S4nshi

  3. Grix says:

    Hi, saw your comment on oculusrift.com. Would very much appreciate if you could publish the source of this program to github. Very helpful!

  4. Sven says:

    Hallo,
    ich habe das mal mit meiner Rift ausprobiert, aber leider passen die Bilder optisch nicht zusammen, da die Rift eine “asymetrische” Darstellung braucht. Das linke Bild muss rechts beschnitten sein, das rechte Bild links, also beide in der Mitte etwas zusammengeschoben. Vermutlich muss man dazu nur was am Shader anpassen ?
    Das ganze gibt es (optisch) funktionierend übrigens im StereoscopicViewer (der kann auch live-Bilder für die Rift anzeigen), aber da ist die Latenz leider viel zu hoch.
    Wär toll wenn Du das Projekt weiterführst. Insbesondere auf einer portablen Hardware wie dem Pi (der selbst hat ja leider das USB Problem).
    Sven

Leave a Reply

Your email address will not be published. Required fields are marked *

*