Vr (cardboard / OSVR...) compatibilty?

:information_source: Attention Topic was automatically imported from the old Question2Answer platform.
:bust_in_silhouette: Asked By c4n4r
:warning: Old Version Published before Godot 3 was released.

hey there!

Just asking, do you know if Godot team plan to integrate Vr on the engine?
Would be cool to create a little slender like for Vr on Godot.

any idea?

:bust_in_silhouette: Reply From: fluffrabbit1

There are two components to VR which Godot currently lacks: stereoscopic cameras and head tracking. But if the furthest your mind has gone on the matter is making a Slenderman survival horror game then you’re either a visionary or very childish. In any case, I’m working on doing some VR stuff using what Godot is currently capable of and it’s coming along rather well, so in my infinite generosity I’ll do my best to explain how this works to you and anyone else who is interested. Also this is a fun topic. Sorry if I sound like a dick. I’m a dick.

First: Stereoscopic cameras. Stereo just means you have 2 at once. In Godot the scene hierarchy is such that every viewport can only have one camera. So you make 2 Viewports, one that covers the left half of the screen and one that covers the right. In each Viewport you make a Spatial, which is an empty 3D object. This Spatial serves as the pivot point between the left camera and the right camera, so both Spatials should be in the exact same place. Remember, Viewports do not inherit 3D data from parent nodes, so each Spatial is effectively the root of its own scenegraph. (English translation: When you move a 3D object that contains a Viewport the Viewport won’t move, nor will any of its child nodes.) Within each Spatial you put a camera and position one camera to the left and one camera to the right. So it looks like this:

ViewportL

  • Spatial
    • Camera
      ViewportR
  • Spatial
    • Camera

So now you need to be able to move it around as one, but like quarreling siblings the nodes won’t (or can’t) cooperate with each other with this hierarchy. So what do you do? You make a third Spatial outside of the Viewports in the exact same position as the others, then parent the Viewports to it and give it a script… Your hierarchy should look like below:

Spatial (with a script)

  • ViewportL
    • Spatial
      • Camera
  • ViewportR
    • Spatial
      • Camera

The script should basically position the child Spatials where it is every frame, so something like:

onready var SpatialL = get_node( ‘ViewportL/Spatial’ )
onready var SpatialR = get_node( ‘ViewportR/Spatial’ )

func _ready():

set_process( true )

func _process( d ):

Player/camera movement code goes here

var trans = get _ global _ transform()
SpatialL.set _ global _ transform( trans )
SpatialR.set _ global _ transform( trans )

And now you have stereo cameras suitable for VR. You might want to put a black vignette around each eye to sort of round it out and not have 2 ugly rectangles side by side, but I’ll let you figure that part out on your own. Basically it’s just textureframes on top of your viewports.

That was the easy part. The hard part is getting head tracking working. I’m still working on that code for my own project (though I temporarily put it aside in favor of other things). The basic idea is that every phone or Rift or other doohickey has any combination of three sensors: Accelerometer, Magnetometer, and Gyroscope (in order of most to least common). The data from these sensors is combined using some magical code that you yourself may have to write, which is a technique called sensor fusion (which basically just means math). But before you get to fuse anything you actually need to be able to get data from these sensors. (PROTIP: This isn’t going to work on your laptop. Try testing on your phone.)

Godot currently only supports one of these sensors: accelerometer. That’s because those with commit access in the repository aren’t hip with fresh ideas. Don’t worry, I’m sure they’ll be impeached soon. Oh wait, it’s a dictatorship. You’re powerless to stop them. Gyroscopes add stability but only the magnetometer can tell you where you’re facing relative to Earth’s axis, because it’s a compass. So you need to combine data from Input.get_accelerometer() with data from Input.get_magnetometer(), in the process hopefully smoothing it out unless you’re trying to make an earthquake simulator. Accelerometer gives you roll and pitch while magnetometer gives you yaw. Here’s the current discussion on magnetometers, which I think reflects just how little VR is on everyone’s minds right now: https://github.com/godotengine/godot/pull/4154

Godot is not an easy program to compile, so I’d wait until this gets merged, but when it gets merged support will only be there for Android, because I’m the one who wrote the magnetometer code and I’ve only ever developed on Android. I’m very sorry if Android is not you. Someone is working on (or planning to work on) the same thing for iOS, but that still doesn’t cover the plethora of other mobile operating systems. Blackberry, PalmOS, Windows Phone, Symbian OS, NokiOS, and CheeriOS are out of luck. But if you just so happen to be an Android developer (the poorest, most unemployed kind of developer) then just say the word and I can upload some slightly outdated Android export template APKs containing magnetometer support because because somebody with commit access decided the only thing missing from his life was a powerful enemy.

Like I said, I’m still working out the bugs in my own sensor fusion implementation (mainly to do with the smoothing) but here’s a quick function which returns an angle from the accelerometer and magnetometer without any smoothing. I’m not sure how necessary the wrap function is but I drink a lot so IDGAF.

func xRotateVector( vct, angle ):

var ss = sin( angle )
var cc = cos( angle )
return Vector3( vct.x, vct.y * cc - vct.z * ss, vct.y * ss + vct.z * cc )

func zRotateVector( vct, angle ):

var ss = sin( angle )
var cc = cos( angle )
return Vector3( vct.x * cc - vct.y * ss, vct.x * ss + vct.y * cc, vct.z )

func wrap( val ):

var circ = PI + PI
return val - ( int( val / circ ) * circ )

func quick_orientation():

var a = Input.get _ accelerometer()
var m = Input.get _ magnetometer()
var v = Vector3( 0.0, 0.0, wrap( PI + atan2( a.x, a.y ) ) )
a = zRotateVector( a, v.z )
v.x = wrap( PI / 2 + atan2( a.y, a.z ) )
m = xRotateVector( zRotateVector( m, v.z ), v.x )
v.y = wrap( atan2( m.x, m.z ) )
return v

Now I’ve given you everything. Sorry I didn’t bring lube.

Having just two viewports side by side with vignette is not enough to do the trick. You also need to distort those viewports according to a mesh modeling the lenses of the VR headset:


See the image is distorted. This little trick does a lot difference.
(In case of the Oculus, these meshes are available by calling a function from their C++ API.)

You also need to implement a shader to fix chromatic aberration (because of the lenses too, you can notice it naturally through glass or aquariums), and there is room for other optical tweaks to improve rendering.

Zylann | 2016-04-08 03:38

My headset advertises aspherical lenses to minimize distortion, and indeed I never see any. Is there any reason the lenses of the Rift are shaped like they are? Does it give a wider FOV or something?

fluffrabbit1 | 2016-04-08 06:35

It might have to do with the FoV, yes.
By the way, cameras of the Oculus don’t have a classic frustum, you actually have to build the two eyes with 4 different angles each, instead of the usual FovY+aspectRatio.

What is the model of your headset? It seems indeed that Google Cardboard has less distortion, according to images I could find on the web, but it has. I only experimented with Oculus at the moment, but it looks like any VR headset requires more or less distortion.

Zylann | 2016-04-08 14:49

The Rift is a $500 piece of hardware which I cannot afford. I’m currently using the Ritech Riem:

http://www.gearbest.com/home-gadgets/pp_136422.html

The trouble is, especially with Godot, there are all kinds of different APIs like the Rift API and the Cardboard API and probably others, and there’s all this information and data on what’s what and where and how and numbers and such rather than complete automation unified under a single standard, so I personally choose not to support anything that I don’t already have a rock-solid grasp of.

fluffrabbit1 | 2016-04-08 23:53

Hi,

Thanks for your work!!
Having VR for Godot would be very cool and i think it should be possible.It is totally nomral having two Viewports like in Godot.
I’done some “research” some years ago using Ogre3d:
Oculus dk1 test under linux

all one needs
is actually two viewports and two cameras an a barreldistortionshader fragmentshader
wich is used for the viewports…so one should render the camera image to a texture and use the barrel shader on it. besides in higher resolution displays like the dk2 and vive etc one also needs a color abbreviation in the shader also.
There is OpenHMD which i’ve used but Steam uses OpenVR which supports, as far as i know Oculus/vive/Cardboard…
SO i think that all the device settings can be optained using the openVR lib…
As soon i have time i will give it a try including the lib.
I will see also if i can post he fragment shaders i’ve used.

Godot rocks.

zuuka | 2016-06-29 07:54

:bust_in_silhouette: Reply From: Mux213

We have support for this now in Godot 3.0.

Create an ARVROrigin node and add an ARVRCamera child node.

For mobile VR add the following code to the _ready of your scene:

var arvr_interface = ARVRServer.find_interface("Native Mobile")
if arvr_interface and arvr_interface.initialize():
  get_viewport().arvr = true

That should turn on stereoscopic rendering on for mobile devices with simple orientation tracking.

If you want desktop VR instead have a look at the new OpenVR GDNative module you can find here: GitHub - BastiaanOlij/godot_openvr: GDNative based Open VR module