Oculus DK2 test render

I’ve been very excited about the new 3D rendering in the latest release. So I dug up an old model (a spaceship corridor that I made after a Blenderguru tutorial a long time ago) rendered it at 7680x4320 in spherical 3D and stitched the left and right images from Blender together in Gimp and voila!

It looks really nice on my Oculus DK2. Full 360 degree view in full 3D. Take a look (and please ignore the empty space behind you - it was the old camera position, and I never thought there needed to be anything when I made the original).

I recommend MaxVR to see it on the Oculus (which is not free), but Whirlygig should also work if you play with the settings, though it’s not good at very large images and might crash.

Screen should be spherical panoramic and 3D set to OU (over/under).

https://www.dropbox.com/s/mgrjgbcd0d93n7h/VR-uHD_R.png?dl=0

Um, and I should perhaps mention that this is a 130MB png image :slight_smile:

i’d recommend making a little smaller image, the download looked like it was going to take hours! when you take the dk2 screen resolution into account you’re not really getting any benefit making an image larger than 4k wide anyway.

i’m curious about your process though. did you just render 2 spherical panoramas from 2 cameras placed side by side? technically that doesn’t really work because the left and right eyes fall out of alignment as you turn your head.

there is a patched version of blender that properly creates a panorama that holds stereo as you look around you, but ran into some rendering quirks with that one. i’ve come up with a custom technique using a refractive sphere that has worked pretty well on some projects, but it has its own limitations (you lose render passes because everything has to pass through a virtual “lens” object)

You’re right I should have reduced it to 360/fov of Oculus, but I was curious if I could supersample it on the Oculus (which I haven’t looked at yet), plus I thought that I’d make it high enough so anyone testing it on any VR rig would get more than enough resolution, not just me on my Oculus DK2.

Turns out the process to render this is now trivial with the new 2.75 that came a few days ago, you have everything you need.

Under scene settings enable “Views” and choose “Stereo 3D” (default I think). Then change your camera to “Panoramic” and camera type to “Equirectangular”. When you then render, it looks like nothing much has changed (except it takes twice as long). But when you save the rendered image, Blender creates not one but two images (suffixes _r and _r). And that’s it - you got all you need :slight_smile:

Stereo Equirectangular rendering isnt fully in yet… the 3D is fine straight ahead, but the further you rotate away from straight ahead, the more incorrect it gets… at 90 degrees its paralax is 0, and at -180degrees the views are inverted.

The reason is because you can’t take a fixed point for stereo equirectangular cameras… you need the cameras to orbit around a point for each vertical line…

http://paulbourke.net/papers/vsmm2006/vsmm2006.pdf is a decent paper on the topic

There is a patch for blender but it hasnt been committed yet.

This is very interesting - I’ll take a look when I’m home from work. Still, and again I’m at work now so I can’t check with the Oculus, but my impression from the png I uploaded is that it looks perfect without any sense of distortion.

I get that if you imagine a person with a few cm distance between each eyeball, looking around in space, you do need a “rotating camera rig” with a fixed distance between cameras to make it perfect. Perhaps though, the distortion is so minor that your brain is tricked into accepting it. I can’t tell, when testing this in my Oculus DK2, that it is not perfect. But perhaps I’ve gotten so used to imperfection that I no longer notice :slight_smile:

It is not so much the distortion but the parallax offset… it will become more comfortable with closer objects then anything.

So maybe that’s the reason it doesn’t seem “wrong” then, as the render is of a corridor where the nearest objects are not that close…

I have to say that the feeling of suddenly being “inside” one of your own renders it quite amazing, so I’m looking forward to playing with this some more! :slight_smile:

Dalai Felinto created a build for blender stereo 360 panoramas that addresses the parallax problems somewhat. It’s not perfect, but it’s a really nice step in the right direction. Remember with oculus its very uncomfortable for the viewer to be using a 3d camera that has a convergence point that isn’t (for all intents and purposes) infinity. So set the convergence distance to something ridiculously high and you should be ok (there are downsides to this method too but its the easiest in my opinion). And keep the Interpupillary distance at 6.5!
Here’s his blog post containing the blender build: http://www.dalaifelinto.com/?p=1009

Good luck, pre-rendered VR is a nightmare but the results can be really nice.