projecting 360 panoramic photos onto geometry

If I shoot a room with a 360 panoramic rig and then I built a 3d model of the room - how can I project the 360 photo onto the geometry so that I can use the photo as the texture?

ALSO: if I shoot multiple 360 panoramics of the same room - how would I project all the photos such that the panoramics overlap and fill in missing areas so that I can look in the room from any position or angle?

thanks!

try to project them from the same viewpoint as you took the pictures

for the already stitched panorama i would try to build a cube map out of it, then project each face of the cube with a 90degree FOV camera

1 Like

Hi pingking, thanks for your response - my questions is technical - how would I project a 360 panoramic image in blender?

can you be a bit more specific (Cycles/BI, is it for a material, is it a lamp, etc)? Just because there are so many ways…

going off this web page explaining how to shoot 360 panoramic photos

produces panoramic images like this:
https://www.ptgui.com/gallery/

these are equirectangular images.

I tried projecting textures from a lamp but couldn’t get that to work - probably I’m not doing it right.

I’m open to working in either cycles or BI. I would like to develop this technique to produce photo based game environments

thanks!

In Cycles:
-with materials: TextureCoordinates::Object(any obj at the projection center) >> ImageTexture(projection::Sphere) >> Emission
(you’ll need 2.75rc… for older blenders a bit more work is required as you need to input the projector coordinates by hand)

-with a lamp as a projector: Geometry::Normal >> EnviromentTexture >> Emission
(but the light shouldn’t have any falloff)

In BI:
-with a material: in the Texture tab, Mapping panel: set coords to object, select your center object and set the projection to sphere.

I don’t know if it’s possible to project with a lamp in BI, I tried it and failed. :frowning:

1 Like

Hi Secrop,

Thanks for the tips. I tried the Cycles approach using the lamp as the projector - It seems to basically work but the projection is not very clear - even when the projection image is 4k. Any suggestions?

to test I built a simple interior and rendered a 360 panoramic equirectangular projection


then I projected back the render using a lamp as the projector as you suggested


as you can see it basically works but isn’t usable for my needs. Any ideas on how to get a better result?

thanks!

With the lamp as a projector, you need to set the size of the lamp to something very small. Also the background should be black.
Have you tried with materials?

ah thanks for the tip! I set the lamp size to 0 and it works - and yes, I tried with materials which is much better than using the lamp as projector. Thanks Again for the help

i don’t think camera projection works with equitangular sources, you would need to reinterpret it the planar facets of a cube as pingking23 said. what about that technique doesn’t work for you?

i’m doing a lot of VR stuff these days, so I’ve had my head in these sorts of challenges for a bit now. Overall the scenario you are describing is not trivial (to fully reconstruct a space and texture it with photos). You might have more luck using something like 123D catch by autodesk to get portions and then reconstruct the full space after the fact, but it will definitely have that 3d scanned look to it.

Hi shteeve - I got it working - my problem was I missed setting the lamp size (I’m a blender novice user)
I rendered out equirectangular and used that to reproject back onto the geometry - worked great.

I agree with you - this reconstruction approach is not easy - but Agisoft Photscan seem to be able to do it really well and I’m planning to try Photoscan in the coming weeks

I was thinking about a process of taking multiple equirectangular photos of a space - using lidar to capture the geometry - then projecting the photos back on the lidar model (skipping steps) - I think this should work - except the tricky part is how to combine the projected shots on the model and blend them together

do you mean like in this screenshot ?
Does it work for any texture, in my case a PNG image.
For me does not work.
thank you for your suggestion.

emitter_su_texture.pdf (89.8 KB)

For blending you must bake the projected textures to UV space and then blend the different projections in whatever software you find fit. This process is not very easy in blender, but there are other options if you need a fast and solid solution (for commercial uses for example). Some of them were mentioned, you can also use photogrammetry or matchmoving software for solving scene and camera positions. For projections based on 3d mesh, solved cameras and textures look into Nuke and Mari.

Hi, I searched same thing and I did not get answer in this Thread, how ever I find solution on other place and if anyone is interested here it is Solution, but for quick answer here is photo demonstration what solution looks like :slight_smile: thanks to gandalf3


Hi Reneno! Its pretty Awesome and u’ve done it 5 yrs before.
Now I want to do the Same kind of Equirectangular 360 image Projection to a room 3d Mesh which I built.

Could you guide me on how to do it in the Recent Blender Version?
If I could get your contact it will be more helpful- Any social media or any mail Pls.

[email protected] - my email

Hi @Secrop

I am tryibg to acheive the same. I want to project a 360 Equirectangular image to a cube mesh exactly like it was in real world. I sthere anyway to do that

could you explain me how to do it in blender.

Use the negative of the Incoming vector on your EnviromentTexture node and plug it directly into the Surface output.

I know you posted this 8 years ago but this was really useful to me today. Thanks :+1: