window coordinates not helping, need another solution for earthquake simulation

I’m not a texture guy so try to bear with me here. I’d like to project my texture onto my geometry, just the way I have it in the screen grabs I’ve posted below. Using ‘window’ coordinates was the only way I was able to do this but I want my texture(I used the first frame of the sequence for this) to stick to the geometry at all times as I will be simulating an earthquake and anytime the geometry moves the texture is projected through just as expected, but that’s not what I want, see second pic. Is there a way to keep the texture mapped to the geometry so that it’s independent from the window coordinates? I have a decent camera solve so I’m not worried about misalignments.
Please let me know if my problem isn’t clear.

In each jpeg below, the top screen grab is the camera view and the second is the render.


Attachments


Have you tried UVProject modifier? Hit Unwrap for whatever you have and use same Camera as a Projector.

Hi eppo, thanks for the help. I did try that but it may not have worked the first few times because at that point I had already scaled and translated my UVs. I will try it again with my UVs reset.

EDIT: Actually this won’t work either as it’s just going to project it onto the geometry at all times. I’d like the texture to live on the geometry physically not just projected as it does when you use window coordinates or the UVProject modifier. If I use either of those two methods, when I do a cell fracture it’s going to look just like my second screen grabs above which is what I’m trying to avoid.

Make sure scales are applied to objects you unwrap; scale Projector if you’d need to adjust some sizes.

Any other ideas?

A possibiliy would be to bake the projection…
here’s the quick steps:
-Unwrap your object.
-create a new image texture.
-create a material with an emition shader and your background with the window coordinate system connected.
-create a new image texture node with the new blank image (without connecting anything)
-bake the Emit color.

This will project the texture from the camera into the new UVmap, which will be fixed to the object and not to the window vector.

Awesome thanks Secrop! I will give it a shot. I’m using blender internal right now though but it shouldn’t be a problem to switch over seeing that there’s only one or two pieces of geometry.

In BI it should be more or less the same, by baking the texture with the window coordinates, into the empty texture with the UVmap coordinates.

Thanks. I ended up having to go a different route instead of doing an earthquake simulation because initially I couldn’t find a solution to my issue. Now it’s just a big gaping hole which will have to suffice.
I should bookmark this thread though for future reference.