Is it possible to subdivide ONLY the uv map?

I know it may be a weird question, so let me explain why I would like to do this.

I am using Blender to generate geometry for terrains in a game I am prototyping. I have a cube made of 16x16x6 individual planes. I then subdivide these several times to get different levels of detail. Then I deform this into a sphere (as my terrain is spherical). In my game, I deform the vertices according to a noise algorithm.

Of course, the more detailed the cube, the larger the fiel gets to the point of being unwieldly. I am already at the point where I had to script all of these operations in python as the UI cannot handle it without crashing.

Anyways, I realized that really the only thing each plane has in common is the following:

  • the object origin (always 0,0,0)
  • the uv map.

Before subdividing (in fact, before separating the cube into multiple separate objects), I UV mapped the cube using sphere projection. After I seperate the cube into multiple planes, they have their own uv region.

What I would like to do is have a file with the lowest resolution geometry, but with a VERY highly detailed UV map. Then, all I need to do is have a seperate file with 6 planes; each one with very high levels of subdivision to match the uv map. In my game, I can swap the vertex info as needed. This would reduce space on disk enormously.

So, that all being said; is it even possible to have a uv map with a higher level of detail than the mesh itself? If not in Blender, perhaps another app?

You can bake a texture using a detailed mesh and UV map, then use that texture on a simple, unsubdivided plane. Or on a plane with as many or as few subdivisions as you like. Once the texture image is created, it is independant of the actual UVs or mesh geometry used to create it. It is simply an image file. You then get to decided how and where you want to use that image file on your mesh.

Your confusion seems to arise from the idea that, because you can use UVs to create the texture image, that image is then somehow linked to those UVs. It is not.

Now, your idea to use a single texture image and a single plane to simulate six texture images on six planes is probably possible, as well. You need to look into the UV Warp Modifier to see how that’s done. It works in animation. I don’t know if it works in the Game Engine, or if it is exportable to other engines.

Welcome to BlenderArtists :smiley:

I think there is a little confusion. The uv is not used directly to create a texture. The UV map is being used to determine where to sample noise from a noise function. For instance, if the uv coord for a vertex is 0.5/0.5, I pass that to my noise function and get a value to use for the topology of the plane. The noise function knows to return data that is sphere projected, and as such makes an assumption that the uv was created with sphere projection.

So basically, this has nothign to do with textures. The UVs solely exist to direct the noise algorithm to return noise for that position. I wonder if that uv warp modifier coudl be leveraged for this purpose as well?

Oh, and thank you for the welcome :smiley:

Thanks for the info, but this actually has nothing to do with textures. The UV map is only used to provide coordinates for a noise function within my code. No actual texture is ever produced/used.

Thanks for the welcome :slight_smile: