Python dungeon/level generator

Been thinking about it, we don’t have access to vertex groups in the engine,

so maybe paint vertices that can be ‘corner formed’ a color, and then after maze is built, ‘inside corner’ sections where two walls meet to form the corner, the overlapping sections vertex can be ‘zipped’ ?


I loaded the blend and have dungeonGenerator.py in the text area, alt-p does nothing.

@hike1: it only runs in the game engine, not as a module/extension in blender. So you need to be in game engine mode and then press ‘p’ in the 3D view.

@BPR: That’s a good idea. Have you made any progress implementing it? I wonder if moving the verts around would slow things down compared to just having a handful of different wall tiles and selecting the appropriate one?

Battery, I think I solved the clostrophobic issue, what about having each tile represent a road, and each wall a building, and we generate the buildings using the same system, but smaller pieces?

Behind the buildings in the ‘gap’ somehow we draw terrain like your forest demo?

Have you downloaded UPBGE new build with geometry instancing?

Like mazes within mazes!? Then you could use each building section stored by it’s x,y position as a seed and just generate them as needed. Worth a shot.

Here’s a couple of simpler examples of using it to generate a city:
cityExample.blend (759 KB)
cityExample2.blend (816 KB)

The generator could be integrated with the terrain generator I’m working on. But I’m not in a position to think about how they sit together until I’ve got the terrain generator working fully. Presently I’ve nearly finished procedural grass using a geometry shader to generate individual blades:


Then I plan to add some vegetation shaders and a weather cycle. Finish off with a few optimisations, the whole thing runs at a fairly stable 50fps with grass, terrain, procedural clouds and a day-night cycle, but I can do better! Then this project should be ready to mix in with procedural buildings and caves.

Are you using upbge with instancing?

All that grass just begs for it…

can you store the maze array as a kd tree and dynamically add and remove the pieces?

I use 2 radius to make a close list, and far list
Any items on the Big list, but not the small is on the far list

(Items on close list need to be added to scene, if not there already)
Items on far list get deleted…

A loading bubble

Nah, I tried upbge but it didn’t offer any benefits for me. The grass doesn’t really need instancing. All the grass geometry is created procedurally on the graphics card, generating each individual blade on the geometry shader. It’s already pretty fast and there’s room for improvement. I’ve managed to generate some insane numbers of grass blades with it. Also, instances would just introduce repetition, which I’m trying to hide, and instancing doesn’t work with custom shaders. The main bottle necks are the python and alpha planes on the trees.

Generating terrain with python is a hefty task. I’ve spread it out over several frames to lessen the impact but the more granular each step becomes the more often you’ll reach a segment that hasn’t finished being generated. If there’s too much in each step you’ll get a massive drop in fps as you approach a new area. It’s a fine balancing act and upbge does nothing to improve python’s speed. Really there are bits that need to be in C/C++ (stuff on the vert, poly and mesh API that’s really just missing and you need to do them in python).

As for the alpha planes, they really slow down the rasterizer. So adding more trees causes the frame rate to drop. Upbge doesn’t lessen this either. Finally, upbge just seems to mess with the node materials. Otherwise I would use it. I’m keeping an eye on the project though.

You could do, but lookups on a 2D array are already pretty fast. I haven’t really used KD trees so it’d be one to test.

here are some test tiles for building a road, (no ‘lots’ to generate random buildings on yet *)

Attachments

AtlasRhoads.blend (6.06 MB)

Wicked, very sci-fi looking.

Thank you =) I think they all need a second uv layer, (a dirty overlay) and trash instances and broken cars etc.

Looked it over, and apparently I need to keep them under 300 verticies each if instancing is to add any performance boost.

I am not sure if I need to worry about instancing for the road though, as the draw.distance will not be super far. But for the building pieces I should definitely make it a priority to keep each piece under 300 verticies.

I’d make a couple versions of each road piece, with slightly different dirt/crack texture maps. It should help hide the repetition a bit. Then in python you can do something like: random.choice([‘straightRoad1’, ‘straightRoad2’])

Alternatively, you could try texture nodes to play with the mapping of the dirt/crack texture dependent on the objects position in world space.

I’ve recently come to learn that half of procedural environments is cleaver texturing to keep the environment looking interesting. Because even if the level it’s self is varied, if the textures don’t change it still looks predictable.

I made a few foundations and a door and 2 window sections… trying to make it flexible, low poly, and able to make at least a few buildings using the same pieces

I could use some help re-writing the generator to account for the shapes I am using

edit: got it!

Thankyou so much for this. Quick question though, I’m using the "attached walls solution but cannot generate doors for each room properly.

geometry instancing is in!

static draw call batching is in soon :smiley:

(this should open this up quite a bit as far as polygon density)