[Dev] OpenVDB Based Particle Mesher Modifier

I actually uploaded the patch to developer.blender.org after last dev meeting as requested by Ton, you can check it out here: https://developer.blender.org/D1008. And yes, OpenVDB needs to be compiled and installed separately.

Hi KWD,

Great work on the mesher, looks really good!
I was wondering; when using this mesher with the particle system, does it producde the same artifacts as the normal Blender Fluid simulator as in the following picture? (talking about the jagged edges)
http://i.stack.imgur.com/NfPsQ.png

I really HOPE not!

That’s quite an extreme example, you shouldn’t get such artifacts, at least i haven’t encountered them so far. :slight_smile: Again, the mesher used by the fluid system is quite archaic…

is there any test videos going on?

@Kevin great news :slight_smile: Thanks a lot for your work and high quality code. Now we can think about real VFX in blender. Fast test: 200K particles.

http://s30.postimg.org/afofhac0f/VDB2.gif

Hi ! KWD.

I have a problem with your patch. At the end of building, I had a linking error.

edit : @KWD, Thanks for your help. I succeeded to compile it.

Just looked on what openVDB can do, and to me, openVDB seems like coding heaven :).
Really looking forward to having it inside blender. I guess it’s a small step then towards efficient volumetric sculpting :slight_smile:

Yeah , blender voxel grid Ho!

@ KWD,

I made a simulation with 600 000 fluid particles.
After a certain number of frames, I have no more mesh and this error.

OpenVDB exception ArithmeticError : Detected Overflow in bbox volume computation, use uint64 for the PointIndexT type in the PointPartitioner.

I remarked that this error dissapeared when I increase voxel size. But the mesh quality is decreased by this. And it is only delaying the problem later in the animation.
I hoped to get around the problem by using a mask object.
But when I tried to use mask object, I encountered another error.

OpenVDB exception IoError : could not open /home/kevin/parts2.vdb for writing

I don’t really know what that means, would probably need to investigate a bit to see at what point of the conversion this comes from.

OpenVDB exception IoError : could not open /home/kevin/parts2.vdb for writing

This is due to the fact that I left some code in the mesh to volume function to output a VDB file in order to see if the conversion went fine. Normally it shouldn’t be an issue.

@@zeauro almost forgot, could you share your file for testing? :slight_smile: (To reduce file size, turn off visibility for both the particle system and the mesher).

I don’t share the first file because it takes a while to bake 600000 particles.
So I did another one with 200000 particles.

In fact, it is quite easy to reproduce the problem. You have to increase bounding box of particles system.
It cannot grow infinitely whitout a need to increase voxel size.

The error linked to your home only appear when trying to use mask object.
I build with boolean modifier maybe it is related.

Zeauro - basically it means your particles fly too far from the source? it’s logical, since openVDB processess the meshing in volume, so there is a growing demand for memory - as the particles get further away…

I guess there should always be a limiting bounding box - this would save memory, and probably render times too, so there’s no meshing too far outside your shot…

Now, that I know that I can easily create a limiting bounding box before baking particles.
I was just wondering if OpenVDB is able to support that without bounding box.

What is this PointPartioner ? Does it mean that there is a way to decrease memory use by modifying the code ?

If it couuld, OpenVDB would be quite fantastic for artist’s freedom.
It would allow to have a really big scene with multiple cameras points of view and same mesh quality for each.

The PointPartitioner is a tool that was just added to the library (v3.0), I don’t really know its inner workings, but it essentially spatially sorts and partitions a point list (here, the particle system) into a series of buckets for easier processing.

I would not worry too much about OpenVDB’s memory usage at the moment. First because the tool that I’m using to convert the particle system (openvdb::tools::ParticlesToLevelSet) was drastically improved and can now handle up to tens and hundreds millions of particle. During the production of “How to Train Your Dragon 2” they (DWA) converted about 300 million particles with this very tool. Second, in quite some cases, Blender’s particle system is using more memory than VDB during the conversion.

When it comes to your file, the SPH solver becomes “unstable” when it collides with Suzanne due to some imprecision caused by not having enough subframes. As a result the particle system “explodes” and quite a few particles end up isolated and scattered around Blender’s world which screws up big time the level set generation. In such scenario, you’re basically asking to get an error. Just go in the particles settings and crank up the subframes setting. :wink:

As per the masking, it’s really buggy at the moment, I managed to screw things up and it performs even worse than in my last video:


In a bug free environment, only the inside of the selected cube should contain some mesh. Here it only generates a mesh on the positive X axis side because the mask is on this side, if I 'd move the cube over the 3D grid, it will cut everything underneath the grid, and so on… quite a headache :confused:

I know. More subframes also means more computation and more baking time.
I had the same problem with my 600 000 particles blend file which take 45 hours to bake 800 frames.

I should probably use classical solver instead of double density. But final result with this solver is less predictable. With double density solver, it is more easy to to do some low amount of particle tests and then increase it.
I am missing Mantaflow’s solvers.

Ideal tools would be viewport/properties/node tools for GPU accelerated physics of particles meshed by OpenVDB.

zeauro - bullet physics has SPH(and had it for years), and that would probably perform well because of good multithreading and also gpu in the last version. I never got why blender coders implement things without harnessing something that is allready there.

Blender’s particle system has nothing to do with Bullet and mixing the two together isn’t at all a simple task.

The hard truth is that particles need to be COMPLETELY redone. Which in turn requires a lot of Blender’s core to be completely redone. And as of right now, no one is willing to tackle this. Lukas is being a champ and tackling some much-needed hair updates, but speaking to him in IRC, he is more than fed up with the particle system. To paraphrase him the other day: the particle system needs a complete rewrite, but completely rewriting the particle system is next to impossible right now.

are you a developer?
well, bullet is there for many years, doing good job at collision and simulation in various areas. There has been a lot done on the particles in blender during that period the cloth and hair system was also worked on. If there is a better model for simulating something, it could’ve been commited to bullet, so it gets more stable thanks to bullet developers too. like current hair rework which you mention, where bullet collisions could have been used. but I guess this discussion is getting too offtopic…

good thing is, hopefully we have a great mesher soon :slight_smile: