Particle nodes..?

So that just didn’t happen or what? I mean, there were even test builds 2 years back but not a word since it seems…

Hair and particlesThe Gooseberry open movie project requires sophisticard control of hair and fur in animated characters. We’re still working on fixing and updating the old system, by replacing parts of this by newer code that’s more in control.
New is having much better control over physics simulation of hair, including proper collisions using Bullet. There will be a better hair edit mode and hair keyframing as well.
Ideally this was all meant to become a node system. For that, the outcome is still uncertain at this moment.
Working on it: Lukas Toenne
Likely to happen in 2015: 100% for working sims, 50% for nodes

Everything NodesSo while we’re on Particle nodes anyway – we should check on having Modifier-Nodes, Constraint-Nodes, Transform-nodes, Generative modeling Nodes, and so on.
And – why not then try to unify it into one system? Here a lot of experimenting and design work is needed still. It’s surprising and inspiring to see how well Python scripters manage to do this (partially) already.
Working on it: Lukas Toenne, Campbell Barton, …
Likely to happen in 2015: Unknown.

Aah, thanks Richard, I did some searches yesterday before posting this but couldn’t find any recent info on this, much appreciated.

With that said, sad-quotes-R-us… Past tense indicates it’s an unlikely outcome. :frowning:

Ideally this was all meant to become a node system. For that, the outcome is still uncertain at this moment.

And about “everything nodes” I’m guessing a system like the 3ds Max Creation Graph would be possible to implement, use the current modifiers as base for a node based GUI aimed at point/mesh manipulation. That would be very cool.

Short answer: the existing particle system needs to be torn out at the roots and completely rebuilt, along with the cloth and hair systems, and their underlying physics engine. All of these existing systems are hopelessly hacky and broken (according to Lukas) and were stacked on top of one another in a way that was never intended. It would be completely futile to attempt to drop nodes on top of the current system. The fix will not be easy.

Matt (sorry to just use your name, it seems weird to direct a comment to M9105826 and I don’t know if this board let’s us target users),

How likely is that to happen? Is an overhaul like that following the 2.4 series in order? A possibility? Hearing people gripe about the particles (as well as using the built in pFlow in 3dsmax, and investigating Thinking Particles, and bifrost in Maya, and ICE in softimage, and the new 3dsmax ICE knockoff, Houdini, etc)…it really seems like the “new standard” in 3D packages is being set, and even passe’…

While there are a number of little workflow gripes that people talk about (and would be REALLY nice to have changes and options)…the “big picture” items seem to be more and more pressing.

And, while I understand that the Open Movies are a particular tact decided upon by the Blender Institute (Foundation?), I’m sure that I am not the only one who questions the wisdom of trying to develop tools in the middle of a production (vs. refining tools…every production exposes things that need tweaked). Not to turn this into a Gooseberry gripe… but…

Where is the concerted effort to get under the hood and update all of this? To give us a new engine with which to ride into the new era? To give us particle that are more than marginally better than what 3dsmax had in 1999?

You seem to have an ear to the ground and a finger on the pulse…what are the heads thinking about this? The changes needed are comprehensive…the kind that require putting a halt on productions and lesser concerns…you gotta get a car off the road to fix its engine…

What do they want to do? How do they want to do it? Are they putting it off? For a particular reason?

From what I remember…the gooseberry funding drives were a little less than spectacular (perhaps I remember incorrectly)? And I always attributed that to a little bit of fatigue and frustration with the movies as a development vehicle. But…I would imagine a drive to accomplish these overhauls would be an order more effective? I know I would donate…

Sorry about the rant…but what do you think the solutions the powers-that-be are leaning toward? If it’s all just talk…that doesn’t seem like its being taken seriously? I suppose I could review the road maps…but I feel like there is a sense that the incremental changes need to halt for a moment in order to take care of foundational issues?

Or am I just crazy and verbose?

This overhaul of physics systems and nodification of everything will probably be on the scale of the 2.4 -> 2.5 GUI/UI re-code. Meaning scratching almost everything piece by piece and re-doing it. I Distinctly remembered that Physics Sims and Nodification of everything was planned for next large blender release 2.8+ and probably for the next Open movie. So it’s not gonna happen during gooseberry.

from the code blog.

For 2.8x we can target projects for bigger rewrites, with a lot of compatibility loss. Examples could be:

New “unified physics” systems, using much more of Bullet, unification of point caches (Alembic).
Particle nodes (could co-exist for a while with old particles though)
Nodification of more parts of Blender (modifiers, constraints)

Which sounds amazing, I just wish drivers could be added to that list. It still feels wonky to do drivers for me in F-Curve editor, input logic as algorithms that can’t be re-used easy. Where a node based approach would be awesome IMO.

So if I understand it correctly, the problem is more about BI not wanting to focus on them until they overhaul other parts of the dynamics system?

That’s not unreasonable, but it feels like that (more complete) overhaul is something that is a year in the future, at least… :stuck_out_tongue:

Some things go first. Trying to create a nodify solution without a proper despgraph it´s just a waste of time. Software is made upon layers and the physics-simulations system relies heavily on the animation engine.

Seems like their hands are really full right now, I’d love me a good particle system but the things they’re working on are top priority, too.

Lots of big changes are waiting on other big changes. Depsgraph being the most important, but luckily that’s close to completion.

I don’t wish to add fuel to the gooseberry fire, but I will anyway. I get fatigued hearing about everyone ranting about the open movies as a “waste”. Nobody seems to understand that all the code in the world means nothing if it’s not battle tested in production. Features are awesome and all. They fill a nice bullet point list. But if they are poorly implemented and buggy, no one will use them.

The open movies put the devs in the same room as the artists and subjects the code to the stresses of ACTUAL PRODUCTION. I can’t stress enough how important and awesome that is. Not to mention the good publicity the open movies generate in the 3d community OR the production experience the devs are getting, knowing what artists want and need and how to implement them.

Blender’s tool set would be half as good without the open movies.

Did you know that while Alias was developing Maya 1.0 they commissioned a short film called Bingo that was specifically for the purpose of testing out Maya in production before it was released?

/rant

Movies are great for adding features. Movies ARE NOT great for doing overhauls where they’re needed. It’s simply not possible to completely rewrite chunks of Blender with their roots deep and intertwined in the code while also trying to hit production goals within a certain time frame. Particles/hair/cloth in particular are going to need a concentrated effort where that’s one of the only things that’s touched for a while if Lukas is to be believed about how bad it is.

Oh, I agree wholeheartedly on that front. Ripping out the guts of the program is best done outside of a production. But once the physics rewrite is complete and in master, then an open movie would be a great way to test and refine it.

^ Yeah, and that’s the point that I was making, and that Matt is saying also. In my meaningless opinion, if the new dependency graph is almost done as Matt mentioned…and now that I think about it, I don’t know how much that has to do with Gooseberry or not…but it seem’s to me that a “finished” dependency graph should then be battle tested and tweaked by the open film…not developed simultaneously. Like someone else said, they’ve had to do a lot of work on hair and particles for Gooseberry…but it’s all kind of pissing in the ocean since its not “battle testing” the overhaul, its the “final hack” of the old version. And…wasting an entire open movie on a “final hack” instead of a “trial by fire” for something new is…naive.

Again, I don’t claim to know the order of Gooseberry developments, etc…

But…

If the last two years had seen a concentrated effort to fix the dependency graph, tear into the particles and physics, etc etc all the internals… then we would be right now seeing a “Gooseberry” production START to TEST those new features, instead of seeing a Gooseberry production ending with no new dependency graph, an even more hacked particle system, etc etc etc.

Now, I wasn’t around for Orange or Peach or Durian or any of that…but retroactively considering things… Elephant’s Dream, Big Buck Bunny, Yo Frankie, Sintel (especially Sintel? “Battle Testing” 2.5?), Tears of Steel…to a fair degree, it’s pretty clear what each of these films was testing and proving. I acknowledge, I don’t necessarily know answers to questions like “Was 2.5 built and then tweaked with Sintel, or was 2.5 built during Sintel?”… But regardless, each of these open projects you can clearly pick at least one or two MAJOR components of Blender that was being unveiled or updated and needed the trial by fire.

Gooseberry seems…missing that. Like they put the cart before the horse…and like Gooseberry could have been something far more meaningful and amazing if instead of raising money and writing scripts for the last two years, they were tearing into Blender and getting ready for battle. And then right now, with the promise of like, a quasi-Blender 3.0 on the table, who WOULDN’T donate to an Open Movie to temper that codebase like hot steel?

Or am I wrong?

And this is what I’ve been saying on this forum for years wrapped up in a little bow. The open movie projects do more harm than good imo. At least in the way the BF is going about it. Like you said feature should be implemented first then then tested, not implement while in production. That just gives us hacky implementations of tools or tools that are incomplete then are never fully implemented or explored as the next open movie project focuses on something else. It’s not a good way to of about developing production worthy tool imo. Production tools are usually the hackiest tools around because while in production you are looking for solution not polish. That’s why people like Autodesk take these production level tools (a lot of Maya’s features started out as production tools developed by VFX companies) and polish them for mass consumption. Blender needs to do the same.

All of this is true overall I agree it’s how things should be done. But the world isn’t perfect. Once the “machine” starts to roll it’s hard to stop. If you look at some of the original scheduling for each of the open movies, the plan was always to develop the features first while scripting and early preproduction was happening, then the features would be ready for testing in time for full production. That never happens (welcome to real life), but that was always the goal. But people have been hired and schedules have been locked down. Sometimes you can’t just stop the train.

So while the open movies don’t always get it right, they still do a whole lot of good things. Everyone is blasting blenders hacky hair system, except has anyone noticed that it’s working extremely well now? It’s been almost useless for years and now it works. It will take years for a new hair system to materialize and in the meantime we at least now have a useable tool. Thank you gooseberry.

This doesn’t have to be the case though as long as the devs. polish what they have developed after the movie is done. A good example of this is the motion tracking/greenscreen stuff developed for Tears of Steel, the original functionality was kind of rough in spots, but then it was refined a bit in subsequent releases and now you hear a lot of good things about it.

The hair dynamics in Gooseberry however may be a different story because it’s part of a system that needs a massive overhaul, so eventually the code they put in will be ripped out with the rest of the particles once the new node-based system is ready, but there’s no doubt that there’s been some useful stuff done as a result which is unlikely to be considered a waste (the work on reducing peak memory in Cycles and the sequencer improvements).

Refining things like memory usage and such is what imo that Open Movie projects should focus on, not major features that end up being hacky and plain broken or at least were implemented in a very short sighted manner. Some features become broken a year down the the line just by virtue of the developer not thinking ahead of things that come up later on since their main focus is on the open movie project.

The camera tracker and cycles are both good examples of features that were developed prior to the movie production, and then refined in production. They were also brand new components, with fresh code. The hair system is looking pretty good in gooseberry right now, but it is polish on top of a questionable architecture, rather than a final polishing pass of a brand new piece of code.

500 mb
https://media.contentapi.ea.com/content/dam/eacom/frostbite/files/gdc2018-frostbitediceemittergraph.pptx

Node based particle system in Frostbite engine