Thoughts about procedural animation. ".blend" template included.

http://3.bp.blogspot.com/-d9Bj-FOTNYI/VDnyCtkxSqI/AAAAAAAAEa8/XmC2vMPTa_0/s1600/AutoLeg.gif

Hi folks!

This thread is meant to document a technique created by our fellow @BluePrintRandom

There is a .blend template below. What is inside this template:

  1. One leg with the traditional animation as reference.
  2. One leg made by Merril’s procedural technic.

DOWNLOAD LINK(<—Google Drive, saved in Blender 2.71)

Please notice that even though the procedural leg is made out of parts, a continuos mesh will work too.

The cloth attached to the leg can be used to store the movement of the bone by using the “record animation” tool inside Game menu. Unfortunately it’s not possible to store rotation yet, so it’s pretty experimental.

Merril and I are working more in the Game Engine front however.

This .blend template is still very preliminar. There is a lot of room for optimisation.

Merril also proposes this workflow, that could be added into Blender U.I. by default:

"
Select item

insert keyframe - Visual space keyframe
adds a “empty” that is red.

The empty Has a value, that is time, and if there are two, it chooses the center of the two as the target, but interpolated between them, over time,

So frame 1 is hand idle, frame 2 is at time 20 and is at Magazine location on belt, and frame 3 is at time 40 on bottom of gun, and 4 is time 60 back at idle,

So the keyframes can be parented to items, and no matter what the current animation state, the hand will reach all 4 points.

(If there are two key frames with the same time stamp it chooses the average of the two as the target).

"

TO DO:

  • Hypothesis / idea: combine this template to a plane “face oriented” to “shadow”.

References

I think it’s also important to quote the animation technique with Steenberg’s method here:
LINK (<—Quelsolaar.com)

Possible application:

All the best!

Ortiz

This system looks pretty cool, I never really looked much into the constraint animation style. I think the confuse animation demo makes it look a little easier than it actually is to achieve with the GUI, but from the .blend you shared it looks fairly straight forward.
Thanks for sharing!

The trend of “physics” based animation is more going towards an armature with a ragdoll matching it, then the “procedural” animation can be done with things like full body IK on the armature itself rather than the ragdoll, that’s what Endorphin which started all the physics hype does …Which is exactly what I’m working on.
I’ve created a script that automatically creates a ragdoll out of an armature, assigns limits, and then applies a servo to match the armature. (Still not released, don’t get confused between this and my old, useless, hardcoded ragdoll. though it’s essentially the same concept.)
The Full body IK part is still a very early WIP, as so far I’ve gotten it to create a chain and then set an active direction for it, but still not taking limits into consideration + doesn’t work well with disconnected bones. That Steenberg thing is fullbody IK not ragdoll btw.

About ITasc, It uses too heavy and unneeded equations for the IK, I don’t find it useful over the standard one in any case, even robotics.

I think both are applicable,

(like I use a object to scale my walk cycle) and another object to change the stride angle,

of the “Action Armature” and the ragdoll walks different accordingly,

with better base animations, and less “hard coded” tools it could be amazing,

punch start, punch hit location, punch return… -> when player, punches enemy in face even if they are slightly offset

right now my feet ik targets are parented to a bone, that does copy scale and rotation, of a object used to manage stride length, and angle, in real time

edit:
I will try and present a hand animation system, that can use actions, and switch to “space frames” and back… that is put together in a sensible manner.

Ps:I can’t wait to see the new rig generator Jackii!

Hello guys! Thank you so much for the warm reception! Working with computer graphics can be very lonely job, but when we have a talk at that level I feel like it’s worthwhile :slight_smile:
I spend a lot of time thinking about how much effort we, 3D persons, dedicate to develop realistic solutions.

Don’t you think that we sacrifice to many time on that instead of gameplay?

In that sense I tend to see the 3D communities (such as blenderartists) as more conservative in terms of aesthetics, me included, don’t get me wrong :wink:

I don’t know… I fell that that those guys from TigSource for example are ahead of us on some stuff and we could benefit a lot from an exchange. Take the work of @SolarLune as an example… He is someone who is flirting with a vocabulary common to both foruns.

Well, That’s a lot offtopic, but what I was trying to say is that I like the idea of procedural methods more in the sense of achieving something that is faster and more expressionistic, this way generating more time to dedicate to innovative mechanics for games. (<—Well considering the procedural leg, this sounds like a big fat lie XD).

Thinking about what is a good practice when looking to lonewolf developers and small teams…

Recently I’ve re-watched “Indie Game, the movie”. I think I’m more in the mood of “Making the games you wanna make”. I’m sorry about expanding the scope of the discussion. :slight_smile:

Thank you so much, again, for your rich feedback on that topic.

ok, so here is a procedural walk system, this is not using scale yet, (to set stride length and height)

This is just the stride angle,

I think using this, a “scramble” animation and side steps, mixed together correctly, you could scale them each, at the same time you blend them, to make any walk/side step/back peddle etc

so this +LinV could control all leg animations, (that are cyclic like walking)

if one were to use separate left and right leg actions it could perform kicks etc as well.

w = walk

up arrow / down arrow adjust stride angle

Attachments

Checkit.blend (508 KB)

Part 2 - scale

this is not perfected yet (the equations for manipulating scale based on stride angle)

w = walk

up arrow / down arrow adjust stride angle and length and height

all that is left is animation speed…

Attachments

Checkit2.blend (521 KB)

There’s a lot more to a walk animation than just stride length and height. Don’t forget hip movement and also foot angle.

Here’s a simple illustration of a walk cycle:


See how the foot changes angle, that’s just part of the way that the bones move in subtle ways during an animation. Think of the hip roll of a “feminine” walk, or the swaying of the shoulders in a “masculine” walk. You need to study traditional animation a lot before taking that practice in to procedural animation.

I made that walk cycle, from scratch, the system can manipulate a more complex walk cycle, that was meant to illustrate a method for adjusting a walk cycle,
it is by no means perfect,

Would using modules decrease the processing power usage?I think someone needs to find a way to optimise it.How could that be done?

Of course, I can understand that. I’m just brainstorming RE; how could you procedurally script something like foot orientation or hip movement?

I think a walk cycle would have to be clocked, perhaps a scaled float 0.0 being the start of the cycle, and 1.0 being the end. At any time, the different guide objects would need to know where the clock is at, so they can adjust their own timing to match it. The float could then be used to scale the animation, so if your walk is currently running at 20 frames per second, a common walk cycle speed, and the self_clock was at 0.25, you’d know you’re just coming up on the first crossing pose (frame 5), as the contact foot(left), and the raised foot(right) pass each other. You’d know that the right foot should be pointing down and angled slightly to one side, while the left foot should be aligned to the normal of the ground. Meanwhile the hips, or the shoulders (whichever you’ve set as the body hook, where the IK of the body “hangs” from) should be moving up and slightly twisting to the right. The head should be following the look target, so you don’t have to worry about that.

I think animations need to stored, or generated with a float as their timer, not frames. Key frames can be floats too, and would be recalculated if you added any more after the last (1.0) so that the overall timing never goes beyond 1.0. On the other hand, some animations, like falling down would be longer than a walk cycle, so maybe you could build them on the range of 0.0 - 3.0 or whatever. Or you could keep the timing at 1.0, but include a relative time scale value, like a walk cycle is (1.0,1.0), while a shooting animation is (1.0,0.3) while crawling slowly along the ground could be (1.0,8.0), though then if we consider that the first value is always going to be 1.0, we can discard it. Of course it’d be best if different animations had a timescale which scaled together, like walk= 1.0 and punch = 0.5 is better than walk = 1.1 and punch = 0.6666, because in the first case, you could walk one step and punch twice and then be ready to do something else, while in the second case the values never match up.

Just ideas… :slight_smile:

Yeah. This is important but as long as we use armatures and the action actuator we are doomed in terms of performance I guess XD

I don’t know about you guys, but from my perspective what takes more time in game development is Uv mapping and to create actions… In that sense what kind of stylized solutions do you like more? Cutout characters? Sprites? How much time do you think someone could save if he/she pick some kind of more simplified style instead our traditional methods? :slight_smile:

Of course… Only simplified actions will not transform “game making” into something less painful XD

Yours,

Ortiz

Well, as you develop more and more you start to build up a library of assets. I keep all the old stuff I made in the past, and I often reuse it. If you’re going to make a game completely from scratch, then yes it could take too long, but with a good library of animations, models, textures etc… as well as the knowledge of how to make things faster and easier (it’s always faster and easier the second time you do it) you can start to get to the point where it’s possible to make something in a reasonable time frame.

Using procedural generation can help a lot with that. I already use it for generating levels, and populating them with monsters and items, and for building pathfinding data (I don’t do any of that stuff by hand anymore), why not extend that to generating agents, from textures to animations? I’ve already got a procedural texture generator, a procedural animator would be the next logical step.

my own walk cycle is 30 frames, however “walking” plays .55 frames per second,

and sprinting adds another .25 per frame and then local linV -y adds a amount from 0-.25

Sure!
That’s a fact. And that is a good thing :slight_smile: That’s why I like to allocate a lot of time to craft templates and store everything in an organized backup. Also the majority of what I’ve learned I learned from awesome templates made by the Blender community.

My problem is that I’m always restless about the aesthetics side. What would make myself happy today usually will make me nonconformist tomorrow :I

Yeah, when I first started using Blender I made a load of GLSL compatible assets, with normal maps and such. Later I switched to single texture for compatibility with older computers in mind. Then I made stuff with baked textures. Now I’m returning to GLSL because single texture has been dropped from the BGE.

Because of this I have some stuff which is good, but not useful, and other stuff which is not good (I made it years ago, with less skill) but it is usable in my current projects because it includes all the correct textures.

Here’s where procedural animation can be really useful. Once you start building a library of poses and animation, you can store that as a script, which can be used later with any project. Even if the way that animation in Blender is changed in the future, you can still use the old scripts again, since they don’t rely on Blender actions or keyframes or specific bones. that means though that such a system would have to be developed to avoid dependencies. It would be better for example to get the size of the character, and stride length etc… from the placement of empties, rather than using class bge.types.BL_ArmatureBone(PyObjectPlus), because in the future it’s conceivable that the way bones are handled in BGE could change.

I think proceduraly generated enemies with weak and strong ai.It would make the game replayable many times.You would need procedural animation to do it.I would love to see this in the bge and also the ability poceduraly generate enemies.A logic brick version of this would be nice or in whatever visual programming language the bge gets.

well, one advantage of the walking ragdoll, you can “morph” from one pose to another using physics, so you could in theory play multiple actions and see the average etc,

idle->to “cross pose 1” (from smoking mirror’s post) - “blending” into playing another action etc,

also I can adjust “relax factor”

angV=part.worldAngularVelocity.copy()
part.worldAngularVelocity= ((angVmultiplier)+targetAngV)(1/(multiplier+1))

so

part.worldAngularVelocity= ((angV2)+targetAngV)(1/(3))

so the most efficient version with these values==

part.worldAngularVelocity= ((angV2)+targetAngV)(.333333333)

it has begun.

Attachments

mECHAcHICKENrIG2.blend (511 KB)