Pitchipoy Facial Bone animation

I have built a charactter and skinned him with the Pitchipoy rigify rig. Being that it uses bones for facial deformation, I am assuming that all speaking animation is by moving the bones to match the common lip motions, such as AAAA, OOOO, MMMM, FFFF, THHH and so on. However, after searching for 2 days now, all I am getting are tutorials discussing about using shapekeys. I would like to know if anyone has seen a tutorial on facial animation using bones or has bone facial animation died? If so then what is the use of Pitchipoy?

Use the pose library of the armature to set facial poses for your phenomes, then apply your poses to the timeline of your animation in the correct position.

I would suggest making a bone group that only has facial bones so you can quickly select only the facial bones when you apply the pose. This way, you will not overwrite the positions of any body controls.

Good luck!

Thanks DanPro, but I cannot seem to find a way to control the influence for the pose, like you can with shape keys. Is this possible?

Yes and no. I know of a way to do it, but instead, I’ll point you to a better method. The method you are describing uses phoneme poses/shapes. It is a fast but inaccurate method for lip sync.

A far better method is the Relative Shapes method. This method uses simple open/close and mouth narrow/wide shapes as the basis for the lip sync. Then, in later polishing passes, additional shapes and keys are added for specific sounds like mmm’s, pee’s, bee’s, eff’s and vee’s.

Since you are looking for a tutorial there is none better then the Animation Toolkit by CGCookie. That is where I learned the method and a lot more. I highly recommend it.

Just like Humane Rigging should be a staple and required for rigging, The Animation Toolkit should be required for animation.

It will cost you about $15, last I checked, but is free to members now.

I hope that points you in a better direction.

Good luck!