Now that you’ve created your shape keys, how can you properly manage and animate with them? Adding drivers in Blender to your 3D model’s rig (armature) is a great way of speeding up the process of animation and of creating realistic face expressions, including lip syncing for speech.
Since last time when we looked at a way of making a bunch of different shape keys, I went ahead and made some more. They’ll appear in a short while because this time we’re going to be looking at how these shape keys can be attached to a rig and controlled from there, speeding up the process of changing them and mixing them together, and animating them for your content.
Now that you have your shape keys let’s say you’ve done a bunch more for the lips, the nose, the eyes, whatever you need them for in general, necessarily on a person. It comes the time that you want to add this to your rig, because once you have a lot of these, it’s going to get complicated trying to animate all of these different values to get the desired result that you want.
Let’s add our rig, armature. Now if you already have a rig for the body itself to control the arms, the legs, the chest, my advice would be to just attach this as part of your rig you already have.
But since I deleted it to make it simple here… I’ll just add a new rig. So the advantage of a rig is that you can control multiple different shape keys at the same time. Now what can every single bone do? Every bone can move along this axis.
In this case it’s the y-axis, but depending on how you’ve done it, it might be the x-axis. It can move along the y-axis, but my advice would be to keep it only so you can view it from the front perspective and still change everything.
If you try and start changing it on an axis perpendicular from the direction of the face, it’s just going to complicate things. A bone such as this can move left or right, up or down, rotate left, rotate right, scale bigger, scale smaller.
Every single one of these motions has the ability to control its own shape key, so I made a few shape keys that are quite a mess. Brow-out-up, brow-in-up and uh… happy-puppy-eyes. Now as before we can just do a new shape key from mix: puppy eyes left.
Now as before we can just assign these a vertex group. If you have rigged your model already you’re going to have quite a few more vertex groups here in your selection since all the vertex group names for the arms, the legs, whatever your bones are called are also going to be in this list.
So make sure you give everything proper naming conventions. That’s pretty important, naming conventions and proper folder structures within your properties’ viewport: the box thingy here. They become pretty important when you’ve got a lot of stuff going on.
So now we have a little selection of shape keys and we’re going to put all of these onto a bone. More specifically two bones. And now we’re just going to go through all of these hover over the value and press CTRL+D.
What this does is, it gives it a driver and this driver will allow you to use the properties of the bone to control the driver’s values. See it’s not purple. So just quickly going to go over all of them, and now if we open up the drivers’ editor and now we have these values with the different shape keys that we’ve created.
You can see the names all here. We go down into drivers and select the object as the armature and the bone, and select ‘left’. So here we can have ‘eye left’, and now it’s targeting this bone or its value. But instead of world space which is defined by its location in the world itself you want to give it local space.
Now we just need to choose which axis we want to affect this shape key. Arbitrarily I’m going to say that I want when moving the bone up, it is going to turn this on, so we’re going to use z location to affect this.
If we go on to here, a quick tip I need to insert here, is that when you’re figuring out which axis you need to affect your vertex group in the drivers, when you’re looking at the bone, make sure to go to the orientation of the gizmo here.
Make sure to set it to normal because that’s the orientation, which the driver uses. We see here the global axis is the z-axis of course. But the normal axis is actually the y-axis. We can see that moving it up is affecting the base here.
Now that we’ve done it for this one it becomes a lot simpler. We can just copy it and paste it. It’s going to say ‘error’ but you just have to click this and it’ll click it off. It’ll update itself, and since this is for the other side, we need to change it from ‘eye left’ to ‘eye right’.
Now going back here, changing this affects this. We now have the value of the brow going out and up, the outer side brow going up. And we’re going to put this on the same bone. But instead of the y location and we’re going to put this to the x location and copy this other‘brow up’ to the right side.
Now when we do this we should be able to move this, and it goes out now. There’s no need for me to do the other one. But now we have rigging control over these different ‘up’ parts of the model. If you want to you can go to Pose Mode and add ‘limit distance’ to prevent it from going beyond what it can actually take, as affecting the shape key.
It just keeps it in its place, you know. Now that’s really it, that’s the basis of how all of the shape keys work together. All you do is scale up the number of bones and the number of shape keys that get changed by this, and like anything else you can just animate this to change.
See now, it goes around, so what this has allowed you to do is to modify three shape keys all together seamlessly, just by moving this around, saving you the trouble of going to all of these, finding what you’re looking for, animating it up, down. It saves a lot of hassle, it really does.
Before we finish off, you might be wondering how exactly to know what shape keys your model might need? As you can see here there can be quite a lot of them. this isn’t even all of them. I’ve kind of optimized the system for my own requirements.
If you need to have a reference guide of what motions you might require for the shape keys in your human face, and of course this can be transferred to animal faces with varying levels of realism, depending on what you’re going for, then something you can look into is the FACS system: Facial action coding system.
What FACS is intended for medical reasons, like the research of the human face and anatomy. What it basically does is, it defines facial expressions by what parts of the face these expressions affect.
Now there’s several resources online you can look to, but the one I personally thought was a pretty good one for reference is linked down in the description. So have a look at that and we’ll come back later with another tutorial to just talk a bit about implementing this system.
Particularly the lips and creating the ability to synchronize speech with the animation of the lips moving in a realistic manner.