Jump to content

Mouth Movement


Recommended Posts

Posted

I know not all player models support active speaking, but I am thinking about how the well designed ones can open their mouths in time to scripted dialogue in single player, and in accordance with taunting in multiplayer.

 

With the source code revealed, would there be any way to... set up some code and then bind the speaking script to a key, allowing the mouth to open and close whenever a key was held down?

  • 3 weeks later...
Posted

an hint: making my mod, i see that nbot all models are suppported for labial \ face expression.

well all depending by the weight. there is something unknown in the script that regulate the movement of some bones when the characters does the talking animation. On modview in the head animation is possible to see on the JA skeleton and cfg fiels, but NOT in JO, the single 4 frames that together make the lips movement. 

that's another hint i can do for you.

For making a character with facial expression you need:

- a humanoid character with eyes, eyeball, mouth, face, and lips. if some part is missing, yu need to create, for some of my charatcer of legacy of kain mod, i need to merge into the model the mouth_eyes mesh, another big trouble i had to do was to fix the eyes for suture the eye edge to the eyes orbicolar of facial. XD terrific terrific work: vertex for vertex of fixing xyz coordinates, Argh!

however, when yu make the weight yu need to be careful. use the cervical for weight the nek, the cranium for all head and hairs.

for the front , yu need the ceyebrow bones, if yu weight the front on this bone, yu can have a frontal movement of the skin for emotive expression.

for eyes animation, you need basicall to weight to leye and reye ones the respective 2 eyes, be careful the face bones are perfectly collimating with the face or this can create some strange deformation of the face with the eyes movement. i have some model that beat the eyeball in a mode little strange. O.o well, are not all humanoid, i FORCED the models to work and have expressivity of emotion. O.o

however, for the talking. is very easy: the lower lip, the chin, the nek frontal part congiuntion with the jaw... you need to weight to the jaw bone. the jaw is the key! the upper lips and the cheek at cranium, the lower lips and the chin at jaw. in that's mode, yu can create the mouth movement animation.

about how this working, i thing is strongly hardcoded in some place of the code, and i have no idea or where is or what kind of istruction are setted.

i suppose that is something like

"if the model is producing a sound with the CHAN_VOICE or CHAN_VOICE_GLOBAL move the jaw and the lip with animation BOTH_FACE_TALK0/1/2/3/4 in a random mode,of something like that until the sound track played is over. after, break the animations. 

I think function in that's mode. :)

for the weight i have direcrt experience and i am sure that function. also, the CHAN_VOICE and the CHAN_VOICE_GLOBAL are importat string of the icarus code that acrtivate the lip movement. the other CHAN command seems not not affected the lip movement, for facial expression,. is automatically setted on the combat. the NPC with the front weghted to Ceyebrow bone making an anger face when fight. in cineamtis is possible to script the facial expression with icarus. there is a command for do that, now i not remember the name of the set comand. i do some cinematic in my level with dialogues, so i know how that's work.

what i not know is how that work on the code.

:( i not know in what files, what command line and what section of the code are setted the intercation betweeen the voice audio track, the CHAN_VOICE script and the bones movement\deformation of the model. but in every case, is really a wonderful work. i am ever wondered by the JA labial animation :o

Posted

an hint: making my mod, i see that not all models are supported for labial \ face expression.

well all depending by the weight. there is something unknown in the script that regulate the movement of some bones when the characters does the talking animation. On modview in the head animation is possible to see on the JA skeleton and cfg fields, but NOT in JO, the single 4 frames that together make the lips movement. 

that's another hint i can do for you.

For making a character with facial expression you need:

- a humanoid character with eyes, eyeball, mouth, face, and lips. if some part is missing, you need to create, for some of my character of legacy of kain mod, i need to merge into the model the mouth_eyes mesh, another big trouble i had to do was to fix the eyes for suture the eye edge to the eyes orbicolar of facial. XD terrific terrific work: vertex for vertex of fixing xyz coordinates, Argh!

however, when you make the weight you need to be careful. use the cervical for weight the neck, the cranium for all head and hairs.

for the front , you need the eyebrow bones, if you weight the front on this bone, you can have a frontal movement of the skin for emotive expression.

for eyes animation, you need basically to weight to leye and reye ones the respective 2 eyes, be careful the face bones are perfectly culminating with the face or this can create some strange deformation of the face with the eyes movement. i have some model that beat the eyeball in a mode little strange. O.o well, are not all humanoid, i FORCED the models to work and have expressiveness of emotion. O.o

however, for the talking. is very easy: the lower lip, the chin, the neck frontal part congiuntion with the jaw... you need to weight to the jaw bone. the jaw is the key! the upper lips and the cheek at cranium, the lower lips and the chin at jaw. in that's mode, you can create the mouth movement animation.

about how this working, i thing is strongly hard coded in some place of the code, and i have no idea or where is or what kind of instruction are set.

i suppose that is something like

"if the model is producing a sound with the CHAN_VOICE or CHAN_VOICE_GLOBAL move the jaw and the lip with animation BOTH_FACE_TALK0/1/2/3/4 in a random mode,of something like that until the sound track played is over. after, break the animations. 

I think function in that's mode. :)

for the weight i have direct experience and i am sure that function. also, the CHAN_VOICE and the CHAN_VOICE_GLOBAL are important string of the icarus code that activate the lip movement. the other CHAN command seems not not affected the lip movement, for facial expression,. is automatically set on the combat. the NPC with the front weighted to Eyebrow bone making an anger face when fight. in cinematic is possible to script the facial expression with icarus. there is a command for do that, now i not remember the name of the set command. i do some cinematic in my level with dialogues, so i know how that's work.

what i not know is how that work on the code.

:( i not know in what files, what command line and what section of the code are set the interaction between the voice audio track, the CHAN_VOICE script and the bones movement\deformation of the model. but in every case, is really a wonderful work. i am ever wondered by the JA labial animation :o

 

That certainly narrows in on the problem at hand... If only we could bind, as you said 'move the jaw and the lip with animation BOTH_FACE_TALK0/1/2/3/4 in a random mode' to a key rather than a sound file. That would be exactly what all machinima filmers would want to sync to their own external audio.

Circa likes this
Posted

That certainly narrows in on the problem at hand... If only we could bind, as you said 'move the jaw and the lip with animation BOTH_FACE_TALK0/1/2/3/4 in a random mode' to a key rather than a sound file. That would be exactly what all machinima filmers would want to sync to their own external audio.

mh, i have a strange idea for your problem. Binding can make with menu or console game, setting an event produced by an animation is a think that can do only an animevenet.cfg file, the solution of your problem can be this: you need a large amount of C++ knowledge, yu need to creat a command for the animevent.cfg files that allow you to make a binding function for automatic associate animation -> key to reproducing. the key can be setted later in the animevent.cfg file with the new command you have programmed editing the code. is something like create a new function like AEV_SOUND or AEV_EFFECT and is the only way for do that, with animevent.cfg of the models file yu can set also the frame for the binding.

so yu need

- a large knowledge of programming

- search into the code the part about the function of all mechanism of labial lip movement.

- searching into the code the functions about Animevent.cfg command lines.

- create a command line for Animevent.cfg file that allow you to bind a key to an animation for reproducing the animation itself, and that's is the more complicated part.

- compile the edited code with this new implementation.

- create your binding in your animevents.

you can do so a generic binding key for the five animation of lip movement in the _humanoid animevent.cfg file, or specific animation with a new animevent file to add in a specify folder character, for bindinbg the key for only that character.

maybe i am crazy, maybe i had some nightmare in the sleep and i am awaked now, but i not have another idea for resolve ypur problem.

for align with the name of animevent parameters, you can call the new command AEV_BIND, likethe command AEV_SOUND that produced a sound by an animation, or AEV_EFFECT that can produce an efx. effect by an animation. :)

Posted

I'd like to speak on this matter a bit as lip syncing was some of the first bits of code that I examined when the source code was released.

 

Essentially, the way that lip syncing works is based on the principle of Amplitude Modulation (AM). The audio sample which is currently playing gets the loudness/amplitude of the waves monitored, and every arbitrary interval (500ms? 200ms? 100ms? I don't know exactly when.), it samples the audio and picks an animation between FACE_TALK0-4 based on the loudness of the sample. It's primitive, but it works.

 

If I remember right, the code for lipsyncing allows this sort of stuff to take place easily. You'd need to set the entity up to be playing a sound effect from it, and the engine apparently does the rest. See how taunts do it. 

Asgarath83 likes this
Posted

I'd like to speak on this matter a bit as lip syncing was some of the first bits of code that I examined when the source code was released.

 

Essentially, the way that lip syncing works is based on the principle of Amplitude Modulation (AM). The audio sample which is currently playing gets the loudness/amplitude of the waves monitored, and every arbitrary interval (500ms? 200ms? 100ms? I don't know exactly when.), it samples the audio and picks an animation between FACE_TALK0-4 based on the loudness of the sample. It's primitive, but it works.

 

If I remember right, the code for lipsyncing allows this sort of stuff to take place easily. You'd need to set the entity up to be playing a sound effect from it, and the engine apparently does the rest. See how taunts do it. 

That's actually genius, but it requires importing the audio into the game engine and getting it to play... and also makes it more complicated to get them to do it whenever one wants, rather than getting it to run randomly through 0-4 whenever a button is pressed. Still, very good insight to have.

Posted

I'd like to speak on this matter a bit as lip syncing was some of the first bits of code that I examined when the source code was released.

 

Essentially, the way that lip syncing works is based on the principle of Amplitude Modulation (AM). The audio sample which is currently playing gets the loudness/amplitude of the waves monitored, and every arbitrary interval (500ms? 200ms? 100ms? I don't know exactly when.), it samples the audio and picks an animation between FACE_TALK0-4 based on the loudness of the sample. It's primitive, but it works.

 

If I remember right, the code for lipsyncing allows this sort of stuff to take place easily. You'd need to set the entity up to be playing a sound effect from it, and the engine apparently does the rest. See how taunts do it. 

Wow, good to Know, for me is a wonder know that. Thanks Eez :)

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...