LongCut logo

Setting up a Custom Character in Omniverse Audio2Face

By NVIDIA Omniverse

Summary

Topics Covered

  • Map Face Parts Precisely
  • Set Eye Transform Options
  • Add Correspondences Strategically
  • Proxy Dynamic Meshes Easily

Full Transcript

[Music] foreign we're going to see a quick overview of setting a custom character in order to face

audio to face generates facial animation From Any Given input speech good morning Professor Austin how are you doing are we to face is generating not only

the motion of the face but also the motion of the eyes the jaw and the tongue good morning Professor Austin how are you doing

in order to set up a character that can be driven by a little face the character will need to have a pair of eyes a tongue and a separated lower denture

and the skin for the face the character could also have some optional elements Dynamic elements that move together with the skin such as brows eyelashes or the

teal line in the eyes or static elements that are not moving with the skin such as the hair or the shirt in this case

finally we need of course an audio to face mesh that will drive our character the workflow we're setting a character starts on the character transfer tab the

first step is to map all the relevant parts of the face so that audio to face understands how your character is composed after identifying all the parts we are

ready to set up the character this will bring the a2f meshes for the next steps the gray match will be used as the animation driver on the final step audio

to face will drive the gray character and in its turn the gray character will drive the animation of the custom character in order to do that we need to identify

some corresponding points between order to face meshes and the custom character this brings us to The Next Step the skin mesh fitting with the available tools

tags can be added on the character and the green mesh edit it and removed as well with the correspondence set we can start the fitting process

when the fitting process is done you will be presented with the result of the fitting a successful fitting should make the character match closely resemble the green mesh the quality of the fitting is an

important factor to the final quality of the character transfer once satisfied the post wrap will finalize the mesh fitting process and bring back the character to its original

shape the next steps will prepare the Joe and the tongue this last process you only need to click on the each button once

with the skin Joe and tongue ready we reach the final step of the character setup the last remaining step is to attach another to phase Pipeline on the gray mesh and with that the custom character

is ready good morning Professor Austin how are you doing so let's see how we can do this directly inside our little face I have my file

loaded and as you can see we have all the necessary meshes to fill up the character setup we have the skin we have the eyes we

also have the lower denture the tongue and all the other additional meshes so let's start by selecting the skin mesh and adding it to the skin field

now we will present that with a character orientation in this case the correct one if not you can change this orientation here at the top once you are done we can just remove because it's

just for visualization one thing to notice is that here there is some information in case your mesh has some issue the next step is to adding the lower

denture the tongue and then here we are reaching the eyes so we will choose the left eye meshes

and the right eye meshes it's important that they are separated in both sides we can have as many meshes as possible but they have to be separate for the

left and right side now we are presented here with the choice of having use the parent transform the measural position or compute the select mesh

in this case when you select the match you can see that it has its own transform then you could use the select major position

in these other cases that I prepare in this case the match has no transform the transform where the center of the eye is

is actually on its parent X form in this case if you added this type of eyes you should use the parent exponent the final

one is basically meshes that have no reference here in the compute they have no reference plus the pairing also doesn't provide

the information then in this case you could use the compute from selected mesh it will try to fit a sphere and try to find the center of Pi automatically in this case I added them from the mesh

transform so we're gonna choose the colloid as the reference and then just use the selected mesh work position

now we can add our additional meshes so we have the Eric here the shirt

and also the top denture finally we will add the dynamic meshes that should move together

with the um with the skin and also we have the eyelashes which in this case is a new feature in our interface can be hair can be set up as

curves so with this we have the setup already so let's just go ahead and click the setup character

and this will bring our mesh set up here uh if the material gets lost in many cases is because the material is

inherited not set directly you can just simply go ahead and um find it in this case is the

hair all right so now we need to start getting some correspondences uh I usually think is more convenient to

hide the elements that are not needed to for example like the static and Dynamics are not needed then we can start adding by choosing the

add mode and the way it works is you have to put one correspondence on each side and then choose the other side then when you click again then you will press in a new

correspondence the order is not important but once you place both correspondence then you switch to the next step you can put as many as you need usually

you would want to cover the important features and when you are done you can click on the down button if you wanted to edit because the position was not correct you can click

on the edit menu and then refine Yours correspondences by clicking dragging on them you can also delete the correspondences

and save them into a Json file in order to be used later so I actually I did this before and I saved a good group of correspondences

so it here is my setup you can see that I cover the eyes and the mouth with more detail and then I just provide you

General landmarks to guide the fitting with that we can start the mesh fitting process the result will show

and UI to check whether these two meshes are quite similar in this case the result is pretty good and because the nose it doesn't move

it's not a big problem but we could even add some other correspondences here to improve that if you wanted to do that you could simply delete the fitted mesh

and then just bring back the original mesh and then start adding correspondences again once you're done you can just do the

match fitting process again but for this demo this will be enough so when we are satisfied we can just continue with the post wrap

the pause wrap will set up the character back to its original position ready for the next step we will prepare the jaw motion then the

tong mesh fitting and finally the tongue mesh pause wrap at this step we might want to deal with

the Dynamic meshes so these Dynamic meshes are the ones that we follow

the movement of the skin so we can use the proxy UI tool to drive them the way this works you can open the UI

and here they will have the explanation you first select the driver mesh in this case our skin followed by all the measures that you want to deform with

that driver so let's select our skin and then select all the dynamic meshes and then just simply press apply

when it's successful we can just simply cross the UI now it will not show anything but in the next step which is setting up the audit

to face pipeline we can see that they will be set correctly at that time also what we can do is we can hide this mesh as we don't need it

anymore and then set up the audio to face pipeline usually if we did the setting correctly

all these measures will show up already set up here you have the choice between the regular and streaming player

as well as the networks that we have let's choose the new modern clear once this is performed the

mesh is ready to be played let's choose uh proper voice and then let's test it tourism is also a significant

contributor to bangkok's economy generated 427.5 billion bats 13.38 billion dollars in Revenue in

2010.

and also we can try the Chinese new language foreign and finally this is a new addition we

can also switch the networks so you can try which network and model is more suitable for your language or for the speech that you are

working with so both models works for all languages and female or mainly as well

personal including the French Creed of course you heard that slippery again please check our forum and Discord channel for any doubts or feedback that

you might have thank you

Loading...

Loading video analysis...