A model to automate the process of generating facial animation for interactive applications is proposed. The model allows to transfer facial expressions from a face mesh to another. The rig structure can be refined on-the-fly to deal with different input geometric data according to the need.