Loading...
Please wait, while we are loading the content...
Similar Documents
A NON-INVASIVE APPROACH FOR DRIVING VIRTUAL TALKING HEADS FROM REAL FACIAL MOVEMENTS
| Content Provider | CiteSeerX |
|---|---|
| Author | Fratarcangeli, Marco Fanelli, Gabriele |
| Abstract | In this paper, we depict a system to accurately control the fa-cial animation of synthetic virtual heads from the movements of a real person. Such movements are tracked using Active Appearance Models from videos acquired using a cheap we-bcam. Tracked motion is then encoded by employing the widely used MPEG-4 Facial and Body Animation standard. Each animation frame is thus expressed by a compact subset of Facial Animation Parameters (FAPs) defined by the stan-dard. We precompute, for each FAP, the corresponding facial configuration of the virtual head to animate through an accu-rate anatomical simulation. By linearly interpolating, frame by frame, the facial configurations corresponding to the FAPs, we obtain the animation of the virtual head in an easy and straightforward way. |
| File Format | |
| Access Restriction | Open |
| Content Type | Text |