Rendering an avatar from sign writing notation for sign language animation
Moemedi, Kgatlhego Aretha
MetadataShow full item record
This thesis presents an approach for automatically generating signing animations from a sign language notation. An avatar endowed with expressive gestures, as subtle as changes in facial expression, is used to render the sign language animations. SWML, an XML format of SignWriting is provided as input. It transcribes sign language gestures in a format compatible to virtual signing. Relevant features of sign language gestures are extracted from the SWML. These features are then converted to body animation pa- rameters, which are used to animate the avatar. Using key-frame animation techniques, intermediate key-frames approximate the expected sign language gestures. The avatar then renders the corresponding sign language gestures. These gestures are realistic and aesthetically acceptable and can be recognized and understood by Deaf people.