Developers just open sourced a framework for AI avatars that move and gesture while they talk
A newly released open source framework called SentiAvatar aims to make AI avatars behave more like real people during conversation. The project includes a motion dataset with 21,000 clips, a foundation model trained on hundreds of hours of motion data, and a system that synchronizes speech, facial expressions, and body gestures in real time. If the technology works as advertised, it could help developers build more believable NPCs, AI assistants, and virtual characters.