What if you could wave your arm and your metaverse avatar could mimic your motion? Or, imagine an avatar reaching out and touching objects inworld or creating real-time animations? Linden Lab is exploring these possibilities with an experimental feature called “Puppetry.”
A beta is open to the Second Life community for further development and to find out what amazing things creators will do with this new technology. They acknowledge the codebase is alpha level and does contain rough edges, but Linden says it functions.
Avatars are made from bones, they make up the skeleton and are used to move the avatar’s limbs. A webcam can capture data, then puppetry accepts target transforms for avatar skeleton bones and uses inverse kinematics (IK) to place the connecting bones in order for the specified bones to reach their targets. For example, using a webcam to track your face and hands could allow your avatar to mimic your face animations and finger movement, or more natural positioning of the avatar’s hands and feet against in-world objects might also be possible.
The Puppetry feature requires a project viewer and can only be used on supporting Regions. Download the project Viewer at the Alternate Viewers page. Regions with Puppetry support exist on the Second Life Preview Grid and are named: Bunraku, Marionette, and Castelet.
When using the Puppetry Viewer in one of those regions, if someone there is sending Puppetry data you should see their avatar animated accordingly. To control your own avatar with Puppetry it’s a bit more work to set up the system. You need: a working Python3 installation, a plug-in script to run, and any Python modules it requires. If you are interested and adventurous: please give it a try. More detailed instructions can be found on the Puppetry Development page, and there is also more to read in the Second Life Puppetry Wiki.
There is also a Blender Addon by “FeklixWolf” that can create a Second Life skeleton usable for rigging/animation, as well as support for puppetry streaming. For those sufficiently tech savvy and skilled with Blender, it’s available from GitHub.
Using python and blender, though free, take a fair amount of skill. For now you can enjoy other folks puppetry while they work out the rough parts. It’s been suggested that this feature might be great for performing artists to connect in a virtual world.
There’s also Avatar Dynamics Open Beta that animates and allows avatars to interact.
Here’s more info on what may be a little more user-friendly avatar animation for today. These interactions can be turned on or off and can be used in chat.
David Raiklen wrote, directed and scored his first film at age 9. He began studying keyboard and composing at age 5. He attended, then taught at UCLA, USC and CalArts. Among his teachers are John Williams and Mel Powel.
He has worked for Fox, Disney and Sprint. David has received numerous awards for his work, including the 2004 American Music Center Award. Dr. Raiklen has composed music and sound design for theater (Death and the Maiden), dance (Russian Ballet), television (Sing Me a Story), cell phone (Spacey Movie), museums (Museum of Tolerance), concert (Violin Sonata ), and film (Appalachian Trail).
His compositions have been performed at the Hollywood Bowl and the first Disney Hall. David Raiken is also host of a successful radio program, Classical Fan Club.