miliselect.blogg.se

Linden labs closing second life
Linden labs closing second life











  1. Linden labs closing second life how to#
  2. Linden labs closing second life update#

Download the project Viewer at the Alternate Viewers page. The Puppetry feature requires a project viewer and can only be used on supporting Regions. There's a lot to explore and try, and we invite the Second Life community to be involved in exploring the direction of this feature. Alternative hardware could be used to feed information into Second Life to animate your avatar - a game controller or mocap equipment. For example, using a webcam to track your face and hands could allow your avatar to mimic your face animations and finger movement, or more natural positioning of the avatar’s hands and feet against in-world objects might also be possible. We are excited about Puppetry’s potential to change the way we interact inside Second Life. The receiving Viewers use the same IK calculations to animate avatars in view.įor more details about the Puppetry technology, take a look at the Knowledge Base article Puppetry : How it Works The Viewer transmits the Puppetry data to the region server, which broadcasts it to other Puppetry capable Viewers nearby.

Linden labs closing second life update#

This is a lesser known functionality of the Viewer which has been around for a while but has, until now, only been used for automated test and update purposes. The target data is supplied by a plug-in that runs as a separate process and communicates with the viewer through the LLSD Event API Plug-in (LEAP) system. The IK calculation can be tricky to get right and is a work in progress. For example the position and orientation “goal” of the hand could be specified and IK would be used to compute how the forearm, elbow, upper arm, and shoulder should be positioned to achieve it. Puppetry accepts target transforms for avatar skeleton bones and uses inverse kinematics (IK) to place the connecting bones in order for the specified bones to reach their targets. We have some basic things working with a webcam and Second Life but there's more to do before it's as animated as we want.

Linden labs closing second life how to#

See the section below “How to participate” to learn how to use Puppetry yourself. The codebase is alpha level and does contain its share of rough edges that need refinement, however the project is functionally complete and it is possible for the scriptors and creators of Second Life to start to try it out. We have been working on this feature for some time and now we are ready to open it up to the Second Life community for further development and to find out what amazing things our creators will do with this new technology. Wouldn’t it be cool if you could animate your avatar in real time? What if you could wave your arm and your avatar could mimic your motions? Or imagine if your avatar could reach out and touch something inworld or perform animations? Linden Lab is exploring these possibilities with an experimental feature called “Puppetry.”













Linden labs closing second life