Meta's Presence Platform Brings New Mixed Reality Tools for Developers
Image courtesy of: Meta (formerly Facebook)
- Staff Writer
- On October 29, 2021
At this year’s Connect 2021, Meta (formerly Facebook) shared its vision of the metaverse alongside several exciting announcements, including plans to expand support for the mixed reality capabilities of Quest with a suite of new tools that will enable developers to build out an immersive and more connected digital experience.
During the keynote event on Thursday, Meta announced the launch of the Presence Platform, which provides a broad range of machine perception and AI capabilities, including Passthrough, Spatial Anchors, and Scene Understanding. This new system will allow developers to build more realistic mixed reality, interaction, and voice experiences that seamlessly blend virtual objects and content in a user’s physical world. The Presence Platform includes three main SDK toolsets—Insight SDK, Interaction SDK, and Voice SDK.
Insight SDK, which builds on the experimental Passthrough API launched earlier this year, lets developers easily create mixed reality experiences that deliver a real sense of presence. Everyone will have access to the Passthrough feature in the next release, allowing developers to build, test, and ship experiences using Passthrough capabilities.
Additionally, Insight SDK will give developers access to other new technology capabilities, including Spatial Anchors and Scene Understanding. With Spatial Anchors, developers can essentially “lock” virtual objects in a physical space that can persist across sessions. Whereas, Scene Understanding will allow developers to quickly build complex and scene-aware experiences that interact with a user’s real-world environment using a feature called, Scene Model, which gives a geometric and semantic representation of the user’s actual physical space.
As for Hand tracking and controller-centric interactions, both come featured in the Interaction SDK, which will make it significantly easier for developers to add high-quality, robust interactions to their games and applications by providing a library of commonly used gestures—like grab, poke, target, and select. These ready-made components can all be used together, independently, or even integrated into other interactions. Interactions SDK simplifies the development process by solving the many tough interaction challenges associated with computer vision-based hand tracking. It offers standardized interaction patterns and prevents regressions as the technology evolves, and also will make it easier for developers to build their own custom interactions and gestures.
Lastly, as part of the Presence Platform, the new Voice SDK will open up a set of voice-control capabilities powered by Facebook’s Wit.ai natural language platform. With Voice SDK, developers can create hands-free navigation, like launching a specific game mode from an app with your voice, as well as new voice-driven gameplay, like talking with an in-game character or avatar or even casting a voice-activated magic spell.
The company says, like Passthrough API earlier this year, the Interaction SDK and Voice SDK will both be released as experimental capabilities for the time being, enabling developers to start building and testing out their prototypes.
Early next year, Meta plans to make available a sample experience called The World Beyond, showcasing the potential and possibilities of what developers can build using the Presence Platform. It will release in the form of a sample project, which could serve as a template for developers to utilize and leverage as they build their apps.