5/2/2023 0 Comments Unreal sdk![]() When apps are running natively on a Quest device, casting content is trickier. LIV’s footprint is almost entirely dependent on how well-optimised your application is! ![]() Additional camera types, effects, and layers can be added in future updates to the LIV App, making your SDK integration last longer.įor this to work well, we have developed a minimum-latency, high-performance transport layer that also handles resource management. ![]() Accurate latency compensation works without any additional effort from you, the developer.Optimised resource use - we only do the bare minimum work required in the SDK, allowing it to stay lightweight and easy to maintain.This output can then be recorded or streamed using software like OBS or Discord.ĭoing this work out-of-engine comes with some significant benefits: The compositor takes in multiple timestamped sources, performs latency compensation, and composites them together. ![]() These textures are then submitted for composition! The background & foreground are separated by clipping geometry, based on the user’s location within the scene. This camera then renders your app into a background and foreground, to allow the user’s body to be composited in. The LIV SDK spawns a camera inside your app which is controlled by LIV. With the power of out-of-engine compositing, a creator can express themselves freely without limits as a real person or an avatar! How It Works Thanks to our software, creators can film inside your app and have full control over the camera. It contextualizes what the user feels & experiences by capturing their body directly inside your world! The LIV SDK provides a spectator view of your application.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |