AU Class
AU Class
class - AU

Use Your Smartphone's Voice-to-Text Feature and Varjo Marker Tracking to Create Annotations in VRED

Share this class
Search for keywords in videos, presentation slides and handouts:

    Description

    Most extended reality (XR) applications rely on handheld controllers or hand-tracking technology to interact with virtual content. Interactions such as "pointing," "selecting," or "touching" objects in virtual reality (VR) are typical use cases. Sometimes these input methods can be cumbersome for inexperienced VR users to work with, leading to frustration, extended training/setup time, or missed opportunities. Learn how the VRED API—along with its integrated web server and Varjo Marker tracking—can be used to enable intuitive methods of input by using your smartphone as the input device.

    Key Learnings

    • Learn how to send Python API commands from a web page to VRED.
    • Learn how Varjo Marker Tracking works in VRED to locate objects.
    • Learn how to create a low-code web app to create annotations in a VRED scene using your smartphone's voice-to-text feature.