Advertisement

Hand tracking support in Quest 2

Started by May 05, 2022 10:14 PM
3 comments, last by frob 2 years, 6 months ago

Hey,

I was looking to implement controller free hand-tracking support for our oculus quest 2 in our inhouse engine. Our primary platform at the moment is just PC.

My question is, is this feature exposed through the pc oculus sdk? Or is it only exposed through the quest 2's native android/mobile api? Im having a hard time finding anything definitive on this. The closest Ive seen is the Avatars sdk which allows you to express limited hand gestures with a controller, but thats not what Im looking for.

The other thing Ive seen recommended is to use the OpenXR hand tracking extension?

Tracking support through ovrTrackingState::HandPoses should give you the 6DoF position/orientation of both controllers, updated with tracking state, and ovr_GetInputState gives the buttons and such. I don't think it gives gestures like some other libraries, only the tracking and buttons.

Advertisement

Thanks frob, Yea, im kinda at my wits end with this. We have support for what you mentioned built out, but ive kinda been at a loss on where exactly is controller-free hand-tracing for the quest is supported. For sure its exposed through their native mobile api, but beyond that Ive not been able to find if its exposed somehow through one of their pc sdks

Do you have access to the source of those APIs or the Unity Input system that also provides gesture support? I don't know if they have an Unreal version too, but assume there is one somewhere. I've always just used libraries that provided support for all the various systems for me, rather than targeting a specific API.

I'd try to find source, as any of those source implementations might let you find out how they do it. They might be relying on some hidden values, relying on bit flags set internally, or maybe just tracking it in each update where you can duplicate their gesture detection.

This topic is closed to new replies.

Advertisement