Oculus Quest: SDK for hand tracking and improve the graphics

Oculus Quest: SDK for hand tracking and improve the graphics

Oculus provides developers with new tools for the Oculus Quest, which should provide better graphics and extensive integration of hand tracking.

 

In mid-December, the company Oculus has released a hand-tracking for self-sufficient points of virtual reality Oculus Quest. However, at present, hand tracking only works in the menu points VR and some selected applications of the Oculus.

 

But that will soon change: now Oculus provides developers with the Oculus Mobile SDK 12.0 software interface that they can use to integrate hand tracking into their own applications. Oculus expects a “new generation of virtual applications without a controller”.

 

The following example was implemented using the commonly used software Unity. The Unreal must be submitted in the first half of 2020.

 

 

Developers have access to the entire skeleton hands. Gesture recognition, pre-configured in Unity, should ease the integration into applications.

 

Разработчики видят руки пользователя как облако точек или как модель скелета. Картина: Окулус

 

The first application from third-party developers with manual control should appear at the beginning of 2020. It is not known what it will be.

 

 

Fixed rendering

 

Quest in Oculus now has the function of Fixed Foveated Rendering (FFR), which developers can use instead of manually setting the level of FFR.

 

Fixed Foveated Rendering is a feature of rendering, which developers can use in Oculus Quest. It displays the periphery of the lens with lower resolution than in the center, which allows the software to maintain a constant and comfortable frame rate by reducing detail in places that are less noticeable. Developers can choose one of four levels: low, medium, high and very high.

 

 

FFR can facilitate developers porting games to PC Quest. However, the levels “high” and “ultra-high” can be very noticeable to the user.

 

 

Dynamic FFR gives developers the ability to enable Oculus to dynamically adapt the level of foveal based on the use of the GPU. This means that if at the time it is not required for performance, users will not see the pixelation and blurriness that you see on some Quests today.

 

However, this feature is disabled by default, so developers need to add it to their games by using software update to get the benefits.

 

 

It seems that this feature isn’t available for the major game engines that developers use Unity or Unreal Engine, but, as in the case with most built-in functions in the SDK, it will probably appear in the next update.

 

Source

Go to our cases Get a free quete