Leap Motion, an American company that manufactures and markets computer hardware sensors that are capable of sensing finger motions as input has introduced an early beta version of their Interaction Engine to make it easier for developers to build VR environments with hand manipulation support. The company defines the Interaction Engine as “a layer that exists between the Unity game engine and real-world hand physics.”
The latest development of the engine aims to satisfy human expectations when it comes to object interactions. The Interaction Engine implements an alternate set of physics rules that applies whenever the user’s hands are inside the visible area of a virtual object.
The company previously introduced the early development version of their Interaction Engine several months ago along with the blocks demo of their updated Orion tracking platform. This time, Leap Motion has made the early access beta of the Interaction Engine available to all Unity developers as a module for Unity Core Assets.
Being able to interact with objects in the VR world using your bare hands with the help of the Interaction Engine is what makes the Orion unique. The Interaction Engine makes it possible for the user to grasp and pick up objects with a natural feel and physics. It gives users an extended higher-level of interactions such as the ability to stacks objects and throw an object with real life physics. Users can also move the objects to the desired position, determining what happens when tracking is momentarily lost, throwing velocity, and layer transitions to handle how collisions work.
While the new engine is still in beta, the company has already started working with VR headset manufacturers to integrate their sensors directly into the headsets. Meanwhile, users of the Oculus Rift and HTC Vive can have a glimpse of the hand motion controls by using the developer kit and attaching the Leap Motion sensor bar into the headset.