MOBILE AR IS HERE!

ManoMotion hand tracking SDK is made for developing AR experiences for both Android and iOS mobile platforms

Power Up your App development!

Spatial interaction using mobiles allows humans to experience AR in the physical space to control and manipulate items with hands. Hand tracking and gesture control technologies are revolutionizing the Mobile AR space and facilitating the interaction between humans and computers.

Why ManoMotion SDK Pro?

- The most versatile hand tracking features in one SDK. - Dynamic movements and various trigger gestures supported. - Optimized for mobile with minimal power consumption, low data size, and memory usage. - Realistic hand interaction in AR. - No Extra hardware requirement. - Consulting and support in development.

Hand Tracking

Hand tracking provides in-depth information about the hand, such as positioning x,y,z rotation values for individual points or the whole skeleton. The joint orientation also uses the hand side and left or right-hand information. The relative depth is detected by using the hand's distance from the camera by a value between 0 and 1.

SKELETON TRACKING

Skeleton tracking contains the confidence and joint information of the skeleton. The joints x, y and z are provided with individual positions for each of the 21 joints of the hand skeleton. SDK can be configured to return the joint information in 2D or 3D. SDK offers one and two hand tracking.

HAND SEGMENTATION

Capable of extracting the hand texture and placing it in 3D. It is done by the addition of depth estimation that enables developers to provide experiences in which the hand occludes virtual objects and vice versa.

TRY-ON FEATURES

Enables AR developers to develop ring and watch try on solutions for the evolving e-commerce fashion industry in a very quicker and easier way.

Gesture Analysis

The gesture analysis can be used to understand the user´s intent. By combining information from previous and current frames, ManoMotion’s technology can determine the type of gesture performed by the user. This information is classified into three categories that help developers to design experiences, customizable to different behaviors, and mapped into the Unity world.

MANOCLASS

ManoClass represents the raw detection of the hand in every frame. The result is independent of the previous information as it is a per-frame analysis. For each frame given, the SDK will return a ManoClass result which can be any of the following hand classes (Grab, Pinch, Point, NoHand).

CONTINUOUS GESTURES

The Continuous Gestures are a convenient package of Gesture Analysis aimed to understand and categorize if the user is continuously performing a given Gesture. Continuous Gesture is calculated based on present and past information. This means that a Continuous Gesture will be fired only if the user has kept the same hand pose for a certain amount of frames (time).

TRIGGER GESTURES

The Trigger gestures are one time gestures. These types of gestures are specific sequences of ManoClasses & hand states that when performed sequentially they will be recognised as a trigger/event, similar to a mouse click.

Platforms Supported

and

additional features

Unity Android

Unity iOS

Unity Windows

AR Foundation

Orientation

Front/Rear camera

Try it on Your Phone

PARTNERS

Scroll to Top