Spatial interaction using mobiles allows humans to experience AR in the physical space to control and manipulate items with hands. Hand tracking and gesture control technologies are revolutionizing the Mobile AR space and facilitating the interaction between humans and computers.
- The most versatile hand tracking features in one SDK.
- Dynamic movements and various trigger gestures supported.
- Optimized for mobile with minimal power consumption, low data size, and memory usage.
- Realistic hand interaction in AR.
- No Extra hardware requirement.
- Consulting and support in development.
Hand tracking provides in-depth information about the hand, such as positioning x,y,z rotation values for individual points or the whole skeleton. The joint orientation also uses the hand side and left or right-hand information. The relative depth is detected by using the hand's distance from the camera by a value between 0 and 1.
Skeleton tracking contains the confidence and joint information of the skeleton. The joints x, y and z are provided with individual positions for each of the 21 joints of the hand skeleton. SDK can be configured to return the joint information in 2D or 3D. SDK offers one and two hand tracking.
Capable of extracting the hand texture and placing it in 3D. It is done by the addition of depth estimation that enables developers to provide experiences in which the hand occludes virtual objects and vice versa.
Enables AR developers to develop ring and watch try on solutions for the evolving e-commerce fashion industry in a very quicker and easier way.
The gesture analysis can be used to understand the user´s intent. By combining information from previous and current frames, ManoMotion’s technology can determine the type of gesture performed by the user. This information is classified into three categories that help developers to design experiences, customizable to different behaviors, and mapped into the Unity world.
ManoClass represents the raw detection of the hand in every frame. The result is independent of the previous information as it is a per-frame analysis. For each frame given, the SDK will return a ManoClass result which can be any of the following hand classes (Grab, Pinch, Point, NoHand).
The Continuous Gestures are a convenient package of Gesture Analysis aimed to understand and categorize if the user is continuously performing a given Gesture. Continuous Gesture is calculated based on present and past information. This means that a Continuous Gesture will be fired only if the user has kept the same hand pose for a certain amount of frames (time).
The Trigger gestures are one time gestures. These types of gestures are specific sequences of ManoClasses & hand states that when performed sequentially they will be recognised as a trigger/event, similar to a mouse click.
SDK solutions for iOS and Android developers developing mobile hand tracking solutions
We support SDK PRO + ARFoundation (up to version 2.X)
Yes. The core libraries of SDK PRO need to be compiled to phones therefore you need to compile for an Android/iOS device.
SDK PRO is calibered to detect a single hand in everyday scenarios. Any extreme conditions (e.g Light, Shadow, Contrast, Color, Clothing) will lead into noise interfering with the quality of signal.
No. ManoMotion SDK PRO standard product is calibrated towards the detection of one hand at a time. We would be happy to help you with a CUSTOM SOLUTION(custom solution CTA).
Yes. Connection to a stable/fast* An internet connection is required from SDK PRO for a brief period of time in order to validate the license.
Yes. The gesture/tracking information received is the raw information captured by SDK PRO tech. You can apply smoothing algorithms (median filter, linear interpolation etc) on top of the information to smooth it out.