Stockholm, Sweden and Palo Alto, California
Computer-vision experts ManoMotion announced its successful integration of Apple’s ARKit to its SDK, which will henceforth allow developers to introduce gestural interaction to the augmented elements developed using the ARKit. The integration will enable developers and corporations to incorporate gestural interaction into Augmented Elements developed using Apple ARKit. The integration will be made available on the latest SDK build, available for download in the coming weeks on ManoMotion’s website.
Watch the demo video:
ManoMotion’s 3D real-time gesture recognition technology lets people use their actual hands in VR/AR/MR for interaction with virtual objects. With no extra hardware and using a standard 2D camera only (such as a cell phone camera), it recognizes and tracks many of the 27 degrees of freedom (DOF) of motion in a hand. Providing real-time, accurate hand-tracking with depth information, the technology handles dynamic gestures (such as swipes, clicking, tapping, grab and release, etc) with an extremely small footprint on CPUs, memory, and battery consumption.
“Introducing gesture control to the ARKit, and being the first in the market to show proof of this for that matter, is a tremendous milestone for us.”
– Daniel Carlman, CEO of ManoMotion
The new integration will allow developers to create applications and content on the ARKit with “hand presence”, in which:
- People can use their actual hands in 3D, instead of the limited 2D screen, to manipulate objects across depth, in AR/MR space
- Augmented elements can be manipulated with the right hand or the left hand
- A set of predefined gestures, such as point, push, pinch, swipe and grab, can be accessed and utilized for interactive manipulation of Augmented elements
The extent of manipulation can be precisely defined and determined by users
“Up until now, there has been a very painful limitation to the current state of AR technology – the inability to interact intuitively in depth with augmented objects in 3D space,” said Daniel Carlman, co-founder and CEO of ManoMotion. “Introducing gesture control to the ARKit, and being the first in the market to show proof of this, for that matter, is a tremendous milestone for us. We’re eager to see how developers create and potentially redefine interaction in Augmented Reality! ”
The integration will initially be made available for Unity iOS, followed by Native iOS in subsequent updates. For tips, assistance and guidance on how to utilize the integration tool, SDK users can engage ManoMotion’s tutorials and documentation. Furthermore, SDK users will be supported by ManoMotion’s dedicated technical team of software engineers, developers and computer vision scientists via the company forum and email. Developers interested in using ManoMotion’s SDK with ARKit should visit: https://www.manomotion.com/get-started/.
An integration for ARCore is coming soon, and will be announced at a later date.
Daniel Carlman, CEO
ManoMotion is a computer vision-based software company founded in 2015. Based in Stockholm, Sweden, and with a sales and marketing office in Palo Alto, California, ManoMotion’s long-term vision is to bring unparalleled intuition to human-machine interactions using gesture technology. They have developed a core technology framework to achieve precise hand tracking and gesture recognition in 3D-space simply via a 2-D camera – available on any smart device. They offer the solution across multiple platforms in Virtual Reality, Augmented/Mixed Reality or any environment that requires natural and intuitive interaction.
© 2017. ManoMotion AB. All rights reserved. ManoMotion is a trademark of ManoMotion AB. All other trademarks are the property of their respective owner(s).