The Future of Interaction is here!

11/07/2017

Stockholm, Sweden


“We’re making the camera the first Augmented Reality platform”

– Mark Zuckerberg

 

Yes, Zuckerberg delivered his idea of the future of Augmented Reality on camera-enabled devices at the F8 keynote in April. And when the guru behind Facebook says something, we ought to listen closely.

Let’s begin with some context first – Since the dawn of time man has always sought for a way to convey that which he imagines in his head – From paint to photograph, from performance to film and more recently, from language to code. He has always wanted to make real the creative worlds which he dreams up in his mind.

Today, the aim of AR and VR is to superimpose these imagined worlds with the real world, and blur the lines between them. This opens up new dimensions as to how man communicates his ideas and interacts. AR and VR define not only the next technological revolution but also the next culturalone. And with it comes an entirely new worldview of User Experience (UX), User Interface (UI) and Human-Computer Interaction (HCI).

This is what ManoMotion is concerned with.

In line with Zuckerberg’s address at F8, ManoMotion’s team of specialists stretch the borders of what is technically possible to decipher from a simple RGB Camera. We build the next generation UI, bringing you real-time 3D interaction through hand tracking and gestural analysis across VR, AR and the IoT. And what’s more, we only require a camera-enabled device, a.k.a your smartphone.

User Experience


2 things define a VR or AR Experience. The first is the detail and aesthetics of the virtual or augmented worlds we create. This is primarily the works of designers, artists, and visionaries, who define the visual realism of VR and AR. The second is the interactive experience within these virtual or augmented worlds. Users need to be able to interact with the same level of intuition and realism as they would in real life in order to be immersed completely.

Let’s paint a scenario – You are a soldier on a battlefield under enemy fire, crouching behind cover. Out of ammunition, you reach out to grab a magazine from a fallen soldier. In a few seconds, you swipe it into the holster, pull the charging handle and press the trigger until your ammunition runs out. Mission accomplished.

via GIPHY

On a PC, this would all be over with simple keyboard clicks and the tapping of your right mouse button. With gesture control however, you have the opportunity to immerse yourself fully into the context of the game.

Using real-gestures and hand-movements, you can possibly mimic the real scenario of handling the weapon with your bare hands. Therefore, you bridge the gap between the visual experience of VR and AR and the physical realism of it.

Enabling the Hand


Hand Tracking and Gesture recognition falls under a region of computer science that aims to interpret hand motions and gestures through mathematical algorithms. Our hand has 27 degrees of freedom of motion, 21 for all joint angles and 6 for orientation and location. Pretty much everything we do on a daily basis involves our hands in some significant way.

“The human hand is the single most manipulatory instrument in the world. It’s evolved over a hundred million years to become the fine tool that it is. And when we use a mouse, we disable it.”

John Underkoffler

The aim is to enable hands to become better controllers of our human-computer interactions. The 27 DOFs of motion in hand well surpasses what a mouse, PS4 joystick, or even an Oculus Touch controller can offer. Therefore, instead of creating new controllers, why not uses the ones we already have? The human hand is a superior tool that offers unsurpassed maneuverability and flexibility. It’s free, and the battery never runs out!

We maximise the information already available from an RGB camera, and through intelligent algorithms, along with machine and deep learning frameworks, we provide precise hand tracking and gestural analysis supporting all 27 DOFs of motion of the human hand.

 

Status Quo


Let’s face it. If we move forward in Virtual and Augmented Reality, relying on big, bulky hardware, controllers and wearables wouldn’t make sense. These would only diminish the interactivity of the experience. You want portability. You wish to be untethered. You might also cringe at the price tag of $1000 for a heavy chunk of headwear that constantly needs to be plugged into  a computer.

ManoMotion’s technology focuses on allowing the human hand to directly  and intuitively interact with machine interfaces. Think of navigating through menus with a swipe of a hand and a tap of a finger. Or imaging immersing yourself in a game which uses gestures as part of its gameplay. With smartphones today already providing robust computing strength, you can already enjoy inexpensive, mobile VR and AR experiences through GearVR and Daydream headsets. Add gesture control to the mix and you could have a new and convenient method of gameplay.

The Time is Right


According to MarketsandMarkets Research Firm (2015), the gesture recognition & touch-less sensing industry is expected to grow from USD 5.15 Billion in 2014 to USD 23.55 Billion by 2020 alongside the VR and AR industries, which are expected to be valued at USD33.9 Billion and USD117.4 Billion respectively by 2022.  Gesture and motion technologies are expected to be significantly integrated into smartphones and similar devices over the coming years. We therefore face an imminent future where gesture and hands-free control could define the IoT.

Perhaps software or games of yesterday’s 2D interfaces never demanded the complexity and dynamism in interactions that we expect in games today. As VR and AR grows rapidly, having more intuitive control and maneuverability in 3D environments become cornerstones to the overall experience.

 

Let’s go Mainstream!


ManoMotion’s envisions to bring intuitive gesture-based technology mainstream – We want to be the standard for hand-motion tracking. Imagine interacting within a virtual world as you would in real life. How much simpler can you go than using just a camera-enabled device and your bare hands?

So if you’ve been waiting for the next thing since the mouse or the touchscreen, you might already the answer in … well … your hands! We’ll be bold here and say it’s only a matter of time before gesture technology reaches its tipping point!

Tags: AR, Future of Interaction, Gestural Interaction, Market, VR