Hand Tracking with MS Research and Kinect at Future Decoded 2015

Hand Tracking with MS Research and Kinect at Future Decoded 2015

A bit of a departure from RoomBuilder, which I haven't blogged about for a long time, but which is going VR... stay tuned.

This post is about graphics and relates to something that's pretty cool - hand tracking using a Kinect sensor. I was lucky enough to attend Future Decoded last week. There was some great keynotes on the technical day. However, there was a bit of a surprise during the keynote from Chris Bishop at Microsoft Research. His main focus was on AI, machine learning and making use of deep learning to solve difficult problems. One of the applications that cropped up during the keynote was hand pose tracking using Kinect and an Xbox sensor. Now, this has come up before with Handpose. However, it was still a bit of a surprise to see a live demo - which I manage to capture a few seconds of once I realised what was happening. Whether there's been much progress since the original Handpose reveal is hard to tell, but it's clear that solving the hand tracking issue has real benefits for AR and for Microsoft - with hardware such as HoloLens and also for VR, a real interest of mine. Your hands are the ultimate natural user interface for many tasks - assuming issues such as occlusion and latency can be solved.


Popular posts from this blog

Straight Skeleton Madness (a plea for help)

Historical Post 2 - September 2007 : Breakthrough - External Straight Skeleton Nodes