Adam from Occipital here (we make the Structure Sensor). As of now, you can use OpenNI with the Structure Sensor on any platforms currently supported by OpenNI. We’ve also forked OpenNI on GitHub (along with many others who have done the same) to make sure it remains available after April 23rd. For those who don’t want to compile code, we’ll… Read more →
Important PrimeSense has a limited number of Carmine 1.08 and Carmine 1.09 (short range) sensors available for purchase for academic and other noncommercial research only. Purchases and delivery are subject to availability and lead time, PrimeSense’s terms and conditions and are at PrimeSense’s sole discretion. For more information please contact PrimeSense at firstname.lastname@example.org by April 1, 2014 if you would… Read more →
Sensors are a key reason that our mobile devices are coming to understand each of us on such a personal level. PrimeSense is the best of the 3D sensor companies we found. Essentially, they let machines see you or your general location in 3D, allowing the machines to learn the context of a situation. PrimeSense devices allow the new Microsoft… Read more →
NI Mate may be helpful. It takes motion and depth sensing data from a Kinect and turns it into MIDI and OpenSC (Open Sound Control) data, which can be quickly passed over a network. NI mate Documentation (last update 25.6.2013) | NI Mate. Read more →
OpenNI middleware for grab and release gestures.
Body tracking middleware for OpenNI, focusing gesture recognition.
Doing blob tracking with a kinect using Processing is surprisingly hard, as I’ve discovered today in the CIID Generative Design workshop I’ve been running, the below shows the trickery required to buffer RGB images from the kinect and then doing blob tracking on them:
Open Natural Interaction (OpenNI).