Steering Anki Overdrive Cars via Kinect and Bluemix
I’ve open sourced a sample showing how to steer Anki Overdrive cars via Kinect and IBM Bluemix. Read on to learn more.
Join the DZone community and get the full member experience.Join For Free
I’ve open sourced a sample showing how to steer Anki Overdrive cars via Kinect and IBM Bluemix. The sample requires the Node.js controller and MQTT interface that I had open sourced previously. The sample is another alternative showing how to send commands to the cars in addition to Watson speech recognition.
The project contains sample code that shows how to send MQTT commands to IBM Bluemix when "buttons" are pressed via Kinect. For example, buttons are pressed when you move hands over them and wait for two seconds. The project is configured so that this only works when hands are between 50 cm and 1,00 m away from the Kinect. The distance is measured via the Kinect depth sensor.
The left picture shows the depth information, the right picture is an RGB image.
When a hand is moved above a button, the state of the button changes to "pressed" and if the hand is still above the button two seconds later, the appropriate action is invoked.
Here is a picture of me driving the cars via Kinect.
Published at DZone with permission of Niklas Heidloff, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.