Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Steering Anki Overdrive Cars via Kinect and Bluemix

DZone's Guide to

Steering Anki Overdrive Cars via Kinect and Bluemix

I’ve open sourced a sample showing how to steer Anki Overdrive cars via Kinect and IBM Bluemix. Read on to learn more.

· Cloud Zone
Free Resource

Site24x7 - Full stack It Infrastructure Monitoring from the cloud. Sign up for free trial.

I’ve open sourced a sample showing how to steer Anki Overdrive cars via Kinect and IBM Bluemix. The sample requires the Node.js controller and MQTT interface that I had open sourced previously. The sample is another alternative showing how to send commands to the cars in addition to Watson speech recognition.

Check out the project on GitHub.

The project contains sample code that shows how to send MQTT commands to IBM Bluemix when "buttons" are pressed via Kinect. For example, buttons are pressed when you move hands over them and wait for two seconds. The project is configured so that this only works when hands are between 50 cm and 1,00 m away from the Kinect. The distance is measured via the Kinect depth sensor.

The left picture shows the depth information, the right picture is an RGB image.

kinect-not-pressed

When a hand is moved above a button, the state of the button changes to "pressed" and if the hand is still above the button two seconds later, the appropriate action is invoked.

kinect-pressed

Here is a picture of me driving the cars via Kinect.

Image title


Site24x7 - Full stack It Infrastructure Monitoring from the cloud. Sign up for free trial.

Topics:
ibm bluemix ,kinect ,cloud ,iot

Published at DZone with permission of Niklas Heidloff, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}