Steering Anki Overdrive Cars via Speech Recognition on Bluemix
How to use IBM Bluemix and Watson Speech Recognition to steer connected cars.
Join the DZone community and get the full member experience.Join For Free
i’ve extended the anki overdrive demo so that you can steer the cars via voice. i’ve used a combination of the watson speech to text service and the watson natural language classifier service on bluemix to implement this. all of this is available as open source .
to convert speech to text an extended version of the watson speech to text sample is used so that received text (which is marked as ‘final’) can be sent via mqtt to the internet of things foundation .
the watson speech to text service has to be defined as device in the internet of things foundation. check out the readme of the project for details.
there is a new version of the node-red flow which includes the speech recognition functionality. the watson classifier is used to basically map between the text received from the speech to text service to available commands. at this point there are four commands (move, stop, turn left, turn right) that the classifier understands based on the provided training data . the nice thing about the classifier is that you can even say things like ‘depart’ and the classifier figures out that it belongs to the classification ‘move’ even though it was not defined in the training data.
Published at DZone with permission of Niklas Heidloff, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.