Last week I had the opportunity to present to a great audience at the MoDev DC meetup group on “Smarter Apps with Cognitive Computing”. In this session I focused on how you can create a voice-driven experience in your mobile apps. I gave an introduction to IBM Bluemix and IBM Watson services (particularly the Watson language services), and demonstrated how you can integrate them into your native iOS apps. I also covered IBM MobileFirst for operational analytics and remote logging to provide insight into your app’s performance once it goes live. Check out a recording of the complete presentation in the video below:
You can read more detail about how this example works and access source code for the sample application in the links below:
- Blog: Voice Driven Apps With IBM Watson & IBM MobileFirst
- Code (iOS and Node.js): github.com/triceam/IBM-Watson-Speech-QA-iOS
Just create an account on IBM Bluemix and you can get started for free!
This app uses three services available through IBM Bluemix, all of which are available for you to try out:
- Speech to Text – Convert spoken audio into text
- Question & Answer – Natural language search
- Advanced Mobile Access – Capture analytics and logs from mobile apps running on devices
Feel free to poke around the code to learn more!