Do Androids Dream of Electric Sheep?
Do Androids Dream of Electric Sheep?
Join the DZone community and get the full member experience.
Join For FreeDelivering modern software? Atomist automates your software delivery experience.
In all of these movies, humans can directly communicate with computers using voice commands (“Tea. Earl Grey, hot”) or more elaborate sentences and the computers will talk back to humans as a result. In some of the movies, the computer is a non-bodily entity (like the ship’s main computer in Star Trek), but sometimes it even is humanoid (like Cmdr. Data in Star Trek or the Replicants in Blade Runner).
In a Pecha Kucha presentation I recently delivered at the OOP conference in Munich, I took on this theme and started my talk with a imaginative conversation between two computer programs that most of you might know very well: Eliza and Siri.
Do Androids Dream of Electric Sheep? from Peter Friese on Vimeo.
In this Pecha Kucha session, Siri and Eliza join me live on stage to explain why we do not (yet) use the potential of our smart phones.
Much has been said about Siri, so let’s focus on Eliza.
Eliza started as an experiment in 1964, when Joseph Weizenbaum, then professor of computer sciences at the MIT, wrote it as an early attempt to research natural language communication between man and computer (see his ACM paper “ELIZA – A Computer Program For the Study of Natural Language Communication Between Man And Machine”). At its heart, Eliza is program that reacts to specific key words in sentences entered by a human user and will then respond to those keywords according to a predefined script. It’s the script that basically makes up the “personality” of Eliza, the most famous one being Doctor, simulating a rogerian psychoanalyst.
(Weizenbaum was rather surprised to see how people thought that Eliza had a real personality and a capacity for empathy. In reaction, he started to raise uncomfortable questions about our dependence on computers.)
In my Pecha Kucha, I went on to tell my audience to make better use of the capabilities of our smart phones. Most people have stopped thinking about it, but after all, we’re carrying a super computer in our pockets.
So to set a good example, I decided to create ElizaApp – an app that can listen to what you say and answer in spoken language. This is going to be a great project, because I’ll show several very interesting things:
- How to analyse spoken language on a mobile device
- How to synthesize speech on mobile devices
- How to integrate a JavaScript engine in your mobile app
- How to create a Siri-look-alike chat UI
Over the course of the next few weeks, I will write several posts covering these topics. If there is anything that interests you in particular, feel free to add a comment.
Of course, in the end Eliza will be available on the App Store. In the mean time, be sure to check out ElizaApp and register for early access!
Start automating your delivery right there on your own laptop, today! Get the open source Atomist Software Delivery Machine.
Published at DZone with permission of Peter Friese , DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.
{{ parent.title || parent.header.title}}
{{ parent.tldr }}
{{ parent.linkDescription }}
{{ parent.urlSource.name }}