Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Machine Learning — A Lesson Learned

DZone's Guide to

Machine Learning — A Lesson Learned

Google's self-driving cars finally caused an accident, *after* they were updated to act more human.

· Big Data Zone
Free Resource

Need to build an application around your data? Learn more about dataflow programming for rapid development and greater creativity. 

Google self-driving car finally causes an accident — a lesson in AI.

According to The Verge, Google had recently performed a software update that changed the behavior to be more human like.

“So several weeks ago we began giving the self-driving car the capabilities it needs to do what human drivers do: hug the rightmost side of the lane.”

The truth is that one of the complaints about self-driving cars is that they are too cautious so Google adapted the software so the car would move to the far right of the lane so two cars could fit in the single wide lane.  This is what a regular, old fashioned, human being does so cars can move more fluidly though the heavily congested streets of California.

All in all a great idea.  The incident was that the car detected some road construction and then reentered itself back to the ‘legal’ center of the lane to avoid the obstructions.  Even though the car was barely moving and edged back to the center, a bus hit the back side of the car.  I am not sure why the self-driving car is at fault, maybe a lawyer can chime in, but the last time I checked, anytime somebody hits you from the back it is their fault.  For our lesson today it doesn’t matter who is at fault.

Later in the article Google says:

“We’ve now reviewed this incident (and thousands of variations on it) in our simulator in detail and made refinements to our software. From now on, our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future.”

This made me laugh!  In general what Google is saying is that is doesn’t matter who has the legal right of way, but when something really big is coming your way you should move.

The lesson is that humans are peculiar and there are rules and then there are “rules”.  Programming practical but illegal or illogical rules is going to be part of what we will expect of future AI systems.  That is unless we want to become more like machines.  Between becoming more like a machine and a machine becoming more like a human I vote the machines become more like us.

Featured image is property of Mark Doliner, unmodified, used with permission.

Check out the Exaptive data application Studio. Technology agnostic. No glue code. Use what you know and rely on the community for what you don't. Try the community version.

Topics:
machine learning ,google ,artificial intelligence ,software

Opinions expressed by DZone contributors are their own.

THE DZONE NEWSLETTER

Dev Resources & Solutions Straight to Your Inbox

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.

X

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}