Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Big Data and Machine Learning: The Dehumanization of Everything

DZone's Guide to

Big Data and Machine Learning: The Dehumanization of Everything

· Big Data Zone
Free Resource

See how the beta release of Kubernetes on DC/OS 1.10 delivers the most robust platform for building & operating data-intensive, containerized apps. Register now for tech preview.

There is a lot of talk recently about the consequences of Big Data, machine learning, and the lack of data privacy. Facebook, for example, has a lot of data regarding a lot of people, and Mark Zuckerberg attended the annual NIPS conference and panels on deep learning, so what does it all mean?

According to John Foreman, the ubiquity of Big Data and machine learning signify a changing culture in which models based on the data gathered from individuals can be used to make educated guesses about personalities, wants, and needs:

Data left online and in the real world form anchor points in the photo of you, from which machine learning algorithms can project the rest of your image. And as machine learning models grow in accuracy and sophistication, particularly at companies with an incentive to target ads, so does the interpolated image of exactly who you are.

But Foreman's real point is in the way that the data gathered from individuals can begin to shape the individuals themselves:

If an AI model can determine your emotional makeup . . . then a company can select from a pool of possible ad copy to appeal to whatever version of yourself they like. They can target your worst self . . . or they can appeal to your aspirational best self.

And this behavior, Foreman argues, is self-perpetuating. AI models create a model based on certain data, but by using that data in future situations, an echo chamber is created. New behavior is still informed by old behavior. It creates a cycle:

This is a concern with Chicago’s crime hotspot targeting model. What happens when a model shits where it eats? Police focus in on a hot spot and generate more arrests there. Those hotspots become hotter. The neighborhood gets less desirable. Education and jobs suffer. Those hotspots become hotter. The model sends more police. And on and on the death spiral goes.

Ultimately, Foreman's argument comes across as a bit sensational, but given the current hot-topic status of data privacy, he's asking some important questions. What consequences will the availability of individuals' data have? Where is the trend toward machine learning and AI modeling headed? Check out Foreman's full article and see what you think.


New Mesosphere DC/OS 1.10: Production-proven reliability, security & scalability for fast-data, modern apps. Register now for a live demo.

Topics:

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}