Over a million developers have joined DZone.

Big Data and Machine Learning: The Dehumanization of Everything

DZone's Guide to

Big Data and Machine Learning: The Dehumanization of Everything

· Big Data Zone ·
Free Resource

Hortonworks Sandbox for HDP and HDF is your chance to get started on learning, developing, testing and trying out new features. Each download comes preconfigured with interactive tutorials, sample data and developments from the Apache community.

There is a lot of talk recently about the consequences of Big Data, machine learning, and the lack of data privacy. Facebook, for example, has a lot of data regarding a lot of people, and Mark Zuckerberg attended the annual NIPS conference and panels on deep learning, so what does it all mean?

According to John Foreman, the ubiquity of Big Data and machine learning signify a changing culture in which models based on the data gathered from individuals can be used to make educated guesses about personalities, wants, and needs:

Data left online and in the real world form anchor points in the photo of you, from which machine learning algorithms can project the rest of your image. And as machine learning models grow in accuracy and sophistication, particularly at companies with an incentive to target ads, so does the interpolated image of exactly who you are.

But Foreman's real point is in the way that the data gathered from individuals can begin to shape the individuals themselves:

If an AI model can determine your emotional makeup . . . then a company can select from a pool of possible ad copy to appeal to whatever version of yourself they like. They can target your worst self . . . or they can appeal to your aspirational best self.

And this behavior, Foreman argues, is self-perpetuating. AI models create a model based on certain data, but by using that data in future situations, an echo chamber is created. New behavior is still informed by old behavior. It creates a cycle:

This is a concern with Chicago’s crime hotspot targeting model. What happens when a model shits where it eats? Police focus in on a hot spot and generate more arrests there. Those hotspots become hotter. The neighborhood gets less desirable. Education and jobs suffer. Those hotspots become hotter. The model sends more police. And on and on the death spiral goes.

Ultimately, Foreman's argument comes across as a bit sensational, but given the current hot-topic status of data privacy, he's asking some important questions. What consequences will the availability of individuals' data have? Where is the trend toward machine learning and AI modeling headed? Check out Foreman's full article and see what you think.

Hortonworks Community Connection (HCC) is an online collaboration destination for developers, DevOps, customers and partners to get answers to questions, collaborate on technical articles and share code examples from GitHub.  Join the discussion.


Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}