Are We All Doomed? Your Role in the Ethics of Tech
Are We All Doomed? Your Role in the Ethics of Tech
Humans have always loved tech. But we're quite a ways from stone tools, and this new tech raises some interesting ethical conundrums for developers.
Join the DZone community and get the full member experience.Join For Free
The advancement of technology has always been a driving force behind humanity. From our earliest days, we have created tools to enhance and help our lives.
Impactful Historical Technology
We took materials around us and shaped them into useful items that helped us accomplish far more than before.
© 30/4/2009, Peter A Bostrom
Beasts of Burden
Using animals to move heavy farming machinery and goods allowed us to travel quicker, farm more effectively, settle in permanent locations, and have more time for scientific and creative pursuits.
By Offut, Denton; Dun, Finlay, 1830-1897; Fairman Rogers Collection (University of Pennsylvania) PU [No restrictions], via Wikimedia Commons
The Printing Press
When people say to me that the Internet allows us to communicate more widely and at speeds unimaginable before, I am often reminded of the printing press as I imagine people said much the same at its introduction. The printing press had a huge impact on the dissemination of ideas, and also the democratization of literacy.
The past decades have seen an unprecedented speed of progress, creating technology after technology that promises to enhance our lives.
The Smartphone and Social Networks
The smartphone and social networks have connected communities that weren’t connected before and helped the socially isolated find people like them around the globe. They have empowered minority groups by creating new business opportunities and created a world of independent, 'indie,' developers.
@ Associated Press
Major schisms in transport technology have had big impacts on society, and the forthcoming automated car revolution will be no exception. Large sections of the population will have more time for other activities instead of at the wheel.
By BP63Vincent (Own work), CC BY-SA 3.0, via Wikimedia Commons
And behind many of these technologies is big data, machine learning, and artificial intelligence. The potential benefits of these advances are widespread, but one of the clearest is in health. Technologies such as Apple’s HealthKit have enabled medical professionals to conduct research at a scale and speed that had not been possible before. Increasing participants from dozens to tens of thousands and helping them gain insights that are hard to get by asking patients directly.
The rest of this article will wander, but I will focus on artificial intelligence as much as possible as it brings something new into the equation of human ingenuity. In the past we still had control, if a technology proved unsafe or unwise, we had the ability to pull the plug and adapt. If we realize the full potential of artificial intelligence, this may no longer be the case.
As engineers, we tend to create because we want to experiment, push boundaries and find out what’s possible, not necessarily because we need to.
There’s a famous work of fiction called Frankenstein (by Mary Shelley). In case you are not familiar with it, I will repeat the relevant facts here. Dr. Victor Frankenstein aims to prove his experiments in reanimating dead creatures by bringing a human corpse back to life. Throughout the book, the monster turns on its creator and Victor regrets his creation and the power it has over him. In the English language at least, the phrase “Frankenstein’s Monster” has come to mean a creation that haunts you and that you regret.
By The Man in Question (Frankenstein’s monster (Boris Karloff).jpg) CC BY-SA 4.0, via Wikimedia Commons
There are other famous examples of this throughout history, including Einstein (his involvement with the atomic bomb), Mikhail Kalashnikov (AK-47), and John Sylvan (Coffee capsules). Frequently this ‘corruption’ of the creator’s idea had nothing to do with the creator, as not everyone shares the enthusiastic idealism of an engineer.
The examples above are older and not so specific to our industry, so what about recent examples? There have been an increasing amount of developers from large tech companies voicing their concerns. Let’s take a critical (re)look at the technologies from earlier.
A Critical (Re)look
By Unuaiga (Own work) CC BY-SA 4.0, via Wikimedia Commons
There’s increasing evidence that smartphones are having harmful effects on isolation, face-to-face communication, fear of missing out (FOMO), loneliness, sleep, stress, concentration, and procrastination. More pragmatically for engineers, they have also affected how people are willing to pay for apps.
Image from Darkness
Again, as an isolated, ‘different’ child, I can appreciate where social networks have helped bring together people with similar interests, but have they gone too far? Some of the negatives here are similar to smartphones, with extra thrown in for good measure. Such as bullying, trolling, filter bubbles, data harvesting, privacy, addiction, depression, and jealousy. All which have led to major ‘real-world’ problems over the past few years including populism, polarisation, extreme views, and more.
- Instagram is the most harmful social network for your mental health
- 6 Ways Social Media Affects Our Mental Health
- Your Filter Bubble is Destroying Democracy
Data feeds a lot of the other platforms covered so far and as developers, we have spent the past ten years working on tools and applications that help to process, store, and analyze the vast quantities of data we now collect and use. Where is this all leading?
Yes, there are so many more applications with all this data, but there is also so much potential danger of it falling into the wrong hands or ‘actors’ using it to manipulate you. I recently saw a talk by the CEO of Cambridge Analytica, Alexander Nix. The talk, and how the company uses data was simultaneously amazing and scary. He showed how the company uses data to precisely pinpoint who campaigns should talk to, on what subjects, and how. And this is a company who is open about what they do, who knows about those that are more secretive.
Whilst not at full roll out yet, we are close. I will cover the impact on jobs later, but there’s one other interesting area with self-driving cars that affects us far more. That of responsibility and ethics. Cars are the biggest killer worldwide second to disease and whilst we hope that removing humans from this equation will reduce this number, what will happen when (not if) deaths mount up?
Florida Highway Patrol / AP
Who is responsible? The company, an individual team, a programmer, you? Even before we get to this stage, how do you even program an automated machine to reach decisions about the value of life and ethics? There are programmers tackling these decisions now, and I hope they are giving it deep thought.
I Was Following Orders
I don’t want to go into too much detail here as we can move into dark examples from history, but as a developer, you should take some responsibility for what you code and justify anything you write. This level of responsibility and your ethical barometer is up to you, and it will vary from person to person. We are lucky to work in high-demand and privileged roles, so if someone asks you to work on something you are not comfortable with, look for new opportunities.
Image © Danomyte
One of the obvious topics for discussion is the effects of technological advancement on job losses, not only in a direct industry but also related industries. For example with self-driving cars, job losses won’t occur only with drivers, but the myriad of businesses that support them and rely on their patronage. History shows that most major technology changes have resulted in short-term job loses that recover in the long-term, but we are facing a scale and speed we have not seen before and this will be a challenge for society to keep up with.
And don’t think that these losses will be limited to blue-collar roles, here is a selection of white-collar roles that technology is currently replacing:
- Financial and Sports Reporters.
- Online Marketers.
- Anesthesiologists, Surgeons, and Diagnosticians.
- E-Discovery Lawyers and Law Firm Associates.
- Financial Analysts and Advisors.
We like to think that as creative workers, we are safer. This is not true, visit willrobotstakemyjob.com and enter ‘computer programmer.’ You might be surprised.
Many say that this reduction in work will make us better humans and we should embrace our free time and creativity to solve bigger problems than we can now. There is a lot of truth to this statement if all goes according to plan. But I worry if this has the same potential for the global population and if we’re as creative as we like to think we are (compared to robots). Jobs give us purpose, we are slow learners (compared to robots) and if we will even be able to afford to work less unless universal basic income happens, which is a whole other discussion.
- Technological unemployment
- With robots, is a life without work one we’d want to live?
- Welcoming Our New Robot Overlords
- If You Think Basic Income is “Free Money” or Socialism, Think Again
Algorithms are unbiased, but we aren't.
There have been extensive discussions on the topics of the bias and dominance of particular demographics in tech, with justified reason. But this has consequences beyond the obvious that you may not have considered. For example, with your own unconscious biases.
There are countless examples of failed automated systems, including:
- Systems that don’t work with black skin.
- Photo recognition systems that label people of African descent as gorillas.
- Photo recognition systems that label Asian people as ‘having their eyes closed.’
- An AI for recognizing ‘gay’ people. Its inventors claimed this was a research project, but even perceiving this as ‘OK’ is an example of a lack of diverse opinions.
At the moment, these guffaws vary between offensive, frustrating, and laughable, and we are still learning. As I mentioned earlier, we are mostly still in control, when people notice these problems we can stop and correct them. What concerns me is the forthcoming wave of programs that write themselves. If we don’t work hard to remove our biases now, then the impact of these biases will grow and grow. Imagine recognition systems in robots that we rely on with these unconscious biases.
This article scratched the surface of this topic and I am aware that I have missed a lot. I hope to revisit the topic again in the future.
I’m not saying you should stop creating, not at all. I’m not a Luddite or anti-technology. I am pro change and pro improving our lives through technology. But before creating your next project I want you to all ask two questions first.
- Because you can make it, doesn’t mean you have to.
- If you do move forward, think of the consequences first.
Don’t forget the unintended consequences, history is full of other people ruining the dreams of idealist scientists and engineers. Consider the worst that could happen with your technology and work backward from there.
Points to Remember
If you forget everything else from this article, remember these three points.
1. Keep Learning and Stay Relevant
Then at least if a robot does replace your job, you have more transferable skills to use in new opportunities.
2. Enhance Humans, Don’t Replace Them
Our best technological advances have enhanced us a species, not replaced us. Strive to create technologies that do the same.
3. Diversity Matters
Involve as many people from different backgrounds with projects as you can. I delivered this post as a talk in countries that don’t have much inherent diversity and I had interesting conversations with developers about how they can introduce diversity when there’s not much locally available. My suggestion right now is to seek external advice from overseas, but this is an interesting topic I’d like to dig into in the future.
The future is up to you. Be awesome with it.
There are many discussing this topic at the moment, here's some further reading.
Opinions expressed by DZone contributors are their own.