Safety, Privacy and Ethics—The Dilemma of the Electronics Engineer

01 September 2016

Headlines today are full of evidence that we have entered new ethical territory. We have robots and drones that kill, Apple at serious odds with the FBI, and many other examples spotlight the pushback on safety, privacy and ethics. While we can look at each issue separately and debate it, another question looms. Does the design engineer that creates products that are now under the ethics microscope have any responsibility for how they are used or misused?

One of the most compelling examples of a creator’s recognition of what he or she has unleashed is found in the words of J. Robert Oppenheimer, former director of the Los Alamos Laboratory during World War II and father of the hydrogen bomb. When Oppenheimer observed the bomb’s first test, he said, "I am become death, the destroyer of worlds," quoting the Hindu scripture the Bhagavad Gita. Clearly Oppenheimer felt on some level the weight and responsibility of his creation. Eventually he expressed there was no remorse in making the bomb, but he believed that it was not used correctly. Do engineers even consider such factors today?

Why today is different


It is important to look specifically at two things—how are today’s challenges different than yesterday’s, and how do we charge engineers with an ethical responsibility—or do we?

Technology now explodes to life in a matter of weeks to months. We rely on technology to safeguard our secrets, maintain our health records privately, hold our personal communications with friends, family and others, and more. We believe on some level that the data is protected and that it will not be used, in any way, against us.

When we look at such current cases as Apple’s refusal to create a back door to its iPhone for the FBI and other law enforcement, what is the most important aspect? Some would say that safety takes precedence over everything. Others say the company’s development engineers made a serious attempt to safeguard the privacy of clients going forward, and also had the courage to say, “No.”

When considering the NSA and its dealings, which we are so much more aware of given Edward Snowden’s revelations in The Guardian, do we just think of Snowden and the material? Should we consider the role of the algorithms that tap into high-security data? Could it be reasonable to forecast that the algorithms would be used in this way? Are any of the engineers involved at all responsible for the results?


How about the use of robots by Dallas police to kill a shooting suspect after officers were shot? This situation is believed to be the first time a robot was used by police to kill a person in the US. Yet, when discussing the event, the headlines and law enforcement consistently used the word “suspect” to describe the person killed. The justification became that other options would have further exposed the police to danger, and that the use of such a device in the future would only be used as a last resort. But is this true?

Whether various robotics systems eventually end up in widespread use by law enforcement is not so much about the technology as it is about legal, financial and ethical policy.


Other situations and dynamics play into the issue. These include:

  • The militarization of our police forces. In spite of rhetoric to the contrary, 2014 and 2015 saw peak shipments of surplus military equipment to local police departments nationally. Are we dangerously bringing warfare electronics into our neighborhoods?
  • The ubiquitous use of surveillance equipment and recognition technologies. Sensors and cameras are virtually everywhere, as shown by the Boston Marathon bombing. Does an expectation of privacy still exist?
  • We are no longer outraged. The government has “no doubt” that innocent civilians have been killed by drones, and despite the checked, double-checked, triple-checked actions taken, it continues. How many have died? 2,400 people since 2004, and that is just in Pakistan, according to a report by London’s Bureau of Investigative Journalism. A mere 84 of the dead were confirmed Al-Qaeda members. Without outrage, do we set the stage for default approval?


In Oppenheimer’s world, he was the person in charge of the Manhattan Project and tasked with producing the first nuclear weapons. Oppenheimer was the top dog, the one responsible—and he took on that role for accolades as well as criticism. Today the lines of responsibility for an end result are blurred. We make parts of weapons used in war. We write algorithms and we have no clue if they will be used to protect us or for covert and illegal acts. Individual engineers are a part of the big picture, often without any grasp of the intended finished product’s use. Engineers today often are not just working at arm’s length from the ultimate output of their labor, but are instead a football field away.

The world that we are dropping technology into is also rapidly evolving. It is a dangerous place. Our ethics revolve around work ethics—how to design properly, use appropriate materials, address the needs of our clients, prevent system failures, and design and ruggedize parts that someone else will assemble. And we often do not know what these systems are meant to do, how they will be used, or whether they should have been built at all. We often just do not know. Or do we say, “I'm just an engineer?”

The charge to engineers


We teach engineers to design correctly, to be productive and to search for solutions, but not necessarily about their ethical role. Curious, I searched for a Code of Ethics and found the IEEE Policies, Section 7—Professional Activities (Part A—IEEE Policies):

“We, the members of the IEEE, in recognition of the importance of our technologies in affecting the quality of life throughout the world, and in accepting a personal obligation to our profession, its members and the communities we serve, do hereby commit ourselves to the highest ethical and professional conduct and agree:

  1. To accept responsibility in making decisions consistent with the safety, health, and welfare of the public, and to disclose promptly factors that might endanger the public or the environment…”

I am curious as to what today’s engineers think about the topic, or if it is considered at all. It seems that we are charged with responsibility for the creation of a product and the safety of its use; however, what does that mean to us?

As technology continues to move at breakneck speed, finding an ethical consensus will remain difficult. We will all decide what is important for us—or choose to remain in the proverbial dark.

If you have thoughts on the question, please consider commenting. Maybe we can initiate a lively discussion.



Powered by CR4, the Engineering Community

Discussion – 0 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Engineering Newsletter Signup
Get the GlobalSpec
Stay up to date on:
Features the top stories, latest news, charts, insights and more on the end-to-end electronics value chain.
Advertisement
Weekly Newsletter
Get news, research, and analysis
on the Electronics industry in your
inbox every week - for FREE
Sign up for our FREE eNewsletter
Advertisement