Ethics? In Computer Science?

23 Apr 2020

What Does It Have To Do With Software Engineering?

At first when thinking about Ethics in the context of Software Engineering I didn’t know what to think. When I looked at the word “Ethic”, I thought of “Ethical”, as if there was some morally right or wrong way to do Software Engineering. But in reality, it means much more. To have ethic in Software Engineering, or in the context of any job really, is understanding the impact that your job has on the world, and being able to take responsibility of your actions if anything happens as a result of your work. It also does still have to do with morality, being able to look at your work and determine if it is something that you and other people can look back on and be proud of is still part of ethics in Software Engineering too. Some may not think ethics is very important for a Software Engineer, but computers have been so deeply engrained in so many fields of work that errors in any code built can lead to disaster, especially in the hands of critical jobs such as anyone in the medical field, or engineering field.

Autonomous Cars and their Ethical Implications

Autonomous Cars have always fascinated me. The thought of having a car that drives itself brings out my inner laziness who desires a machine that will deal with the traffic for me. That being said, looking at the case study really opened my eyes to the ethical issues that the Software Engineers are faced with creating such a vehicle. In essence, what the case study talks about is: If an automated car encounters a situation where an accident is inevitable, what should the car focus on, saving the life of the driver, or saving the life, or potentially multiple lives of the other party involved?

The results of it were that the car should take the path that leads to the least amount of casualties, for example if the car’s options are to run into 10 people who are on the road or 1 person on the sidewalk, the car would swerve towards the passerby as opposed to the larger crowd. The results stayed relatively the same even in the case that if they were the driver and had to potentially sacrifice themself if it meant saving 10 people.

The issue comes with the people being studied, in that they did not believe that the companies would implement such programming. After all, would many people be willing to buy a car that is programmed to kill them? This is the ethical dilemma that the software engineers face. Either way, if the car is given the program, the software engineer has to live with themselves, knowing that they created code that would one day kill someone, or if the program was not created, then they have to know that the cars they programmed are just machines that will run down any number of people in their way. Neither situations are ones to look back on and be proud of, and both situations are under their responsibility as the person who developed the thinking process of the vehicle.

My Stance

After thinking about the situation, I believe automated cars should be built in with the program to self-sacrifice. I would like to say that neither situation be picked, and that we are just not ready for automated cars yet, but with companies like Tesla having auto-pilot functions and the likes, automated cars are inevitable, companies have put too much money into it that they won’t just stop development on them. This ensures that the least amount of casualties occur, but also does work in line with my statement that we aren’t ready for automated cars. If people knew that the cars were programmed to potentially sacrifice their driver in certain dire situations, not many would buy them. This can be seen in the study as they found that the people they surveyed, although they did support the self-sacrificing program, did not intend to buy a car themselves. They agreed that it should be done for the greater good, but were not willing to buy a car themselves. This would be the best situation for the software engineer as well, because unlike the other solution of not giving the car the program, the program would potentially save lives. Of course, if I had the final say in all of it, I’d say to not develop automated cars at all, putting the lives of themselves and others in the hands of the driver, and leaving the ethics of the software engineer at peace.