Software engineering ethics don’t exist currently, but the need for it is becoming more apparent each passing day.
Software is literally everywhere in our life. Yet, software engineers have no ethics or standards to speak of.
Is that okay, or do we need to do something about it? If we do, what might that be?
We have no answers either, but we bring you a set of ideas and thoughts from Robert C. Martin “Uncle Bob” as a conversation starter. This blog post is written based on an interview by Karolina Toth on episode 59 of the Level-up Engineering podcast.
This post covers:
I’ve been in this industry for over half a century, so it matters to me how the world perceives the character of software developers.
The software industry is still young. The first code to execute on an electronic computer was written in the 1940s, so less than a lifetime ago. We haven’t had enough time to understand the ethical impact of building software.
This impact was relatively small for the first 50 years. Computers were far away from people most of the time.
This has changed dramatically. Now, computers are everywhere; nothing happens in our world without them. Our civilization depends on software, so it depends on the programmers building that software.
Currently, programmers have no code of ethics, no set of standards, and no set of disciplines that the majority agrees to be the right way to write software. I consider this an unstable situation.
Currently, software developers carry the lifeblood of civilization. Nothing can function without software, and developers' behavior isn’t in line with that responsibility. I want us programmers to behave in a stalwart way, and acknowledge our responsibility, which is likely to keep growing.
There have been high-profile calamities caused by software over the past decades. Software developers need to start discussions about what may be done before a disaster happens that takes control out of our hands.
For example, most modern cars have some level of automatic control. There are computers exerting that control over the head of the driver or in response to the driver. This level of control raises ethical questions even before we build fully autonomous cars.
I expect a disaster to happen eventually that overtakes politics and causes legislation to constrain the software industry. We need to get ahead of this and be ready with a code of ethics and a set of standards by the time the politicians come to regulate us.
There are conversations about the possibility that computer programs will eventually write their own programs. We may get to a point where you tell the computer what you want your software to do and it’ll automatically write the software. In my view, this is a way of trying to avoid our responsibility.
I fly a small airplane, and it has a great autopilot onboard that can fly the plane virtually for the entire trip. If the autopilot malfunctioned and crashed that airplane, all consequent deaths would be my responsibility as the pilot. The machine can’t be responsible; it’ll always be the pilot supervising it.
The same is true in software engineering: the software can’t be responsible; it has to be the engineer behind it.
Currently, the software industry doesn’t have a set of quality standards, and people are dying due to software errors in machines. This can get a lot worse. We need disciplines and standards to define the requirements for software built for different devices from a microwave oven to a jet airliner.
It’d be great to have a guarantee that the software in our planes will only do what’s necessary. There are independent bodies trying to come up with standards, but they’ve had limited success so far.
We need to answer these questions:
There is a certain joy to finishing a day of work knowing that you’ve done a good job. You look in the mirror and feel proud of yourself. Too many software developers don’t experience this, but they should, and ethical behavior can get you there.
The most common ethical problem is the matter of deploying software that the programmers aren’t sure will function at 100%. They go along with it because they feel like they’re under economical pressure to release as early as possible, so they give up releasing pristine code for the sake of keeping their jobs.
A set of quality standards may solve this. Standards may be different for a microwave oven and a jet airliner, but we need to draw clear lines.
When the developers know they’re doing something wrong, they tend to put the responsibility on someone else, often their manager. They tell themselves that the schedule didn’t give them enough time, so it isn’t their problem that the software ended up being built in low quality.
You can’t hide behind your company and have them deal with all the legal issues for the consequences of you releasing bad code. This attitude is fundamentally unprofessional. In my opinion, this does the most damage, so this is what software engineers should fight against the most.
Engineers should never deploy software that is substandard, may malfunction, or will be difficult to change due to too much technical debt. We need to adopt a professional attitude and refuse to prioritize demands based on mere schedule over quality.
You can’t use the excuse that you’re just building a video game or a thermostat, so it doesn’t matter. Everything matters. We need to develop the proper emotional attitude to building software; that’s the role of software engineering ethics.
It’s always beneficial to release ethical software, in my opinion.
The problem isn’t that managers want to release substandard software. Ask any organization or any manager whether they want to ship quality software, or software that might crash and kill people. The answer is obvious.
The problem is that they don’t have a way to determine the quality of software’s construction. The only people who know that are the developers themselves.
Developers need to find the courage to tell their managers and their organizations that a software can’t be released. Even if it appears to work, it may not be ready. Architects do the same and we trust them, so we don’t move into a house unless they say it’s safe.
We need to do the same in the software industry. The final authority to decide whether a software is ready to ship must lie with the engineers and not with the organization.
Quality software works according to customer demands. It performs the job it was built to do without risk of irregular behavior or crashes. The programmers who built the software have to know that it’ll correctly do what it’s designed to do.
Quality software has to be easy to change. It mustn’t impose a high cost on the customers or the owners when they need to implement changes to it.
Most software, even modern household devices like refrigerators, connect to the internet. That makes them potentially vulnerable to attacks or different kinds of internet malfeasance. This makes safety an essential aspect of quality software.
I hope a license system for software engineering isn’t the right approach, but it’s possible. Setting up a central organization that polices the basic set of ethics and standards similar to lawyers or doctors might work.
I’d prefer a distributed approach.
Programmers could coalesce into guilds adhering to their own set of standards and ethics; this seems like a better option to me. That would allow the market to decide which guilds they prefer to use for different types of software.
I’m also intrigued by the idea of a completely distributed mechanism. It could work like a blockchain where the ability to recognize a software developer as ethical is somehow distributed peer to peer. I’m not sure how that might work exactly, but we’re in the age of applying distributed systems of trust.
The key is always individual responsibility. Every doctor or lawyer is individually responsible for their work. Software engineers shouldn’t be able to devolve blame to their managers saying they were following orders.
I don’t like the idea of a central authority imposing a set of standards and ethics on the software industry. In my opinion, any set of ethics has to be policed at the lowest possible level.
Every software developer has to check their releases themselves. In every team, the tech lead has to police the team. Every company has to police its own codebase.
Software engineers need to set themselves up as the final authority to decide whether or not the software is ready to be deployed. We need a set of standards to measure that along the lines of:
On the lowest level, it’s a matter of learning the techniques. There are many popular methods for creating robust software that’s easy to change, well tested, and reliable. We’ve developed these disciplines and the standards over the past several decades.
We need to understand that the risks are higher now than ever. Many programmers got into the business by randomly writing code once and thinking it was cool. Now, we’re faced with the responsibility that our code may interfere with lives, destroy fortunes, and kill people.
I’ve written a book called Clean Craftsmanship, proposing a set of disciplines, standards and ethics. It’s fundamentally a technical book, but it also covers behavioral ideas. I recommend everyone to start by reading it and seeing if it resonates with them.
Make sure to adjust your behavior to follow the standards, disciplines and ethics you adopt. Make them the fundamental principles of everything you do in your career.
We need a wide-spread discourse of software engineering ethics to start moving towards a solution.
Thinking and talking about this topic is the best way to get the discourse going. You can form groups inside your company, like a lunchtime group or a Saturday group where you get together with your peers to share your ideas.
You can discuss software ethics with your managers as well. Explain to them that you can’t guarantee high-quality software without sufficient time to build it. They will make valid points about the importance of a schedule, but you can also communicate the importance of quality to them.
This starts a negotiation about balancing quality, time, and ethics, which rarely takes place in the current environment.
As the discourse continues, hopefully, it’ll reach customers as well.
Companies could come out saying, “Our developers have adopted a set of disciplines, standards, and ethics. It guarantees improved security, comfort, and value in our products.” This may make businesses realize that their main focus doesn’t have to be to release quickly, but that they can sell quality as well.
This is a race. A catastrophe could happen any day. We’ve already had several big incidents.
The number of incidents are growing, and they’re getting closer to our everyday life. Twenty to 30 years ago, these were limited to stories about losing space probes due to faulty software.
Today, you see two 737s fall out of the sky due to a software malfunction that the developers should have seen coming.
Today, you see malfeasance from automobile companies that detect when the car is tested for emission levels, adjust the engine to pass the test, then revert back to heavy emission. Some of the developers involved in that went to jail, and I consider that a reasonable outcome.
Eventually, there will be an incident that shakes the world and makes everyone think that something has to be done about software development. Hopefully, the industry can get ahead of this by the time it occurs. We better be ready with a set of recommendations for ethics, ideally with a set we’ll already be following by that time.
👉 If you are serious about becoming a great engineering leader, you should take a deep dive into the State of Engineering Management 2022 report.
🚀 Need developers for your team or project? Hire our experienced Angular, React or Node.js developers! Click here for a FREE consultation.
About the author:
Gabor Zold is a content marketer and tech writer, focusing on software development technologies and engineering management. He has extensive knowledge about engineering management-related topics and has been doing interviews with accomplished tech leaders for years. He is the audio wizard of the Level-up Engineering podcast.