Jürgen Fälchle - stock.adobe.c

What would coders do in ethical grey areas?

Software developers are writing code that affects the lives of millions of people and how businesses are run, but what do coders think about some of the ethical questions they face?

As modern civilizations become increasingly reliant on software for everyday life and running business, the people who write code increasingly face ethical questions.

A global survey of thousands of coders revealed where they stand on some of the complicated ethical questions related to today’s software development.

The Stack Overflow annual developer survey of 100,000 coders across 183 countries revealed that around 40% of software developers would consider writing code for unethical purposes in certain circumstances.

“Ethical situations can be complicated. Only tiny fractions of developers said they would write unethical code or that they have no obligation to consider the ethical implications of code, but beyond that, respondents see a lot of ethical grey,” said the report. 

“Developers are not sure how they would report ethical problems, and have differing ideas about who ultimately is responsible for unethical code.”

While 58.5% of coders would not write code for unethical reasons, 36.6% said they would consider it depending on what it was for and 4.8% said they would write it.

When it comes to reporting ethical problems with code, 46.5% said whether they reported it or not would depend on what it is, 35.7% said they would report it but only within the organisation where the code is used, while more than 13% said they would report it publicly and 4.6% said they would not report it.

Most coders (79%) feel obligated to take in the ethical implications of code, while only 6% feel they have no obligation. The remainder were unsure, but were 40% more likely to say they do not need to report any ethical problems.

Just under 20% of coders think they are responsible if the code they write is used for unethical purposes, while 57.5% think upper management of the organisation that uses it are responsible, and 22% blame the person that came up with the idea.

Read more about ethics in IT

The Stack Overflow survey also found that only a quarter of respondents believed a government regulatory body should be responsible.

The survey showed there are fears around singularity, where artificial intelligence (AI) could self-improve and eventually surpass human intelligence, and of algorithms making decisions over fairness. The survey revealed that almost half (48%) of coders believe the people creating the AI should take responsibility.

The use of AI will not only transform how people live, but also what they do. This could lead to a period of instability as people are replaced by robots, mainly of the software variety, but also humanoids in the future.

Read more on Software development tools

Join the conversation

3 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

But what is "unethical"? Is it the shopping cart software that knows all of the discounts that could apply to your order and could select the best one but only applies one if you enter the correct code. Or is that just not "friendly" Unless the survey was more specific about the definition of "unethical" there is no wonder that there was so much ambiguity in the responses.
Cancel
Indeed. Yes there is a link to the survey in teh story. Take a look it might shed some light on it. You are right unethical can mean many things.

Thanks

Karl
Cancel
Here is the survey results link: https://insights.stackoverflow.com/survey/2018
Cancel

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close