denisismagilov - Fotolia

Ethical software development: Ask Uber and Volkswagen

Following TfL's decision against Uber, we investigate the role of professionalism and ethics in software development

Uber is the latest company to get caught out for using software to help it overcome official audits and tests.

Among the reasons Transport for London (TfL) gave in September 2017 for not renewing Uber’s licence to operate in London was the software the app-based taxi firm developed to avoid officials inspecting its drivers.

While newspaper commentary has largely been about the Licensed Taxi Drivers Association, which represents black cab drivers in London, lobbying TfL against Uber, an important part of its decision was Uber’s stealth software.

This is not the first time a company has been found to have written software explicitly to get around official tests and audits.

In May 2014, Volkswagen was found to have modified its engine management software to detect when its diesel cars were being run on an official emissions test, so that it could dial down the emissions. The carmaker effectively wrote software specifically to cheat, according to the New York Times, which wrote: “Volkswagen admitted that 11 million of its vehicles were equipped with software that was used to cheat on emissions tests.”

The newspaper reported that an on-road test conducted by West Virginia University found that some cars emitted almost 40 times the permitted levels of nitrogen oxide. This led to California Air Resources Board’s investigation of Volkswagen.

Looking at TfLs decision against renewing Ubers licence to operate in London, among its concerns was the use of so-called Greyball software, which geofences government and official buildings.

Ethics: How to handle a difficult conversation

“Some people have raised concerns and are uncomfortable with the ethical position they face. I have spoken to BCS members, and when these things come up, they use their chartered status, in a similar way to an accountant, to say their professional body won’t be happy with this practice. A grown-up professional needn’t scream blue murder. You can take a softer line by saying something like ‘you wouldn’t want me to do something illegal’.”

David Evans, BCS director of policy and community

The software reportedly presents an alternative site to customers, or people wishing to book a ride from outside these buildings, which is used to prevent officials from booking an Uber ride.

Other cities have been concerned abut the use of Greyball software. In a blog post, Gerald Gouriet and Charles Holland of barristers’ chambers Francis Taylor Building described Uber’s Greyball program as a method of identifying regulatory staff using the customer app and thereby avoiding regulatory activity, and highlighted the case of New York.

“Uber initially robustly defended the program, but after six days, announced it would be withdrawn,” the pair wrote.

The US City of Portland recently published an audit looking into the use of Greyball software at Uber, which confirmed the transport company had admitted the use of such software. “In a letter dated April 21, 2017, Uber’s counsel provided their second response. In this response, the company admits to having used the Greyball software in Portland for a two-week period, from 5 December to 19 December 2014 against 17 individual rider accounts,” the audit report stated.

Read more about hyper-converged

The records provided by Uber show three of those individual riders actively requested and were denied rides on the Uber platform, the court filing stated. The company said it would never engage in a similar effort to evade regulators in the future.

But as Computer Weekly’s sister title, TheServerSide, notes, the company’s record of unethical practices in software development appears to show there is a culture of contempt among managers.

On her blog about sexual harassment at Uber, Susan Fowler wrote about a “toxic culture” in the company, where managers refuse to cooperate. “I remember a very disturbing team meeting in which one of the directors boasted to our team he had withheld business-critical information from one of the executives so he could curry favour with another,” she wrote.

There is also the case of Uber’s God View tool, infringing users’ privacy by collecting data about their location even when the Uber app is not being used.

Charging more than you need to

Beyond Uber and Volkswagen, examples of unethical coding include overcharging clients, producing poor quality code, or stealing intellectual property.

In a post on open source repository GitHub, one developer has been looking to raise the profile of coding ethics. The developer described how an employer once asked to change the value of refund vouchers on an e-commerce site to make the refund worth less.

The coder wrote: “I think we need to establish a code of ethics for programmers. Doctors, social workers and even lawyers have a code of ethics, with tangible consequences for skimping on them. Why not programmers as well?

“I want to live in a world where a programmer who hasn’t agreed to follow our code of ethics has a hard time getting employed. It is simply not acceptable to write code that is harmful to users. What the hell is wrong with these people?”

The Association for Computer Machinery’s ethics statement says: “Software engineers shall approve software only if they have a well-founded belief it is safe, meets specifications, passes appropriate tests, and does not diminish quality of life, diminish privacy or harm the environment. The ultimate effect of the work should be to the public good.”

Ethics in software engineering is also an area the BCS, The Chartered Institute for IT, has looked into. The BCS’ Code of Conduct for its member states: “You shall have due regard for public health, privacy, security and wellbeing of others and the environment.”

Responsibility to society

David Evans, BCS director of policy and community, believes an overriding outcome in the domain of computing should be to benefit society and improve human wellbeing. For organisations that value customer relationships, ethics is very important. “In the academic world, ethics is top of the checklist,” he says.

But working in an ethical manner can be challenging. “The idea of public benefit or human wellbeing turns ethics into a misplaced concept,” says Evans. “You can lose the reason why you do it. We want professionals who do things that do not cause harm to others, and we also want our IT team to understand the effects of what they do.”

The value of working ethically should, according to Evans, be ingrained in corporate culture, including IT and software development. He says organisations benefit if IT understands the human impact of what it does.

The challenge for people working in IT is that the impact of their work can be quite abstract, says Evans. “It is hard enough to think about what is illegal. It’s harder to get people to understand how their work will impact other people.”

Data protection implications

A case in point is the Data Protection Act. A business may wish to use its customers’ data in certain ways to drive new opportunities.

“I’ve seen reputable companies celebrating tech success when their developments are in breach of the Data Protection Act,” says Evans. “Ethics may constrain you from doing things that may make money.” He argues that data sharing is not an ethical question: “It is the actual law.”

For the BCS, ethics goes hard-in-hand with professionalism. The software industry appears to operate without much regard to the impact on individuals and businesses. “A construction company cannot build a huge dam without consultation,” he says.

“We will need this in software, but the problem with Silicon Valley is that a small startup in a bedroom can disrupt major industries around the world. Dialogue becomes necessary.”

AI and ethics

The industry is now entering the dawn of machine learning, where artificial intelligence (AI) is used to process vast amounts of personal data and then make decisions without the vagaries of human decision making.

Ethics, as it relates to AI, is among the topics author, broadcaster and tech philosopher Tom Chatfield will be speaking about at the InterSystems Technology Summit on 18 October.

“We are busy translating the fabric of our societies into something machine-readable: into data on a scale that only machines can handle, and that in turn will fuel the next generation of machine learning,” he says.

Walker says there are two points to consider as the world translates more into the digital domain: the quality of the translation, and its capacity for iteration and improvement.

“The exponentially increasing volumes of data handled by our tools can, when used well, feed the actionable small data and intuitive insights human lives thrive upon – but they can also create a locked-down world in which decisions occur beyond our scrutiny,” he says.

For Walker, this is the difference between tools that can make integrated health records available anywhere, at the touch of a button, and tools that deny someone insurance based on an inscrutable algorithmic reading of their life.

This was last published in October 2017

CW+

Features

Enjoy the benefits of CW+ membership, learn more and join.

Read more on Software development tools

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close