The world needs to set norms around behaviour in cyber space, says Scott Charney, corporate vice-president of the Trustworthy Computing Group at Microsoft.
However, the technology industry cannot wait for norms and needs to make fast and hard choices, he told attendees of RSA Conference 2014 in San Francisco.
In the absence of norms, the technology industry has to be principled about what it does, and the principles of security, privacy and transparency have served Microsoft well, said Charney.
“Because we have had them for a long time I have not been overly concerned about disclosures about what my company may or may not have done.
“So when people say are you worried about more Snowden disclosures, I can say I am not because we have been principled,” he said.
Adherence to the principles of security, transparency and privacy means that Microsoft does defence and not offence, said Charney.
It also means Microsoft does not put back doors in its products and services, which in any case would be economic suicide, he said.
“People have asked if our Defender anti-virus product will raise alerts if it finds government surveillance software, but the answer is simple. We don’t care what the source or the motive of malware is.
More on Microsoft security, privacy and transparency
“If it is malware, we notify people. If you have good principles and keep it simple, it works fine,” said Charney.
Referring to concerns around the customer data that Microsoft is forced to disclose to the governments, he said the company responds only to court orders that specify particular accounts.
“We have never received an order for bulk data, but we would fight an order for bulk data that does not relate to a specific and identifiable,” said Charney.
In line with its principle of transparency, he said he company runs a government security programme to address concerns about the integrity of supply chains.
“We make our source code available to governments because they are concerned about back doors in products from an American company, and the answer is to be transparent and show people the code,” he said.
Microsoft also issues transparency reports on law enforcement requests it receives, a practice that was started along with other companies long before the Snowden disclosures, said Charney.
“Because of government rules we are not able to talk about things like national security letters, which is why Microsoft and other tech companies sued the government to be more transparent,” he said.
The suit was dismissed but only after Microsoft and its alliance partners won the right to disclose ranges of national security related data requests they receive.
In addition to guiding principles, in the absence of norms, Charney said the world needs a model for understanding actions in the digital world.
He is proposing a model for all types of scenarios that examines actors, objectives, actions and impacts.
For example, in airport security, the actor is the government, the objective is keep the plane safe, the action to send people through a metal detector, and the impact is good.
Applying the model to the Snowden disclosures, the actor is the government, the objective is to stop terrorism, while the action was to put everybody under surveillance.
“But the tricky part is considering whether the impact on privacy, security, trust in the internet, and on terrorism was worth it,” said Charney.
Applying the model to economic espionage, the actor is a government, the objective steal IP to benefit domestic industries. “But if you believe the objective is wrong, stop," he said. "No action justifies the objective because if the objective is wrong you should not be doing it.
“When considering any action, ask who is doing it and what is the objective. If the objective is bad, stop. If it is good, look at the action and then look at the impacts.
“That is the complex part as there will always be positive and negative impacts. It is a matter of balancing the cost with the benefit as in any risk assessment,” he said.
Charney said he is optimistic because although the problems faced by society in the digital world are difficult, the world has faced difficult problems before.
“We have to have the the right way to think about them. We have to push global normative behaviour. We have to think about what actions are appropriate or not and we have to look at impacts.
“If we do that, when the next disclosure occurs regardless of what it is about, we will have a framework for thinking about it – a framework for deciding as a civil society whether we accept it or not, and then we will take appropriate steps,” he said.