Industrial robots a security risk, warns IOActive
Some of the most popular industrial and domestic robot brands have various vulnerabilities that could by exploited by cyber attackers, a research paper warns
A hacked robot can act as an insider threat in organisations, industries and homes, according to research by security consultancy IOActive.
The research report, which is a follow up to research published in March, for the first time names robot suppliers and associated vulnerabilities that can be exploited by cyber attackers.
Publication of the latest report comes just a day after CEOs from 115 technology companies signed an open letter urging governments to address their concerns over the development and use of fully autonomous weapons. This echoes the growing number of warnings given by business leaders and artificial intelligence (AI) experts on the consequences of misusing AI.
Robots’ capabilities can be subverted and used for multiple purposes by outsiders that exploit remote vulnerabilities, the IOActive research report warns.
The researchers found that because many robots use multicast DNS to advertise their presence on the network, it is relatively easy to find a robot’s host name.
In addition, because some robot services do not require authentication, any user on the network can issue commands to perform actions or disable safety features.
This means that humanoind robots, such as Nao and Pepper from SoftBank Robotics, can be used by attackers to capture video and audio and leak it to external servers controlled by attackers. According to the report, around 20,000 of these humanoid robots are already in service worldwide.
The report also notes that because UBTech Alpha robots do not encrypt potentially sensitive or critical information before storage or transmission, this data can be used by attackers.
In addition to being an insider threat, the report warns that physical attacks are possible because many collaborative robots (cobots) interact and are in close proximity to humans. Attackers could access the robots to modify their behaviour, the report said.
Read more about robots
- Robo-advisers are crossing the Atlantic to the UK financial services, in the form of IT–dominated startups.
- Estonian robotics startup Starship Technologies secures permission to trial self-driving delivery robots in Greenwich, south-east London.
- The Hilton Worldwide hospitality chain is trialling a robot concierge named Connie, backed by IBM’s cognitive computing programme Watson.
Cobots, such as those as manufactured by Universal, are industrial robots that work alongside humans and are not confined to cages, but usually include collision detection to prevent injury.
However, IOActive researchers found that these robots can be hacked remotely and their safety features disabled so they can be used to injure humans around them.
Even running at slow speeds, their force is more than sufficient to cause a skull fracture, according to IOActive. Safety features in Nao and Pepper could also be disabled remotely, the researchers found.
According to the report, it is possible to remotely upgrade the UBTech’s Alpha 1S firmware by sending an undocumented command through Bluetooth.
But because binaries from UBTech are not cryptographically signed, they could be replaced by malicious files that change the normal behaviour of the robots or even put the robot out of commission so that it cannot be fixed.
In response to the IOActive report, UBTech said in a statement: "UBTech is commited to maintaining the highest security standards in all of it's products. As a result the company has conducted a full investigation into the claims made in the IoActive report regarding the Alpha 2 robot.
“The Alpha 2 robot was designed to be on an open-sourced platform where developers are encouraged to program their robots with code. UBTech has fully addressed any concerns raised by IoActive that do not limit our developers from programming their Alpha 2”
IOActive has published security vulnerability reports for Softbank Robotics, UBTech, and Universal Robots. .... .... .... .... .... .... ... .... .... ....