Ingo Bartussek - Fotolia

Autonomous drones will fly into danger zones for emergency services

Automated drones will use sophisticated visual systems and machine learning techniques to carry out search and rescue missions where humans fear to tread

Within the next five years, automated drones could be flying into danger zones to carry out search and rescue missions following earthquakes, fires and other disasters, if research under way at New York University Abu Dhab (NYUAD) comes to fruition.

The project, which has already seen a number of prototype drones take to the air, is being run by Antonios Tzes, professor of electrical and computer engineering at NYU Abu Dhabi, who was looking for a way to get drones to operate in indoor or otherwise enclosed environments.

Most regular drones rely on access to GPS to navigate, which means that in so-called GPS-denied (indoor) environments, they are at a distinct disadvantage, and although some models can detect obstacles using visual systems, when they do detect such an obstacle, they are unable to do much more than hover on the spot and not hit it.

But a drone that could operate in such an environment would be ideal for use in disaster situations where humans are unable or unwilling to venture into the afflicted area, both to reconnoitre the situation and even render basic aid to anyone who may be trapped.

“If you want a drone that can navigate itself in a structured or unstructured environment, in a GPS-denied environment, you need to equip it with different sensors,” said Tzes.

NYUAD’s prototype drones use a combination of telescopic vision – multiple cameras to detect depth and distance – and light detection and ranging (Lidar), which uses the travel time of a laser beam to calculate the distance between the drone and an obstacle. Used together, these enable the drone to move slowly and safely through an enclosed environment. At the same time, said Tzes, the prototype’s on-board processing creates a map of the environment.

The second challenge depends on what type of task the drone is being used to carry out, he said. “For example, if you are using it to detect fire, you need an infrared sensor tuned to detect temperatures within a given range, or if you want to detect dead bodies indirectly emitting odour, you need airborne particle sensors.

“If you also want to interact with the environment, you need a manipulator, or robot, which is attached to the belly of the drone – in this case, we’re talking drones that can lift close to 10kg, so we’re looking for robots that are light and fast.”

The final, and most critical element, in many ways, is that the drones must be capable of operating autonomously over a period of time, because a human operator cannot always rely on video feedback to be able to control the machine, said Tzes.

“If the drone is inside a collapsed building that you’re afraid to send a human into, the drone must be autonomous enough that it does not require constant human supervision, and that’s the most critical parameter,” he said. “You must have a lot of intelligence inside the drone to let it start making its own decisions.”

Read more about drone technology

Giving the drones a certain amount of freedom to move away from their human controllers requires both advanced machine and deep learning techniques, and a huge amount of computational power.

So far, Tzes’ team has not gone much further than enabling a drone to guide itself through a cluttered lab environment, using deep learning to analyse massive amounts of video data to identify free space that the drone can move through.

But for Tzes, the life-and-death implications of a search-and-rescue mission mean he wants to retain as much human control as possible, and full artificial intelligence (AI)-driven autonomy will almost certainly never be used.

“We will mostly try to do this with non-AI technologies,” he said. “If we feel there is no other solution, then, and only then, will we use AI. We have played with it and in most cases had good results, but we are not certain about it, especially when it comes to reliability.”

Other issues persist around the sensor technology – real-time sensor capabilities are not yet mature enough to deal with multiple types of chemical compound, or to deal with scenarios in which there may be a biological or nuclear hazard.

Nevertheless, Tzes is forging ahead with a number of potential future innovations to the technology, designed to enhance the proposition still further. For example, the team is already exploring using variable pitch propellers – commercially available on racing drones – to make their drones faster and more manoeuvrable.

Simultaneous advances in robotics technology will also bring new manoeuvrability features to the robot manipulators, so that in some circumstances, drones could eventually be equipped with basic medical equipment – not to be able to perform remote surgery, but to better detect vital signs such as pulse and breathing.

Tzes said he hopes to have a fully working prototype tested in various scenarios within the next 12 months, at which point he will begin to look either for venture capital backing, or to create a startup to enable him to take the next step. A fully developed model, priced at around $15,000, could be just 12 months down the line.

Read more on Artificial intelligence, automation and robotics

CIO
Security
Networking
Data Center
Data Management
Close