“This means that it must be possible for people to delegate tasks to robots, robots to people, robots to other robots, and people to other people. And all this must take place as efficiently as possible,” he says.
Speed is key in alpine rescue operations: a rapid response saves lives.
Delegation to a machineWithin the field of artificial intelligence, one of the intuitions used is to study how people in organizations collaborate while seeking solutions to problems and reaching decisions. We look for capable people who can contribute, we meet and discuss, we make plans, we delegate parts of the overall task to the right people, execute joint plans, evaluate, and continue the cycle.
Professor Patrick Doherty, Artificiell intelligens, IDA Photo credit: Göran BillesonBut what is a ‘task’, and how can you delegate one to a machine? And how can a robot understand the purpose of the overall system, in order to create common benefit? Patrick Doherty breaks down the questions into pieces and defines a task by using a specification tree: a task consists of a plan, an objective and some conditions.
Communication is also important, and the protocol and language to use must be decided. By using a model known as ‘Speech Act’, various dialogues can be developed based on commands such as ‘take off’, ‘wait’, ‘deny request’, ‘remain’, ‘propose’, ‘inform’ and ‘query’.
In order for two systems to be able to collaborate, they must trust each other; they must ‘shake hands’ and exchange specifications, and tell each other what their capabilities are.
A task is then assigned to the robot based on its representation as a specification tree. It decides rapidly whether it is able to carry out the task. An example may be that an area of a given size is to be scanned within 30 minutes at a specified resolution. If the request lies within the capabilities of the robot it replies positively: “Yes, I can do this”. It also commits to doing what it is assigned.
In a large system, many subsystems must all answer positively, in order for the complete system to reach the decision: “Yes, we can do this”.
Furthermore, the system must be able to pass a task autonomously to the team agent that has the best preconditions for completing it. This may be, for example, a drone that is located closest to the area to be scanned, or a drone that has the best camera, etc.
A carefully defined protocol governs this process.
Unmanned helicopter RMAX Photo credit: LiUPatrick Doherty shows a video in which a small unmanned RMAX helicopter scans an area in the Italian Alps and creates a 3D map of the scanned region. The system has been tested within the Sherpa European research project, in which Patrick Doherty and his group participated.
“What’s important here is to obtain as much information as possible as quickly as possible, and to share it with other units,” says Patrick Doherty.
People can also interact with the system, and order it to pay extra attention to a region where injured people are suspected, or a region in which evidence of people can be seen in the real-time images sent by the drone.
“The human operators can give the system exactly the right amount of autonomy that the task requires,” he says. Too much autonomy is just as problematic as too little autonomy when using such systems.
It is also important that people can interrupt a task in different ways. One wants to be able to recover gracefully from interruptions in a resilient manner.
“We’re working on solutions for this at the moment,” says Patrick Doherty.
Mountain rescue, however, is only one of many applications.
“We are creating an infrastructure that can be used in many different applications, not least within the Wallenberg Autonomous Systems and Software Program (WASP), where a demonstration unit in the field of public safety is to be available by 2018-2019,” says Patrick Doherty.
Patrick Doherty is professor of artificial intelligence at LiU’s Department of Computer and Information Science, and was one of the invited keynote speakers at the RED-UAS conference that was held in parallel with the UAS Forum in Linköping. RED-UAS is an acronym for “Research, Education and Development of Unmanned Aerial Systems”.
“Research and education in the field of drone technology are relatively new, and this was the fourth conference to be held,” Fredrik Gustafsson, professor in sensor informatics, informs us. He was one member of the local organising committee, together with Gustaf Hendeby and Martin Enqvist.
Around 60 participants attended the conference, which is held every other year.
Other topics discussed included research and education within simulation, traffic control (it’s important that the drones do not collide with or otherwise disturb routine flights), navigation and, not least, the role of drones in large autonomous systems.