28 May 2019

In his doctoral thesis, Bertil Grelsson shows that it is perfectly possible to navigate at sea and in the air with the aid of images. As long as it’s not too misty. The methods provide an important component in safe and robust navigation.

Photo from the Piraya
The boat is unstable, which gives an image affected by wave motion, possible to correct with the aid of mathematical algorithms. 
“Images are an efficient means of communication, and the methods we use are similar to the way in which people experience their surroundings and recognise their position”, says Bertil Grelsson, newly promoted doctor in the Computer Vision Laboratory, where he has worked as an industry-based doctoral student for the past few years.

His thesis describes how vessels, both flying and floating, can navigate and determine their position, not only in an open landscape but also in an archipelago – far from the well-organised and well-documented street systems of urban surroundings. The research is a part of the puzzle in the WASP programme into autonomous systems, and part of technology development for Saab’s products.

“The aims of the doctoral project were firstly to contribute one of several technical aids that make navigation at sea and in the air increasingly safe, and secondly to increase expertise within neural networks at Saab”, he says.

Real time position

This is something that he plans to continue working with when he shortly returns to work at Saab full-time.

Testing his system in practice is invaluable for Bertil Grelsson.“Two of the technologies I have studied are sufficiently robust and accurate to be used for navigation on, for example, an aeroplane. Using the code we have developed during my research, we are now only a factor of 2-3 from position determination in real time, which is a huge improvement from the technology we were previously looking at. Now we have to continue working with the code to enable us to determine positions in real time.”

This means that the boat or aeroplane can calculate its current coordinates, with an accuracy of a few metres.

He collected the data used in one of the papers in the thesis during the large demonstration held in WARA, the WASP arena in Västervik, in September 2018. (WASP is the Wallenberg AI Autonomous Systems and Software Program.)

A Piraya boat carrying a 360-degree camera took images of the horizon. The boat is unstable, which gives an image affected by wave motion. It is, however, possible to correct the image with the aid of mathematical algorithms. Bertil Grelsson collected images at a rate of 36,000 per hour, and the images he collected have subsequently functioned as training data to allow the neural network in the computer to learn to recognise the surroundings. The information in the images is also compared with known geographical data, such as height contours.

Matching the observed horizon with known contours

“The Västervik work was probably the most important scientific contribution in the thesis. We have combined a type of neural network known as a convolutional neural network, in which we can rapidly determine the horizon by segmentation, with a process of comparison to available geographical data, contours, etc. We were able to rapidly predict the appearance of the horizon from a certain position”, he says.

A second paper in the thesis describes studies of how images from a fisheye camera mounted on an aeroplane can be used in a similar way.
In this case, the image coordinates are compared with world coordinates following an initial calibration of the camera using chessboard patterns of exact dimensions to compensate for distortions in the camera lens.

“In this case we can again match the observed horizon with known contours. Where the horizon is far away, the height differences in the terrain can be smaller than a single pixel, but this is not a problem, since a large part of the horizon can be imaged by a fisheye camera. In this case, we must also take into account in the calculations the refraction of rays over long distances. It makes the maths more interesting, but this method also works in a purely practical sense as one of many technical aids for onboard navigation”, he says.

Several different methods are needed: cameras are not particularly useful when it’s misty or foggy. In such conditions, you need radar. The GPS system can also be used, but it’s not at all unlikely that in a crisis situation the signal will be jammed or manipulated, and if this happens imaging might save the day.

Vision-based Localization and Attitude Estimation Methods in Natural Environments,
Bertil Grelsson, Department of Electrical Engineering, Computer Vision Laboratory, Linköping University, 2019.
Supervisor: Professor Michael Felsberg

Translated by George Farrants

Contact

WASP at LiU

Latest news from LiU

A man in a suit holds a green plant in his hand.

LiU involved in a megastudy on climate behaviour

What is the best way to make people behave in a more climate-friendly way? Researchers at Linköping University and Karolinska Institutet have contributed to a worldwide study on this topic.

Nerve damage from cancer treatment can be predicted

Many women treated for breast cancer using taxanes, a type of cytostatic drug, often experience side effects in the nervous system. Researchers at LiU have developed a tool that can predict the risk level for each individual.

Woman in safety helmet.

Her mission is difficult – but fun and achievable

We are in the midst of a tough transition towards more sustainable development. This requires innovation and knowledge, says Marie Trogstam, a LiU alumna who is now head of sustainability and infrastructure at the Confederation of Swedish Enterprise.