His thesis describes how vessels, both flying and floating, can navigate and determine their position, not only in an open landscape but also in an archipelago – far from the well-organised and well-documented street systems of urban surroundings. The research is a part of the puzzle in the WASP programme into autonomous systems, and part of technology development for Saab’s products.
“The aims of the doctoral project were firstly to contribute one of several technical aids that make navigation at sea and in the air increasingly safe, and secondly to increase expertise within neural networks at Saab”, he says.
Real time positionThis is something that he plans to continue working with when he shortly returns to work at Saab full-time.
Testing his system in practice is invaluable for Bertil Grelsson.“Two of the technologies I have studied are sufficiently robust and accurate to be used for navigation on, for example, an aeroplane. Using the code we have developed during my research, we are now only a factor of 2-3 from position determination in real time, which is a huge improvement from the technology we were previously looking at. Now we have to continue working with the code to enable us to determine positions in real time.”
This means that the boat or aeroplane can calculate its current coordinates, with an accuracy of a few metres.
He collected the data used in one of the papers in the thesis during the large demonstration held in WARA, the WASP arena in Västervik, in September 2018. (WASP is the Wallenberg AI Autonomous Systems and Software Program.)
A Piraya boat carrying a 360-degree camera took images of the horizon. The boat is unstable, which gives an image affected by wave motion. It is, however, possible to correct the image with the aid of mathematical algorithms. Bertil Grelsson collected images at a rate of 36,000 per hour, and the images he collected have subsequently functioned as training data to allow the neural network in the computer to learn to recognise the surroundings. The information in the images is also compared with known geographical data, such as height contours.
Matching the observed horizon with known contours“The Västervik work was probably the most important scientific contribution in the thesis. We have combined a type of neural network known as a convolutional neural network, in which we can rapidly determine the horizon by segmentation, with a process of comparison to available geographical data, contours, etc. We were able to rapidly predict the appearance of the horizon from a certain position”, he says.
A second paper in the thesis describes studies of how images from a fisheye camera mounted on an aeroplane can be used in a similar way.
In this case, the image coordinates are compared with world coordinates following an initial calibration of the camera using chessboard patterns of exact dimensions to compensate for distortions in the camera lens.
“In this case we can again match the observed horizon with known contours. Where the horizon is far away, the height differences in the terrain can be smaller than a single pixel, but this is not a problem, since a large part of the horizon can be imaged by a fisheye camera. In this case, we must also take into account in the calculations the refraction of rays over long distances. It makes the maths more interesting, but this method also works in a purely practical sense as one of many technical aids for onboard navigation”, he says.
Several different methods are needed: cameras are not particularly useful when it’s misty or foggy. In such conditions, you need radar. The GPS system can also be used, but it’s not at all unlikely that in a crisis situation the signal will be jammed or manipulated, and if this happens imaging might save the day.
Vision-based Localization and Attitude Estimation Methods in Natural Environments,
Bertil Grelsson, Department of Electrical Engineering, Computer Vision Laboratory, Linköping University, 2019.
Supervisor: Professor Michael Felsberg
Translated by George Farrants