15 December 2013

This year’s Nobel Prizes for Physics and Chemistry go to leading heoreticians, who with their methods have been able to show the way to the experimentalists. At Linköping University too, more and more researchers are using modelling and simulation in order to understand different aspects of the world around us - tools that require huge amounts of computer power.
As a boy, Patrick Norman loved doing calculations. His career could be defined by the formula [mathematics + natural sciences + computer = quantum chemistry]. Now he divides his time between research and running the National Supercomputer Centre, NSC.

He himself is one of its main customers. The division of Theory and Modelling at LiU deals a lot with materials. Researchers in Theoretical Physics deal with solid materials, electron structures in crystals, and alloys. Computational physicists, of whom Professor Norman is one, look at molecular and polymeric materials.

Designing molecules

“We are trying to design molecules that provide the properties we are looking for. For this, theoretical calculations are a more rational road to take than feeling our way along by means of experimentation,” says Professor Norman.

Their project is up and running: the theoretical process whereby the researchers, with the help of their computers, write algorithms that describe how the intended molecule should be built, the synthesis whereby the molecule is produced chemically, characterisation whereby the structure is measured and investigated, and finally fabrication of the component.

Laser beams are used as a weapon, fired at pilots in the battlefield in order to damage their sight or other sensors in the environment. Professor Norman was part of a group, financed by a major nanoscience programme within the Swedish Armed Forces, which looked into how it would be possible to ward off this threat.

The idea was based on a chromophore, a molecule that can absorb light by sending electrons up to a higher energy level – “exciting” them. Following extensive calculations, experimentation and testing the finished molecule was placed in a solution of thermoplastic, which was then allowed to solidify into plastic glass two millimetres thick. Full-scale trials showed how a laser beam that hit the glass was nullified during the extremely short time it took to travel the two millimetres. This phenomenon is a result of the behaviour of electrons in the quantum world – calculations that could only be done by a supercomputer.

“This is all based on theories formulated by Schrödinger, Einstein, Heisenberg and Dirac. You have to wonder what these brilliant guys would have been able to do if they were alive today,” Professor Norman says.

Obviously they had no idea what the future would bring. In 1929 Paul Dirac wrote:
”The fundamental laws of physics... are fully understood, and the only difficulty is that the precise applications of these laws lead to equations that are much too complicated for us to solve.”

Triolith

The latest of the supercomputers in Linköping, Triolith, contains 25,600 processor cores that operate around the clock. This means that in one day it can complete the same number of calculations that would take a home computer over 100 years.

“But the human brain still has an important role to play. We might wonder, for example, if we could enable the computer to analyse and programme the long, complicated equations we use. But I have never seen anyone really succeed with that,” says Professor Norman.

In Jules Verne’s novel Journey to the Centre of the Earth, Professor Lidenbrock and his retinue journey down into a volcano to explore the interior of the earth – putting their lives on the line for the sake of exploration. 150 years later, Professor of Theoretical Physics Igor Abrikosov made the same journey from the comfort of his desk.

“We were amazed to discover that the iron in the earth’s core can be affected greatly by the earth’s magnetic field. This might explain why the magnetic poles are so stable. If the core had been non-magnetic they would have flipped significantly more often than we have seen – on average every 250,000 years,” Professor Abrikosov says.

This discovery is also important in simulating earthquakes and understanding how they occur.

Our previous understanding of the earth’s core is based on the knowledge that iron loses magnetism with increasing pressure and temperature. But when Professor Abrikosov and his colleagues carried out a simulation of the extreme conditions at a depth of 6,000 kilometres, they were able to see how at a certain point the iron became magnetic again.

In order to describe this behaviour a new theory of the electron structure of iron crystals was needed. This is a kind of fundamental research that may also be of use in applied projects, for example in the design of new materials for industrial tools.

“At present it takes about 20 years to develop new materials. This time can be halved with better theory,” Professor Abrikosov says.

Increasing possibilities

Developments over recent years of ever more powerful computers have increased the possibilities within theoretical physics.

“Now we can do things that can’t be done experimentally, and with a high degree of precision. We can also help in planning experimental studies in advance, instead of going ahead with ‘cook and look’.”

When Linnaeus and other pioneers of natural history research wanted to describe an unknown species of bird, they just went ahead and shot it down. Modern day ecologists have a different approach:

“Carrying out experimental studies on complex ecological systems in nature is often difficult and sometimes ethically questionable,” says Bo Ebenman, professor of theoretical biology.

His research group is particularly interested in what happens in an ecosystem when one species decreases in number or dies out. Studies of this kind have been carried out in bacteria and other single cell organisms with short-lived generations, and on palaeontological data of the major extinctions, for example the dinosaurs. Time scales can be compressed with ‘muscular’ computers.

“In this way we are able to look at the interactions between species with very different generation times. Ten years might be one generation time for a whale, but that is 1,000 generations for plankton, the whale’s staple food,” says Professor Ebenman.

In the summer of 2013, three of the LiU researchers described in the journal Nature how certain species have a particularly major functional importance in a given ecosystem. When such a species is in decline, other species in the system become extinct – often long before the existence of said species is threatened.

In order to reach these kinds of conclusions, the researchers create virtual food webs that mirror reality. These are based on knowledge of how species affect one another: relationships between predator and prey, between species that are mutually beneficial to one another, and species that compete for the same food source.

Since the physical environment sets boundaries for the ecosystem, climate change is a burning issue. How does it affect these interactions? This is the challenge that now stands before Professor Ebenman and his colleagues.

The plan is to bring the ecologists’ food webs together with climate models from SMHI-LiU-NSC, one of 15 climate data nodes that are part of the global network ESGF, which is used for tasks such as preparing the IPCC reports.

Faster simulations with a little help

At the NSC, climate modeller Hamish Struthers and systems expert Prashanth Dwarakanath are helping Swedish climate researchers with simulations that require very large computer resources.

A climate model needs to give a picture of ongoing change on statistical grounds. This is achieved by repeating a simulation over and over again with small changes to the input data. It is these repetitions that require computer power – the quicker the computations, the more reliable the outcome.

Over the years the climate models have achieved much greater geographical resolution, which also makes them more reliable. In the beginning, the map was divided into 500 km squares – now they are smaller than 100 km and in regional simulations they are as small as 11 km.

Mr Struthers is currently working with researchers from SMHI (Swedish Meteorological and Hydrological Institute) and Stockholm University to improve the representation of sea ice in a global climate model. When the water freezes, how thick does the ice become, how and when does it melt, how is heat transported and circulated, and how are the rays of the sun reflected?

There are many questions and of course the more of them can be answered, the more reliable the prognoses of the future climate will be.

Text: Åke Hjelm

Picture 1: The Triolith supercomputer is the most powerful in the Nordic region. In twenty-four hours it can carry out calculations that would take a home computer 100 years. Photo: Göran Billeson

Picture 2: Patrick Norman is director of the National Supercomputer Centre and himself uses its computational capacity in research on new materials. Photo: Göran Billeson

Picture 3: Igor Abrikosov. Photo Anna Nilsen.

Picture 4: Bo Ebenman. Photo Åke Hjelm

LiU magazine no. 4, 2013