26 February 2020

Artificial intelligence (AI) is already used in many fields to make people’s work easier, for instance in production and decision-making. But in the future, AI may also be used to care for people – and here it may be robots doing the work. Two researchers at Linköping University are to lead a project investigating trust, empathy and accountability in the relationship between humans and robots.

Robot
In the future, AI may be used to care for people. Photographer: wonry

A robot can be everything from a “box” that executes a task in a factory to a human-like android. There are fluffy robot seals, robots that look at you with big, begging, Walt-Disney eyes, and doll heads with animated faces.

Some of these may become a feature in Swedish elderly care, in schools and in job interviews. But if we are going to hand over the care of our children and parents to robots, we have to trust them.

Ericka Johnson, Tema Genus.Ericka Johnson Photo credit Charlotte Perhammar“In an encounter between a human and a robot, it’s interesting to look at the emotions that arise. Does the robot inspire curiosity, anger or happiness? And how does the robot respond to the human’s actions?” asks Professor Ericka Johnson, who, together with Katherine Harrison, is principal investigator for a project on artificial intelligence, ethics and trust.

Ericka Johnson is at her office at LiU’s Department of Thematic Studies. The stroller outside the office indicates that she has visitors: Katherine Harrison and her month-old baby, and Brian Cantwell Smith, visiting scholar from the University of Toronto. The three researchers will work on the new AI project, and with questions concerning how we create trust in robots.

Values and norms exist and can be built into the technology.
Katherine Harrison, investigator for a project on AI, ethics and trust. 

Robots in schoool - teacher or toy?

One part of the project at LiU will study the use of robots in school. Together with a robot lab at Uppsala University, the Linköping researchers will test a human-like robot at schools in Uppsala. The robot will teach mathematics, and the pupils’ interaction with it will be filmed and analysed.Ericka Johnson och Katherine Harrison, Tema Genus.The new project focuses on cases where robots are carrying out social tasks. Photo credit Charlotte Perhammar

“Robots are sometimes presented as an option in teaching contexts. But whether they are a good option or not depends on whether they can behave in a way that pupils feel is trustworthy”, says Katherine Harrison.

Brian Cantwell Smith adds that the way in which a user perceives a robot is also partly related to the user’s preconceptions of robots:

“If you think of a robot in a school, are you expecting a teacher or a toy? Our expectation of the robot affects how well we believe it succeeds.”

The researchers will investigate how to program the robot in order for trust to develop between human and machine. If a user swears at a robot, how will it respond?

“The feedback a robot gives in various situations has been pre-programmed. This is why we will study how developers program robots, and what they believe is important in robots’ communication”, says Ericka Johnson, who has previously investigated the relationship between humans and technology.

Norms built into the technology

Technology is no better than the people who create it. An example that has been discussed recently is that most AI assistants are women. Alexa helps Amazon’s users, Apple has its Siri, and at online banks and retail sites, users encounter AI assistants in female form.

“Values and norms exist and can be built into the technology. It’s important to be aware of that”, says Katherine Harrison, gently rocking her baby.Ericka Johnson and the robot Pepper

It is not impossible that her own child will one day meet a robot at school. In Japan, robots have long been trialled in both schools and homes for the elderly. Pupils have tested learning English with the assistance of a robot, while a little robotic seal has got elderly Alzheimer’s patients to open up and become more communicative.
Katherine Harrison is positive to the idea of her child being taught by a robot in certain subjects.

“Having robots support children’s learning can have its advantages. I see how the robotic vacuum cleaner we have at home awakens the curiosity of my four-year-old, and how he tries to interact with it. But there are also aspects that concern me.”

Again, the issue of trust in technology arises.

“There are questions about how robots will handle morally difficult situations. Here the robots need development and training. For me, as a researcher in gender studies, it’s also important that the robot is programmed in a responsible way, so it doesn’t preserve outdated norms regarding for instance gender, the body and sexuality. But that is where our research project comes in. I hope our project can help prevent stereotypical roles being built into the technology.”

Three new LiU projects to study the consequences of AI

The research project that Ericka Johnson and Katherine Harrison are to lead is part of a national initiative for AI. The Wallenberg Foundations currently fund 16 research projects around the country; these will investigate the consequences of AI for society. Besides the robot project, the Wallenberg Foundations are funding two other projects at LiU: one on consumers’ trust in artificial intelligence and one on complex intelligent systems and the management of the future.

“The initiative is solidly focussed on the humanities and social sciences. When robots are to shift from just being labour in the background to also interacting with humans, researchers in the humanities will be important”, says Ericka Johnson.Katherine Harrison, Tema Genus.Katherine Harrison Photo credit Charlotte Perhammar

It’s absolutely vital that researchers from various disciplines like the humanities, social sciences and engineering sciences meet, according to the LiU researchers.
“Only by cooperating across boundaries can we discover how trust, empathy and accountability can be created at the interface between humans and robots”, says Katherine Harrison.

Contact

The research project

Latest news from LiU

Florian Trybel

The collaboration pushing back the boundaries of physics

Theoretician Florian Trybel has an irreplaceable role in creating new materials. Together with his experimental research colleague in Scotland he aims to expand the possibilities of materials in extreme conditions.

Kaiqian Wang.

Discovery about pain signalling may contribute to better treatment

LiU researchers have pinpointed the exact location of a specific protein fine-tuning the strength of pain signals. The knowledge can be used to develop drugs for chronic pain that are more effective and have fewer side effects.

Associate professor Jonathan Josefsson against a grey sky.

Unequal conditions for young people at UN climate summits

Today, young people can participate in major UN climate conferences. But inequality and bureaucracy make this impossible for many. This is the conclusion of a study carried out at Linköping University.