Publicador de contenidos

Back to 2019-04-11-Noticia-ECLE-Gonzalo Génova

An expert says we can teach a robot to imitate human ethical behavior, but ethics is not just about imitating what others do.

Gonzalo Génova participated at the University of Navarra in a seminar on "Ethics for machines: how to teach your robot to behave well".

Image description
Gonzalo Génova is winner of award Razón Abierta 2018. PHOTO: Manuel Castells
11/04/19 16:16 Chus Cantalapiedra

"We can teach machines to imitate human ethical behavior, but ethics is not just imitating what others do". This was stated at the University of Navarra by Gonzalo Génova, who also stressed that he is not so much concerned about the humanization of the robot as "the robotization of the human". Former student of the academic center and Telecommunications Engineer participated in the seminar of the group 'Science, Reason and Faith', focused on "Ethics for machines: how to teach your robot to behave well".

The expert, holder of the department of Computer Science at the University Carlos III of Madrid and winner of award Razón Abierta 2018 in the category of teaching, explained what artificial intelligence is and how a computational system can modify its behavior by learning from the environment, or from the behavior of human beings. And he pointed out that the most important reflection is, "what can we learn about ethics that we are able to make a robot learn?"

He assures that by looking at the robot we should learn that we are not that, "a machine programmed to behave in a certain way". When we capture our ethical knowledge in a robot we can reflect on what we see in it, "but the most important thing we should learn is that we are not the mirror image, no matter how much it resembles us. The most important thing is not the imitation, but the taking of contact with the present reality. Re-cognize the dignity of the person in front of us. To discover for oneself what is right and what is wrong".

Professor Genova also emphasizes that the need for machines to be governed by ethical values is not something of the immediate future, but belongs to the present: "For many years now, machines have been making decisions with a strong ethical charge or at least advising decision-makers, whether to grant a bank loan , to say who receives a transplanted organ or simply to decide whether a student passes or fails a subject. What is perhaps more novel is that machines can calculate decisions and carry them out without direct human supervision, and hence the concern that these decisions should be calculated ethically on the basis of previously established ethical principles".

"Ethics cannot be defined by any law, it will always be above it."

On what ethical principles should govern machines, he says it is as difficult as establishing them for humans. "We've been around for thousands of years and we still haven't settled on agreement. However, if we go down from principles to concrete regulations, there is a lot of interest and a lot of people working in this field, both in large private corporations and in public institutions. However, ethics cannot be perfectly defined by any law or code, it will always be above all of them".

He says that the most important problems cannot be solved with technology, so he does not consider himself a techno-optimist. To explain this, he gives the example of being away from a person you love: "I can communicate at a distance, but if the person is angry with me, there is no gadget that can solve it. It is a problem of human relations".

However, he is sample optimistic because we can use technology to improve our lives and to invent new ways to do good for humanity and for nature, for which we are responsible. "That is what we are called to do," he concludes.

BUSCADOR NOTICIAS

SEARCH ENGINE NEWS

From

To