Publicador de contenidos

Back to 2018_12_05_ics_herzog

"Algorithms reproduce the patterns they see in society, and if society is unfair, racist or sexist, they will make unfair, racist or sexist decisions."

Lisa Herzog, philosopher from the Technical University of Munich, came to Institute for Culture and Society to give a lecture seminar at project 'Religion and Civil Society'

Image description
Elisa Herzog
PHOTO: Courtesy
05/12/18 19:08 Elena Beltran

"Algorithms reproduce the patterns they see in society, and if society is unfair, racist or sexist, they will make unfair, racist or sexist decisions." This was explained by Lisa Herzog, a philosopher at the Technical University of Munich, during a seminar organized by the project 'Religion and Civil Society' of the Institute for Culture and Society.. In today's society there is a tendency to entrust decisions to computers, warns the expert, "because we have the prejudice that they are better".

However, she argues that it is not a good idea to delegate important decisions. "In some U.S. states, algorithms are used to assess the chances of parole," Herzog explains. The program takes into account data such as age, occupation, where the subject lives, although these are not known exactly because "the data are from private companies."

The result is that, ultimately, these programs were "racist" because they found a correlation between some characteristics that black people have and some crimes. "Assuming that you can predict something based on past events, as if human beings are not capable of change, is problematic," the philosopher stresses.

Another example he proposes is Google's algorithm, which turned out to be sexist because it offered the top positions of work to men. "Because there are more men in positions of power, the algorithm thought it was a guideline", which in turn leads to further reinforcement of this status. "Programs continue with the guideline they see, regardless of whether they are unfair."

According to Lisa Herzog, a point of contrast to the way programs that use algorithms decide is Hannah Arendt's Philosophy . "Arendt is very interesting because she focuses a lot on individuality, she doesn't believe that we are just data". The way Arendt sees people, the philosopher notes, they can always start over, make different choices, reinvent themselves. In contrast, digital programs tend to be based on the past.

What can computers do?

The expert believes that computers can be trusted for decisions where you have all the knowledge, know all the facts and no one wants to do it. She proposes more trivial tasks such as a robot to clean the floor or cook, where the decisions or criteria they establish are not going to be problematic.

Although he maintains that it is also possible to use these programs as a way of enquiry in important decisions. "I'm not saying that they can't have their space, but the final decision must be made by a human," he says. Because, in addition, when these decisions are made, men develop skills to reach agreements and live in society. "By making decisions, we learn to see others as equals, which is very important in democratic societies," he says.

"Humans have a tendency to rely on computers, and it's also easier to accept their solution and not think of one on your own," Herzog laments. Some people consider them "more efficient." But the philosopher believes that "some values are too important to be trumped by efficiency."

BUSCADOR NOTICIAS

SEARCH ENGINE NEWS

From

To