Publicador de contenidos

Back to 20211102_Opinion_ICS_nuestro_cerebro_no_piensa

Our brains don't think (and neither do yours).

02/11/2021

Published in

The Conversation

José Manuel Muñoz

researcher in the group Mind-Brain, Institute for Culture and Society (ICS), University of Navarra

Javier Bernácer

researcher in the group Mind-Brain, Institute for Culture and Society (ICS), University of Navarra

"Human beings only use 10% of our brain". "The brain of adults does not change". "The reptilian brain is the one that governs the behavior of children". "A person is smarter the more neurons he/she has". Who among us has not heard these statements at some time? And yet they are false.

These are misconceptions about the brain ("neuromyths") that often permeate the population through certain forms of scientific knowledge dissemination . They even reach the field of Education. This is demonstrated by a study published in 2014, which found that teachers in various countries, both Western and Eastern, tended to believe this class of claims.

The spread of these misconceptions is not trivial, but can lead to unscientific and harmful educational strategies. For example, the excessive enrichment of children's environment and the obsession to teach them as many things as possible before the age of six.

Confusing the part with the whole

Another error that frequently occurs in neuroscience communication consists of perpetuating the so-called "mereological fallacy": assigning to the part (the brain) psychological attributes that, in reality, belong to the whole (the human being as a whole).

With a quick internet search we can come across expressions such as "the brain thinks", "the brain remembers", "your brain sees", or even "your brain hates".

This subject of expressions is not only used by science communicators, but also in areas such as teaching and even in professional science. One of the objectives pursued by the AustralianBrain Initiative ( research ), which its promoters propose as "understanding and optimizing how the brain learns in infancy", serves as sample of the latter.

This mereological fallacy constitutes the conceptual basis of what the philosopher Carlos Moya describes as a new (and paradoxical) "materialistic dualism". Having overcome the dualistic soul-body conception (in the Cartesian way), there is now a tendency to think of a brain independent or isolated from the body. The latter seems, in a certain sense, dispensable. This does not conform to reality: the brain is only a part of the nervous system, which in turn is only a part of the body. This body, moreover, is framed in a social context (it is not a "brain in a bucket") that decisively affects development and the life history of the individual.

The feet do not walk, nor the brain think.

The reader will agree with agreement that your feet do not walk, but it is you who walks using your feet. Likewise, it is not your brain that thinks, remembers, hates or loves, but it is you who does all this using your brain.

It might be thought that the comparison between the brain and the feet is inappropriate, since the brain, unlike the feet, has a great capacity to control the other parts of the organism. However, it should not be forgotten that the brain, in turn, depends on other organs for its subsistence and functioning, especially (but not only) on the heart.

The brain is by no means independent and governs the rest of the body, as the dynamics of development demonstrate: it is not until the twenty-third week of prenatal life that the first synapses appear in the human embryo, and it is not until after the age of twenty that the brain is fully developed. In fact, the brain continues to change until the day we die. Simply put, without a body there can be no brain, both functionally and chronologically.

To a certain extent, it is understandable that scientists or communicators trained in neuroscience tend to transmit, consciously or unconsciously, the mereological fallacy. After all, their specialized knowledge may lead them to overestimate the importance of a part of reality.

Therefore, just as it has become normalized that a "science of the part", such as neuroscience, decisively permeates the understanding of the social sciences and Humanities that study the human being as a whole, the complementary path should also be normalized: that these "sciences of the whole" contribute to a more complete (and realistic) understanding of the nervous system.

To achieve this, neuroscience should be more receptive to study and genuine dialogue with other disciplines (psychology, Education, communication, law, Philosophy). The interdisciplinary partnership could thus help to curb the proliferation of neuromyths and reductionist visions of the human that hinder even the progress of neuroscience itself. Methodological rigor should not be associated with a lack of argumentative rigor. Communicating the brain, after all, does not imply limiting oneself to the brain.

This article was originally published in The Conversation. Read the original.

The Conversation