Blogs

Blogs
Ideology and age: keys to acceptance of content moderation

The perception of content moderation is deeply influenced by ideology. It is not just a difference of opinion, but two opposing views on the role of digital platforms in the public space: while one part of the public calls for more decisive action to curb harmful content, the other fears that such intervention will erode freedom of expression.

  • Among young people, the group that believes that the right thing is eliminated predominates (28%).

  • Ideology draws a clashing patron saint : the left believes that too little is being eliminated, the right that too much is being eliminated.

  • The difference between those who support and those who reject moderation is accentuated with youth.

Content moderation in social networks is the set of practices, policies and decisions aimed at supervising, filtering or eliminating certain content published on these platforms. Its main goal is to ensure that digital interactions take place within acceptable margins for the community, whether for legal, ethical or commercial reasons. This can include anything from removing hate speech, harassment or misinformation, to enforcing rules on nudity, violence or spam. Moderation can be exercised in a guide (by people hired to review content) or automated (through algorithms that detect problematic patterns).

Far from being a merely technical task, content moderation raises important dilemmas about freedom of expression, cultural bias and decision-making power. Digital platforms act, in the internship, as new regulatory actors, with the capacity to define what can and cannot be said in the digital public space. This makes moderation a terrain of constant tension between private interests, fundamental rights and social demands, and places it at the center of the discussion on the democratic health of digital environments.


Content moderation in social networks is a topic that arouses diverse opinions among users. Although there is no unanimous position, the data reveal a clear inclination: a relative majority considers that digital platforms remove very little content. This opinion, shared by more than 39% of respondents, is the most widespread among the population. In contrast, just over 20% believe that platforms remove the right amount and 18% believe that they remove too much. It is striking, however, that nearly a quarter of users (24%) say they do not have a formed opinion, suggesting a certain Degree of disconnection, misinformation or ambivalence about the role of platforms.

Looking at the data from a sociodemographic perspective, patterns emerge that reveal how digital experience, expectations and concerns can vary by age and gender.

Age influences more than gender

Young people under 35 are the group sample the most favorable attitude towards current moderation policies. They are the most likely to believe that the right things are being done away with (28%) and the least likely to believe that too much is being done away with (21%). They also express less concern about the lack of moderation (34%) and show a significantly leave rate of uncertainty (19%). This profile suggests a more fluid relationship with the digital ecosystem, where constant exhibition to varied -and often controversial- content coexists with a greater tolerance for expressive diversity.

In contrast, those over 35 take a more critical stance. Nearly 40% feel that too much content is removed, and their level of satisfaction with current moderation is lower. They also register a higher value of uncertainty, which may indicate less familiarity with moderation practices or a distrust of how algorithms work. This group is also one of the most concerned about excessive deletion.


In terms of gender, the differences are also revealing. Women are more likely to say that too much content is being removed (39%), and at the same time have a higher level of uncertainty (25%). They are also the group most satisfied with the current status and the least leave to think that social networks are falling short in content moderation.

Among men, the feeling that social networks remove too much content is lower than among women, as is the level of uncertainty about this behavior. One in five feel that the networks remove too little, and a very similar percentage sample satisfied with the current status .

Ideology and content moderation: clashing perceptions on a common ground

When users are asked whether platforms remove too much, the right amount or too little content, the answers vary markedly according to political position. And it's not about nuance. The differences between groups are clear and follow a consistent patron saint : as one moves from the far left to the far right, the perception that too much content is removed increases and the idea that too little is removed decreases.


Among those on the left of the spectrum, there is a predominant feeling that the platforms do not eliminate enough: 63% on the left and 56% on the center-left believe so. On the other hand, on the right and the extreme right, the dominant narrative is different. 38% and 46% respectively believe that too much content is being eliminated. In the political center, perceptions are more evenly distributed, although also with signs of division: about a quarter believe that too much is being eliminated, another quarter that it is adequate, and about half that too little is being eliminated.

These differences are seen not only in the averages, but also in the trends: the opinion that "too little is eliminated" follows a very marked downward trajectory from left to right, while the opposite view ("too much is eliminated") grows in the same direction. The middle option ("the right thing") remains relatively stable, with no major variations between ideological groups.

But ideology does not operate alone. Age introduces new layers of complexity. Segmenting the data by age bracket - under and over 35 - reveals differences in both perceptions and the internal coherence of each ideological group .


Among the youngest, the positions are more extreme. For example, 64% of young people on the left believe that too little content is removed, compared to only 7% on the extreme right. In the latter group, half believe that platforms remove the right things, and more than 40% believe that they remove too much. The ideological curves among young people are not only steeper, but also more polarized at the extremes.

Among those over 35 years of age, opinions tend to be more homogeneous. Although ideological differences remain - and the ideological correlation with perceptions continues to be evident - the intensity of these differences is somewhat less. Here, 62% of those older on the left believe that little is being eliminated, and 36% of those older on the extreme right agree with this perception, a gap that is still B, but smaller than among the young.


Polarization is not only expressed in the distance between groups, but also in the internal cohesion within each group. When analyzing the standard deviation of the responses - a measure of the heterogeneity of opinions within each ideological group - an interesting patron saint is revealed: those older than 35 tend, on average, to be more internally polarized ( average standard deviation of 16) than those younger than 35 (14). However, the opposite is true at the ideological extremes: young people on the extreme left and extreme right are more internally divided than their adult counterparts, with standard deviations of 17 and 23 respectively, compared to 9 and 15 among older people.

On the other hand, in more moderate positions -center, center-left, center-right-, older people show a greater dispersion of opinions than younger people. This suggests that ideological polarization among adults is not limited to the extremes, but also penetrates into usually more moderate spaces.

Political interest as a determining factor in the perception of content moderation

Certainly, politics has an influence on the opinion that respondents have about content moderation. However, this is not only an ideological question - already discussed above - but also the Degree of interest in politics that each person has. The data show that the level of political involvement has a clear impact on how the role of digital platforms is interpreted.


In general terms, people who say they are very interested in politics are also those who most frequently consider that the platforms remove very little content (55%). This perception is widespread across all stakeholder profiles, even among those who rank one step below in terms of involvement: 51% of "stakeholders" also believe that they moderate below what is necessary. This patron saint suggests that a greater awareness of the functioning of public discussion - and perhaps of the harmful effects of certain discourses - is accompanied by a greater demand for intervention on the part of the platforms. However, there is also an increase in the perception that too much content is removed among the "very interested" (29%). In other words, the more politicized the Username is, the more likely he or she is to perceive tensions and contradictions in the moderation system. In contrast, among users with low or no interest in politics, the idea that platforms remove too much content is more common.

More blog entries

Añadir comentarios
Please log in to continue.