Through Jeanna Matthews, Clarkson University

Have you had the experience of looking at a product online and then seeing ads for it all over your social media feed? Far from being a coincidence, these oddly accurate ad examples provide insight into the hidden mechanisms that power something you search for on Google, “like” on social media, or encounter while browsing personalized ads on social media. social.

These mechanisms are increasingly being used for more nefarious purposes than aggressive advertising. The threat lies in how this targeted advertising interacts with today’s extremely divided political landscape. Like a social media researcherI see how people who seek to radicalize others use targeted advertising to easily lead people to extreme views.

Advertising to an audience of a

Advertising is clearly powerful. The right advertising campaign can help shape or create demand for a new product or rehabilitate the image of an older product or even an entire company or brand. Political campaigns use similar strategies to push candidates and ideas, and historically countries have used them to wage propaganda wars.

Mass media advertising is powerful, but mass media has a built-in moderating force. When trying to move many people in one direction, the mass media can only move them as fast as the center tolerates it. If it moves too far or too fast, the people in the middle can be alienated.

the detailed profiles social media companies build for each of their users makes advertising even more powerful by allowing advertisers to adapt their messages to individuals. These profiles often include the size and value of your home, the year you bought your car, whether you are expecting a child, and whether you buy a lot of beer.

Therefore, social media has a greater ability to expose people to ideas as quickly as they will individually accept them. The same mechanisms that can recommend a niche consumer product to the right person or suggest an addictive substance just when someone is most vulnerable can also suggest an extreme conspiracy theory just when someone is ready to take it. ‘to consider.

It is increasingly common for friends and family to find themselves on opposite sides in highly polarized debates on important issues. Many people recognize that social media is part of the problem, but how do these powerful personalized advertising techniques contribute to the divisive political landscape?

Breadcrumbs to the extreme

A major part of the answer is that people associated with foreign governments, without admitting who they are, take extreme positions in social media posts. with the deliberate aim of creating division and conflict. These extreme posts take advantage of social media algorithms, which are designed to increase engagementmeaning they reward content that elicits a response.

Another important part of the answer is that people who seek to radicalize others are making breadcrumb trails to increasingly extreme positions.

These social media radicalization pipelines work much the same way whether recruiting jihadists or January 6 insurgents.

You may feel like you’re “doing your own research”, source to source, but you’re actually following a deliberate radicalization pipeline designed to evolve you into increasingly extreme content, whatever whatever pace you will tolerate. For example, after analyzing more than 72 million user comments on more than 330,000 videos posted on 349 YouTube channels, researchers found that users constantly migrated from softer content to more extreme content.

The result of these radicalization pipelines is obvious. Rather than most people holding moderate views with fewer people holding extreme views, fewer and fewer people are in the middle.

How to protect yourself

What can you do? First, I recommend a huge dose of skepticism about social media recommendations. Most people went on social media looking for something in particular, then found themselves looking up from their phone an hour or more later, not knowing how or why they were reading or watching what they had just done. He is designed to be addictive.

I’ve tried to carve a more deliberate path to the information I want and actively try to avoid just clicking on what’s recommended to me. If I read or watch what is suggested, I ask myself “How could this information be in someone else’s best interest, not mine?”

Second, consider supporting efforts to require social media platforms to offer users a choice of algorithms for recommendations and stream curation, including those based on simple-to-explain rules.

Third, and most important, I recommend investing more time in interacting with friends and family outside of social media. If I need to pass on a link to make a point, I consider that a red flag that I don’t understand the problem well enough myself. If so, I may have found myself following a trail built toward extreme content rather than consuming materials that actually help me understand the world better.

Editor’s note: The Conversation replaced the main image of this story to avoid associating a particular political point of view with conspiracy theorists.

Jeanna Matthewscomputer science teacher, Clarkson University

This article is republished from The conversation under Creative Commons license. Read it original article.

Get morning headlines delivered to your inbox

The conversation