Connect with us

Hi, what are you looking for?

Bloomington local

A new behavioral tool helps social media users reduce engagement with distorted and negative content

Indianapolis, Indiana – A team of researchers from Indiana University may have discovered a simple yet powerful way to change how people engage with toxic content online — and their findings could shape future efforts to address the growing mental health crisis tied to social media use.

In a new study published in Proceedings of the National Academy of Sciences Nexus, researchers tested a brief cognitive behavioral therapy (CBT) intervention aimed at helping users identify “distorted thinking” on social media. Their hope: reduce harmful engagement with negative posts and prevent algorithms from amplifying that kind of content in the first place.

“An unexpected finding was that individuals with greater depression severity interacted — liked and retweeted — with distorted social media content more than the control group did,” said Eeshan Hasan, a doctoral student in IU’s Department of Psychological and Brain Sciences and lead author of the study. “Fortunately, our intervention reduces the interaction with distorted content for depressed and non-depressed individuals.”

The intervention, described by the team as “one-shot,” was simple but surprisingly effective. It involved a short educational module on cognitive distortions — patterns of negative thinking common in depression — followed by interaction with a simulated social media platform. The platform, modeled closely after X (formerly Twitter), included sample posts labeled as distorted or non-distorted, mimicking real-world digital environments.

“We wanted something that actually looked like X,” Hasan explained. “When you clicked a heart, the heart would light up. It was designed with these ecological factors in mind.”

Read also: Chevrolet Corvette ZR1 will lead the 109th Indianapolis 500 with record-breaking speed

The tool didn’t just test theory — it tested behavior. Participants were asked to assess whether posts were distorted before engaging with them. This small pause, it turned out, was enough to make a big difference.

“Our study found that individuals were extremely good at identifying distorted content with very little training,” Hasan said. “After the intervention, there was a huge decline in how much people liked and interacted with it.”

The impact of this intervention was even stronger among participants with more severe symptoms of depression, who are often more vulnerable to negative content online. By introducing a moment of reflection before engagement, the tool effectively decreased their interaction with harmful posts.

This approach builds on earlier work by study co-authors Johan Bollen, professor of informatics and AI expert, and Lorenzo Lorenzo-Luaces, a clinical psychologist focused on depression. Their past research showed that people experiencing depression often produce and engage with content containing rigid, negative language — sometimes called “distorted thinking” in psychology.

The new study asked a critical question: what happens if users are trained to spot these thought patterns before reacting?

“This work shows how bringing together different perspectives can lead to creative, practical solutions for real-world challenges like social media and mental health,” said Jennifer Trueblood, the Ruth N. Halls Professor of Psychological and Brain Sciences and director of IU’s Cognitive Science Program.

Read also: Luke Miller overcomes challenges and perseveres to become 2025 student commencement speaker at Ivy Tech Indianapolis

Trueblood helped design the experiment, using her background in decision-making and computational modeling to structure how users would interact with the test platform. Lorenzo-Luaces ensured the distorted posts were psychologically accurate and reflective of common cognitive traps like catastrophizing, black-and-white thinking, and overgeneralization. Bollen, meanwhile, helped generate the sample content using artificial intelligence, making sure the simulation captured real-world dynamics.

This interdisciplinary teamwork allowed for a tightly controlled yet realistic experiment — something rarely achieved in the fast-moving world of social media research.

“Computational social scientists are increasingly integrating data, theory and experiment-driven approaches to address long-standing problems in psychology and the social sciences,” Bollen noted.

Instead of analyzing thousands of uncontrolled posts across real networks — which is how most social media studies work — the team created a closed, experimental platform. This made it easier to isolate the effects of the intervention and draw conclusions about what really changes user behavior.

The study also raises broader questions about the responsibility of platforms in moderating distorted thinking. Although the IU researchers did not work with any specific social media company, their findings suggest a clear path forward: brief educational tools, even those taking only a few minutes, can meaningfully shift user behavior in healthier directions.

“Psycho-educational interventions like ours could be dispensed at scale on social media platforms to counteract the effects of distorted language,” Hasan said. “This would improve quality of life and bring about real change to society.”

However, researchers also acknowledged the challenges ahead. Access to social media data is increasingly restricted. When the project began in 2022, platforms like X allowed researchers to study networks and behavior much more freely. But recent changes have placed such data behind expensive paywalls, limiting academic access and transparency.

Despite this, the research team remains hopeful that their work will inspire further exploration — not just in labs, but in communities, workplaces, and even schools.

“We hope this becomes a starting point,” said Lorenzo-Luaces. “If people can learn to identify distorted thinking on social media, they may become more resilient — not just online, but in their daily lives.”

With depression rates continuing to rise, especially among younger people, the need for fresh and accessible mental health interventions is urgent. And if just one short lesson can help users pause before they like or share something harmful, the implications for digital well-being — and real-world mental health — are significant. In a world where content moves faster than ever, sometimes all it takes is one small moment of awareness to change the whole conversation.

 

Trending posts

Bloomington local

Bloomington, Indiana – Being the first MIH program in the United States to use modern telemedicine kits, the Bloomington Fire Department’s Mobile Integrated Healthcare...

State news

Marion County, Indiana – Indiana households now have a lifeline to assist them control the rising heating house costs as the temperatures start to...

Bloomington local

Bloomington, Indiana – Bloomington city authorities are considering ideas to change Indiana Avenue, which is one of Bloomington’s busiest thoroughfares, in an attempt to solve...

Bloomington local

Bloomington, Indiana – Monroe County Commissioners have approved rezoning a 4.1-acre land for a new business project, a step that appears to be a...