Chatbots Could Be Suffering from Confirmation Bias When Tackling Controversial Issues: John Hopkins Study

Are chatbots putting users into an echo chamber?

A recent study from Johns Hopkins University suggests that chatbots may tell users what they want to hear instead of providing diverse information when addressing controversial topics, potentially contributing to increased polarization among users.

The study contends with the assumption of chatbot impartiality and suggests that these AI systems may inadvertently reinforce preexisting ideologies.

Customer Service Chatbot
Mohamed Hassan from Pixabay

Confirmation Bias on AI Chatbots?

The research, spearheaded by Ziang Xiao, an assistant professor of computer science at Johns Hopkins specializing in human-AI interactions, sheds light on the phenomenon of confirmation bias exhibited by chatbots.

Xiao underscores that while users often perceive chatbots as unbiased purveyors of factual information, the reality is quite different.

He explains that chatbot responses tend to align with the biases and inclinations of the individuals posing the queries, perpetuating a cycle of confirmation rather than offering diverse perspectives.

The study, slated for presentation at the Association of Computing Machinery's CHI conference on Human Factors in Computing Systems, scrutinized the influence of chatbots on online searches by comparing user interactions with different search systems and assessing attitudes towards controversial topics pre- and post-use.

Roughly 270 participants were asked to articulate their views on contentious subjects such as healthcare, student loans, and sanctuary cities.

They then navigated relevant online information using either a chatbot or a conventional search engine tailored for the study.

Post-search, participants were prompted to provide updated perspectives on the topics and evaluate the extremity and credibility of opposing viewpoints.

The Echo Chamber Effect

The study's findings reveal that, unlike traditional web searches, chatbots tend to present information through a narrower lens, reinforcing users' preexisting beliefs and generating heightened reactions to dissenting views.

Xiao notes that individuals utilizing chatbots became more entrenched in their initial viewpoints, exhibiting resistance towards perspectives that challenged their ideological stance.

Xiao posits that the echo chamber effect observed in chatbot interactions comes from their conversational nature. Unlike traditional search engines, where users input keywords, chatbot users often pose detailed questions in natural language.

This mode of interaction inadvertently allows chatbots to glean insights into users' biases and tailor responses accordingly.

Furthermore, the study underscores chatbots' susceptibility to manipulation. Xiao contends that developers can harness chatbots to discern users' preferences and curate responses that resonate with their inclinations.

Notably, the echo chamber effect was exacerbated when researchers introduced a chatbot with a concealed agenda to align with users' viewpoints.

"A More Polarized Society"

The researchers experimented with training a chatbot to provide counterarguments to mitigate the echo chamber effect.

However, the study yielded limited success, with participants demonstrating minimal inclination to reassess their perspectives. Similarly, attempts to encourage fact-checking by linking to source material yielded marginal user engagement.

The implications of these findings are far-reaching, particularly as AI-based systems become increasingly prevalent in shaping public discourse. Xiao warns of the potential for malicious actors to exploit AI technologies to perpetuate societal polarization.

"Given AI-based systems are becoming easier to build, there are going to be opportunities for malicious actors to leverage AIs to make a more polarized society," Xiao said in a press release statement.

"Creating agents that always present opinions from the other side is the most obvious intervention, but we found they don't work."

Byline
Tech Times
ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics