How sure are you about your beliefs, and how good are you at explaining them? Your confidence and opinions might shift when you’re in a virtual space with a group of avatars. Researchers from SWPS University looked into how people tend to be swayed by the views of others, including those of virtual characters.
“We usually conform to the views of others for two reasons. First, we succumb to group pressure and want to gain social acceptance. Second, we lack sufficient knowledge and perceive the group as a source of a better interpretation of the current situation,” the researchers explain.
Moral judgments
Until now, only a handful of studies have explored whether moral judgments, which are assessments of someone else’s actions in a particular scenario, can be influenced by group dynamics. The researchers also delved into how perceptions of others’ behavior shifted when faced with pressure from avatars in a virtual setting.
“Today, social influence is increasingly as potent in the digital world as in the real world. Therefore, it is necessary to determine how our judgments are shaped in the digital reality, where interactions take place online and some participants are avatars, not real humans,” the researchers continue.
In their initial study, the researchers examined how much participants—totaling 103 individuals—would adjust their personal moral assessments to align with those of others. Initially, participants evaluated particular behaviors independently, like a woman disciplining her child for poor grades or a man speaking loudly on the phone in a cinema. Later, participants reevaluated these behaviors within groups alongside three others who had responded differently in the first phase of the study.
“Participants adjusted their opinions to conform with others in 43% cases. However, they did it less often when the judgments concerned situations in which other people were harmed,” the authors say.
Evaluating behaviors
In the second study, conducted with 138 participants within a virtual realm, each individual initially evaluated the conduct of others in specific scenarios. Then, after donning a VR headset, they reevaluated the behaviors in the company of three avatars within a virtual setting.
Some avatars were purportedly under human control, while others were AI-driven. In instances where AI controlled the avatars, participants were informed that the Kent School of Engineering and Digital Arts was conducting tests on their new algorithms, which were integrated into the virtual avatars.
“It turned out that participants changed their judgments to align them with judgments of human-controlled avatars in 30% cases, and in 26 percent cases when avatars were controlled by AI,” the researchers conclude. “The results suggest that judgments about moral behavior, like other judgments we make, are subject to pressure from both real and virtual groups.”