Recently, there have been lots of stories about how everything from disease to fake news spreads. Previously, many researchers studied them in the same way: a contagious agent (or bit of fake news) is passed through proximal contact throughout the population. However, taking the same approach for fake news as for viruses fails to account for the psychological factors that play into fake news spread.
As part of a newly-funded, $6.25 million Multidisciplinary University Research Initiative (MURI) Award, CSL professors Cedric Langbort, Tamer BaÅŸar, and Michel Regenwetter are working with an interdisciplinary team to look at fake news from a dynamic and psychological standpoint, with the goal of more accurately understanding the spread.
“In the first wave of research on the spread of information over networks, researchers thought information was like a virus, in that what determined the spread is the structure of the
network and its virus-like evolution,” said Langbort, principal investigator on the project and associate professor of aerospace engineering. “More recent work is looking at the people element. If you have certain psychological traits, you may be more likely to retweet certain kinds of news.”
A recent study from Sweden supports the hypothesis of the MURI project, titled, “A multimodal approach to network information dynamics.” The Swedish study shows that if a person cares about what society or their online friends think of them, they may be more likely to share information those people would like and less likely to share a dissenting opinion. How much people value their societal identity and their social identification contributes to people’s behavior on and offline. People’s psychological and social traits can help predict if they will interact with messaging and share it with others.
“We take into account the psycho-cognitive elements, but we don’t treat them as immutable characteristics -- rather as dynamic messages themselves,” said Langbort. “We’re interested in isolating different modes of transmission. Are you trying to pass information to me because you’re informing me, or are you crafting what you’re sending me because you expect it to get me to do something?”
The expectation of acting on or being influenced by a message is one of the modes the group is looking at. The team, which also includes Stanford University’s Economics Professor Matthew Gentzkow, Communication Professor Jeff Hancock, and Management Science & Engineering Assistant Professor Johan Ugander, are looking at three different motivations or ‘modes’ for sharing information online.
The first mode is self-expression. Users share information because they want to bring attention to themselves or to satisfy a personal need. The second mode is sharing information for monetary gain.
The third mode for sharing information has been the focus of many previous research studies by the team and others. Persuasion, also known as influencing or manipulation, is sharing information to change or shape the receiver’s opinions so that they would be aligned with the sender’s. In a previous study conducted by the New York Times, it was found that more than half of people would share a story they thought would influence others, even if it wasn’t factual. This influenced the research the group plans to do.
“The really important conclusion of the Times study is that more than 50% of respondents think that by sharing either fake or real stories they are going to influence others and make other people think the way they want them to think,” said BaÅŸar, Swanlund Endowed Chair and professor of electrical and computer engineering (ECE). “We are looking at what is the motivation of those who make up information and send it to others to shape their opinions, as well as on the receiving end why people do not corroborate the validity of the information they have received and are willing to share with others?
For the last several years, Langbort and BaÅŸar have been working to develop a dynamic theory of persuasion built on game-theoretic approaches. They are working to build out this theory and understand how it is relevant to the persuasion mode for this research. Eventually they hope to build a more dynamic mathematical model to both explain and predict people’s actions when it comes to sharing misinformation.
Regenwetter, a professor of psychology who also holds an affiliation with political science and ECE, will work to see if the attempts to influence others by sharing information, true or false, is effective.
“One of the first questions we want to tackle is how participants in an experiment update beliefs based on exposure to true and false evidence,”
said Regenwetter, “A core question here is whether it is reasonable to model human decision makers as people who intuitively follow the rules of probability theory (rational theory). There is lots of prior evidence that people don’t follow probability calculus in their reasoning.”
Altogether, the team has a wide-range of expertise coming together to conduct this research while looking at as many crucial aspects as possible. They are excited about the potential, and ready to get to work.
“Even writing this proposal was a lot of fun,” said Langbort. “We had really interesting planning meetings so I’m eager to bring all of these aspects together and get started.”