skip to main content

CSL Professor works to reduce fake news dissemination

2/26/2020

Allie Arp, CSL

With the United States in the midst of another election cycle, many researchers are trying to prevent the sharing of fake or bad news. CSL Assistant Professor Rasoul Etesami is working to improve the models behind social media tracking, in order to better understand how bad news is shared.

“In social networks, decision-makers are humans and they decide who to interact with and how to manipulate others’ opinions,” said Etesami. “We want to study the stability of such networks, in terms of if an outcome can be predicted or controlled toward a certain
Rasoul Etesami
Rasoul Etesami
direction.”

Most of the current models analyzing social networks act as though a network, whether social or power, is fixed and time-invariant. In these models, there is a network (for example, Facebook), there are agents (humans), and there is interaction (sharing content), which results in an outcome. Etesami believes this doesn’t accurately portray how networks operate, because agents’ decisions and their interactions dynamically evolve, thereby changing the structure of the network.

 “There isn’t one well-accepted model in cognitive decision making. There are proposed models and some of them have proved successful and more descriptive compared to others,” said Etesami, an industrial and enterprise systems engineering assistant professor. “Our project is to take those models and analyze them critically. If we see shortcomings in existing models, we extend them by adding extra features or constraints to capture more realistic and sophisticated scenarios.”

As an example of this, Etesami brought up the last presidential election, the outcome of which some people believe was changed by the spread of false news over social media platforms. Whether or not it’s true, Etesami says this type of information dissemination exists.

“You can easily manipulate people’s decisions by spreading false news over a network that causes people to connect or disconnect themselves from the true source of information based on whether they like or don’t like the message,” he said. “If we have a better understanding of the dynamics of agents’ decisions and the stability of the networks, we can control the propagation of false news. This can help us protect our social networks from adversarial attacks that can affect the whole population.”

As part of the recently funded project “Duality and stability in complex state-dependent network dynamics,” Etesami and his team are working to develop a platform that can simulate human behavior within a dynamic social network like Facebook or Twitter. The data collected through these simulations will allow them to observe how the relationship between networks and agents evolves, and how the agents interact with each other through connecting and sharing information. Once researchers analyze this data, the team plans to move forward with building a richer model.

This research is funded by the National Science Foundation for five years at $503,773.