Bystander bots to combat cyberbullying
“Developmental psychology has shown that instances of bullying can be mitigated by bystander behavior. Someone watching the situation can diffuse it or even stop the bully from acting further,” Bhat said. “This has motivated us to see if we could recreate bystander behavior in an online bullying platform.”
Many social media platforms have taskforces dedicated to anti-bullying by manually monitoring and deleting offensive posts, but they are faced with a losing battle. For YouTube, the chosen social media for Bhat’s research, 300 hours of new content are uploaded every minute for its more than 1.8 billion users to watch. This presents a lot of opportunities for bullying to happen under the radar.
Considering the issues of scalability, how does Bhat plan to track bullying and intervening? Answer: With bullying-detecting bots.“Bullying is very pervasive in online platforms, so we must first be able to detect with high confidence whether or not a comment is bullying,” Bhat explained. “Once we detect a bullying incident we can have a bot act as an automatic bystander and post a prosocial message.”
Bhat emphasized that what bullying looks like can vary based on the social media platform, meaning there can’t be a one-size-fits-all solution. She hopes her bot bystanders can be the first step in reducing the number of bullying incidents and the negative effect on the victim.
“We can’t control how people behave online but we can control how we deal with it,” Bhat said. “We want to see if we can use an automatic bystander to mitigate the situation. That’s what motivates us.”
This research was funded by the National Science Foundation.
As we get into the holiday season, we at CSL are putting together a series of stories a bout research for which we are grateful. This is the second in the Thankful Series featuring research our scientists have conducted for social good. Read the first here.