12/15/2021 Kim Gudeman, CSL
Researchers at the University of Illinois Urbana-Champaign and George Mason University are exploring technology that may provide a “third eye” for firefighters using 360 video.
Written by Kim Gudeman, CSL
Burns, falls and crush injuries from falling structures – these are all hazards that firefighters face whenever they rush into burning buildings. While training can help prepare these emergency responders for numerous dangers, firefighting remains one of the most dangerous jobs in the country.
Researchers at the University of Illinois Urbana-Champaign and George Mason University are exploring technology that may provide a “third eye” for teams using 360 video. Funded by the National Science Foundation, the project seeks to improve situational awareness of commanders by combining 360-degree videos with augmented reality, which integrates the virtual world and the physical world.
Traditionally, remote commanders visualize emergency sites and lead the operation using videos captured by the helmet cameras of firefighters. Unfortunately, these video systems only allow commanders to see a single view of the emergency site at a time.
“We’re building a system that not only gives commanders a more complete view of what is happening in a burning building, but also supplementing it with augmented technology that uses machine learning to highlight potential problems,” said Klara Nahrstedt, the Grainger Distinguished Chair of Engineering Professor in the Department of Computer Science and director of the Coordinated Science Lab at UIUC.
“If a commander is watching the videos of multiple firefighters at the same time and trying to listen to audio communications over several channels, it may be difficult to catch an urgent development in time without technology bringing it to the forefront,” she said. “We hope to use video 360 to give commanders a third eye.”
The key to the project, said Nahrstedt, will be identifying points-of-view that are meaningful to commanders so that the research team can train the system to recognize these incidents when they happen. To better understand those incidents, the team will work with firefighters at the Illinois Fire Service Institute, which is the State of Illinois’ fire training institute based at UIUC. The team will meet with commanders to identify objects and events of interest. In addition, the team will create a dataset of 360 videos that provide a comprehensive picture of possible events during a fire.
Then researchers will train the software system to recognize notable incidents.
“We will need a model that helps us analyze input and identify what are the important things going on,” said Zhisheng Yan, an assistant professor of information science and technology at George Mason. “In the research community, there is currently no such model available.”
In addition to firefighting, the technology could eventually be applied to wildlife monitoring or airport surveillance, among other areas.
Yan noted that firefighters have been slower to adopt technology than other emergency responders, instead relying on instinct. He believes these emerging technologies could be combined with personal experience to provide the best overall view in less-than-ideal conditions.
“We really believe that this can help save limbs and lives,” Yan said. “Through this project, we can make this advanced technology available to help solve these real-life problems.