12/1/2015 August Schiess, CSL
Written by August Schiess, CSL
A single crack in a bridge is often a subtle change, but may eventually result in the bridge collapsing if it goes undetected. Such phenomena can happen in a variety of systems—from single power line outages leading to blackouts to individual cases of a disease leading to an epidemic.
“We want to be able to detect a subtle change in the system that could later affect the entire operation of the system,” said Veeravalli, a professor of electrical and computer engineering at Illinois. “For example, power systems are designed to be resilient to a few line outages, and so it is possible for a small number of line outages in the transmission network to go undetected since they do not affect the loads or the generators in a significant way. But such outages, if left undetected, could eventually lead to catastrophic failures and blackouts. So you want to be able to detect these outages as soon as possible, and take any necessary action to fix them.”
The team, who received over $1.1 million from the National Science Foundation, will also investigate how the algorithms can detect the spread of disease that leads to epidemics.
The power grid and epidemiology may seem like two very different fields, but the team is focusing on how to broadly analyze sources of information, whether it be measurements of the electric grid or hospital database information.
“Every system is modeled differently, but the core of our work is to detect, in real time, variations from the ‘business as usual’ model that describes each system,” said Fellouris. “We’re building algorithms than can adapt to various systems in order to quickly and accurately detect changes, while controlling the number of false alarms below a tolerable level.”
False alarms, like when the smoke detector unnecessarily goes off during dinner preparation, create unwarranted panic. Researchers aim to find the optimal tradeoff between false alarms and quick detection.
Although there is a large body of existing work in this field—known as quickest change detection—going back to the 1920s, there has been renewed interest in the field because of modern applications in areas such as cyber resiliency and healthcare.
“Applications drive the theoretical problems we’re trying to solve,” said Veeravalli. “And we need new algorithms for the new applications.”
Indeed, modern sensor systems have grown enormously, requiring more advanced algorithms to sift through the data to determine exactly when and where significant changes occurs.
“Modern systems generate a large amount of data streams that can be monitored in real time, and these subtle changes may be detectable by only a very small portion of these streams,” said Fellouris. “Our challenge is to construct scalable algorithms that are also good from a statistical point of view.”