Mathematics
Sequential Analysis: Uncovering New Statistical Methods to Solve Some of the World's Biggest Challenges
Treating cancer and protecting the world against new terrorist threats might seem like a lot to tackle for the average person. But for Professor Michael Baron, a recent addition to Â鶹´«Ã½â€™s Department of Mathematics and Statistics, these priorities are just a part of his everyday research on sequential analysis and multiple hypothesis testing.
Consider the cost of a clinical trial. Currently, clinical trials are expensive, time consuming, and often require testing a large number of subjects in order to be sure that a drug is ready for human consumption. Lowering the number of subjects needed in a trial could save valuable time and resources. More importantly, it could save participants from unknown risks. In order to be sure that a drug will not have an adverse effect on humans, certain statistical standards must be met in the drug’s clinical trials. Such standards are determined (and limited) by currently available statistical methods. But new, more efficient methods that push the boundaries of modern statistics could allow clinicians to determine with statistical certainty, based on fewer subjects, that a drug is safe. In the long term, this would mean more efficient use of resources and potentially life-saving reductions in the amount of time needed to establish a drug’s safety. Lowering the cost of clinical trials may lead to lower cost of treatments, thus reducing the overall cost of health care.
Professor Baron’s research is rooted in two statistical techniques: sequential analysis and multiple hypothesis testing. Sequential analysis is the concept of statistical estimation or deci- sion-making in real time as data is collected, as opposed to retrospectively on a fixed sample size, as is typically done. Multiple hypothesis testing is testing for significance across multiple tests concurrently. While fairly straightforward in isolation, these two methods applied together might decrease the average number of data points needed to make decisions or detect changes while intelligently controlling the allowed error rate. By combining these methods, Baron is working on novel statistical approaches to detect significant changes, with huge implications across a range of fields.
Baron joined the Â鶹´«Ã½ community in the fall of 2014, but has been working on sequential analysis and related applications since the beginning of his career. He began researching change-point analysis, which studies when a significant change has occurred in a dataset, while pursuing his PhD at the University of Maryland before connecting this topic to sequential analysis. While consulting on clinical trials, Baron realized that sequential analysis without a way to handle multiple tests concurrently would never be clinically useful. In 2009, while attending a workshop on multiple comparison problems, Baron first questioned how multiple testing was studied in relation to sequentially collected data and discovered a diverse range of problems yet to be solved.
Seeking solutions to problems of this nature, Baron has drawn on the theories of some of his- tory’s most famed statisticians: Carlo Bonferroni and Sture Holm, who advanced multiple testing theory; as well as Albert Shiryaev, Abraham Wald, and Jacob Wolfowitz, who are considered the fathers of sequential analysis. The history of sequential analysis dates back to the middle of the 20th century, when intelligence analysts began to modify existing statistical theories to solve difficult problems during World War II. Much of this statistical theory focused on the accuracy and speed of machine gun fire and rocket propellants and was classified until after the war. The impact of sequential analysis on clinical trial research and threat detection was realized decades later.
Baron’s current theoretical research at AU has expanded upon Abraham Wald’s work on the sequential probability ratio test (SPRT), while taking into account both Bonferroni and Holm’s theories on correcting for error across multiple hypothesis tests. While it seems like an obvious choice—testing sequentially can control error rates and potentially detect differences sooner than traditional sampling— sequential methods can introduce additional biases into parameter estimation. Implementing sequential methods might also be difficult in practice because they rely on a stopping rule rather than a specific sample size, which could create more uncertainty for clinicians budgeting and planning for clinical trials. One of Baron’s contributions may be showing that these methods are both theoretically sound and clinically practical.
But clinical trials are just the tip of the ice- berg. Baron also hopes to find ways to analyze data and detect terrorist or biological threats as they are happening, rather than after the fact. A National Science Foundation grant allows Baron to study the application of sequential analysis and multiple hypothesis testing to cyber security. These novel statistical methods will allow computer scientists to pool online data sources concurrently with the goal of detecting significant changes, which may indicate a threat. Biological threat detection is another aim of this impressive body of research; it could facilitate more efficient analysis of online health or epidemiology data to allow for earlier detection of public health crises.
Fortunately, Baron is more than capable of handling the gravity of his contributions in serious application areas. With an obvious enthusiasm for statistical theory and an eye for the subtle elegance of these two statistical methods, Baron is helping move the needle to address a diverse and complex range of global challenges.