Washington, DC - When it comes to predicting important world events, teams do a better job than individuals, and laypeople can be trained to be effective forecasters even without access to classified records, according to new research published by the American Psychological Association.
According to the authors, the study findings challenge some common practices of the U.S. intelligence community, where professional analysts usually specialize in one topic or region and send reports up the chain of command. In what the authors call the first scientific study of its kind, researchers identified common characteristics that improved predictions by amateur participants in a geopolitical forecasting tournament. The contest was sponsored by the Intelligence Advanced Research Projects Activity (IARPA), an agency within the Office of the Director of National Intelligence that funds research to improve intelligence practices.
“Teams could share information and discuss their rationales but still submit anonymous forecasts,” said Barbara Mellers, PhD, one of the lead researchers and a psychology and marketing professor at the University of Pennsylvania. “This type of teamwork that protects dissent is really important, and I don’t think it’s being used to the full extent that it should be in the intelligence community.”
The most accurate forecasters in the tournament were better at pattern detection, cognitive flexibility, knowledge of geopolitics and open-mindedness, including a willingness to consider unorthodox outcomes, the study found. “They would consider ideas and possibilities that were different from their pet theories or beliefs,” Mellers said.
Researchers from the University of Pennsylvania organized a group of 743 tournament participants that beat four other university-based groups in the tournament held from 2011 to 2013. The tournament sought predictions of 199 world events that were of interest to the U.S. intelligence community, including picking the winner of the 2012 presidential election in Taiwan, determining whether Syrian President Bashar al-Assad would remain in power, and predicting whether North Korea would conduct another successful nuclear weapon test. Tournament competitors were laypeople who had no access to classified records, but their predictions were scored for accuracy by the U.S. intelligence community. The research, which focused solely on the University of Pennsylvania group, was published online in the Journal of Experimental Psychology: Applied®.
Prediction of world events relating to U.S. national security is a difficult task involving numerous variables, including civil wars, terrorist attacks, natural catastrophes and shifts in political allegiances in countries across the globe. Inaccurate intelligence analysis can have very costly results, such as the U.S. war in Iraq, which was based on inaccurate claims about weapons of mass destruction, Mellers said. Methods used by the U.S. intelligence community are outdated, she added.
“Institutional accountability is very strong, and there is a fear of taking risks or doing things differently,” Mellers said. “You don’t want to get caught on the wrong side of maybe because the stakes can be very high.”
The University of Pennsylvania group comprised participants recruited from professional societies, research centers and science blogs who represented a wide range of professions, including computer scientists, financial investors and mathematicians. They were largely U.S. citizens (76 percent) and male (83 percent), with an average age of 36. Almost two-thirds (64 percent) had some postgraduate training.
Group members were assigned to work on their own or in teams, with approximately 20 teams that averaged 15 people per team. All study participants could update their forecasts until a closing date for each question, with more than 150,000 forecasts made by the University of Pennsylvania group for the 199 questions during the tournament. The teams performed approximately 10 percent better than individuals working alone.
Participants who received training in probabilistic reasoning as part of the study also performed better in the tournament. That training included the use of forecasting models to average the likelihood of all possible outcomes for a future world event, along with instruction to avoid personal biases. Participants who viewed forecasting as a skill that required practice and constant monitoring of current affairs also fared better in the tournament.
After winning the tournament, the University of Pennsylvania group is still providing forecasts to the U.S. intelligence community to be scored for accuracy, Mellers said. Geopolitical forecasting tournaments should become a regular facet of research to improve intelligence analysis and track the performance of analysts, the study concluded.