The Good Judgment Project (GJP) is a project "harnessing the wisdom of the crowd to forecast world events". It was co-created by Philip E. Tetlock (author of Superforecasting and of Expert Political Judgment: How Good Is It? How Can We Know?), decision scientist Barbara Mellers, and Don Moore. It was a participant in the Aggregative Contingent Estimation (ACE) program of the Intelligence Advanced Research Projects Activity (IARPA) in the United States. Predictions are scored using Brier scores. The top forecasters in GJP are "reportedly 30% better than intelligence officers with access to actual classified information."
The Good Judgment Project began in July 2011 in collaboration with IARPA-ACE. The first contest began in September 2011.
GJP was one of many entrants in the IARPA-ACE tournament, and has repeatedly emerged as the winner in the tournament.
Starting with the summer of 2013, GJP contestants had access to the Integrated Conflict Early Warning System.
The co-leaders of the GJP include Philip Tetlock, Barbara Mellers and Don Moore. The website lists a total of about 30 team members, including the co-leaders as well as David Budescu, Lyle Ungar, Jonathan Baron, and prediction-markets entrepreneur Emile Servan-Schreiber. The advisory board included Daniel Kahneman, Robert Jervis, J. Scott Armstrong, Michael Mauboussin, Carl Spetzler and Justin Wolfers. The study employed several thousand people as volunteer forecasters. Using personality-trait tests, training methods and strategies the researchers at GJP were able to select forecasting participants with less cognitive bias than the average person; as the forecasting contest continued the researchers were able to further down select these individuals in groups of so-called superforecasters. The last season of the GJP enlisted a total of 260 superforecasters.