Misplaced Pages

Superforecaster

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Forecasters whose results are more accurate than average

A superforecaster is a person who makes forecasts that can be shown by statistical means to have been consistently more accurate than the general public or experts. Superforecasters sometimes use modern analytical and statistical methodologies to augment estimates of base rates of events; research finds that such forecasters are typically more accurate than experts in the field who do not use analytical and statistical techniques, though this has been overstated in some sources. The term "superforecaster" is a trademark of Good Judgment Inc.

Etymology

The term is a combination of the prefix super, meaning "over and above" or "of high grade or quality", and forecaster, meaning one who predicts an outcome that might occur in the future.

History

Origins of the term are attributed to Philip E. Tetlock with results from The Good Judgment Project and subsequent book with Dan Gardner Superforecasting: The Art and Science of Prediction.

In December 2019 a Central Intelligence Agency analyst writing under the pseudonym "Bobby W." suggested the Intelligence community should study superforecaster research on how certain individuals with "particular traits" are better forecasters and how they should be leveraged.

In February 2020 Dominic Cummings agreed with Tetlock and others in implying that study of superforecasting was more effective than listening to political pundits.

Superforecasters

Science

Superforecasters estimate a probability of an occurrence, and review the estimate when circumstances contributing to the estimate change. This is based on both personal impressions, public data, and incorporating input from other superforecasters, but attempts to remove bias in their estimates. In The Good Judgment Project one set of forecasters were given training on how to translate their understandings into a probabilistic forecast, summarised into an acronym "CHAMP" for Comparisons, Historical trends, Average opinions, Mathematical models, and Predictable biases.

A study published in 2021 used a Bias, Information, Noise (BIN) model to study the underlying processes enabling accuracy among superforecasters. The conclusion was that superforecasters' ability to filter out "noise" played a more significant role in improving accuracy than bias reduction or the efficient extraction of information.

Effectiveness

In the Good Judgment Project, "the top forecasters... performed about 30 percent better than the average for intelligence community analysts who could read intercepts and other secret data".

Training forecasters with specialised techniques may increase forecaster accuracy: in the Good Judgment Project, one group was given training in the "CHAMP" methodology, which appeared to increase forecasting accuracy.

Superforecasters sometimes predict that events are less than 50% likely to happen, but they still happen: Bloomberg notes that they made a prediction of 23% for a leave vote in the month of the June 2016 Brexit referendum. On the other hand, the BBC notes that they accurately predicted Donald Trump's success in the 2016 Republican Party primaries.

Superforecasters also made a number of accurate and important forecasts about the coronavirus pandemic, which "businesses, governments and other institutions" have drawn upon. In addition, they have made "accurate predictions about world events like the approval of the United Kingdom’s Brexit vote in 2020, Saudi Arabia’s decision to partially take its national gas company public in 2019, and the status of Russia’s food embargo against some European countries also in 2019".

Aid agencies are also using superforecasting to determine the probability of droughts becoming famines, while the Center for a New American Security has described how superforecasters aided them in predicting future Colombian government policy. Goldman Sachs drew upon superforecasters' vaccine forecasts during the coronavirus pandemic to inform their analyses.

The Economist notes that in October 2021, Superforecasters accurately predicted events that occurred in 2022, including "election results in France and Brazil; the lack of a Winter Olympics boycott; the outcome of America's midterm elections, and that global Covid-19 vaccinations would reach 12bn doses in mid-2022". However, they did not forecast the emergence of the Omicron variant. The following year, The Economist wrote that all eight of the Superforecasters’ predictions for 2023 were correct, including on global GDP growth, Chinese GDP growth, and election results in Nigeria and Turkey.

In February 2023, Superforecasters made better forecasts than readers of the Financial Times on eight out of nine questions that were resolved at the end of the year.

Traits

One of Tetlock's findings from the Good Judgment Project was that cognitive and personality traits were more important than specialised knowledge when it came to predicting the outcome of various world events typically more accurately than intelligence agencies. In particular, a 2015 study found that key predictors of forecasting accuracy were "cognitive ability , political knowledge, and open-mindedness". Superforecasters "were better at inductive reasoning, pattern detection, cognitive flexibility, and open-mindedness". In the Good Judgment Project, the superforecasters "scored higher on both intelligence and political knowledge than the already well-above-average group of forecasters" who were taking part in the tournament.

People

  • Regina Joseph, Good Judgment Project superforecaster, technologist and founding Editor-in-Chief of Blender Magazine, former Futures Division leader and Defence/Security Senior Research Fellow at Clingendael Institute, forecasting science researcher and inventor
  • Elaine Rich, a superforecaster who participated in the Good Judgement Project.
  • Andrew Sabisky, who resigned from his position as advisor to the United Kingdom government at Downing Street, with chief advisor Dominic Cummings telling journalists "read Philip Tetlock's Superforecasters, instead of political pundits who don't know what they're talking about".
  • Nick Hare, former head of futures and analytical methods at the Ministry of Defence (MoD).
  • Reed Roberts, a former PhD student in Chemistry.
  • Jonathon Kitson
  • Jean-Pierre Beugoms
  • Dan Mayland
  • Kjirste Morrell
  • Dominic Smith

Criticism

The concept of superforecasting has been criticised from multiple angles. Nassim Nicholas Taleb has been a particularly strong critic, arguing among other claims that forecasting is not useful to decision makers and that the lack of financial gain accrued by superforecasters is a sign that their actual predictive powers are lacking. Counter-terrorism expert Suzanne Raine criticises it for placing too much emphasis on "what is going to happen" rather than "what is happening" and "how can the future be changed".

References

  1. ^ Adonis (2020).
  2. "Can Policymakers Trust Forecasters?". Institute for Progress. Retrieved 26 August 2024.
  3. "Trademark Electronic Search System (TESS)". tmsearch.uspto.gov. Retrieved 5 January 2023.
  4. ^ "Super Definition & Meaning". Merriam-Webster. Archived from the original on 1 November 2023. Retrieved 1 November 2023.
  5. Tetlock & Gardner (2015).
  6. Bobby W. (2019), p. 14.
  7. ^ BBC News (2020).
  8. BBC News (2020), What is the science behind it?.
  9. ^ Harford (2014), How to be a superforecaster.
  10. Satopää, Ville A.; Salikhov, Marat; Tetlock, Philip E.; Mellers, Barbara (2021). "Bias, Information, Noise: The BIN Model of Forecasting". Management Science. 67 (12): 7599–7618. doi:10.1287/mnsc.2020.3882.
  11. David Ignatius. "More chatter than needed". The Washington Post. 1 November 2013.
  12. Horowitz MC, Ciocca J, Kahn L, Ruhl C. "Keeping Score: A New Approach to Geopolitical Forecasting" (PDF). Perry World House, University of Pennsylvania. 2021, p.9.
  13. BBC News (2020), How successful is it?.
  14. Tara Law. "'Superforecasters' Are Making Eerily Accurate Predictions About COVID-19. Our Leaders Could Learn From Their Approach." TIME. 11 June 2020.
  15. Cochran KM, Tozzi G. "Getting it Righter, Faster: The Role of Prediction in Agile Government Decisionmaking". Center for a New American Security. 2017.
  16. Hatzius J, Struyven D, Bhushan S, Milo D. "V(accine)-Shaped Recovery". Goldman Sachs Economics Research. 7 November 2020.
  17. What the “superforecasters” predict for major events in 2023. The Economist. 18 November 2022
  18. What the “superforecasters” predict for major events in 2024. The Economist. 13 November 2023
  19. The art of superforecasting: how FT readers fared against the experts in 2023. Financial Times. 26 December 2023
  20. ^ Burton (2015).
  21. Mellers B, Stone E, Atanasov P, Rohrbaugh N, Metz SE, Ungar L, et al. "The psychology of intelligence analysis: drivers of prediction accuracy in world politics" (PDF). Journal of Experimental Psychology: Applied. 2015;21(1):1-14.
  22. Mellers B, Stone E, Atanasov P, Rohrbaugh N, Metz SE, Ungar L, et al. "The psychology of intelligence analysis: drivers of prediction accuracy in world politics" (PDF). Journal of Experimental Psychology: Applied. 2015;21(1):1-14.
  23. Superforecasting: The Art and Science of Prediction. Crown. 2015. ISBN 9780804136693.
  24. VICE News (19 May 2017). Chechnya Abuse & The FBI Firing: VICE News Tonight Full Episode (HBO). Retrieved 27 August 2024 – via YouTube.
  25. LLC, New York Media (13 November 1995). New York Magazine. New York Media, LLC.
  26. WHO KNEW (27 April 2021). WHO KNEW The Smartest People In The Room - Regina Joseph & David Hughes. Retrieved 27 August 2024 – via YouTube.
  27. "Blender (magazine)", Misplaced Pages, 27 August 2024, retrieved 27 August 2024
  28. Joseph, Regina. "Clingendael Futures" (PDF). Clingendael Futures.
  29. International Institute of Forecasters (15 August 2022). Forecasting Practices and Processes 5. Retrieved 27 August 2024 – via YouTube.
  30. NSF PREPARE (18 January 2022). RP2 Day 2 Lightning Round 7: Social, Behavioral, Economic & Governance. Retrieved 27 August 2024 – via YouTube.
  31. USPTO.report. "Systems and Methods for Bias-Sensitive Crowd-Sourced Analytics Patent Application". USPTO.report. Retrieved 27 August 2024.
  32. USPTO.report. "Systems and Methods for Multi-Source Reference Class Identification, Base Rate Calculation, and Prediction Patent Application". USPTO.report. Retrieved 27 August 2024.
  33. Nilaya (2015), Guests.
  34. "Superforecasting: The Future's Chequered Past and Present". whynow. 8 February 2021. Retrieved 17 July 2021.
  35. ^ "Superforecaster Profiles". Good Judgment. Retrieved 17 July 2021.
  36. Taleb, Nassim Nicholas; Richman, Ronald; Carreira, Marcos; Sharpe, James (1 April 2023). "The probability conflation: A reply to Tetlock et al". International Journal of Forecasting. 39 (2): 1026–1029. doi:10.1016/j.ijforecast.2023.01.005.
  37. Raine, Suzanne. "Superforecasting will not save us". Engelsberg ideas (in Swedish). Retrieved 26 August 2024.

Further reading

Category: