IniciGrupsConversesMésTendències
Cerca al lloc
Aquest lloc utilitza galetes per a oferir els nostres serveis, millorar el desenvolupament, per a anàlisis i (si no has iniciat la sessió) per a publicitat. Utilitzant LibraryThing acceptes que has llegit i entès els nostres Termes de servei i política de privacitat. L'ús que facis del lloc i dels seus serveis està subjecte a aquestes polítiques i termes.
Hide this

Resultats de Google Books

Clica una miniatura per anar a Google Books.

Superforecasting: The Art and Science of…
S'està carregant…

Superforecasting: The Art and Science of Prediction (2015 original; edició 2015)

de Dan Gardner

MembresRessenyesPopularitatValoració mitjanaMencions
6243127,720 (3.95)12
"From one of the world's most highly regarded social scientists, a transformative book on the habits of mind that lead to the best predictions Everyone would benefit from seeing further into the future, whether buying stocks, crafting policy, launching a new product, or simply planning the week's meals. Unfortunately, people tend to be terrible forecasters. As Wharton professor Philip Tetlock showed in a landmark 2005 study, even experts' predictions are only slightly better than chance. However, an important and underreported conclusion of that study was that some experts do have real foresight, and Tetlock has spent the past decade trying to figure out why. What makes some people so good? And can this talent be taught? In Superforecasting, Tetlock and coauthor Dan Gardner offer a masterwork on prediction, drawing on decades of research and the results of a massive, government-funded forecasting tournament. The Good Judgment Project involves tens of thousands of ordinary people--including a Brooklyn filmmaker, a retired pipe installer, and a former ballroom dancer--who set out to forecast global events. Some of the volunteers have turned out to be astonishingly good. They've beaten other benchmarks, competitors, and prediction markets. They've even beaten the collective judgment of intelligence analysts with access to classified information. They are "superforecasters." In this groundbreaking and accessible book, Tetlock and Gardner show us how we can learn from this elite group. Weaving together stories of forecasting successes (the raid on Osama bin Laden's compound) and failures (the Bay of Pigs) and interviews with a range of high-level decision makers, from David Petraeus to Robert Rubin, they show that good forecasting doesn't require powerful computers or arcane methods. It involves gathering evidence from a variety of sources, thinking probabilistically, working in teams, keeping score, and being willing to admit error and change course. Superforecasting offers the first demonstrably effective way to improve our ability to predict the future--whether in business, finance, politics, international affairs, or daily life--and is destined to become a modern classic"--… (més)
Membre:idiopathic
Títol:Superforecasting: The Art and Science of Prediction
Autors:Dan Gardner
Informació:Crown, Kindle Edition, 355 pages
Col·leccions:La teva biblioteca
Valoració:
Etiquetes:currently-reading

Detalls de l'obra

Superforecasting: The Art and Science of Prediction de Philip E. Tetlock (2015)

S'està carregant…

Apunta't a LibraryThing per saber si aquest llibre et pot agradar.

No hi ha cap discussió a Converses sobre aquesta obra.

» Mira també 12 mencions

Es mostren 1-5 de 33 (següent | mostra-les totes)
An excellent book about prediction. Goes well with Thinking Fast and Slow and with Andy Grove's High Output Management.

Tetlock has run the Good Judgment Project and extensively studied predictions and accuracy. He compares intellectually rigorous but unspecialized predictions based on public information to professional analysts in the Intelligence Community and sees great results from the public. A few of the key aspects seem to be many professionals having a pre-existing agenda or mental model that they try to fit facts to, as well as other long-term means of observing and correcting forecasts. ( )
  octal | Jan 1, 2021 |
Superforecasting: The Art and Science of Prediction, is a book I've been meaning to get around to for a while. While the usual saying goes that many 'experts' are no better than dart-throwing chimps when it comes to making predictions, there really are some people who are very good at those same forecasts. Who are they, and how do you get to think like them? This book is about that. The first few chapters are critical, as are the Eleven Commandments at the back, but there's a decent amount of filler.

Many people we consider 'experts' are primarily storytellers, pundits, individuals skilled at crafting narratives. Confidence means you have a coherent narrative, but not the truth. Doubt and uncertainty are fundamentally undervalued, because knowing what we don't (or can't) know is valuable unto itself. Meta-cognition is no small feat to master, but one that is incredibly valuable.

The problem with forecasting, by and large, is that we don't keep track of our hits and misses, and how confident we were when we made them. Few people reflect on what they predicted would happen months or years ago, and even fewer can remember how confident they were in that prediction. Hindsight bias is a monster to be overcome for exactly that reason. Using things like PredictionBook and Brier scores are critical to establishing metrics of performance, and overcoming your own biases. Without a baseline, it's very hard to tell if you're getting better.

The bits on probability were interesting, particularly the danger of being on the "wrong side of maybe". Even if you predict something as unlikely to pass, and it happens, that does not mean that the prediction was fundamentally flawed. Just that the unlikely happened. Training yourself to think about probability in these terms is also hard for our where's-the-lion brains to process.

The intelligence community could definitely benefit from making this more systematic, even if just informally. A kind of hive-minding for the collective consciousness. Rainbows End showed this particularly well.
( )
  pvoberstein | Dec 14, 2020 |
This was a fascinating look at "superforecasters" - basically people who are the best at predicting the (near) future. They're not always the people you think though - none of the guys who make their living doing forecasts for the news wanted to be tested so who knows how well they're doing! But a lot of the superforecasters described in this book are regular joes who do this for fun on the side. There's a lot of good advice in here on how you too can become a superforecaster - it's a learnable skill, if you're willing to put in the time and effort to practice. Some of the lessons are good to know for everyday rational thinking as well, and reading this book has made me want to try and put some of these suggestions into practice for myself. ( )
  katebrarian | Jul 28, 2020 |
A great read, probably the best light read this summer. A book I would recommend to whole-heartedly to anyone, since we are all forecasters. Usually, I am not a fan of the Predictably Irrational/Freakonomics model of turning an academic paper into a full length book but exceptions must be made. Tetlock was famous for producing the chimp throwing darts study (interestingly, fame was inversely proportional to accuracy, and there's many examples of pundits who project forward a vague forecast and then rationalize their forecast ex post), but the work that produced this book is far more interesting and meaningful. Tetlock starts with a digression into the history of medicine, which consistently spurned randomized trials and empiricism to the harm of patients and the practice of medicine (reminds me particularly of Taleb's work as well as the general narrative of Emperor of All Maladies). Tetlock argues that without randomized testing, our folk wisdom are essentially shots in the dark. As a faint hearted-empiricist, I cannot agree with him more. Tetlock discusses how many forecasts are unfalsifiable because the time frame is too vague, the event is ill defined and language can be ambiguous (maybe, significant, probably). Tetlock argues fairly that sometimes this vagueness comes from a common understanding that is quickly forgotten, a general discomfort with turning probabilities into numbers, or more cynically a method to cover one's own forecasts (this is not totally irrational, as people have trouble understanding probabilistic forecasts, and are quick to pin blame). A point that Tetlock makes that I thought was important is that not all forecasts have the purpose of being accurate. Forecasts have other purposes, such as entertainment, encouraging action, and bringing comfort.

The book goes through Tetlock's entry into the IAPRA competition, which tried to measure the accuracy of clearly defined questions with time frames (Tetlock praises IAPRA for daring to run an experiment that could cost them their jobs). The book goes on to quantify the accuracy of forecasts by explaining the concept of a brier score (0-2, the lower the score, the more accurate), calibration (over many forecasts, the match between a predicted probability and the percentage occurrence of an event, i.e. when a forecaster claims that something is likely to occur with 60% confidence, do those events overall actually occur 60% of the time?), and resolution (to adjust for forecasters who just consistently forecast in the midrange). According to Tetlock, his methods for aggregating the wisdom of the forecasters (the wisdom of the crowd idea, that random errors cancel each other out but real information point the same way) with heavier weights for "superforecasters" and extremizing the probability was the superior forecasting method to the other entries in the competition. According to one source, the method was 30% more accurate than analysts with access to classified information (Tetlock argues that the superforecasters as teams are superior to even prediction markets, but with intellectual candor admits that his markets might have been too inliquid and had no monetary incentive). The core of the book discusses these superforecasters. I was impressed by the factors that Tetlock tries to control for in order to find the secret to their success. Tetlock systemically eliminates superintelligence (the superforecasters on the whole are a little above average but nowhere near genius level) and luck (Tetlock notes the lack of regression to the mean in time series data) as the driving factors behind the superforecasters' success. Instead, tantalizingly, Tetlock argues that their success is driven by learnable habits that is accessible to nearly everyone. Tetlock's superforecasters tend to be foxes, who are intellectually humble, constantly learn through feedback, use base rates, break down problems into pieces like a consulting interview problem ("Fermize"), update their forecasts in a Bayesian manner (though not strictly mathematically), use fine gradations of probability (an analysis demonstrated that rounding to the nearest 10% of these forecasts actually reduced their accuracy), avoided cognitive errors (like confirmation bias, by taking a different position and arguing it to the earnest, and avoiding hindsight bias), had grit, typically had a growth mindset instead of a fixed mindset, did not believe that things necessarily happened for a reason but instead believed in chance, and practiced a lot(Tetlock does not think there is a magic bullet). Tetlock also studied the effect of putting these superforecasters in teams, and observed the general avoidance of groupthink by these teams who had improved accuracy (more ground covered, diverse opinions, and respectful clarification and pushback). Tetlock also suggests creating psychological safety in order to avoid group think by having dedicated devil advocates, suspending hierarchy, and bringing in fresh perspectives. The team members were also generally givers, which maximized accuracy.

Tetlock then responds to a two criticisms from Kahneman and Taleb. Tetlock characterizes Kahneman's objection as an inability for humans to adjust to their system 1 mistakes. Tetlock claims that certain adjustments can be repeated so often that they become reactionary and part of a person's system 1. Tetlock offers potential scope sensitivity as evidence that superforecasters can perhaps overcome their innate biases. Tetlock argues that many of the black swans that people discuss are actually to some degree predictable (though Tetlock admits that extremely rare events would be hard to accurately measure the forecast of), and it is still necessary for forecasts to be used to appropriately allocate resources for different contingencies. Tetlock argues that a leader can be both decisive when decisions are made and still have the good qualities of a forecaster. In particular, Tetlock argues for a decentralized decision making process that only sets out goals, allowing subordinates to achieve this goal through the means they see fit, which he claims was the method used by the German army, US army and several successful businesses.

Tetlock spends some time setting out potential implications of his research. He argues that there might be a place for pundits in setting the questions for superforecasters to answer, and the hope that these forecasts will be imported into political discourse in order to bring empirical rigor to economic and political debates. ( )
  vhl219 | Jun 1, 2019 |
Ressenya escrita per a Crítics Matiners de LibraryThing .
An interesting read on the general art and science of forecasting and the mindset that goes into being a good forecasting. It is well-written and an informative take on the psychology, citing sources and the topics thoroughly researched. The author's work with the Good Judgement Project for the starting point for some of the book's concepts, pairing well with other resources to fully explore the topic. ( )
  redsauce | Jun 6, 2017 |
Es mostren 1-5 de 33 (següent | mostra-les totes)
Sense ressenyes | afegeix-hi una ressenya

» Afegeix-hi altres autors

Nom de l'autorCàrrecTipus d'autorObra?Estat
Philip E. Tetlockautor primaritotes les edicionscalculat
Gardner, Danautor principaltotes les edicionsconfirmat
Has d'iniciar sessió per poder modificar les dades del coneixement compartit.
Si et cal més ajuda, mira la pàgina d'ajuda del coneixement compartit.
Títol normalitzat
Títol original
Títols alternatius
Data original de publicació
Gent/Personatges
Llocs importants
Esdeveniments importants
Pel·lícules relacionades
Premis i honors
Informació del coneixement compartit en anglès. Modifica-la per localitzar-la a la teva llengua.
Epígraf
Dedicatòria
Primeres paraules
Informació del coneixement compartit en anglès. Modifica-la per localitzar-la a la teva llengua.
We are all forecasters.
Citacions
Darreres paraules
Informació del coneixement compartit en anglès. Modifica-la per localitzar-la a la teva llengua.
Nota de desambiguació
Editor de l'editorial
Creadors de notes promocionals a la coberta
Llengua original
Informació del coneixement compartit en anglès. Modifica-la per localitzar-la a la teva llengua.
CDD/SMD canònics

Referències a aquesta obra en fonts externes.

Wikipedia en anglès (1)

"From one of the world's most highly regarded social scientists, a transformative book on the habits of mind that lead to the best predictions Everyone would benefit from seeing further into the future, whether buying stocks, crafting policy, launching a new product, or simply planning the week's meals. Unfortunately, people tend to be terrible forecasters. As Wharton professor Philip Tetlock showed in a landmark 2005 study, even experts' predictions are only slightly better than chance. However, an important and underreported conclusion of that study was that some experts do have real foresight, and Tetlock has spent the past decade trying to figure out why. What makes some people so good? And can this talent be taught? In Superforecasting, Tetlock and coauthor Dan Gardner offer a masterwork on prediction, drawing on decades of research and the results of a massive, government-funded forecasting tournament. The Good Judgment Project involves tens of thousands of ordinary people--including a Brooklyn filmmaker, a retired pipe installer, and a former ballroom dancer--who set out to forecast global events. Some of the volunteers have turned out to be astonishingly good. They've beaten other benchmarks, competitors, and prediction markets. They've even beaten the collective judgment of intelligence analysts with access to classified information. They are "superforecasters." In this groundbreaking and accessible book, Tetlock and Gardner show us how we can learn from this elite group. Weaving together stories of forecasting successes (the raid on Osama bin Laden's compound) and failures (the Bay of Pigs) and interviews with a range of high-level decision makers, from David Petraeus to Robert Rubin, they show that good forecasting doesn't require powerful computers or arcane methods. It involves gathering evidence from a variety of sources, thinking probabilistically, working in teams, keeping score, and being willing to admit error and change course. Superforecasting offers the first demonstrably effective way to improve our ability to predict the future--whether in business, finance, politics, international affairs, or daily life--and is destined to become a modern classic"--

No s'han trobat descripcions de biblioteca.

Descripció del llibre
Sumari haiku

Autor amb llibres seus als Crítics Matiners de LibraryThing

El llibre de Philip E. Tetlock Superforecasting estava disponible a LibraryThing Early Reviewers.

Dóna't d'alta per obtenir una còpia prèvia a canvi d'una ressenya.

Dreceres

Cobertes populars

Valoració

Mitjana: (3.95)
0.5
1 3
1.5 1
2 3
2.5 3
3 25
3.5 9
4 48
4.5 8
5 41

Ets tu?

Fes-te Autor del LibraryThing.

 

Quant a | Contacte | LibraryThing.com | Privadesa/Condicions | Ajuda/PMF | Blog | Botiga | APIs | TinyCat | Biblioteques llegades | Crítics Matiners | Coneixement comú | 157,287,230 llibres! | Barra superior: Sempre visible