Amos Nathan Tversky was an Israeli cognitive psychologist whose collaboration with Daniel Kahneman produced one of the most influential research programs in the history of the behavioral sciences. Tversky's rigorous mathematical mind and deep understanding of decision theory drove the pair's investigation of how human judgment departs from the normative standards of probability theory and Bayesian inference. His work on heuristics and biases, prospect theory, and the psychology of preferences demonstrated that human irrationality is not random but systematic and predictable, a finding with profound implications for statistics, economics, medicine, and artificial intelligence.
Life and Career
Born in Haifa, British Mandate Palestine (now Israel). Serves as a paratrooper in the Israel Defense Forces, earning the highest military decoration for bravery.
Earns his Ph.D. in psychology from the University of Michigan, studying mathematical models of choice and similarity.
Publishes "Belief in the Law of Small Numbers," showing that even trained scientists expect small samples to be representative of the population, violating basic principles of sampling theory.
Co-publishes the landmark "Judgment under Uncertainty: Heuristics and Biases" with Kahneman in Science.
Co-publishes "Prospect Theory" with Kahneman, one of the most cited papers in economics.
Moves to Stanford University, continuing research on judgment, decision-making, and the mathematical foundations of choice.
Dies of metastatic melanoma at age 59. Kahneman later receives the Nobel Prize for their joint work; the Nobel is not awarded posthumously.
The Representativeness Heuristic
Tversky and Kahneman demonstrated that when people assess the probability that an object or event belongs to a category, they often rely on resemblance rather than on prior probabilities and statistical reasoning. A description of a shy, organized person is judged more likely to be a librarian than a farmer, even when farmers vastly outnumber librarians. This representativeness heuristic leads to base rate neglect, the conjunction fallacy (judging P(A and B) > P(A)), and insensitivity to sample size.
Most people judge: P(bank teller AND feminist) > P(bank teller)
Correct (by probability axioms): P(A ∩ B) ≤ P(A) always
The judgment violates a fundamental law of probability,
demonstrating that "representativeness" overrides formal reasoning.
Unlike many psychologists of his era, Tversky was deeply trained in mathematics and formal decision theory. He could construct mathematically rigorous models of choice and similarity, and he understood exactly how the patterns he observed in human judgment deviated from the mathematical norms. This precision was essential to the impact of the heuristics and biases program: the deviations from rationality were not vague impressions but specific, quantifiable departures from the predictions of Bayes' theorem and expected utility theory.
Anchoring and Adjustment
Tversky and Kahneman showed that when people make numerical estimates, they are heavily influenced by an initial anchor, even when that anchor is arbitrary. Participants asked "Is the percentage of African nations in the UN more or less than 65%?" (or 10%) gave dramatically different estimates depending on the anchor, even though both groups had the same information. In Bayesian terms, the anchor functions as a pseudo-prior that is insufficiently updated by the evidence, as if people are performing Bayesian updating but with far too much weight on the anchor and too little on the data.
Availability and Frequency Judgment
The availability heuristic leads people to estimate the frequency or probability of events based on how easily instances come to mind. Events that are dramatic, recent, or emotionally vivid are overestimated; mundane but common events are underestimated. This systematically distorts the "prior probabilities" that people bring to their informal Bayesian reasoning, leading to predictable errors in risk assessment, medical diagnosis, and policy judgment.
Belief in the Law of Small Numbers
In one of his early and most insightful papers, Tversky showed that researchers themselves are subject to the representativeness heuristic. They expect small samples to mirror the population distribution far more closely than sampling theory predicts, leading them to trust small-sample results too much and to design underpowered studies. This "belief in the law of small numbers" is a direct violation of Bayesian reasoning about the precision of estimates, which requires acknowledging that small samples carry substantial uncertainty.
Relationship to Bayesian Statistics
Tversky's work is the empirical foundation for understanding why formal Bayesian methods are necessary. Human intuition about probability fails in specific, predictable ways: it ignores base rates, is swayed by representativeness, anchors on irrelevant information, and confuses availability with frequency. Bayes' theorem provides the corrective, prescribing exactly how evidence should be combined with prior information. Tversky's documentation of where intuition fails thus serves as a powerful argument for the discipline that Bayesian statistics provides.
"Whenever there is a simple error that most laypeople fall for, there is always a slightly more sophisticated version of the same problem that experts fall for." — Amos Tversky
Legacy
Tversky's premature death in 1996 deprived the world of one of its most brilliant minds. His work with Kahneman created the field of behavioral economics, influenced the design of public policy through "nudge" theory, and provided the empirical foundation for understanding human limitations in probabilistic reasoning. For Bayesian statisticians, Tversky's legacy is a reminder that the formal methods they develop serve a real need: human intuition, for all its power, is systematically unreliable when it comes to reasoning under uncertainty.