Wednesday 30 July 2014

Probability and behavioral economics

PHILOSOPHY OF PROBABILITY AND CAUSALITY AND ANTECEDENTS TO BEHAVIORAL ECONOMICS
MUNISH ALAGH
I. DAVID HUME AND CAUSALITY
The following aspects of David Humes writings are of interest
A. From an Enquiry in Human Understanding
a) the different species of philosophy-abstract reasoning and clear simple logic.
b)Ideas as less vivid than sensations
c)how ideas can be contradicted not facts which can only be true or false
d)how if one wants to truly wants to understand cause and effect one must understand habit, custom and belief and that this belief of necessary connexion is something which is felt after uniform experience.
Habit, custom and belief leads us to surmise about our feelings regarding chance, cause or probability of an event based on experience.
B. From general references on Hume and Causality
a) There are no innate ideas and all knowledge comes from experience. Hume rigorously applies this standard to causation and necessity.
b) We have a very tenuous grasp on causal efficacy.
c) Despite constant conjuction of events A and B we are left with a very weak notion of necessity.
d) The above leads to the problem of Induction. We cannot go from Evidence of Cause to the Hypothesis effect of necessity or efficacy.
e) Copy principle shows that senses of impressions lead to ideas, which are less vivid and are product of intellect.
f) Causal reasoning is just due to experience, thats all.
g) Causal Reductionism suggests that our only notion of causation is constant conjunction with certitude, the mental determination of the mind according to some is thus not emphasized and considered contrary to Humean emphasis on necessity.
II. PROBABILITY IN HUME
Hume had differentiated between fiction and belief by annexing feeling to belief, and the superiority of chances of any event lead to lead to a higher degree of belief. Frequency definition of probability explains that odds are the ratio of the amount a person feels he should get in return for risking a certain amount in a given situation. Important here is the word feels, for different persons may give different gambling odds based on their subjective evaluation of a given situation.
Repeated observation does not lead to truth, sun rising everyday till now makes no difference to whether it will rise tommorow ( but repeated observation does bring you more accuracy -weak law of large numbers).
Hume says the fact that the sun rises everyday does not mean it will rise today, Bayesian analysis shows just this, confidence about the probability of a single event based on the experience of many.
III. PROBABILISTIC APPROACHES TO CAUSATION

Philosophers have studied the role of probabilistic notions in the analysis of causal concepts:
1. Causation is not restricted to deterministic process.
2. Causes must in some way make their effects more likely.
3. An event of type B is positively relevant to an event of type A if and only if the conditional probability of an event of type B is greater than the unconditional probability of an event of type A.
4. Wesley, Salmon  suggests not to abandon relating causation to probability but to supplement probabilistic concepts with other ones, in particular with causal processes.
5. Whereas the logical probability of an event of type E, given only that events of type C causally necessitate events of type E, will be higher than the a priori logical probability of an event of type E, but this relation cannot be used as a part of a probabilistic analysis of causation since the relation itself involves that very concept.
6. The most difficult problem involves the dictum that cause effect relations must always involve positive statistical relevance. Dictum cannot be accepted in an unqualified way yet a cause which can  probabilistically bring about a certain effect may atleast raise the probability of that effect vis a vis some other state of affairs.
(Wesley Salmon, In Michael Tooley (ed.),(1980))
IV. PHILOSOPHY OF PROBABILITY

Philosophically we consider three kinds of probability-physical, epistemic and subjective. Physical called chances and subjective Credences.
So consider the statement: Smokers have a greater chance of Cancer.(Chances.)
Or the statement: New Evidence makes it unlikely that the butler did it.(Epistemic.)
Or: I think it will probably rain tonight.(Credences.)
Chances are real features of the world, they are whether we conceive or know them, epistemic only measure evidence confirming hypothesis, credences measure our beliefs on propositions, an event or state of affairs like rain may not hold but proposition will, it just may not be true, proposition cannot be contradicted just falsified.
In terms of epistemic probability, chance that the butler did it might be low, what matters is the probability relative to the evidence before the court that the butler was the murderer.
In terms of credences there is an on off view of belief, yes or no, versus degrees of belief or credences, "probably", a coin may be two headed or two tailed, credence of heads is 1/2, chance is 0 or 1, nor is epistemic 1/2 because you have no evidence of type of coin, still credence in heads remains 1/2.Credences and chances are not relative to evidence, but epistemic probability is.
(Mellor D H, 2005)

More generally we consider the following kinds of Philosophical Interpritations of Probability:

Classical Interpritation:

"The theory of chance consists in reducing all the events of the same kind to a certain number of cases equally possible,
that is to say, to such as we may be equally undecided about in regard to their existence, and in determining the number
of cases favorable to the event whose probability is sought. The ratio of this number to that of all the cases possible is the
measure of this probability, which is thus simply a fraction whose numerator is the number of favorable cases and whose
denominator is the number of all the cases possible.
The preceding notion of probability supposes that, in increasing in the same ratio the number of favorable cases and that
of all the cases possible, the probability remains the same." (Laplace,1814).
Laplace goes on to explain the last statement by giving the example showing that 1/3 is the same as 2/6. (To get the real flavour of this example it is necessary to read the original Chapter 2 "Concerning Probabilities" in Laplace, 1814).
The Classical Definition is essentially a consequence of the principle of indifference. The principle of indifference asserts that if there is no known reason for predicating of our subject one rather than another of several alternatives, then relative to such knowledge the assertions of each of these alternatives have an equal probability.(Keynes, 1921). Keynes  states that this rule as it stands leads to paradoxical and even contradictory conclusions. Keynes says in the treatise in Chapter 4 on the principle of indifference that the principle of indifference is not applicable  to a pair of alternatives, if we know that either of them is capable of being further split up into a pair of possible but incompatible alternatives of the same form as the original pair. (Page 61, Keynes, 1921.)

Logical Probability:

Keynes makes the case for "rational degree of belief" or for probability based on a logical assessment of the degree of belief which is rational to entertain in given conditions. (Page 4, Keynes, 1921). Ramsey held that epistemic probabilities simply are degrees of rational belief, rather than being logical relations that merely constrain degrees of rational belief. Ramsey feels that no one estimating a degree of probability simply contemplates the two propositions supposed to be related to it; he always considers inter-alia his own or hypothetical degree of belief. This according to him seems to be the only way of accounting for the fact that we can all give estimates of probability in cases taken from actual life, but are quite unable to do so in cases in which, were probability a logical relation, it would be easiest to discern.(Ramsey, F.P,1931).

Subjective Probability:

Ramsey regards probability as a measure of the degree of belief of an individual in assessing the uncertainty of a given situation. Ramsey takes as his measure of the degree of belief a psychological theory which is unlike the utilitarians for it does not seek to give pleasure a dominating position but seeks to consider anything at all we want, Ramsey then states that in order to be consistent these degrees of belief must follow the laws of probability (Ramsey, F.P,1931).

Frequency Definition- Frequency interpritations date back to Venn (1876), and represent an objective notion of probabilities, heedless of anyone’s beliefs, it is the philosophical position that lies in the background of classical statistical inference. Relative frequencies bear an intimate relationship to probabilities, it refers to the proportion of occurrences of an attribute A within a finite reference class B.(Hajek, 2007) The frequency interpritations involve limiting relative frequencies and must be relativized to a reference class. Richard von Mises, defines probability(1957, page 28-29):  and the
range of applicability of probability theory is delineated thus:
1. It is possible to speak about probabilities only in reference to a properly
defined collective. (R. Mises 1957, p. 28)
2. A collective [appropriate for the application of the theory of probability
must fulfill] . . . two conditions: (i) the relative frequencies of particular
attributes within the collective tend to fixed limits; (ii) these fixed limits
are not affected by any place selection. That is to say, if we calculate the
relative frequency of some attribute not in the original sequence, but in a
partial set, selected according to some fixed rule, then we require that the
relative frequency so calculated should tend to the same limit as it does in
the original set. (pp. 28–29)
3. The fulfillment of the condition (ii) will be described as the Principle of
Randomness or the Principle of the Impossibility of a Gambling System.

Propensity Definition
Propensities, or chances, are not relative frequencies, but purported causes of the observed stable relative frequencies. Frequentists do not define relative frequences for a single event such as a toss of a coin. On the other hand propensists use the law of large numbers to explain behavior of long run frequencies.For Popper (1957), a probability p of an outcome of a certain type is a propensity of a repeatable experiment to produce outcomes of that type with limiting relative frequency p. For instance, when we say that a coin has probability 1/2 of landing heads when tossed, we mean that we have a repeatable experimental set-up — the tossing set-up — that has a propensity to produce a sequence of outcomes in which the limiting relative frequency of heads is 1/2. (Hajek, 2007).

Axiomatic Theory is based on the axioms of Kolmagorov in which probabilities are numerical values that are assigned to events. The numbers are non negative, they have a maximum value of 1 and the probability that one of two mutually exclusive events occurs is the sum of their individual events, these are collectively known as the Kolgomorov probability axioms. Namely,
A1. Non negativity.
A2. Normalisation.
A3. Additivity.
Kolmogorov goes on to give an infinite generalization of A3 known as countable additivity. He also defines the conditional probability of A given B which can be given in the form of an example as:
Probability that the toss of a fair die results in a 6 is 1/6, but the probability that it results in a 6 given that it results in an even number is 1/6/1/2=1/3.

Important consequences of the conditional probability axiom include Bayes Theorem
Namely,
P(H/E)= [P(H)/P(E)]P(E/H)=P(H)P(E/H)//[P(H)P(E/H)+P(not H)P[E/( not H)].
This theorem provides the basis for Bayesian confirmation theory, which appeals to such probabilities in its account of the evidential support that a piece of evidence E provides a hypothesis H. P(E/H) is called the likelihood (the probability that the hypothesis gives the evidence) and P(H) the prior probability of H( the probability of the hypothesis in the absence of any evidence whatsoever).
Events A and B are said to be independent if the probability of the intersection of A and B is given by probability of A into Probability of B. If P(A) and P(B) are greater than zero, this is equivalent to P(A/B)=P(A) and to P(B/a)=P(B), intuitively, information about the occurrence of one of the events does not alter the probability of the other. Thus, the outcome of a particular coin toss is presumably independent of the result of the next presidential election. Independence plays a central role in probability theory. For example it underpins the various important laws of large numbers, whose content is roughly that certain well-behaved processes are very likely in the long run to yield frequencies that would be expected on the basis of their probabilities. The axiomatic approach is well developed but its interpretation is open to question, leaving any analyst of probability with the task of  enumerating as well as analyzing the other theories not just for the sake of completion but also to throw light on what Probability actually is.
V.PROBABILITY AND ANTECEDENTS TO BEHAVIORAL ECONOMICS

The following two statements mean exactly the same thing:
Large samples are more precise than small samples.
Small samples yield  extreme results more often than large samples do.
—The first statement has a clear ring of truth, but until the second version makes intuitive sense, you have not truly understood the first.
—Large samples are more precise than small samples.Using a sufficiently large sample is the only way to reduce the risk.
—you wish to confirm the hypothesis that the vocabulary of the average six-year-old girl is larger than the vocabulary of an average boy of the same age. The hypothesis is true in the population; Girls and boys vary a great deal, however, and by the luck of the draw you could select a sample in which the difference is inconclusive, or even one in which boys actually score higher.
Small samples yield  extreme results more often than large samples do. Researchers who pick too small a sample leave themselves at the mercy of sampling
A research study showed that more small schools had done well, so the Gates foundation started funding small schools and a causal story can easily be linked to this saying that attention to students is more in small schools, actually larger schools empirically, if anything, do better possibly because of greater curriculum options . And so Unfortunately the causal analysis is wrong, the actual fact is which could have been pointed out had the Gates foundation taken statistics seriously is that more small schools had also done badly. Clearly on average small schools are not better just more variable.
Many decisions are based on beliefs concerning the likelihood of uncertain events.
Occasionally, beliefs concerning uncertain events are expressed in numerical form as odds or subjective probabilities. The subjective assessment of probability involve judgements based on data of limited validity, which are processed according to heuristic rules. However, the reliance on this rule leads to systematic errors. Such biases are also found in the intuitive judgement of probability. Kahneman and Tversky describe  three heuristics that are employed to assess probabilities and to predict values. Biases to which these heuristics lead are enumerated, and the applied and theoretical implications of these observations are discussed. This discussion below is based broadly on writings by Kahneman and Tversky, the following heuristics and the biases they lead to are discussed:

  • Representativeness.
  • Availability.
  • Adjustment and Anchoring.
(i) representativeness, which is usually employed when people are asked to judge the probability that an object or event A belongs to class or process B; (ii) availability of in-stances or scenarios, which is often em-ployed when people are asked to assess the frequency of a class or the plausibility of a particular development; and (iii) adjustment from an anchor, which is usually employed in numerical prediction when a relevant value is available. These heuristics are highly economical and usually effective, but they lead to systematic and predictable errors. A better understanding of these heuristics and of the biases to which they lead could improve judgments and decisions in situations of uncertainty.

References:
1. Beauchamp, Tom L. and Rosenberg, Alexander. Hume and the Problem of Causation, OUP, NY 1981.;
2. David Hume, An Enquiry Concerning Human Understanding, Clarendon, Oxford, UK, 2000, edited by Tom L Beauchamp;
3.  Hajek, Alan (2007) Interpretations of Probability Stanford Encyclopedia of Philosophy
4. Keynes, John, 1921 , A Treatise on Probability. New York: Dover Publications,(2004). page 3-9, page 41-52.
5. Laplace, P. S., 1814, English edition 1951, A Philosophical Essay on Probabilities, New York: Dover Publications Inc.
6.  Mellor D H, Probability-A Philosophical Introduction,Routledge-Taylor and francis Group, London and New York, 2005.
7. Popper, Karl R., 1957, “The Propensity Interpretation of the Calculus of Probability and the Quantum Theory”, in S. Körner (ed.), The Colston Papers, 9: 65–70.
      8. Ramsey, F.P,1931,"Chapter 7, Truth and Probability" (1926)In Braithwaite, R. B. Foundations of Mathematics and Other Logical Essays. London: Kegan, Paul, Trench, Trubner & Co. pp. 156–198.
      9.  Tversky, Amos; Kahneman, Daniel, Psychological Bulletin, Vol 76(2), Aug 1971, 105-110.
     10. Von Mises, Richard (1957) Probability, Statistics and Truth, first Lecture-The Definition of Probability, page 1-29
     11. Wesley Salmon, In Michael Tooley (ed.),Causation (Oxford Readings in Philosophy) Oxford Up. 137-153 (1980).

No comments:

Post a Comment