45 results on '"Imprecise probability"'
Search Results
2. Teaching only about Keynes’s Contributions in Chapter 1 (Plus Possibly Chapters 2, 3, 6 and 26, if Time Permits) of the A Treatise on Probability Falls Far ,far Short of Providing Students with An Informed Foundation about Keynes’s Analysis of Decisionmaking,rationality and expectations
- Author
-
Michael Emmett Brady
- Subjects
General theory ,Heuristic ,Economics ,Foundation (evidence) ,Rationality ,Imprecise probability ,Mathematical economics ,Interval valued - Abstract
Teaching students about Keynes’s views on decision making ,expectations and rationality requires that the students have had a general introduction and overview of Parts II(interval valued probability), III(finite probabilities)and V(statistics, inexact measurement,and approximation, as discussed by Keynes in Chapter Four of the General Theory on pp.39-40 and 43-44) of Keynes’s A Treatise on Probability (1921) and a detailed coverage of chapters 15 and 17 of his Part II of the A Treatise on Probability on the importance of interval valued probability . Keynes argued in Chapter 15 that decision makers in the real world mainly rely on and use interval valued probability in a heuristic manner. They are not relying on the use of ordinal probability which is simply far, far too weak to provide a foundation for rationality. Meeks’ coverage of chapters 1-4 of Keynes’s A Treatise on Probability plus her possible excursion into chapters 6 and 26 would fail to provide the basic foundation needed by students to understand Keynes’s views in the Treatise on Probability and General Theory. Any primary source of discussion about the links between the A Treatise on Probability and General Theory MUST incorporate Keynes’s exchanges between himself and Hugh Townshend in 1937 and 1938,followed by the conflict between Tinbergen and Keynes in 1938-40 over Keynes’s imprecise probability(the intervals of Part II of the A Treatise on Probability ) using approximation and inexact measurement and Tinbergen’s approach based on precise probability and exact measurement .Meek’s coverage of these sources is simply nonexistent. Most importantly, Meeks needs to avoid the 1973 ,volume 8 version of the A Treatise on Probability in the CWJMK like the plague, as its editorial foreword by R. Braithwaite is completely flawed and erroneous.
- Published
- 2021
- Full Text
- View/download PDF
3. What Was E. Borel Referring to in His 1939 Book when He Describes '…the Beautiful Work of Mr. J.m. Keynes…' in the A Treatise on Probability? The Answer Is the Belated Recognition on Borel’s Part of the Extremely Important Part Ii of the A Treatise on Probability on Imprecision and Inexact Measurement
- Author
-
Michael Emmett Brady
- Subjects
Law of thought ,GEORGE (programming language) ,Reading (process) ,media_common.quotation_subject ,Decision theory ,Value (economics) ,Imprecise probability ,Mathematical economics ,SIPTA ,media_common ,Statistician - Abstract
E. Borel, in his 1924 review of the A Treatise on Probability, did not read Part II. He skipped Part II, although he did apologize to Keynes and Russell for doing so in his review, acknowledging that this was the most important part of Keynes’s A Treatise on Probability .Borel was certainly correct .He does not use the word “beautiful” to describe Keynes’s work at any place in his 1924 review. Now Parts I, III, IV, and V of the A Treatise on Probability are well done and make some breakthroughs, as acknowledged by Edgeworth in 1922 in his two reviews. However, statistics is not the type of mathematics where one mathematician would use the highest form of compliment possible among mathematicians, ”beautiful”, to describe the work of another mathematician. Borel recognized that Keynes was certainly a mathematician. By a process of elimination a la Sherlock Holmes, Borel can only be talking about Part II of the A Treatise on Probability. Part II is the part that Borel skipped in his review in 1924.Fifteen years later, Borel had to have finally been able to figure out what Keynes was doing ,with the help of Bertrand Russell and William Ernest Johnson, in Part II, much like Edwin Bidwell Wilson had finally been able to grasp Keynes’s points in Part II in 1934, some eleven years after his 1923 review of the A Treatise on Probability, although Wilson argued that uncertainty ,in Keynes’s sense, is really not important for a statistician. Borel must have been referring implicitly to Keynes’s truly impressive work on non additivity and interval valued probability in Part II of the A Treatise on Probability, which Keynes had based on G. Boole’s 1854 The Laws of Thought, in his footnotes in Chapter Two of his 1939 French book,the title of which is translated variously into English as Probability:Its Philosophy and Practical Value or Practical and Philosophical Value of Probabilities or Practical Value and Philosophy of Probabilities. Unfortunately, B. de Finetti ‘s review of Keynes’s A Treatise on Probability took place in 1938 and so he could not make use of Borel’s revised opinion of A Treatise on Probability that appeared in 1939 in Borel’s Practical and Philosophical Value of probabilities. The strange and incomprehensible failure of, especially ,philosophers and economists, but ,in fact ,all academics who have written on Keynes’s A Treatise on Probability, to cover Part II of the A Treatise on Probability means that theories of imprecise probability had to be created from scratch all over again, using Koopman’s 1940 works as a foundation, instead of building on Adam Smith, George Boole and J M Keynes with one,and only one, exception-the American mathematician, Theodore Hailperin. A careful reader of the contributions made by Adam Smith, G.Boole,J. M. Keynes, and T. Hailperin to imprecise probability ,with their emphasis on interval valued probability,non additivity and partial ordering, has no need to spend an inordinate amount of time reading the Koopman, C A B Smith ,Dempster, I J Good, H. Kyburg, I.Levi, i.e. SIPTA,etc. literature on imprecise probability because the latter literature is simply grossly ignorant about the former, which is basically involved in an exercise in reinventing the wheel . Borel needed to have made an explicit reference to Part II of the A Treatise on Probability in chapter 2 of his 1939 book. If he had, then, possibly, the history of decision science would have been very different and Ramsey’s error filled reviews in 1922 and 1926 would have been ignored.
- Published
- 2021
- Full Text
- View/download PDF
4. Keynes Spelt Out Exactly What 'Degree of Rational Belief ' Meant in His a Treatise on Probability (1921): Correcting the Severe Errors in Courgeau (2012)
- Author
-
Michael Emmett Brady
- Subjects
Meaning (philosophy of language) ,Nothing ,Philosophy ,Reading (process) ,media_common.quotation_subject ,Biography ,Imprecise probability ,Rational belief ,Degree (music) ,Epistemology ,Logical theory ,media_common - Abstract
A very, very, very severe problem has been occurring repeatedly over the last 100 years in the social sciences and philosophy, when it comes to the question of understanding the meaning of Keynes ‘s logical theory of probability and his concept of rational degrees of belief. It is the failure of commentators on Keynes’s book to have actually read the A Treatise on Probability that is an ongoing problem. Instead of reading the A Treatise on Probability, practically all social scientists and philosophers evaluate Keynes’s contribution based on a reading of F. P. Ramsey’s 1922 and 1926 reviews ,which are combined with the introduction to the latest edition of the A Treatise on Probability, Volume 8 of the Collected Writings of John Maynard Keynes, written by Richard B. Braithwaite, who claimed that he had read the A Treatise on Probability during the break between academic terms at Cambridge University in 1921, as reported in C. MIsak’s 2020 biography of Frank Ramsey. Given that it took the French mathematician, Emile Borel, three years to cover Part I of Keynes’s A Treatise on Probability in preparation for his 1924 review and that it took the American mathematician, Edwin Bidwell Wilson, Paul Samuelson’s mentor, 13 years before he was able to write his review of Part II of Keynes’s A Treatise on Probability in the September, 1934 issue of the Journal of the American Statistical Society, I believe that the above facts provide overwhelming evidence that Richard B Braithwaite claim can’t be sustained. It is not possible, as he claimed, to be able to read the A Treatise on Probability in less than two month’s time. This problem shows up again repeatedly on pages 278-284 in D. Courgeau’s 2012 research manual, Probability and Social Science, where evaluations of Keynes’s work are made that have nothing to do with Keynes’s book. Misak based her assessment on individuals, such as Clive Bell, who simply would have no idea about what Keynes was talking about. When one combines such empty speculations with the impossible claims made by Braithwaite and Ramsey’s juvenile book reviews of Keynes’s A Treatise on Probability, the result is simply nonsensical, given that the fundamental differences between Keynes and Ramsey are between imprecise views of probability versus precise views of probability, respectively.
- Published
- 2020
- Full Text
- View/download PDF
5. On the Explicit Connections Between Keynes’s Chapter 15 of the A Treatise on Probability(1921) and Chapter Four of the General Theory(1936):Keynes’s Method in the General Theory is Inexact Measurement and Approximation using Imprecise Probability from the A Treatise on Probability
- Author
-
Michael Emmett Brady
- Subjects
Set (abstract data type) ,Argument ,Technical analysis ,Liquidity preference ,Interval estimation ,Inverse function ,Function (mathematics) ,Imprecise probability ,Mathematical economics ,Mathematics - Abstract
Keynes, as he had done in all of his major works either directly or indirectly, from the 1913 Indian Currency and Finance through the General Theory in 1936, always used his A Treatise on Probability method and methodology of inexact measurement and approximation when performing a technical analysis. This involves Keynes’s use of interval valued probability to deal with the problem of uncertainty. Uncertainty involves non(sub ) additive probability that introduces the immense complications of non additivity and non linearity into an analysis of decision making. Uncertainty, U, itself is a function only of the Evidential weight of the argument,w,or U=g(w). It occurs if Keynes’s Evidential Weight of the Argument,V(a/h) =w ,where 0≤w≤1,is less than 1.A w
- Published
- 2020
- Full Text
- View/download PDF
6. J M Keynes’s 1931 Comment, '…I Yield to Ramsey, I Think He Is Right' Refers to Ramsey’s Work on Precise Probability and Degrees of Belief, Not to Imprecise Probability and Degrees of Rational Belief: 20th and 21st Century Philosophers and Economists Simply Are Ignorant About Keynes’s Imprecise Theory of Probability Contained in Part II of the A Treatise on Probability
- Author
-
Michael Emmett Brady
- Subjects
Argument ,Philosophy ,Yield (finance) ,Value (economics) ,Order (group theory) ,Special case ,Imprecise probability ,Mathematical economics ,Rational belief ,Merge (linguistics) - Abstract
A major error ,committed by all philosophers and economists in the 20th and 21st century who have written on the 1931 comment of Keynes on Ramsey about “…I yield to Ramsey, I think he is right”, is their failure to recognize that Keynes’s logical theory of probability is an imprecise theory of non additive probability based on intervals and dealing with rational degrees of belief, whereas Ramsey’s theory is a precise theory of additive probability that deals with degrees of belief only. The two theories merge only in the every special case where Keynes’s weight of the argument, V(a/h) =w,0≤w≤1, has a value of w=1 and all probability preferences are linear. Nowhere in any of Ramsey’s publications during his life is there ANY recognition on his part that the two theories are diametrically opposed except in the special case where w=1 and probability preferences are linear. It should have been obvious to Ramsey, if he had indeed read the book that he claimed he had read, that Keynes’s probabilities MUST be non additive if, as Ramsey also failed to recognized, only a partial order can be defined on the probability space.
- Published
- 2020
- Full Text
- View/download PDF
7. J M Keynes and E. Borel’s Initial Skipping of Part II of the A Treatise on Probability in His 1924 Review: What Changed Borel’s Mind 15 Years Later?
- Author
-
Michael Emmett Brady
- Subjects
Lexis ,Common knowledge (logic) ,Index (publishing) ,Law of thought ,Reading (process) ,media_common.quotation_subject ,Space (mathematics) ,Imprecise probability ,Frequency ,Mathematical economics ,media_common - Abstract
Emile Borel’s review of the A Treatise on Probability in 1924 is, in my opinion, quite above average. I would give it a grade of B/B+. Borel was also an intellectually honest researcher. Borel did not pretend to have read Parts II, III, IV, and V of Keynes’s A Treatise on Probability, as has been done repeatedly by psychologists, philosophers, historians, and economists, who have cited the A Treatise on Probability in their references when writing about Keynes’s logical theory of probability in his A Treatise on Probability. Borel apologizes to Keynes (and Bertrand Russell,who Borel knew had assisted Keynes in writing the A Treatise on Probability) for not reading Part II of Keynes’s A Treatise on Probability because he realized that, for Keynes, Part II was the most important part of the A Treatise on Probability. Borel was correct. It was the most important and intellectually powerful part of the book. It was the most important and intellectually powerful part of the book because Keynes presented for the second time in history a theoretical, technically advanced approach to imprecise probability. The first attempt in history was Boole’s original achievement in The Laws of Thought in 1854. Adam Smith had presented the first non technically advanced imprecise theory of probability in 1776 in the Wealth of Nations, which was opposed by Jeremy Bentham’s precise theory of probability that was used in his 1787 The Principles of Morals and Legislation. However, Borel bemoaned the fact that Maxwell, who was a graduate, just like Keynes himself, of Cambridge University, who had made contributions to physics using the limiting frequency interpretation of probability, which Borel thought that Keynes had given insufficient space and emphasis to in his book, had been overlooked by Keynes. This is correct with respect to Part I of the A Treatise on Probability. However,it is incorrect with respect to the totality of A Treatise on Probability because Keynes covered Maxwell on pp.172-174 of Chapter 16 in Part II. Maxwell is listed in the index to A Treatise on Probability on p.463. In Part V, in chapter 32, Keynes makes it clear that, if the only relevant evidence consists of statistical frequencies and there is no other relevant evidence, then the logical probability estimate of a probability is identical to the estimate made by the limiting frequency theory if the statistical frequency can be shown to be stable over time using the Lexis -Q test. Another important result of this paper is that it appears that no academic has read Part II of the A Treatise on Probability since 1921. Otherwise, it should already have become common knowledge that Keynes had covered Maxwell in chapter 16 of Part II of the A Treatise on Probability.
- Published
- 2020
- Full Text
- View/download PDF
8. Adam Smith as an Example of Samuelson’s 1952 Point That Mathematics Can Be Written Out in the English Language: On Smith’s Anti-Utilitarianism Based on the Differences Between Precise and Imprecise Probability Presented in 1776 in the Wealth of Nations
- Author
-
Michael Emmett Brady
- Subjects
Nothing ,Utilitarianism ,Foundation (evidence) ,Contrast (statistics) ,Legislation ,Sociology ,Schools of economic thought ,Adversary ,Positive economics ,Imprecise probability ,Mathematics - Abstract
In the Wealth of Nations in 1776, Smith gave two clearly worked out mathematical examples involving a comparison- contrast examining the concepts of precise probability (exact, definite, linear, numerical) and imprecise probability(inexact, indefinite, nonlinear, non numerical) that must incorporate uncertainty, which means there is missing or unavailable evidence that is not available to the decision maker at the time that he must make a choice between two or more different alternative options or alternatives. Smith’s analysis is carefully presented on pp. 106-113 and pp. 419-423 of the Modern Library edition of the Wealth of Nations edited by Cannon with the foreword by Max Lerner. It is interesting that there has not been a single academic economist, philosopher, historian, sociologist, psychologist, political scientist, social scientist or decision theorist in the 244 years since Smith published the Wealth of Nations in 1776 to note this fact. The fact that Smith believed that the use of precise probability, as advocated by Jeremy Bentham, who was Smith’s great intellectual opponent and adversary, was possible only under very special conditions, explains why Smith rejected utilitarianism as an ethical system and foundation for the science of economics-the requirement for precise probabilities and precise outcomes was, in general, not possible, due to the fact of missing or unavailable relevant information data ,knowledge or evidence that the decision maker would need to estimate the consequences in the future of his present decision to act. An example of this severe misunderstanding and confusion of Smith’s approach to decision making can be seen, for just one instance, in the 2016 paper by Hollander Thus, discussions about whether Smith was a utilitarian ,partly a utilitarian, whatever that may mean, not a utilitarian or anti-utilitarian are all besides the point once it is realized that Smith completely rejected the additivity and linearity of the probability calculus upon which Bentham based his utilitarianism on, that all men can calculate. Smith realized that Bentham’s belief in the ability to calculate future consequences was extremely limited. Apparently, economists can’t read the English Language that Smith used to express his mathematical analysis of his approach to decision making in the Wealth of Nations. The belief that Smith did not use mathematical analysis in the Wealth of Nations can only be a conclusion reached by economists who are themselves mathematically illiterate, inept, innumerate or severely confused about how mathematical arguments and analysis can be presented. This leads to the conclusion that the M. Friedman, G. Becker, and G. Stigler school of economics, that is taught at the University of Chicago, can have nothing to do at all with Adam Smith’s Wealth of Nations because they base all of their economic analysis on precise probability, which is an approach that is identical to that expressed in the original work of Jeremy Bentham in chapter IV of the Principles of Morals and Legislation (1787).
- Published
- 2020
- Full Text
- View/download PDF
9. The Claim That the Diagram on Page 39 of Keynes’s a Treatise on Probability(1921) Represents ‘Keynes’s View of Probability’ (S. Bradley, 2019), Has No Support: It Represents a Very Brief Introduction to Part II of Keynes’s a Treatise on Probability On Non Additive Probability
- Author
-
Michael Emmett Brady
- Subjects
Structure (mathematical logic) ,Non additivity ,Philosophy ,Diagram ,Encyclopedia ,Imprecise probability ,Mathematical economics ,Interval valued ,Word (group theory) ,Logical theory - Abstract
A major error in analyzing how Keynes operationalized his logical theory of probability in 1921 is to assume that Keynes’s theoretical structure is presented by him at the end of Chapter III of the A Treatise on Probability on pp. 38-40, which contains a diagram on page 39 that Keynes himself characterized as being a “brief” illustration that would be supplemented later with a “detailed" analysis in Part II. Economists, who have written on Keynes’s A Treatise on Probability, such as G. Meeks, D. Moggridge, R. Skidelsky, R. O’donnell, A. Carabelli, A. Fitzgibbons, and many, many others, have erred by failing to cover Keynes’s non additive, non linear approach, using Boole’s interval valued probability, which is based on lower and upper probability bounds and represents a detailed approach to imprecise probability, in Parts II and III of the A Treatise on Probability. Instead, it is erroneously argued, on the basis of this diagram alone, that Keynes’s approach was an ordinal theory that could only be implemented some of the time. No philosopher has ever made this error until the publication of an article in 2019 by S. Bradley in the Stanford Encyclopedia of Philosophy dealing with the origins of Imprecise Probability. The contributions of the founders of the imprecise approach (Boole and Keynes would use the word ‘indeterminate’) are simply either skipped over, as in the case of Boole, or completely misrepresented, as in the case of Keynes.
- Published
- 2020
- Full Text
- View/download PDF
10. Keynes Rejected Kalecki’s Theory of Investment Because There Is No Major Difference Between Kalecki’s and Tinbergen’s Theories of Investment: Both Kalecki and Tinbergen Accepted Precise Theories of Probability Because They Were Frequentists
- Author
-
Michael Emmett Brady
- Subjects
Effective demand ,Investment decisions ,Frequentist inference ,Animal spirits ,Economics ,Investment function ,Capitalism ,Imprecise probability ,Investment (macroeconomics) ,Mathematical economics - Abstract
It is quite impossible for Kalecki’s Theory of Effective Demand to have anything to do with Keynes’s Theory of Effective Demand because Kalecki, like Tinbergen, was a frequentist who accepted only precise, exact, additive numerical probability as the general case. For Keynes, probability was generally imprecise, inexact, non additive and non numerical (interval). The belief that there is some kind of connection between Kalecki’s frequentist theory of investment and Keynes’s non frequentist theory is due to the false claims made by Joan Robinson, a mathematically and statistically illiterate economist, who did not realize that Kalecki’s theory of investment is in all major respects, just another version of Tinbergen’s theory based on frequentist, precise probability. Lopez and Mott (1999) and Mott (2009) do not seem to have any knowledge of the fact that Kalecki’s theory of investment and Tinbergen’s theory of investment are, in all major respects, identical: “Investment, and the level of employment in capitalism, are highly volatile. In part, investment volatility stemmed from psychological factors, such as 'animal spirits,’ expectations and conventions. But it was due also to an assumption that the 'decision period’ for capitalists (i.e. a period long enough for capitalists to take new decisions) was a very short one. Capitalists were viewed as taking decisions almost on a day-to-day basis. Kalecki never denied that psychological factors do influence investment decisions or that investment might be volatile. In fact, in several works he actually made reference to a 'crisis of confidence’. But in his theory the weight is given entirely to 'objective’ factors. He insisted that capitalists did not react solely, or mainly, to their expectations, but rather to the 'hard fact’ of realized profits; and he assumed the investment function to be relatively stable in the sense that investment will not fall or rise due to events with a very short life.” (Lopez and Mott, 1999, pp.293-294; boldface and underline added). The boldfaced and underlined sentences above are identical to the position that Tinbergen defended against Keynes in the Economic Journal in the period 1938-1940. Lopez and Mott (1999),as well as Mott (2009), appear to be completely unaware that their second paragraph would be an excellent summary of Tinbergen’s critique of Keynes that he made in 1940. It should not be as surprising, then, that Tinbergen and Kalecki, who are both Frequentists and both advocates of precise probability, have theories of investment completely different from Keynes’s theory, which is built on imprecise probability, inexact measurement, and approximation, which allows for evidence to be incorporated in probability assessments in the form of propositions that allows a decision maker to consider evidence which is infrequent and nonfrequent, as well as frequent, evidence as opposed to Kalecki and Tinbergen, where the only evidence allowed is relative (statistical) frequencies. Lopez and Mott (1999),as well as Mott (2009), are thus completely unaware that the basic, fundamental conflict between Tinbergen and Kalecki on the one hand, and Keynes on the other hand, is over the application of probability to conduct (decision making). Kalecki and Tinbergen are both frequentists, who believe in precise and exact, additive probability. Keynes is a logicist who, with George Boole, is the founder of the logical approach to logical theories of probability that are inexact, non additive and imprecise. Keynes’s explicit discussion of approximation and inexact measurement on pp.39-40 and pp.43-44 of the General Theory in chapter Four are taken directly from Part II of the A Treatise on Probability, 1921.
- Published
- 2020
- Full Text
- View/download PDF
11. Some Examples of 21st Century Philosophers Who Have Written About Keynes’s Logical Theory of Probability, but Have Skipped Part II of the A Treatise on Probability, Where Keynes Presented His Interval – Valued Approach to Probability
- Author
-
Michael Emmett Brady
- Subjects
Law of thought ,Position (vector) ,Philosophy ,Imprecise probability ,Mathematical economics ,Interval valued ,Logical theory - Abstract
Nearly one hundred years after Keynes published his A Treatise on Probability in 1921,it appears that practically no philosophers have read Part II of the A Treatise on Probability in either the 20th or 21st centuries. This simply means that no modern day philosopher is in any position to recognize that Keynes’s work on his interval valued approach to imprecise probability in 1907,1908 and 1921 makes him the founder, along with G Boole’s contributions in his 1854 The Laws of Thought ,of the imprecise theory of probability long before Koopman’s work in 1940 or the work of I. J. Good, C. Smith ,or H.E. Kyburg in the early 1960’s.
- Published
- 2020
- Full Text
- View/download PDF
12. After 100 Years, the Time Has Come to Acknowledge That Boole and Keynes Founded a Mathematically, Technically, and Logically Advanced Approach to Imprecise Probability
- Author
-
Michael Emmett Brady
- Subjects
symbols.namesake ,Computer science ,Generalization ,symbols ,Cover (algebra) ,Imprecise probability ,Mathematical economics ,AND gate ,Boolean algebra - Abstract
Keynes’s and Boole’s contributions to the theory of imprecise probability are not just “notions” or “suggestions” or “intuitions”. Keynes and Boole actually worked out problems in great detail in which they derive lower and upper probability bounds based on their foundation of Boolean algebra and logic. Their work is very advanced and compares very favorably to work done up to the mid 1980’s, when T. Hailperin made major advances in the generalization of the Boole-Keynes approach. Unfortunately,it appears that these contributions are not known,have been ignored,or are of a technical nature that is too difficult for present day researchers to master. Only Emil Borel in 1924 gave an answer, which was that it was too difficult for him to cover.
- Published
- 2020
- Full Text
- View/download PDF
13. Can Shiozawa’s, Morioka’s and Taniuchi’s Microfoundations for Evolutionary Economics (2019) Serve As the Microfoundations for '… Post-Keynesian Economics ' (2019, p.vii)? The Answer Is Definitely Yes if Post –Keynesians Can Break Away From Joan Robinson’s Anti-Mathematical, Anti-Formalist Views
- Author
-
Michael Emmett Brady
- Subjects
Short run ,Keynesian economics ,Shackle ,Satisficing ,Post-Keynesian economics ,Imprecise probability ,Principle of effective demand ,Microfoundations ,Bounded rationality - Abstract
Although Herbert Simon never read J M Keynes’s A Treatise on Probability (1921) or understood the necessary connections between the General Theory (1936) and the A Treatise on Probability, he independently discovered an alternate formulation that was equivalent to Keynes’s approach, but nowhere as technically advanced. Simon’s approach thus leads to the same kind of conclusions and results that Keynes provided in the A Treatise on Probability in 1921. On p.xii, Shiozawa correctly states that “Bounded rationality is the basis of all evolutions of economic entities…” and “Because of bounded rationality, any existing entities are not optimal at any time.”, it will be necessary to connect Keynes’s degree of logical probability, P(a/h) =α, where α is a degree of rational belief, which is defined on the unit interval between 0 and 1, to Simon’s work. Keynes’s interval valued probability is always bounded below and above by lower and upper probabilities. This is what Keynes meant by uncertainty, which requires the evidential weight of the argument, V (a/h)=w, also defined on the unit interval between 0 and 1, to almost always be less than 1, so that risk assessments can’t, in general, be made about future outcomes unless one is dealing with the short run or immediate or near future. As noted by Keynes in chapter 5 of the General Theory, these short run expectations are usually fulfilled most of the time, so that w is close to, near, or approximately 1, unless negatively impacted by changes in long run expectations regarding fixed investment/technical Innovation,which have low to very low w values. Therefore, simple three to six day moving average models can be reliably used to forecast short run production, inventory, stockout, buffer stock, and consumption activities (see chapters 4 and 5 by Morioka and his construction of “ … a dynamic and multisector model of the multiplier theory…” first theoretically developed by Keynes in the A Treatise on Probability in 1921 in chapter 26 on page 315 in footnote 1, which was then applied by Kahn and Kalecki later in the 1930’s. Taniguchi provides valuable mathematical and applied analysis of Operations Management, Production Management, and Supply Chain subjects and issues, that are used in the quantity adjustment process of the firm. This point was originally introduced by Shiozawa in an earlier chapter in the book. However, in the case of total ignorance (Shackle’s complete and total uncertainty or fundamental uncertainty, w=0, which he developed based on the ideas of Joan Robinson), Post Keynesians argue that such mathematical models ,as used by Shiozawa, Morioka, and Taniuchi, would not be applicable. This is precisely Joan Robinson’s claim, that mathematics can not be used in economics because no one ever knows anything about the future, be it near or far; hence, the mathematical equations and functions do not, and can’t, exist. However, for Keynes, this type of argument, about the impact of total ignorance on analyzing outcomes, deals only with the distant or far future and not with the near or immediate future. The Post Keynesian school, following Joan Robinson, G L S Shackle and Paul Davidson, has completely confused the definition of uncertainty made by Keynes in chapter 12 of the General Theory on page 148 in footnote 1, where Keynes defined uncertainty to be an inverse function of the weight of the argument, V, which must come in degrees, with a notion that there is always complete and total uncertainty about any event in the future, so that it does not matter in distinguishing the short run (near or immediate future) from the long run (distant or far future). All events can only be either certain or they must be uncertain for the Post Keynesian school, since uncertainty is the negation of certainty. It is impossible to have degrees of uncertainty or liquidity or disquietude for the Post Keynesian school, just as it is impossible to have degrees of ergodicity or non ergodicity. Post Keynesians, who argue that there are degrees of uncertainty and that uncertainty requires non ergodicity, are involved in an immense logical contradiction. It is very likely, then, that Post –Keynesians, who are unanimously loyal to the agenda established by Joan Robinson, while implicitly completely rejecting Keynes’s A Treatise on Probability and General Theory approach to uncertainty ,will also reject Shiozawa’s, Morioka’s and Taniuchi’s Microfoundations for Evolutionary Economics (2019), due to the necessary mathematical formulations contained in their book that are absolutely needed in order to develop important analysis required from operations management, production management, and supply chain management applications, which can be viewed as major advances on Keynes’s early 1930’s emphasis on the importance of maintaining sufficient buffer (safety) stocks. Buffer stocks must be maintained to avoid supply side shocks ,such as those that hit the world economy in the mid-1970’s to mid-1980’s at both the macro and micro levels. The second important point made by Shiozawa is that optimal results can’t ever be calculated, but, following Simon, satisfactory results can be expected to result from a process involving study, memory, intuition, experience and expertise. The conclusion that optimization can’t be accomplished under bounded rationality was also arrived at by Keynes with respect to his definition of degree of rational probability, α. Keynes’s worked out examples of his conventional coefficient of weight and risk, c, in footnote 2 on page 315, which was offered by Keynes as an alternative formulation to the much more difficult interval valued approach of using upper and lower bounds, that Keynes had worked out in Parts II and III of the A Treatise on Probability, which he called approximation and inexact measurement, leads to the same conclusion. Optimal results require exact,precise probabilities, but Keynes’s imprecise probabilities can allow a decision maker in a firm-industry to obtain a satisfactory result. Simon is implicitly relying on imprecise probability assessments by decision makers. Everything developed in this book is based on, and follows from, the foundation supplied by H. Simon to Shiozawa, Morioka, and Taniuchi. Unfortunately, H. Simon’s approach has been rejected by Post Keynesians, who, instead of using Keynes’s very similar approach, are using a diametrically conflicting approach to uncertainty and risk that was authored by Joan Robinson, G L S Shackle, and P Davidson. This book effectively develops a microeconomics consistent with quantity (output) adjustment that follows directly from Keynes's Principle of Effective Demand. I believe that the authors can extend this to Keynes's macroeconomic structure in the General Theory in the future.
- Published
- 2020
- Full Text
- View/download PDF
14. A Comparison of J. M. Keynes’s Logical Approach to Probability and Any ‘Objective Bayesian’ Approach to Probability Needs to Incorporate All Five Parts of Keynes’s a Treatise on Probability, Not Just Part I
- Author
-
Michael Emmett Brady
- Subjects
Argument ,Bayesian probability ,Liquidity preference ,Economics ,Upper and lower probabilities ,Imprecise probability ,Understatement ,Upper and lower bounds ,Mathematical economics ,Unit interval - Abstract
Philosophers, historians, economists, decision theorists, and psychologists have been repeating a very severe error of omission for nearly a hundred years that was originally made by the French mathematician Emile Borel in his 1924 review of the A Treatise on Probability, 1921. Borel decided to skip Parts II through V of the A treatise on Probability. He explicitly apologized to Keynes at the beginning of his review for his decision involved in skipping Part II, acknowledging to Keynes, correctly, that Part II was the most important part of the A Treatise on Probability. Borel’s acknowledgment and apology are, in fact, an understatement, because without an understanding of Part II,it is impossible to understand Keynes’s theory of decision making and the role played by that theory in the General Theory(1936). This all comes out in the Keynes-Townshend exchanges of 1937 and 1938, where Keynes makes it crystal clear to Townshend that his theory of liquidity preference is built on his non numerical probabilities, which a reading of Part II makes clear are interval valued probabilities, each with an upper bound and a lower bound. These probabilities are non additive. Keynes’s definition of uncertainty on page 148 of chapter 12 in footnote 1 defines uncertainty as an inverse function of Keynes’s evidential weight of the argument, defined on the unit interval between 0 and 1. Any probability with a w < 1 is an interval valued probability that is non additive. The only way to discuss Keynesian uncertainty is by non additive, interval valued probability or by decision weights like Keynes’s c coefficient. D. P. Rowbottom attempts a defense of Keynes’s position against J. Williamson’s intellectual attacks which I view as correct. However, Rowbottom badly handicaps himself by providing a defense of Keynes’s position that is limited to the use of Part I of the A Treatise on Probability. Rowbottom could have presented an overwhelming counter argument against Williamson if he had understood Keynes’s concepts of interval valued, non additive theory of imprecise probability from Part II of the A Treatise on Probability, Keynes’s finite probabilities from Part III, Keynes’s decision weight translation of imprecise probability in chapter 26 of Part IV and Keynes’s inexact, approximation approach to statistics in Part V that Keynes combined with his application of Chebyshev’s Inequality for establishing the lower bound of a probability estimate. Starting with the 1940 work of Koopman and continuing through the work of,for example H. Kyburg,Jr.,I. Levi, I. J. Good,and then on to the work of for example, B.Weatherson, D. Rowbottom, B. Hill, S. Bradley and practically all other academics who have written on Keynes and imprecise probability, the exact same error of omission has kept on repeating itself over and over again for a 100 years.
- Published
- 2020
- Full Text
- View/download PDF
15. B. Hill’s ‘Confidence’ Approach to Decision Making Under Uncertainty Completely Overlooks the Contributions Made in J M Keynes’s Parts II -V of His a Treatise on Probability, 1921 and General Theory, 1936: Keynes’s Interval Valued Approach to Imprecise Probability and Decision Weight Approach Appeared Some 60–80 Years Before Hill Began His Research Program
- Author
-
Michael Emmett Brady
- Subjects
Decision weight ,Research program ,General theory ,Bayesian probability ,Economics ,Quality (philosophy) ,Imprecise probability ,Mathematical economics ,Interval valued ,SIPTA - Abstract
B. Hill’s work on the “Confidence approach “ to decision making under uncertainty is based on the use of interval valued probability that is categorized as being imprecise, in contrast to the standard Bayesian requirement that a probability assessment must be precise. This requirement is imposed by ignoring the issue of the relative strength or weakness of the supporting evidence upon which the assessment of different probabilities is being based ,or what Adam Smith referred to in 1776 in the Wealth of Nations as the differing quality of the supporting evidence. Hill’s work overlooks the work done by J M Keynes in this area some 80 -100 years before his work appeared .The conclusion reached is that what Hill views as being a new, novel, original, creative, and innovative approach is, in fact, when compared to Keynes’s much earlier work, a marginal or incremental improvement only. The problem is Hill’s reliance on the very poor work done on Keynes’s contributions by B. Weatherson and I J Good. This work is based only on chapters I-IV and a severe misreading of chapter VI of Keynes’s A Treatise on Probability, as well as overlooking Keynes’s application of his work on pp.309-315 of the A Treatise on Probability in chapter 26 to Keynes’s discussions of confidence in the General Theory. None of Keynes’s technical work in Parts II -V of the A treatise on Probability or the General Theory is ever mentioned by Hill. Hill’s overlooking of Keynes’s contributions is representative of the current level of understanding of Keynes’s contributions exhibited by members of SIPTA as of the year 2020.
- Published
- 2020
- Full Text
- View/download PDF
16. J M Keynes’s Contribution to Solving the Certainty Effect Problem: How Some Philosophers Overlooked Keynes’s Conventional Coefficient of Weight and Risk, C
- Author
-
Michael Emmett Brady
- Subjects
Reflection (mathematics) ,media_common.quotation_subject ,Economics ,Certainty ,Certainty effect ,Imprecise probability ,Decision maker ,Mathematical economics ,Preference ,media_common - Abstract
J M Keynes solved the problems of the certainty, reflection, translation, and preference reversal effects long before these effects were specified in the post world war II literature by psychologists. Keynes recognized in chapter 26 of the A Treatise on Probability (1921; p.313) that all of these effects were a result of non linear probability preferences on the part of the decision maker. An understanding of Keynes’s contribution would have helped philosophers, such as I. Levi and B. Weatherson, to deal with this problem.
- Published
- 2020
- Full Text
- View/download PDF
17. The Myth that Ramsey Destroyed and Demolished Keynes’s Logical Theory of Probability is Easily Dismissed as a Fairy Tale by Anyone who has read Parts II-V of the A Treatise on Probability (1921)
- Author
-
Michael Emmett Brady
- Subjects
Competition (economics) ,Law of thought ,Philosophy ,media_common.quotation_subject ,Intuition (Bergson) ,Ignorance ,Tournament ,Biography ,Type (model theory) ,Imprecise probability ,Mathematical economics ,media_common - Abstract
Ramsey’s many ,many confusions and errors about Keynes’s logical theory of Probability all stem from his failure to a) read more than just the first four chapters of Keynes’s A Treatise on Probability(1921),b) his gross ignorance of Boole’s logical theory of probability that Keynes had built on in Parts II,III,IV,and V of the A Treatise on Probability,c) his complete and total ignorance of real world decision making in financial markets(bond, money, stocks, commodity futures),government,industry and business,and d) his complete and total ignorance of the role that intuition and perception played in tournament chess competition under time constraint,a role that was taught to J M Keynes by his father ,J N Keynes,who was a rated chess master who played first board for Cambridge University in the late 1870’s and early 1880’s. Anyone who has read Parts II,III,IV and V of the A Treatise on Probability can avoid making the type of errors that have recently shown up again in C.Misak’s( 2020) biography of Ramsey,where it is asserted that Ramsey easily demolished Keynes’s logical theory of probability or S Bradley’s (2019) historical foray into the beginnings of imprecise probability,which are based on B.Weatherson’s (2002 ) bizarre claims that the “modern” theory of imprecise probability, which uses interval valued probability defined by lower and upper probabilities,as well as Kyburg’s own deficient knowledge base,can be used to help explain Keynes’s strange ,unfathomable and mysterious beliefs in “ non numerical” probabilities. Of course,since Keynes’s work in the A Treatise on Probability in Parts II-V is directly based on Boole’s theory of interval valued probability that defined lower and upper probabilities in chapters 16 -21 of the 1854 The Laws of Thought,what Weatherson(2002) and Bradley (2019)are doing is to reinvent the wheel,not knowing that the wheel had already been invented thousands of years before them.This error can be directly traced to both Weatherson’s and Bradley’s extremely limited understanding of Keynes’s introductory,initial,beginning approach to the use of interval valued probability that takes place in chapter III of the A Treatise on Probability on pp.30 and 34,which are the same pages emphasized by philosophers,for example, such as H E Kyburg and I J Good.
- Published
- 2020
- Full Text
- View/download PDF
18. One Hundred Years After Keynes’s a Treatise on Probability Appeared, There Is Still Pervasive Confusion Among Academics About the Logical Theory of Probability: Keynes’s Theory of Logical Probability Is an Interval Valued Theory of Probability Based On Boole That the 18 Year Old Ramsey Never Understood in His Life
- Author
-
Michael Emmett Brady
- Subjects
Dutch book ,Argument ,Philosophy ,Completeness (logic) ,Reading (process) ,media_common.quotation_subject ,Business cycle ,Biography ,Positive economics ,Special case ,Imprecise probability ,media_common - Abstract
The very recent publication of C. Misak’s autobiography of F P Ramsey in 2020, as well as a number of book reviews made by different reviewers of that book, have resulted in the resurrection of highly misleading claims made by F. Ramsey, when he was 18 years old, that can only be characterized as being silly -to someone who has actually read the entire book. Therefore, individuals, who have only read Part I of Keynes’s book, are highly susceptible to claims made when reading book reviews of academics who never read the book they claimed to be reviewing. Such is the case with Frank Ramsey. The Keynes -Townshend correspondence in 1937 – 1938 over the connections between the A Treatise on Probability and the General Theory, as well as the Keynes-Tinbergen exchanges of 1938-1940 concerning Keynes’s critique or Tinbergen’s use of precise probability and the Normal distribution to model the business cycle, make it crystal clear that Keynes never accepted Ramsey’s critique in his lifetime and continued to use his own logical theory of imprecise probability .Ramsey’s subjective theory of probability is a theory of precise probability that can’t deal with overlapping evidence or conflicting evidence. Of course, Keynes did believe that Ramsey had strengthened the logical foundations for PRECISE probability with his betting quotient approach combined with his Dutch book argument regarding degrees of belief. However, Keynes’s logical theory is one that is based on rational degrees of belief and imprecise probability. Only in a very special case, where Keynes’s degree of the completeness of the relevant evidence, w, equaled 1 on the unit interval would Keynes have considered using Ramsey’s theory. A similar fate has occurred regarding Keynes’s separate theory of the Evidential Weight of the Argument. Here we will find a plethora of mathematical and logical errors littering the academic journals in economics and philosophy.
- Published
- 2020
- Full Text
- View/download PDF
19. How Did Clive Bell, One of Keynes’s Bloomsberry Artist Friends, Become a Recognized Expert on Keynes's a Treatise on Probability, given that He Had No Knowledge of Mathematical Logic, Statistics, Probability or Boolean Algebra?
- Author
-
Michael Emmett Brady
- Subjects
Mathematical logic ,Power (social and political) ,Action (philosophy) ,Decision theory ,Intuition (Bergson) ,Relevance (law) ,Tournament ,Psychology ,Imprecise probability ,Epistemology - Abstract
Clive Bell was an artist.There is no possible way that Clive Bell could have understood/advised Keynes about material appearing in his 1921 A Treatise on Probability or have had any understanding of the roles that intuition and perception played in Keynes’s logical theory of probability unless he had been a rated tournament chess player who understood the important role of intuition and perception, a role that could only be grasped by someone who has actually played Over-The-Board (OTB) tournament chess under time constraint (a clock). The belief that Clive Bell’s recollections/memories about his friendship with Keynes encompass a knowledge of Keynes’s logical theory of probability or Keynes’s concept of the role of intuition in decision making under time constraint are simply nonsense. Bell has been cited in work done by C. MIsak in 2020, that is related to her biography on Frank Ramsey, whose subjective theory of probability was regarded by Keynes as a academic exercise that had very limited applicability in the real world of messy, unclear, ambiguous decision making and in tournament chess, although there is a role for Ramsey’s blunt force approach in Correspondence (Postal) Chess. The failure to understand the relevance of the role of the time constraint in decision making is at pandemic levels in the academic fields of history, philosophy, psychology, economics, political science and decision theory. Consider the following claim made by R. Skidelsky, a historian: “Keynes’s people are thinkers, and he equips them with the tools of thought: logic. Ramsey’s people are actors and he equips them with the tools of action: calculating power.” Skidelsky,1992,pp.71-72). Keynes correctly realized that such calculating power is very rarely used in an OTB tournament chess game. It primarily occurs in opening play and/or in a situation where one player has prepared in advance an innovation (new move) that requires his opponent to be able to avoid a forcing line or lose. Keynes’s decision makers are thinkers who analyze the position on the board using their intuition to perceive the fundamental dynamic elements of the position. It is impossible to make any specific calculation. Ramsey’s decision makers are memorizers who are looking to calculate forced lines OTB that usually do not exist. A rare exception is the following game where memorization and calculating power come into play: White:Towne (USCF -1750) Black: Brady (USCF-1850) - August, 2007 (La Palma Chess Club, Calif., USA) 1)e4 g6 2)d4 c5 3)Nf3 cxd4 4)Nxd4 Bg7 5)Nc3 Qa5 6) Bc4? Qc5! 7)Bxf7 + KxBf7 -/+ and White resigned in forty moves, although he could have resigned at move 6.
- Published
- 2020
- Full Text
- View/download PDF
20. The Fundamental Error of Rational Expectations Proponents Is Their Claim that They Have Discovered True Statistical Models: Given that No Model Can Be True, Talk of Having a 'True Model' Is An Anti-Scientific Oxymoron Given Keynesian Uncertainty
- Author
-
Michael Emmett Brady
- Subjects
Rational expectations ,Oxymoron ,Computer science ,Probability distribution ,Probability and statistics ,Statistical model ,Expected value ,All models are wrong ,Imprecise probability ,Mathematical economics - Abstract
No model can ever be true. By definition, models are only, at best, approximations to reality. Some models are better approximations than others, so one can talk about one model being better than another model. However, to talk about a model yielding true predictions means that the speaker does not understand what a model is and what it is used for. This is especially true in the area of probability and statistics. George Box said it well when he stated that ‘all models are wrong, but some are useful.’ Rational expectations advocates violate basic scientific approaches to theory construction and model use when they claim that there is a true model of how the economy operates that consumers and producers can learn from experience. There can never be any scientific support for that claim or any of the following claims, given that all models are only approximations, which can never be true: • There is a true(correct, right ,valid) probability • There is a true(correct, right, valid) expectation • There is a true(correct, right, valid) model • There is a true(correct, right, valid) expected value • Consumers and producers can learn the true(correct,right,valid) model • There is a true(correct, right, valid), objective probability or probability distribution • There are true(correct,right,valid) model consistent expectations
- Published
- 2020
- Full Text
- View/download PDF
21. J M Keynes Was Never Concerned About Ramsey’s ‘Critique’ of His Logical Theory of Probability Because He Realized That Ramsey Had No Idea About What His Theory of Imprecise Probability Entailed: Ramsey Was an Advocate of Precise Probability
- Author
-
Michael Emmett Brady
- Subjects
Conceptualization ,media_common.quotation_subject ,Imprecise probability ,Mathematical economics ,Economic consequences ,Interval valued ,Reputation ,media_common ,Exposition (narrative) ,Treasury ,Logical theory - Abstract
Keynes had successfully applied his theory of logical, imprecise probability in his Indian Currency and Finance(1913), during his time in the British Treasury from 1914 till 1919, and in his The Economic Consequences of the Peace(1919). What Keynes applied was the concept of inexact measurement and approximation, which in Part II of the A Treatise on Probability is clearly shown by Keynes to be an interval valued theory of probability of upper and lower bounds. In January, 1922,an 18 year old Frank Ramey published an extremely poor review of Keynes’s A Treatise on Probability in the Cambridge Magazine. Keynes realized immediately that Ramsey’s conceptualization of probability was an exact and precise version of additive mathematical probability with better epistemological foundations, while his theory was about inexact, imprecise, non additive interval valued probability. Keynes also realized that Ramsey’s theory always assumed that w, the evidential weight of the evidence, was complete, so that Ramsey’s foundation could only accommodate additivity, whereas for Keynes the foundation was always based on non-additivity, with additivity restricted to fields like physics and engineering. Keynes recognized, however, that Ramsey’s great intellectual capabilities would lead him to have a tremendous academic career, although Keynes himself was not an academic, given his vast understanding of actual, ambiguous, real world decision making. Keynes thus used his great influence and reputation to assist Ramsey’s academic career, while also clearly understanding that Ramsey’s theory of subjective probability was basically one that would be generally limited and restricted to academic and board room exposition in journals and business reports. It is amazing that after 100 years there still are no academics who have the slightest understanding of the interval valued theory of non additive, imprecise probability erected in Part II of the A Treatise on Probability by Keynes or the decision weight approach of the conventional coefficient C. There is not a single mention of Ramsey and/or subjective probability in: • Keynes’s pre-General Theory drafts and writings from 1932-35 • Keynes’s 1936 General Theory • Keynes’s 1937 Quarterly Journal of Economics article • Keynes’s correspondence with H. Townshend in 1937-1938 • Keynes’s exchanges with Tinbergen in 1938-1940
- Published
- 2020
- Full Text
- View/download PDF
22. Keynes’s Method in the A Treatise on Probability and the General Theory Is Inexact Measurement Involving Approximation, Imprecise Probability, and Indeterminate Probability Using Boolean Interval Valued Probability: There Is no Explicit Theory of Ordinal Measurement Developed, Used or Deployed Anywhere in Either the A Treatise on Probability or the General Theory
- Author
-
Michael Emmett Brady
- Subjects
Meaning (philosophy of language) ,Level of measurement ,Law of thought ,Short run ,Statement (logic) ,Phenomenon ,Aggregate (data warehouse) ,Imprecise probability ,Mathematical economics - Abstract
Keynes was a lifelong proponent, advocate and user of the inexact measurement techniques that he had learned from reading Boole’s The Laws of Thought and been taught by William Ernest Johnson. Keynes called these techniques inexact measurement or approximation in the A Treatise on Probability. They involved the use of interval valued probability (upper-lower probability bounds) to deal with indeterminate probabilities and Chebyshev’s Inequality, which was used in order to provide a lower bound for imprecise probability estimates. Keynes extended his approximation techniques to encompass outcomes involving macroeconomic gross domestic product and other aggregate measures in Chapter Four of the General Theory. Chapter III of the A Treatise on Probability in an introduction to the discussion of measurement. The discussion of measurement is continued in chapter five and finished in Part II in chapters 15-17 of the A Treatise on Probability. It is impossible to understand Keynes’s approximation approach to measurement unless the reader of the TP has understood Chapters 15-17. Keynes is very clear on this: “It will not be possible to explain in detail how and in what sense a meaning can sometimes be given to the numerical measurement of probabilities until Part II is reached. But this chapter will be more complete if I indicate briefly the conclusions at which we shall arrive later.” (Keynes,1921, p.37) It is not possible for the Cambridge fundamentalists (Skidelsky, O’Donnell, Carabelli, Fitzgibbons, Runde) to understand Keynes’s views on measurement because they have not read or understood Part II of the A Treatise on Probability. Keynes’s unchanging lifetime views on inexact measurement versus exact measurement and imprecise probability versus precise probability explain the Keynes-Tinbergen debate of 1939-40. Tinbergen’s background in Physics meant that he was an advocate of the Limiting Frequency interpretation of probability. Tinbergen was thus an advocate of precise and definite probability and exact measurement. Tinbergen brought this view with him when he started working in economics. Keynes was the exact opposite. Tinbergen was used to analyzing inanimate phenomenon, like atoms, protons, electrons, particles, cells, molecules, genes, chromosomes, fair decks of cards, dice, coins, etc. Keynes was used to analyzing animate human behavior. Humans, unlike inanimate phenomenon, like atoms, protons, electrons, particles, cells, molecules, genes, chromosomes, fair decks of cards, dice, coins, can think and reason. Humans have memories, emotions, minds, and brains. While exact calculation and precision can be applied in many areas of the physical and life science, it is doubtful in most social sciences, liberal arts, behavioral sciences and especially in most areas of economics, finance and business. The exception would be studies of consumer consumption spending and business inventory demand, which are highly stable in the short run. Keynes was not an ordinalist .Nowhere in the TP or GT does Keynes develop ,apply, or advocate an ordinal theory of probability. The Keynesian fundamentalists have misinterpreted pp.38-40 of the TP by ignoring Keynes’s clear statement on p.37 of the TP that his detailed analysis on measurement would take place in Part II, while only a brief introduction would take place on pp.38-40 of chapter III.
- Published
- 2019
- Full Text
- View/download PDF
23. Keynes’s Major Result From Part II of the A Treatise on Probability Was That, Given That Numerical Probabilities Are Additive, Then Non Numerical Probabilities Must Be Non Additive: Non Additivity Is a Sufficient Condition for Some Degree of Uncertainty to Exist
- Author
-
Michael Emmett Brady
- Subjects
Property (philosophy) ,Law of thought ,Additive function ,Comparability ,Post-Keynesian economics ,Imprecise probability ,Mathematical economics ,Upper and lower bounds ,Infimum and supremum ,Mathematics - Abstract
Keynes’s major accomplishment in Part II of the A Treatise on Probability (1921),which he also accomplished in the 1907 and 1908 Fellowship Dissertation versions, was to show that the addition property of the purely mathematical calculus of probability could only be operational in certain circumstances where the evidential weight of the argument, V(a/h) =degree w, was equal to a w of 1,where 0≤w≤1, so that the decision maker had complete data/evidence set.All relevant information or evidence had be known before any decision had to be made. Thus, all numerical probabilities are additive, so that they would sum to 1. However, there were also non numerical probabilities, which were non additive because of the existence of missing, relevant data or evidence, that would not sum to 1. By far the most important case was sub additive,as opposed to super additive, probabilities, which would sum to less than 1. Note the obvious fact that ordinal probabilities are not only not non additive, since they can’t sum to less than 1, but they also can’t be multiplied. Keynes then showed that non numerical probabilities, which were non additive because they summed to less than 1, had to be interval valued probabilities with a lower (greatest lower bound) and a upper (least upper bound) bound. This, of course, followed directly from pp.265-268 of Boole’s 1854 The Laws of Thought. The adherents of all Heterodox, Post Keynesian, and Institutionalist schools of economics (Skidelsky, O’Donnell, Carabelli, Dow, Lawson, and Runde, for example. See the references) argue that Keynes’s non numerical probabilities were ordinal probabilities. However, this conclusion is an oxymoron, because ordinal probability can have absolutely nothing to do with questions concerned about additivity of probabilities summing to 1 or non additivity of probabilities not summing to one, but being less than 1, since, by definition, ordinal probabilities can’t be summed, added or multiplied. Numerical and ordinal probability can’t deal with non measurability, non comparability or incommensurability. Such problems require nonlinear and non additive approaches to measurement. Only Keynes’s and Boole’s initial ,imprecise approaches to probability, using interval valued probability, can do this. It should be obvious to a mathematically trained reader that Boole and Keynes are the founders of the imprecise probability approach to decision making. Unfortunately, there are no economists or philosophers who covered Part II of the A Treatise on Probability in the 20th or 21st centuries. All readers, like Emile Borel, the French mathematician, skipped Part II of the A Treatise on Probability. This is why the obviously false claim, that Keynes’s non numerical probabilities had to be ordinal probabilities, which directly contradicts Keynes’s position that they are non additive, continues to be generally accepted nearly one hundred years after the publication of Keynes’s A Treatise on Probability.
- Published
- 2019
- Full Text
- View/download PDF
24. J. M. Keynes (Inexact Measurement, Approximation, Non- Linearity, Non-Additivity, Interdependence, Imprecision) Versus J. Tinbergen (Exact Measurement, Linearity, Additivity, Independence, Precision) on Probability in 1939–1940: There Was No Middle Ground Between Them
- Author
-
Michael Emmett Brady
- Subjects
Independent and identically distributed random variables ,Econometric model ,Upper and lower probabilities ,Probability distribution ,Special case ,Imprecise probability ,Mathematical economics ,Principle of indifference ,Independence (probability theory) - Abstract
The Keynes-Tinbergen debates of 1939-40 pits two advocates of completely different, and diametrically opposed, methods of analysis. J M Keynes was a lifelong practitioner of Inexact Measurement, approximation, nonadditive, nonlinear, indeterminate and imprecise probability, which emphasized the application of Boole’s interval-valued approach to a logical probability using upper and lower bounds or limits. The exact measurement was a special case involving problems that satisfied Keynes’s principle of Indifference (combinatorics, permutations) or frequency data that met the Lexis-Q test for stability over the long run. Probabilities for Keynes were non-(sub or super) additive, incorporated nonlinear probability preferences, and were not independent unless the data was based on the inanimate phenomenon. J. Tinbergen was a lifelong practitioner of Exact Measurement, definite, precise, and determinate probability, which emphasized the application of the limiting frequency interpretation of probability. The exact measurement was the general case. Probabilities for Tinbergen were definite, exact, determinate, linear, additive and independent. Empirical evidence allowed one to specify a correct probability distribution of all possible outcomes before any choice needed to be made. Tinbergen’s physics background determined his approach to probability. There was no room for compromise between Keynes’s Inexact approach to measurement and Tinbergen’s Exact approach to measurement except in areas of study where an analysis showed fairly stable behavior exhibited by decision makers that changed very slowly over time. These fields would be consumer consumption expenditure studies and business inventory studies to meet that consumer demand. Tinbergen’s concentration on Investment spending on durable capital goods in his study about the Business cycle was precisely where Keynes would argue that precision was not possible, given the constant interactions of changing expectations of future profits with difficult to judge technological advance, innovation, change, and obsolescence problems. In August 1938, Keynes had sent Harrod his linear, first order, difference equation model that analyzed the interaction of the multiplier with the accelerator (relation) dynamically over time with respect to Harrod’s views on the relationship between full employment and the capital stock in his model of inter-temporal economic growth over time. Keynes reached the conclusion that it was impossible to identify full employment equilibriums from unemployment equilibriums, given a set of possible multiple equilibria that was being generated by the positive feedback engendered by the interplay of the multiplier-accelerator interactions. From 1939-1941, Samuelson had also reached the same conclusion using his own linear difference equation analysis. All that it was possible to conclude from the analysis was that the equilibriums, full employment or unemployment, were the dynamically stable or unstable. No type of Classical or neoclassical Benthamite or Walrasian pendulum models could be applied. Microeconomic theory, based on maximum–minimum optimization problems, could not be applied since this approach postulated the existence only of negative feedback. Samuelson essentially gave up on econometric modeling after 1945 since he realized that the econometric models were based on the existence of the Bentham-Say-Ricardo-Walras- Jevons-Fisher -Wicksell-Frisch “rocking horse(pendulum)” models that he had aptly described as being violin–sting plucking models. An examination of Tinbergen’s 1929 dissertation (see Buitenhuis,2015, pp. 46-51) shows that, in the economics part of the dissertation, Tinbergen was doing microeconomic optimization problems similar to the type of problems done by Paul Samuelson in his 1941 dissertation and 1947 Foundations of Economic Analysis based on negative feedback, but without the macro analysis provided by Samuelson involving the positive dynamic feedback effects resulting from the interactions of the multiplier and accelerator that can’t be modeled as optimization problems at the macroeconomic level. That this was Tinbergen’s Exact approach, taken from the pendulum models in mathematical physics, is confirmed in Tinbergen (1929), Tinbergen (1938), Tinbergen (1970), Cornelisse and Van Dijk (2006), Squartini and Garlaschelli (2014), and Buitenhuis (2015). Tinbergen’s work is basically Benthamite and Walsarian pendulum model building, combined with the Benthamite assertion that the whole (Macro) can never be any more than the sum of the individual parts(micro). Therefore, a macroeconomy composed of many utility/profit maximizing consumer-producers ,who are all identical to each other and independent ,is modeled as if they were a macro ensemble of gas particles, each of which is independent and identically distributed ,where each gas particle is randomly hitting other particles and exchanging electrons, so that, in the long run, the exchanges of electrons(exchanges of different bundles of goods and services) between all of the identical particles in the macro ensemble cancels out, so that a normal (lognormal) probability distribution describes the interactions. Tinbergen and Klein wanted to approach economics as if it were physics. Unfortunately, it is not the case that an argument of analogy holds between a particle in physics and an individual in economics. Exact, precise, linear, additive, definite, determinate probabilities are not available, in general, to provide an economic analysis except in the area of consumption and inventories. This was pointed out in great detail by Adam Smith in 1776. However, economic analysis can involve inexact, imprecise, indeterminate, nonlinear, nonadditive, indefinite quantitative analysis. Keynes and Tinbergen (Klein) came from different intellectual backgrounds. Keynes’s method is built on Boole’s approach using lower and upper probabilities. This approach runs throughout the TP. Tinbergen and Klein are using the precision of physics as their model of economic analysis.
- Published
- 2019
- Full Text
- View/download PDF
25. Keynes, and Only J. M. Keynes, Was Responsible for the Logical and Mathematical Development of the Multiplier Concept in 1921(1908) in His a Treatise on Probability (1921) That He Later Used in the General Theory (1936)
- Author
-
Michael Emmett Brady
- Subjects
Mathematical theory ,Series (mathematics) ,Geometric series ,Logical conjunction ,media_common.quotation_subject ,Economics ,Multiplier (economics) ,Limit (mathematics) ,Imprecise probability ,Infinity ,Mathematical economics ,media_common - Abstract
J M Keynes had already developed the theory of the Multiplier concept mathematically, logically, and technically in his "A Treatise on Probability" (1921). The same analysis can be found in his second Cambridge Fellowship Dissertation of 1908. Samuelson explicitly covered the material, Keynes’s risk (R) formula, presented by Keynes on page 315 of the A Treatise on Probability, in his 1977 article in the Journal of Economic Literature. However, Samuelson overlooked the footnote, footnote 1 on page 315 of the A Treatise on Probability, in which Keynes applies an explicit multiplier analysis to a geometric, declining, infinite series because Keynes left out the intermediate steps of taking the limit of the series as n, the number of terms in the series, approached infinity. Keynes simply gave the final answer one will obtain after he has taken the limit. Samuelson also overlooked Keynes’s generalized risk analysis on page 353 of the A Treatise on Probability ,which extended Keynes’s Risk analysis ,which used Chebyshev’s Inequality to derive a lower bound .This would be an imprecise probability that would become more accurate as more observations were obtained, leading eventually to a precise estimate of probability from the normal distribution if the decision maker could afford to delay action for the period of time needed. R. Kent’s 2007 article in the History of Political Economy leaves completely unresolved the issue of where Richard Kahn got the idea for the use of the multiplier from. Kahn, in 1936, stated, in a note on a paper of Neisser’s that appeared in the Review of Economic Statistics, that “…my own ideas were largely derived from Mr. Keynes.” (Kahn,1936, p.144). Kahn’s contribution originated in private conversations with Keynes, where Keynes showed Kahn his chapter 26 analysis contained in the A treatise on probability on page 315 in footnote 1. Keynes, and no one else in history, was the person who had already originated the mathematical theory of the multiplier, which Keynes in the General Theory called the logical theory of the multiplier. Keynes's discussion of the logical theory of the Multiplier on pp.122-123 of the General Theory is simply a literary description of the mathematical analysis presented by Keynes on page 315 of the A Treatise on Probability. Kent's belief that Keynes had presented a multiplier analysis in 1929 is correct. The issue of whether Keynes made an arithmetic error in adding up the terms of the series is completely irrelevant to the main issue, which is “who is the person who created the theory of the multiplier?”
- Published
- 2018
- Full Text
- View/download PDF
26. Greenspan's Synthesis of the ‘Keynes-Knight’ Approach and the Ramsey-De Finetti-Savage Approach in Decision Making: A Continuum Exists Between Situations of No Knowledge and Complete Knowledge
- Author
-
Michael Emmett Brady
- Subjects
Philosophical analysis ,Bayesian probability ,Value (economics) ,Economics ,Knight ,Imprecise probability ,Mathematical economics ,Outcome (probability) ,Term (time) ,Simple (philosophy) - Abstract
The differences between Knight’s approach in Risk, Uncertainty and Profit (1921) and Keynes’s logical theory of probability approach in the A Treatise on Probability (1921), on the one hand, and the Ramsey-Savage-de Finetti Subjective or Bayesian approach, on the other hand, are based on the question of whether it is always possible or not to estimate a probability with a precise, exact, numerical value. Keynes and Knight argued that it is not always possible to provide a precise numerical answer to the question, “What is the probability of this outcome relative to this evidence?”, while Ramsey, de Finetti and Savage argued that it was always possible. de Finetti and Savage added a qualification regarding their views on numerical probability only in the case involving the initial conditions at the beginning of a probability assessment. Due to a lack of enough evidence in the beginning stage of a probability assessment, an imprecise estimate of probability could result. However, as time went on, more additional, sufficient evidence would result that would always lead to a precise probability estimate. Much important evidence would be missing or vague or Ambiguous (Daniel Ellsberg’s term). Keynes and Knight argued that that there would be many cases of what Keynes called indeterminate probability estimates, where additional evidence would not be sufficient to lead to a precise probability by the time a decision had to be made. It is impossible to postpone many financial, economic, and business decisions until more, relevant information has accumulated that would lead to the convergence of a imprecise probability to a precise probability at some point in the future. Thus, it is the relative strength of the evidence that determines if a numerically precise probability can be assigned. The mathematical laws of the probability calculus assume that the available evidence used is relevant and complete before a probability calculation takes place. This is a situation of strong evidence. On the other hand, if evidence is missing or not available, one is dealing with a situation of weak evidence. Greenspan cuts through the logical, epistemological, and philosophical analysis made by Keynes and Knight to arrive at a simple and direct definition of uncertainty that entails the work of Keynes and Knight.
- Published
- 2018
- Full Text
- View/download PDF
27. The Unbridgeable Gulf Forever Separating A. Smith and J. M. Keynes from Bentham, Classical Economists, Neoclassical Economists, And Modernn Economists (Friedman, Becker, Stigler, Lucas, Sargent, Wallace, Muth, Kydland, Prescott, etc.): The Formal and Mathematical Concepts of Uncertainty, Weight of the Evidence, and Confidence
- Author
-
Michael Emmett Brady
- Subjects
Limiting case (philosophy of science) ,Decision theory ,Liquidity preference ,Bayesian probability ,Economics ,Probability distribution ,Upper and lower probabilities ,Classical economics ,Subjective expected utility ,Neoclassical economics ,Imprecise probability ,Mathematical economics - Abstract
It is logically impossible for Classical Economists, Neoclassical Economists, and “Modern” Economists (Irving Fisher, Friedman, Becker, Stigler, Lucas, Sargent, Wallace, Muth, Kydland, Prescott, etc.) to grasp Smithian and Keynesian economics because both Smith and Keynes explicitly incorporated a specific term, weight of the evidence, into their decision theories to deal with uncertainty besides an additive concept of probability. F. Knight’s distinction between risk and uncertainty is flawed because, although he tried, he failed to explicitly integrate such a term as the “weight of the evidence” into his theory of decision making in order to differentiate uncertainty from risk mathematically and logically. Knight needed an understanding of non-additivity. Integrating such a term into decision theory automatically creates a violation of the purely mathematical laws of the probability calculus because the probabilities can no longer be precise (additive), but must be modeled mathematically as interval valued probabilities (non additive). A concept of weight of evidence thus violates the additivity, complementarity, and linearity requirements needed for the application of either the frequency approach to probability, using objective, precise probabilities or the Subjective, Bayesian approach to probability of Ramsey, de Finetti, Savage, and M Friedman upon which Subjective Expected Utility is build, which requires precise, subjective probabilities. George Boole explicitly provided the first explicit logical and mathematical foundation for the concept of upper and lower probabilities, or interval valued probability, in 1854 with the publication of The Laws of Thought in which he put forth the first explicit logical theory of probability in history. Keynes built on Boole’s foundation and based his logical theory of probability and decision theory on interval valued probability. However, this non-additive and nonlinear approach directly conflicts with the linearity and additivity constructs that underlie the subjective and frequency approaches to probability of I. Fisher, Friedman, Becker, Stigler, Lucas, Sargent, Wallace, Muth, Kydland, Prescott, etc. It is not possible for subjectivists to deal with weight because it would automatically make the subjective theory of probability and SEU theories very special theories that could never be more than a limiting case of the work of Adam Smith, George Boole, and J M Keynes. Thus, the intellectual conflict over the role of the weight of the evidence and uncertainty variables in decision theory, which completely divided Smith from Cantillon and Bentham in the eighteenth century, also completely divided Keynes from the Classical and Neoclassical economists of the 20th century. Practically all of the “What did Keynes mean in the General Theory?” literature can be traced back to economists’ gross ignorance of the Keynesian concept of the weight of the evidence and its role in the General Theory. Once Keynes introduced the weight of the evidence variable into the General Theory on p.148, the probabilities(expectations) automatically become interval valued. Non-additivity and non-linearity become the general case, while classical and neoclassical theories based on Benthamite utilitarianism’s use of additive and linear precise probabilities, becomes a very special case of Keynes’s. Keynes’s concept of liquidity preference is directly founded on the logical and mathematical analysis of the weight of the evidence that he incorporated into his conventional coefficient of weight and risk, c, in chapter 26 of the A Treatise on Probability. The conventional coefficient , which substitutes a mathematically more tractable analysis for the much more difficult Boolean interval valued approach, adapted with revisions by Keynes for use in Part II of the A Treatise on Probability, allows a reader to see the explicit role that weight plays in Keynes’s theory .Part II of the A underlies Part III and Part III underlies Part V of the A Treatise on Probability. Liquidity preference automatically brings in interval valued probability if w, the weight of the evidence, is less than 1. Keynes’s analysis of imprecise probability, which occurs in chapter 29 the A Treatise on Probability, is based, as acknowledged by Edgeworth in 1922, on using Chebyshev’s inequality to form a lower bound with the upper bound being given by probability estimates using other known probability distributions, such as the Normal distribution. As more evidence is accumulated, improved estimates will start to approach the upper bound and become precise. Keynes’s analysis of indeterminate probabilities takes place in chapters 3, 5, 10, 15, 16, 17, 20, 22, and 26 of the A Treatise on Probability. “Modern” Economists (Irving Fisher, Friedman Becker, Stigler, Lucas, Sargent, Wallace, Muth, Kydland, Prescott, etc.) can’t deal with Keynes’s general theory of decision making, which underpins the General Theory, because all of their theoretical approaches must use precise probabilities that are additive.
- Published
- 2017
- Full Text
- View/download PDF
28. An Examination of Bruno De Finetti's Understanding of J M Keynes's a Treatise on Probability: Close but No Cigar
- Author
-
Michael Emmett Brady
- Subjects
Dutch book ,Bayesian probability ,Shackle ,Economics ,Subjective expected utility ,Coherence (philosophical gambling strategy) ,Post-Keynesian economics ,Imprecise probability ,Mathematical economics ,Probability interpretations - Abstract
Bruno de Finetti came close to understanding, but still disagreeing with, what Keynes was doing in the A Treatise on Probability, which Keynes had started working on in 1904 and continued developing through August, 1914. Keynes then put the work aside until he reviewed it for publication in 1920. It was published in 1921. Bruno de Finetti thought that, perhaps, Keynes ‘s views on “non numerical” probability might or could be supported if the decision maker had to use or resort to interval valued probabilities in the initial or beginning stages of an analysis of a decision problem. He gave an example where this is the case given conflicting evidence. His example is an interval valued probability. However, it is not an example involving missing, unavailable, relevant evidence, which is what Keynes meant when discussing the making of decisions under uncertainty. Unfortunately, de Finetti, like Emile Borel, F Y Edgeworth, and E B. Wilson before him, skipped the analysis in Part II of the A Treatise on Probability that would have allowed him to conclude that Keynes’s views on “non numerical probabilities” were not mysterious at all, but were interval valued probabilities based on the non (sub) additivity of probability applications in the real world. This means that Keynes was arguing that non-additivity is the general case in the real world of decision making not additivity. There is not a single reference in the corpus of de Finetti’s lifetime works that connects Keynes’s non numerical probabilities to both Boole’s and Keynes’s interval valued probability, as done explicitly by Keynes in chapters 3, 15, 16, 17, 20, 22, 26, 29, and 30 of the A Treatise on Probability. On the other hand, de Finetti always makes it clear that he believed that Keynes’s “non numerical probabilities” could never be the general case in decision making. The general case had to end up always leading to a precise, exact probability estimate and never a non numerical probability, either imprecise or logically indeterminate, even if it did turn out to be the case that what Keynes meant by non numerical probability was interval valued probability. For de Finetti, all intermediate and final probability estimates must be point estimate probabilities. Thus, what de Finetti meant by uncertainty is the existence of measurement error resulting from the concept of inner and outer measures in measure theory when dealing with initial conditions that would likely involve a paucity of relevant evidence. This is completely different from what Keynes meant by the word uncertainty. For Keynes, uncertainty meant that there was missing or unavailable relevant evidence or knowledge. However, Keynes clearly restricted the existence of ignorance, where there is no relevant evidence, to the distant future where technological change, innovation and advance threatened to make current technologies obsolete overnight. This threat of obsolescence is limited to long run investment in physical durable capital good and financial products. Keynes completely and totally rejected the Post Keynesian claims of GLS Shackle, Joan and Austin Robinson, and Paul Davidson that there was complete and total uncertainty about the future so that the weight of the evidence, w, had to equal 0. Note that de Finetti also ignored Frank Ramsey’s two highly critical, but error filled, reviews of Keynes and A Treatise on Probability. Of course, de Finetti supported Ramsey’s additive and linear approach to the use of numerical probability using betting quotients because it established the coherence and consistency of the better’s beliefs with the purely mathematical laws of the probability calculus and prevented a Dutch book from being able to be made theoretically against the better. Therefore, rationality had only one meaning for de Finetti. It meant consistency with the purely mathematical laws of the probability calculus, while for Keynes rationality meant basing one’s estimate of a probability on all of the available, relevant evidence or knowledge. Note that the usual definition of rationality used by economists is Maximizing Utility or Maximizing Subjective Expected Utility (SEU theory). Finally, de Finetti ultimately changed his mind as he grew older. Bruno de Finetti came to regard all logical theories of probability as being partly defective.
- Published
- 2017
- Full Text
- View/download PDF
29. The Formal, Normative Theories of Savage, De Finetti, and Savage and De Finetti Require Additivity. They Never Introduced Sub or Non Additivity (Interval Valued Probability, Imprecise Probability, Indeterminate Probability) into Their Formal, Normative Theories of Subjective Probability or of Maximizing Subjective Expected Utility
- Author
-
Michael Emmett Brady
- Subjects
Decision theory ,Statistics ,Bayesian probability ,Upper and lower probabilities ,Subjective expected utility ,Imprecise probability ,Mathematical economics ,Outcome (probability) ,Axiom ,Mathematics ,Probability measure - Abstract
George Boole was the first scholar to present a rigorous, mathematically derived, concept of upper and lower probability with a large number of applications in 1854 in The Laws of Thought. Boole made it clear that this was due to the existence of missing or insufficient information, evidence or knowledge. Boole mentioned a number of times that the problems under consideration would always have incomplete or insufficient data. He termed these kinds of problems as being logically indeterminate. If, however, there was no missing or unavailable information, then the upper and the lower probabilities will become equal to each other. There will no longer be any interval valued, or upper and lower, probabilities. There will be a determinate, exact or precise probability. Additivity will always hold. Keynes introduced his weight of the argument,V,a logical relation, in chapter 6 of the A Treatise on Probability. Its purpose was to grade single pieces of data or evidence. Thus V1(h/x1 x2 x3)>V2(h/ x1 x2) means that the weight of the argument, V1, is greater than V2. The comparable mathematical representation of V was accomplished by Keynes with his weight of the evidence variable, w. w was defined on p.315 of the A Treatise on Probability in chapter 26. W was defined on the unit interval [0,1], just like probability is defined on the unit interval between 0 and 1. A w=1 meant that there was no missing or unavailable evidence or knowledge. The evidence is complete. If w=1, then there can’t be any upper and lower probability estimates. Only if w
- Published
- 2017
- Full Text
- View/download PDF
30. Hicks's IS LM Interpretation of the IS-LM Model in J M Keynes's Chapter 15 of the General Theory: Partly Wrong in 1937 and Partly Wrong in 1981
- Author
-
Michael Emmett Brady
- Subjects
Variables ,IS–LM model ,media_common.quotation_subject ,Decision theory ,Liquidity preference ,Shackle ,Economics ,Post-Keynesian economics ,Imprecise probability ,Mathematical economics ,media_common ,Interpretation (model theory) - Abstract
Hicks’s 1937 interpretation of Keynes’s IS-LM model in chapter 15 of the General Theory was inferior to Champernowne’s 1936 interpretation which correctly incorporated variables, denoted by Champernowne as Q and Q’, that stood for “The State of the News”, investor nervousness, uncertainty, or the confidence a decision maker had in his expectations. These variables appeared in both the IS and LM functions as independent variables that could shift both curves simultaneously. Only Champernowne’s work in 1936 incorporated these variables. No economist in the time period 1936 -2017 has acknowledged Champernowne’s unique and far sighted accomplishment although his article has been cited repeatedly since 1936. Hicks deliberately removed these variables from his own 1937 paper in Econometrica because it would have prevented him from creating what he called the general, general theory. In 1981, after wasting much of his readers time talking about different time periods and fix prices (money wages) versus flex prices (money wages), Hicks finally admits, at the very end of his 1981 recantation-retraction of his 1937 IS LM paper in an article published in the Journal of Post Keynesian Economics, that the problem with IS LM is that it does not incorporate Keynes’s concerns with uncertainty into the LM curve. Nowhere is the reader told that it was Hicks himself who had deliberately removed the uncertainty variable explicitly when presenting his interpretation of Keynes’s IS – LM curves from chapter 15 of the General Theory. Hicks’s 1981 paper’s citation of Shackle on uncertainty demonstrates that he was still confused and did not understand that Keynes’s definition of uncertainty, on p.148 of the General Theory, was that uncertainty was a unique function of Keynes’s weight of the evidence variable only. Keynes integrated uncertainty explicitly into decision theory with his indeterminate, interval valued probability approach, based on Boole, in chapters 3, 5, 10, 15, 16, 17, 20 and 22 of the A Treatise on Probability, and in his conventional coefficient of weight and risk, c, from chapter 26 of the A Treatise on Probability. Keynes dealt with imprecise probability in chapter 29 with his suggested use of Chebyshev’s Inequality as a lower bound. Since the Liquidity Preference LM curve is built on Keynes’s decision theory from the A Treatise on Probability, Hicks needed to have stated that he had no idea about what Keynes meant by uncertainty in either 1937 or 1981, but that uncertainty, as a function of the weight of the evidence, needed to be reintegrated into the IS LM model in both the IS curve and the LM curve.
- Published
- 2017
- Full Text
- View/download PDF
31. On Weckstein's 1959, Keynesian, Weight of the Evidence Based Demolition of G L S Shackle's Attack on the Concept of Probability
- Author
-
Michael Emmett Brady
- Subjects
Ex-ante ,Common cause and special cause ,Keynesian economics ,Shackle ,Economics ,Upper and lower probabilities ,Conditional probability ,Imprecise probability ,Probability interpretations ,Possibility theory - Abstract
R. Weckstein’s 1959 article in the February,1959 Symposium in Metroeconomica on Shackle’s theory of possibility demonstrated that Shackle’s attack on the concept of probability, as an approach to be used by decision makers in the real world, due to the theory of probability’s requirements, such as objectivity, complete ordering, the additivity of objective probabilities, divisibility, repetition, and repeatability, was applicable mainly to the relative and limiting frequency interpretations of probability. Shackle’s attack also held against the subjective theory of probability, which also made the assumptions of complete ordering and additivity. However, Weckstein demonstrated that all of Shackle’s objections to probability per se totally failed when confronted by logical theories of probability, in general, and J M Keynes’s Logical Theory of Probability, in particular, which explicitly criticized classical, limiting frequency, relative frequency, and subjective theories of probability. Keynes explicitly pointed out that all of these other theories of probability were actually special cases, sound and valid in their specific fields of application, but not general in scope or application. Keynes argued that only his logical theory of probability could be a general theory of probability, applicable to single events, infrequent events, and frequent events in the form of Boole’s propositional logic and development of indeterminate/imprecise/determinate probabilities. Specifically, it was the Keynesian concept of the weight of the evidence that leads to the complete refutation of Shackle’s attacks on probability per se. Shackle’s attacks on the concept of probability are sound when made against the Classical, Propensity, Subjective, Relative Frequency, and Limiting Frequency interpretations of probability, but are unsound when confronted with Keynes’s Theory of logical probability. Shackle never responded to Weckstein’s criticisms based on Keynes or logical theories of probability. In fact, he deliberately chose to respond mainly to Weckstein’s mention of Reichenbach’s posit approach to singular events in his article. In fact, Shackle never dealt with Keynes’s logical theory of probability in his lifetime since he never got past page 14 of the A Treatise on Probability. Instead, Shackle attempted to completely bypass the A Treatise on Probability and The General Theory, which is built on Keynes’s weight of the evidence, by claiming that Keynes’s February,1937 article in the Quarterly Journal of Economics represented Keynes’s final view of decision making and probability. A recent article by Derbyshire represents a continuation of Shackle’s severe misrepresentation of probability and especially of Keynes’s work on uncertainty, decision making, and probability. Keynes, building on the logical approach to probability of Boole, had already solved the problems that other theories of probability had in dealing with single, crucial decisions, additivity, and uncertainty. Keynes finished with his analysis in the 1908 Fellowship dissertation, which contained his interval valued, on linear, non-additive approach, when Shackle was five years old. This appears in the TP in Part II. Shackle never read the TP. He read little bits and pieces of it. He then wrote vague and ambiguous comments on the TP that made no sense when he wrote them and which still do not make any sense in 2017. Keynes’s approach would make a better foundation for scenario planning than Shackle’s, with his theory of possibility, if one assumes that academics are capable of actually reading the TP. However, it might be the case that academics are not capable of reading the TP. In that case, then, Shackles’s theory would be a second best choice.
- Published
- 2017
- Full Text
- View/download PDF
32. An Examination of the Fundamental Reasons Why Frank Ramsey Failed in His Reviews of J M Keynes's a Treatise on Probability
- Author
-
Michael Emmett Brady
- Subjects
Mathematical theory ,Non additivity ,Technical analysis ,Economics ,Interval (graph theory) ,Special case ,Imprecise probability ,Mathematical economics ,Interval valued - Abstract
Frank P Ramsey did not consider the possibility of representing the concept of probability by an interval valued approach in his lifetime. Ramsey considered probability to be either ordinal or numerical. There was absolutely no room for interval estimates and interval probability in his approach. It is mentioned nowhere in any of his published or unpublished work. Ramsey was not familiar with the applied work done by George Boole on interval valued probability. Ramsey believed only in the standard mathematical theory of probability. The crucial assumption for Ramsey was additivity. Of course, Keynes rejected additivity except as a special case. This is the major difference between Ramsey and Keynes. Keynes believed it obvious that there is missing relevant evidence or knowledge in many decisions made by human beings. Non addiitvity immediately leads to uncertainty. Uncertainty requires the existence of non additivity. Keynes made it explicit in Part II of the A Treatise on Probability that additivity was a special case and not the general case. Unfortunately, Ramsey could not follow Keynes’s analysis in Part II of the TP. This, for instance, was a severe problem also for both F Y Edgeworth and E B Wilson. However, Edgeworth figured out that Keynes’s “non numerical” probabilities were interval valued by means of a very careful reading of Chapter III of the TP. It is very probable that Edgeworth also had the benefit of many private discussions with Keynes between 1909 and 1920 when both men were co-editors of the Economic Journal. Unfortunately, Ramsey’s conclusion was that Keynes’s “non numerical probabilities” were ordinal probabilities. Economists have mistakenly followed Ramsey’s assessment and ignored Edgeworth’s vastly superior assessment of the interval valued nature of Keynesian probability. The major defect in the CWJMK version of the A Treatise on Probability is that the editorial foreword was written by a rabid proponent of Ramsey’s belief in additivity, Richard Braithwaite. Anyone reading this editorial foreword before he/she reads the A Treatise on Probability will be completely biased against Keynes’s technical analysis in the A Treatise on Probability. The only existing antidote to Braithwaite’s “review” is the two combined reviews of Edgeworth, who provided, overall, the best reviews ever written of the A Treatise on Probability and Bertrand Russell’s review. In fact, I had to use Edgeworth’s two reviews to buttress my case in support of my dissertation with one of my dissertation committee members.
- Published
- 2017
- Full Text
- View/download PDF
33. On J M Keynes's Original Contributions to Decision Making Under Uncertainty: Indeterminate, Interval Valued Probabilities in Part II of the a Treatise on Probability and Imprecise, Interval Valued Probabilities in Part V of the a Treatise on Probability
- Author
-
Michael Emmett Brady
- Subjects
Normal distribution ,Range (mathematics) ,Statistics ,Upper and lower probabilities ,Inverse function ,Limit (mathematics) ,Imprecise probability ,Indeterminate ,Upper and lower bounds ,Mathematical economics ,Mathematics - Abstract
All of the logical, statistical and mathematical foundations for Keynes’s work on uncertainty in the General Theory and his 1937 Quarterly Journal of Economics reply article can be found in Parts II and V of Keynes’s A Treatise on Probability. Keynes, building on the original, mathematically developed, logical theory of probability first put forth by George Boole in his The Laws of Thought in 1854, demonstrated how to construct upper and lower probabilities that would define an interval valued probability. Keynes dealt with indeterminate probabilities first. Indeterminate probabilities are interval valued probabilities where the range between the upper and lower bounds or limits will not diminish as new additional data or evidence is examined because there will always be important relevant evidence that will never be attained and is permanently missing or unavailable when a decision or assessment must be made. The great American mathematician, Edwin Bidwell Wilson, acknowledged this begrudgingly in 1934 in his JASA article titled, “A Problem of Boole’s.” Keynes also developed the first approach to “Safety First” by showing how to use Chebyshev’s Inequality to provide a lower bound or limit. This imprecise lower bound or limit would be based on the initial, small amount of data or information that was available to the decision maker at the start of his assessment of a problem. However, over time, as more and more relevant data and information became available, the decision maker would be able to make use of more accurate and reliable statistical methods related to the Normal distribution. The Normal distribution estimate would then serve as an upper bound. The range between the lower and upper bounds would then diminish as larger and larger samples were used. Eventually, a precise probability estimate would result. The lower bound would be relevant at the initial stages of a estimation process. Keynes’s views on imprecise probability estimates, in cases where significant, additional data and information can be obtained over time and a decision to act can be postponed indefinitely, imply that eventually a precise probability would be able to be calculated that was stable. When Keynes defined uncertainty to be an inverse function of the weight of the evidence, he automatically supported his GT with a foundation consisting of the mathematical and logical analysis contained in Parts II and V of the TP on indeterminate probabilities, imprecise probabilities, the weight of the evidence, and the conventional coefficient of weight and risk. Unfortunately, 81 years after Keynes wrote the GT and 96 years after Keynes published his TP, economists are still unable to read Parts II and V of the TP. Economists, like the French mathematician, Emile Borel, and the American mathematician, Edwin Wilson, found that Parts II and V of the TP were just too overwhelming for them.
- Published
- 2017
- Full Text
- View/download PDF
34. How Quantifying Probability Assessments Influences Analysis and Decision Making: Experimental Evidence from National Security Professionals
- Author
-
Jennifer S. Lerner, Jeffrey A. Friedman, and Richard J. Zeckhauser
- Subjects
Actuarial science ,National security ,business.industry ,media_common.quotation_subject ,Homeland security ,Public relations ,Imprecise probability ,Numeral system ,Action (philosophy) ,Political science ,Terrorism ,business ,Overconfidence effect ,Skepticism ,media_common - Abstract
National security is one of many fields where public officials offer imprecise probability assessments when evaluating high-stakes decisions. This practice is often justified with arguments about how quantifying subjective judgments would bias analysts and decision makers toward overconfident action. We translate these arguments into testable hypotheses, and evaluate their validity through survey experiments involving national security professionals. Results reveal that when decision makers receive numerals (as opposed to words) for probability assessments, they are less likely to support risky actions and more receptive to gathering additional information, disconfirming the idea of a bias toward action. Yet when respondents generate probabilities themselves, using numbers (as opposed to words) magnifies overconfidence, especially among low-performing assessors. These results hone directions for research among both proponents and skeptics of quantifying probability estimates in national security and other fields. Given that uncertainty surrounds virtually all intelligence reports, military plans, and national security decisions, understanding how national security officials form and interpret probability assessments has wide-ranging implications for theory and practice.
- Published
- 2016
- Full Text
- View/download PDF
35. A Likelihood Story: The Theory of Legal Fact-Finding
- Author
-
Sean P. Sullivan
- Subjects
Persuasion ,Adversarial system ,media_common.quotation_subject ,Econometrics ,Conditional probability ,Imprecise probability ,Probability interpretations ,Measure (mathematics) ,Mathematics ,media_common ,Fact-finding ,Simple (philosophy) - Abstract
For over 50 years, courts and scholars have tried to conceptualize fact-finding, and burdens of persuasion, in terms of the probability of facts given the evidence. The exercise has not produced a satisfying theory of fact-finding. The problem is reliance on probability. Fact-finding is not about probability. It’s about likelihood. The difference between these concepts is substantial. Where probability theories of fact-finding ask about the probability of the facts given the evidence, the proposed likelihood approach asks about the probability of the evidence given different assumptions about the facts. Where probability theories measure subjective beliefs, the likelihood approach measures the relative weight of evidence alone. Using the statistical properties of likelihoods, I show that every burden of persuasion in use today can be reduced to the same simple rule of likelihood reasoning. This likelihood theory of fact-finding closely mirrors the procedure of adversarial litigation, and solves all of the paradoxes, difficulties, and unacceptable implications that have long frustrated probability theories of fact-finding.
- Published
- 2016
- Full Text
- View/download PDF
36. Richard E. Braithwaite on J. M. Keynes's A Treatise on Probability and Logical Theory of Probability: Ignorance is Bliss
- Author
-
Michael Emmett Brady
- Subjects
media_common.quotation_subject ,Philosophy ,Ignorance ,Imprecise probability ,Interval valued ,Logical theory ,BLISS ,Law ,Similarity (psychology) ,Intuition (Bergson) ,Relation (history of concept) ,Mathematical economics ,computer ,media_common ,computer.programming_language - Abstract
Richard E Braithwaite’s Oct,1931 review article in Mind on Jefferys’ work on probability also summarized what the current assessment of Keynes’s A Treatise on Probability and logical theory of probability was. This assessment is based on a complete and total ignorance on Braithwaite’s part about what Keynes actually accomplished in the A Treatise on Probability. He had no idea about what an interval valued, indeterminate probability is. He had no idea about how Keynes built on Boole’s upper-lower bound approach. He had no idea about the concept of the weight of the evidence, w, and how it is connected to the size of the difference between the lower and upper bound. He apparently forgot that Keynes’s logical approach to probability was carefully laid out in 1907 and 1908, which would be 12 or 13 years before Harold Jeffreys published his articles in 1919 with Wrinch.Finally, Braithwaite had no idea about how to compare objects using a relation of similarity and/or dissimilarity which, of course, is the basic requirement needed for pattern recognition, which is recognized as fundamental by cognitive psychologists and cognitive scientists if decision makers are to successfully use their intuition and induction. He is totally oblivious to the connection between degrees of similarity and Keynes’s logical probability relations. In short, he was an ignorant fool who failed abysmally, like Hugh Townshend, to make use of the clues Keynes periodically sent his way.
- Published
- 2016
- Full Text
- View/download PDF
37. Essays on Risk and Uncertainty: Comparing J. M. Keynes and the Von Mises Brothers, Richard and Ludwig, on Probability and Decision Theory
- Author
-
Michael Emmett Brady
- Subjects
Frequentist probability ,Economics ,Law of total probability ,Conditional probability ,Probability and statistics ,Imprecise probability ,Mathematical economics ,Propensity probability ,Probability interpretations ,Probability measure - Abstract
Richard von Mises set forth a Limiting Frequency approach to probability for use primarily in physics and some areas of biology, chemistry and engineering. Emphasis was placed on the existence of series or sequences of events that resulted from a known generating mechanism. The series consisted of repeated, homogeneous events that were identical to each other, except for their particular place selection in an infinite series. This conception of probability is extremely limited and can play only a very limited role in liberal arts, social science, behavioral science, and especially in economics, business, and finance.Ludwig von Mises conception was that there were two different concepts of probability. First, there was a class concept of probability that was basically a version of a limiting frequency interpretation of probability. He had a second conception that he called case probability. Case probability was supposed to deal with unique events. It essentially is a subjective degree of belief approach that allows one to calculate no more than an ordinal relationship. Ludwig Von Mises is thus not a Bayesian Subjectivist. His dual approach to probability is reminiscent of Rudolf Carnap, although it is greatly underdeveloped technically when compared to Carnap.J. M. Keynes’s conception of probability is a logical theory of probability built on G. Boole’s earlier logical theory of probability. Keynes’s theory deals with the degree of rational belief one should believe and act on. It is not “a” or “the” degree of belief used by Subjectivists. It was a relative concept, which means that all probabilities are conditional probabilities. There is an objective probability relation that holds between a given body of evidence, the premises, E, and a conclusion or hypothesis, H. Thus, P (H/E) =α. The objective probability relation measures the degree of similarity (dissimilarity) that exists between the evidence, E (premises), and the conclusion or Hypothesis, H.There are many severe and numerous errors in Richard Von Mises book, Probability, Statistics and Truth, regarding Keynes’s logical theory of probability that will be identified and corrected in this paper. Neither of the Von Mises brothers understood Keynes’s interval valued approach to probability or had any clear cut understanding of the concept of weight or decision weights like Keynes’s conventional coefficient in chapter 26 of the A Treatise on Probability, although such an approach appears out of nowhere in Probability, Statistics and Truth in Lecture 4.Ludwig von Mises’s conception of case probability has no developed concept of the weight of the evidence underlying it. Its identification of probability as being ordinal probability would be too weak from Keynes’s point of view to help an entrepreneur in making decisions related to his revenues and costs and the question of whether he should invest in fixed capital now or wait to do so at some time in the future (liquidity preference and user cost of capital issues).
- Published
- 2016
- Full Text
- View/download PDF
38. The Rule of Probabilities: A Practical Approach for Applying Bayes' Rule to the Analysis of DNA Evidence
- Author
-
Ian Ayres and Barry Nalebuff
- Subjects
Bayes' rule ,business.industry ,Computer science ,Conditional probability ,Bayesian inference ,Imprecise probability ,Machine learning ,computer.software_genre ,Bayes' theorem ,Inductive probability ,Prior probability ,Artificial intelligence ,business ,Probability interpretations ,computer - Abstract
Bayes’ rule is not being used to guide jury decision making in the vast majority of criminal cases introducing evidence of DNA testing. Instead of telling juries the “source probability,” the probability that the individual whose DNA matches was the source of the forensic evidence found at the crime scene, experts only present pieces of the puzzle. They provide the probability that a randomly selected innocent person would have a match or the expected number of innocent matches in the database. In some cases, the random match probability will be so low (one in a quadrillion) that the intuitive source probability is practically one hundred percent. But, in other cases, with large database trawls and random match probability at 1 in a million, jurors will have no ability to convert the random match probability or the likelihood ratio based on expected number of matches into relevant data that will help them address the question of guilt. This Article shows that a correct application of Bayes’ rule should lead fact-finders and litigants to focus on the size of two variables that influence the source probability: the probability that a non-source in the DNA database would have an alibi, and the probability that the source of the DNA is included in the database. This Article suggests practical means of estimating these two variables and argues that as a legal matter these parameters as well as the Bayesian posterior source probability are admissible in court. In particular, focusing on the prior probability that the “database is guilty,” i.e. the probability that someone in the database is the source of the forensic evidence, is not just analytically and empirically tractable, but avoids the evidentiary limitations concerning a particular defendant’s prior bad acts. Appropriate application of Bayes’ rule, far from preempting the fact-finding and adversarial process, can guide advocates to engage the important aspects of the evidence that are still likely to be open to contestation. Perhaps most important, appropriate application of Bayes’ rule will also allow jurors to reach verdicts via a coherent path that employs sound logic and reasoning.
- Published
- 2015
- Full Text
- View/download PDF
39. Indeterminate versus Imprecise Interval Probabilities: The Keynes-Knight and the De Finetti-Savage Approaches
- Author
-
Rogério Arthmar and Michael Emmett Brady
- Subjects
Operational definition ,Bayesian probability ,Calculus ,Knight ,Appeal ,Interval (graph theory) ,Decision maker ,Indeterminate ,Imprecise probability ,Mathematical economics ,Mathematics - Abstract
The operational definitions of uncertainty used by John M. Keynes and Frank H. Knight are based on missing information that will not be available to the decision maker at any time. The founder of this approach is George Boole. This leads to indeterminate interval probabilities. The definition of uncertainty proposed by Bruno de Finetti and Leonard J. Savage is significantly different from the one advanced by Keynes and Knight. Uncertainty, for de Finetti and Savage, can only exist in the initial conditions of an experiment or study due to measurement error or a lack of data available at the beginning of a study. Imprecise interval probabilities will always eventually result in precise probabilities due to Bayesian conditionalization. Recent studies, such as the papers by Feduzi, Runde and Zappia (2012, 2014), seem to have overlooked this particular aspect of de Finetti and Savage’s 1962 paper ‘Sul Modo di Scegliere le Probabilita Iniziali’. This particular work is cited by the said scholars as supporting their claim that there are similarities between the de Finetti-Savage conception of uncertainty and the Keynes-Knight approach. This claim, however, loses much of its appeal once it is realized that the discussion of de Finetti and Savage involves only the initial probabilities.
- Published
- 2014
- Full Text
- View/download PDF
40. How Is It Possible for Keynes's Theory of Logical Probability to Be 'Non-Probabilistic'? Answer. Only If You Have Been Confused by Reading Frank P. Ramsey's Reviews of Keynes's A Treatise on Probability
- Author
-
Michael Emmett Brady
- Subjects
Level of measurement ,If and only if ,Reading (process) ,media_common.quotation_subject ,Probabilistic logic ,Garbage dump ,Imprecise probability ,Mathematical economics ,Logical theory ,media_common ,Mathematics - Abstract
Ramsey’s fundamental error, made in both 1922 and in 1926, was to assume without any support that Keynes’s Logical Theory of Probability was based on ordinal measurement that could only be applied on some occasions. Keynes’s theory is not an ordinal theory at all. Keynes’s theory is an interval estimate theory of probability. Ramsey’s reviews of Keynes’s work in the A Treatise on Probability must now be consigned to the intellectual garbage dump of history where they belong.
- Published
- 2014
- Full Text
- View/download PDF
41. Adam Smith's Theory of Probability and the Roles of Risk and Uncertainty in Economic Decision Making
- Author
-
Michael Emmett Brady
- Subjects
Frequentist probability ,Bayesian probability ,Probabilistic logic ,Probability distribution ,Decision-making ,Imprecise probability ,Mathematical economics ,Probability interpretations ,Optimal decision ,Mathematics - Abstract
Adam Smith rejected the use of the mathematical laws of the calculus of probabilities because the basic information-data-knowledge provided in the real world of decision making did not allow a decision maker to specify precise, definite, exact, numerical probabilities or discover the probability distributions. This means that Smith rejected the classical interpretation of probability of La Place and the Bernoulli brothers, the limiting frequency-relative frequency interpretation of probability, and the personalist, subjectivist, psychological Bayesian approach used by all neoclassical schools of thought because all of these approaches to probability claim that ALL probabilities can be represented by a single numeral between 0 and 1 and the decision maker knows the probability distributions. Smith, like Keynes, rejects this immediately. Thus, Smith’s inductive or logical concept of probability, like Keynes’s, only approaches mathematical probability in the limit. Adam Smith recognized that economic decision makers were confronted with knowledge structures that were not sharp and clear, but cloudy and amorphous. However, decision makers were still able to use the concept of probability in the weaker, interval sense of the concept of probability that was first thought to have been advocated by George Boole and later, with much greater force, by John Maynard Keynes in his two Fellowship dissertations, submitted in 1907 and 1908, respectively, and his A Treatise on Probability (1921). Instead of sharp, definite, determinate, calculated, and exact probabilistic estimates or distributions, inexact, indefinite, indeterminate, and imprecise estimates of probabilities could be derived and used so that decision makers were able to make choices among different possible options that concerned the future in a rational fashion. An important conclusion of this paper is that it was Adam Smith who first explicitly recognized that the mathematical concept of probability is not applicable, in general, in real world decision making. Smith also rejects the normative and prescriptive roles of mathematical probability in decision making. Adam Smith applied his approach to probability and uncertainty by analyzing the economic decisions made by human beings in choosing a particular profession and organizing various insurance markets to cover the risk of loss. Smith’s risk is however, not the standard deviation of the Normal probability distribution used by "Modern" economists, since important data/information/knowledge is missing and not available to the decision maker at the point in time that he is required to make a decision.
- Published
- 2013
- Full Text
- View/download PDF
42. Keynes’ Lower-Upper Bound Interval Approach to Probability
- Author
-
Michael Emmett Brady and Rogério Arthmar
- Subjects
Nonlinear system ,Probability bounds analysis ,Decision theory ,Conditional probability ,Lower upper ,Interval (mathematics) ,Mathematical structure ,Imprecise probability ,Mathematical economics ,Mathematics - Abstract
This paper shows how Keynes provided a complete mathematical structure for his system of probability, which he called “approximation”, in the A Treatise on Probability, 1921.Keynes used the standard concept of conditional probability to duplicate many of Boole's problems. It is also demonstrated that he provided a solid mathematical structure for his non linear, non additive decision theory approach. We conclude that Keynes should be recognized as the founder of the modern non additive approach to probability.
- Published
- 2010
- Full Text
- View/download PDF
43. Combining Imprecise or Conflicting Probability Judgments: A Choice-Based Study
- Author
-
Aurélien Baillon and Laure Cabantous
- Subjects
High probability ,Core (game theory) ,Cumulative prospect theory ,media_common.quotation_subject ,Econometrics ,Context (language use) ,Sensibility ,Pessimism ,Function (engineering) ,Imprecise probability ,media_common ,Mathematics - Abstract
This article develops a new approach to study the impact on beliefs and decisions of uncertain probability forecasts by advisors. The core concept of that approach, which builds on the revealed-preference approach favored by economists, is the one of revealed beliefs - the precise probability leading to the same choice as an uncertain probability forecast. The paper defines two properties of revealed beliefs - pessimism and likelihood sensibility - and allows them to vary as a function of the source of uncertainty. It then focuses on the impacts on revealed beliefs of two uncertain contexts of intense practical importance, namely consensual but imprecise uncertainty (Si) and conflicting uncertainty (Sc). In the first case, the advisors agree on the same imprecise probability forecast whereas in the latter case the advisors disagree on the probability of the same risk and the decision-makers receives two precise but different probability forecasts. Two experiments test a series of predictions concerning the impacts on revealed beliefs of each uncertain context. The results of both studies show that revealed beliefs vary as a function of the source of uncertainty, in particular for low and high probability events. The second study tests a series of predictions on the differences between revealed beliefs and the beliefs that decision-makers report as their best estimate of the probability of the risk (called judged beliefs).
- Published
- 2009
- Full Text
- View/download PDF
44. The Weight of Argument and Non-Additive Measures: A Note
- Author
-
Marcello Basili and Carlo Zappia
- Subjects
Ellsberg paradox ,Decision theory ,jel:D21 ,Imprecise probability ,jel:B16 ,Probability theory ,Dempster–Shafer theory ,Prior probability ,Econometrics ,Mathematical economics ,Probability interpretations ,uncertainty, probabilities, Keynes ,Mathematics ,Probability measure - Abstract
This note argues that a representation of the epistemic state of the individual through a non-additive measure provides a novel account of Keynes’s view of probability theory proposed in his Treatise on Probability. The paper shows, first, that Keynes’s “non-numerical probabilities” can be interpreted in terms of decisional weights and distorsions of the probability priors. Second, that the degree of non-additivity of the probability measure can account for the confidence in the assessment without any reference to a second order probability. And, third, that the criterion for decision making under uncertainty derived in the non-additive literature incorporates a measure of the degree of confidence in the probability assessment. The paper emphasises the Keynesian derivation of Ellsberg’s analysis: the parallel between Keynes and Ellsberg is deemed to be significant since Ellsberg’s insights represent the main starting point of the modern developments of decision theory under uncertainty and ambiguity.
- Published
- 2007
- Full Text
- View/download PDF
45. Insurance and Probability Weighting Functions
- Author
-
Sanjit Dhami and Ali al-Nowaihi
- Subjects
jel:C60 ,jel:D81 ,Posterior probability ,Conditional probability ,Moment-generating function ,Imprecise probability ,Weighting ,Decision making under risk ,Prelec’s probability weighting function ,Higher order Prelec probability weighting functions ,Behavioral economics ,Rank dependent utility theory ,Prospect theory ,Insurance ,St. Petersburg paradox ,Mean-preserving spread ,Statistics ,Econometrics ,Random variable ,Mathematics ,Probability measure - Abstract
Evidence shows that (i) people overweight low probabilities and underweight high probabilities, but (ii) ignore events of extremely low probability and treat extremely high probability events as certain. Decision models, such as rank dependent utility (RDU) and cumulative prospect theory (CP), use probability weighting functions. Existing probability weighting functions incorporate (i) but not (ii). Our contribution is threefold. First, we show that this would lead people, even in the presence of fixed costs and actuarially unfair premiums, to insure fully against losses of sufficiently low probability. This is contrary to the evidence. Second, we introduce a new class of probability weighting functions, which we call higher order Prelec probability weighting functions, that incorporate (i) and (ii). Third, we show that if RDU or CP are combined with our new probability weighting function, then a decision maker will not buy insurance against a loss of sufficiently low probability; in agreement with the evidence. We also show that our weighting function solves the St. Petersburg paradox that reemerges under RDU and CP.
- Published
- 2006
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.