The G8 protests and the logically inconsistent foundations of neoclassical economics

You may also like...

12 Responses

  1. enteringthewhirlpool says:

    Sonnenschein–Mantel–Debreu Theorem (Debreu 1959) proves the basic axioms of neoclassical economics are logical inconsistent.

    How? What assumption of neoclassical economics is violated by the possibility of multiple equilibria? Why do you describe this as a problem? The existence of multiple equilibria is realistic. By the third axiom do you mean the third of the “methodological X” items you have listed? I have looked at the reference you give (Arnsperger, C & Varoufakis, Y 2006) and fail to see how their definition of methodological equilibration is inconsistent with the possibility of multiple equilibria. Please explain.

    What any of this has to do with the ideological miscellany of G8 protestors beats me, but for the moment let’s stick to the technical stuff.

  2. HopeForTheDismalScience says:

    The Sonnenschein–Mantel–Debreu (SMD) Theorem (Debreu 1974; Mantel 1974; Sonnenschein 1972, 1973) proves that General Equilibrium has multiple equilibria. Kirman (1992, p. 119) notes that the uniqueness of equilibrium in General Equilibrium Theory (GET) is required to justify comparative statics and its use in policy analysis, so undermining one of the three assumptions underpinning the neoclassical framework: methodological equilibration. Rizvi (1994, p. 358) considers this result the most significant ‘negative’ result in mainstream economic since the capital controversies and Arrow’s impossibility theorem.

    The SMD Theorem shows that if every agent has nicely shaped individual demand curves, one cannot say that market demand has a nicely shaped curve also. Such market demand curves provide for multiple equilibria using GET with the consequence that we cannot assure dynamic stability or the stability of equilibrium. These stability issues are at odds with the neoclassical equilibration assumption because equilibrium lacking dynamic stability makes the neoclassical technique of comparative statics meaningless and the stability of equilibrium cannot be assumed as a small shock perhaps sufficient to move the system to an adjacent equilibrium making comparative statics meaningless also. Rizvi (1994, p. 363) concludes that the SMD Theorem brings the microfoundations project based on general equilibrium theory to an end. This suggests that the effort over the last one hundred years to find microfoundations for the demand curve as a result of utility-maximization is essentially wasted for the intended result.

    Keen (2001) provides a more extensive argument, if the above is unpersuasive.

    References

    Debreu, G 1974, ‘Excess demand functions’, Journal of Mathematical Economics, vol. 1, no. 1, pp. 15-21.
    Keen, S 2001, Debunking economics: the naked emperor of the social sciences, Pluto Press, Annandale, N.S.W.
    Kirman, AP 1992, ‘Whom or What Does the Representative Individual Represent?’ Journal of Economic Perspectives, vol. 6, no. 2, pp. 117-36.
    Mantel, RR 1974, ‘On the characterization of aggregate excess demand’, Journal of Economic Theory, vol. 7, no. 6, pp. 348-53.
    Rizvi, SAT 1994, ‘The Microfoundations Project in General Equilibrium Theory’, Cambridge Journal of Economics, vol. 18, no. 4, pp. 357-77.
    Sonnenschein, H 1972, ‘Market Excess Demand Functions’, Econometrica, vol. 40, no. 3, pp. 549-63.
    —- 1973, ‘Do Walras’ identity and continuity characterize the class of community excess demand functions?’ Journal of Economic Theory, vol. 6, no. 4, pp. 345-54.

  3. esminihan says:

    This is a very interesting conversation. I’d like to contribute one comment and one question. It is true, several governments and institutions employ CGE and DSGE models. However, the researchers generally responsible for their construction and application to policy analysis have training that far exceeds “the very narrow version of the free market peddled by neoclassical economics, which is that taught in the majority of undergraduate courses.” These models are not employed because of ignorance about their limitations, but because they are useful, serving as one part of a more comprehensive analysis. However, I can see the validity of your concerns if the results of such models are used out of context by non-experts who may not be aware of the limitations which, I might add, potentially includes your G8 protestors as well as politicians. My question is simply: what are your suggestions for alternative approaches to policy analysis?

  4. HopeForTheDismalScience says:

    Esminihan thank you for your comments and question. I hope the following discussion from my PhD thesis (Bell 2009) covers both sufficiently.

    Blaug (1992, pp. 238-39) comments that “[The] central weakness of modern economics is, indeed, the reluctance to produce theories that yield unambiguous refutable implications, followed by a general unwillingness to confront those implications with fact.” Blaug (1992, p. 18) quotes the Dunhem–Quine thesis that “no conclusive disproof of a scientific theory can ever be made.” The Dunhem–Quine Thesis or the ‘under-determination problem’ can explain the “general unwillingness to confront these implications with fact” because an observation contradicting a theory can be explained by adding an auxiliary hypothesis. This technique makes the neoclassical framework impervious to empirical falsification. For example Blaug (1992, p. 168) cites Weintraub (1985) arguing at length that GE must be appraised as research in mathematics and not as a theory that can be falsified. Additionally, Blaug (1992, p. 168) cites Hahn (1984, pp. 4-5) claiming that falsifiability for GE is unnecessary arguing that GE [science] provides understanding without prediction, justifying GE with the symmetry thesis.

    However, the underlying philosophy in neoclassical economics is instrumentalism; “a system of pragmatic philosophy that considers idea to be instruments that should guide our actions and their value is measured by their success” (WordNet® 3.0 Princeton University 2009). Friedman (1953, p. 15), a major proponent of instrumentalism, states “…, the relevant question to ask about the ‘assumptions’ of a theory is not whether they are descriptively ‘realistic’, for they never are, but whether they are sufficiently good approximations for the purpose in hand. And this question can be answered only by seeing whether the theory works, which means whether it yields sufficiently accurate predictions.”

    The relative predictive performance is more important than the ability of a theory to describe reality in the view of neoclassical economics whose underlying philosophy is instrumentalism where ideas are instruments and realism is unnecessary; the only measure of the success of an idea is its ability to predict. This ability to predict seems to have been forgotten. Neoclassical economics neither makes falsifiable predictions nor has realistic assumptions. However let us pretend that neoclassical economics makes falsifiable predictions to provide it with its original philosophical foundation instrumentalism.

    Musgrave (1981) discusses the flaws in instrumentalism, noting three types of assumption: negligibility, domain and heuristic. Musgrave (1981) discusses Friedman’s example of assuming no air resistance when applying the Newton’s Universal Law of Gravity in the earth’s atmosphere. This is a negligibility assumption applied within the domain assumptions of objects of high mass and low air resistance, but outside the domain assumptions the theory of gravity fails to provide an adequate model for, by way of example, the terminal velocity of a skydiver. Musgrave (1981) notes that Friedman and neoclassical economics fail to acknowledge or to clearly specify their domain assumptions and when operating outside of domain assumptions theory may well be misleading or totally incorrect. Colander (2000, p. 3) equates neoclassical economics “to the celestial mechanics of a nonexistent universe” for using theory outside its domain assumptions. This makes neoclassical economics inappropriate for policy development unless used for prescriptive reasons as it fails to describe or predict.

    Friedman (1953, p. 14) continues has advocacy of instrumentalism “Truly important and significant hypotheses will be found to have ‘assumptions’ that are wildly inaccurate descriptive representations of reality, and, in general, the more significant the theory, the more unrealistic the assumptions (in this sense)”. Musgrave (1981) notes this approach may be suitable for heuristic assumptions, citing Newton’s solar system consisting of just the sun and the earth, an unrealistic assumption, but the model could still make reasonably accurate predictions. Introducing more realistic assumptions lead to an increase in predictive performance, eventually leading to the many bodies problem and Poincare’s solution, with this inturn ushering in chaos theory and complexity theory. Keen (2001, p. 153) summarises in contradiction to Friedman (1953, p. 14) that abandoning the factually false heuristic assumptions normally leads to better theory – not worse theory. The parallel in neoclassical economics are its three underlying assumptions: methodological instrumentalism, methodological individualism, and methodological equilibration. Abandoning these three unrealistic assumptions for more realistic ones include: first, agents are rule following; second, agents interact directly not just via uniformly known market prices; and third, assume the economy is dynamic. These realistic assumptions describe the ‘science of complexity’ framework.

    Neoclassical economists had to focus on the variables more easily modelled given the mathematical techniques and computing power available in the 1950s. Additionally, Musgrave (1981) notes Newton simplifying the many body problem with a two body problem as a heuristic assumption to allow calculation. All the neoclassical assumptions can be seen in this light, allowing simple, mostly linear, theory to approximate the economy, which is in fact a complex system. For instance Keen (2001, pp. 175-6) quotes Jevons (1911), Clark (1898), Marshall (1920, p. xiv) and Keynes (1923) who recognise the economy as a dynamic process that is better modelled dynamically but static analysis provides a stop–gap measure until adequate technical ability arrives to model the economy dynamically.

    The static and dynamic divide between neoclassical and complexity economics is an important dimension to discuss for two the reasons. First there is a need to reconcile the inconsistency between the SMD Theorem and VL Smith’s experimental economics. Second, there is a need to explain why DSGE and CGE even though using the word ‘dynamic’ are still part of the ‘static’ stop–gap measure.

    There is an inconsistency between the SMD Theorem, which finds GE unstable, and the VL Smith (2007) experimental economics, which finds stability in market prices. The static/dynamic divide provides a simple explanation for these results that appear at odds with one another. The SMD Theorem is a static theory by assuming all the buyers and sellers agree on a price at one point in time then swap the goods. VL Smith’s (2007) markets are dynamic in that buyers and sellers continually trade. In summary, the examples show static instability and dynamic stability. Keen (2001) illustrates the difference between dynamic and static economic analysis using the analogy of learning to ride a bike. Learning how to balance on a stationary bike tell us very little about riding a moving bike. Further, it is easy to balance on a bike that is moving but nearly impossible on a stationary bike, so why waste time learning to balance on a stationary bike when you want the bike to move. Static economics has failed to find a stable GE and modelling the dynamic economy as a static entity may well be misleading and time–wasting.

    The DSGE and CGE models use the word ‘dynamic’ but they are still part of the stop–gap static regime. Their claim to being dynamic is at the level that a number of pictures of a stationary bike can be combined to produce a moving film. However each of these still pictures or GE is actually unstable. The CGE and DSGE produce dynamic instability, contrasting with VL Smith’s dynamic stability. Making the CGE and DSGE dynamically stable requires relaxing the methodological equilibration assumption.

    Farmer and Geanakoplos (2008, p. 54) discusses neoclassical theory as “an elegant attempt to find a parsimonious model of human behaviour in economic settings. It can be criticized, though, as a quick and dirty method, a heroic attempt to simplify a complex problem. Now that we have begun to understand its limitations, we must begin the hard work of laying new foundations that can potentially go beyond it.” The computing power is available and the ‘science of complexity’ provides a suitable framework. Additionally, the ‘science of complexity’ uses scientific realism, which provides a much sounder philosophical basis than neoclassical economics using instrumentalism.

    While Farmer and Geanakoplos (2008, p. 54) call neoclassical economics “an heroic attempt to simplify a complex system”, Blaug (1992, pp. 238-39) criticises neoclassical economics for its “reluctance to produce theories that yield unambiguous refutable implications, followed by a general unwillingness to confront those implications with fact.” Given neoclassical economics is based upon instrumentalism whose only claim to the validity of a theory is its ability to predict, this begs the question what is neoclassical theory? Neuman (2003, p. 43) discusses the difference between social theory and ideology, noting ideologies avoid tests and discrepant findings.

    The falsification avoidance in the following neoclassical theories very much confirms Blaug (1992, pp. 238-39) criticism: the Rational Expectations Hypothesis (REH), efficient market hypothesis (EMH), the General Equilibrium Theory (GET) and dynamic stochastic general equilibrium (DSGE). On REH, Prescott (1977, p. 30) claims that “Like utility, expectations are not observed, and surveys cannot be used to test the rational expectations hypothesis. One can only test if some theory, whether it incorporates rational expectations or, for that matter, irrational expectations, is or is not consistent with observations.” Following his advice, REH is examined within the larger theories of DSGE, EMH and GET. Regarding DSGE, Mäki (2002, p. 42) discusses how Kydland and Prescott (1982) rather than using standard econometric testing of their model, create a new method for testing where their parameters are quantified from calibration of their model, claiming these calibrated parameters yield empirical content for testing. Prescott (1988, p. 84) acknowledges that their models are “necessarily false and statistical hypothesis testing will reject them.” Regarding EMH, Barberis and Thaler (2002, p. 8 ) note that any test of the EMH jointly tests the discounted future cash flow model, making it difficult to provide evidence of market inefficiency. This is known as the “joint hypothesis problem”. Regarding GET, the Sonnenschein–Mantel–Debreu (SMD) Theorem finds that the aggregate excess demand curve is shapeless, proving the neoclassical assumptions are inconsistent. GET assumptions are already unrealistic but attempts are made to fix the shapeless curve by specifying all goods have neutral Engels curve (Keen 2001, pp. 38-42). In practice there are few if any goods with neutral Engel curves, making GET assumptions even more unrealistic.

    The ‘science of complexity’ offers a number of approaches that can allow economics to become a science and resolve many of the current impasses. For example Lo’s (2004) Adaptive Market Hypothesis (AMH) reconciles the EMH and behavioural economics within an evolutionary framework. Using agents within a network structure based upon an ‘input-output table’ could be implemented in CGE, GET and DSGE to improve shock transmission and improve the realism of the models making them more suitable for policy development. This would in effect make CGE, GET and DSGE into ABMs.

    Logically and philosophically the ‘science of complexity’ is an improvement over the neoclassical framework. However, initially the neoclassical framework was more computationally expedient.

    Brock and Colander (2000, p. 76) note when economists apply neoclassical economics to policy development that many modifications are made to allow for the unrealistic assumptions and that supplements are added, resulting in a pragmatic and eclectic approach. So, practitioners adjust for domain assumption problems even if they are unrecognised as such. This eclectic approach could easily accommodate Agent Based Models (ABM), introducing dynamic modelling into the practitioner’s toolbox. However Dawid and Fagiolo (2008, p. 352) note that despite encouraging signs ABM are not considered standard tools. Dawid and Fagiolo (2008, p. 352) consider factors other than inertia in the profession to learning new tools are impeding the uptake of ABM, such as, concerns over empirical model validation and robustness checks and unrestricted number of potential model parameters.

    Implementing input-output networks in CGE and DSGE could improve policy applicability and accuracy by making the underlying assumptions more realistic. CGE is used routinely by governments even though it lacks microfoundations as discussed. DSGE is criticised for its poor modelling of transmission of shocks. CGE including DSGE fail to capture the network dynamics of the economy. For example take two sectors of equal input-output size that have few and many firms respectively. The linkages and number of firms determines the speed of transmission of any shock and potentially setting up endogenous effects.

    Brock and Colander (2000, p. 79) discuss how the complexity approach lacks a complete formal model of the economy but can be used to develop policy using informal models that describe the current economy and the effect of policy upon it.

    References

    Barberis, N & Thaler, RH 2002, ‘A Survey of Behavioral Finance’, NBER Working Paper, no. W9222, viewed 17 Feb. 2009, .
    Bell, PW 2009, ‘Adaptive Interactive Expectations: Dynamically Modelling Profit Expectations’, University of Queensland.
    Blaug, M 1992, The methodology of economics, or, How economists explain, 2nd edn, Cambridge surveys of economic literature., Cambridge University Press, Cambridge, England.
    Brock, WA & Colander, D 2000, ‘Complexity and Policy’, in DC Colander (ed.), The complexity vision and the teaching of economics, E. Elgar, Cheltenham, UK ; Northampton, MA, pp. xv, 307 p.
    Clark, JB 1898, ‘The Future of Economic Theory’, The Quarterly Journal of Economics, vol. 13, no. 1, pp. 1-14.
    Colander, DC 2000, The complexity vision and the teaching of economics, E. Elgar, Cheltenham, UK ; Northampton, MA.
    Dawid, H & Fagiolo, G 2008, ‘Agent-based models for economics policy design: Introduction to the special issue’, Journal of Economic Behavior and Organization, vol. 67, no. 2008, pp. 351-4.
    Farmer, JD & Geanakoplos, J 2008, ‘The Virtues and Vices of Equilibrium and the Future of Financial Economics’, Cowles Foundation, viewed 15 Feb. 2009, DOI Discussion Paper No. 1647, .
    Friedman, M 1953, Essays in positive economics, University of Chicago Press, Chicago.
    Hahn, F 1984, Equilibrium and macroeconomics, Basil Blackwell, Oxford.
    Jevons, WS 1911, The theory of political economy, 4th edn, Macmillan and co. limited, London.
    Keen, S 2001, Debunking economics: the naked emperor of the social sciences, Pluto Press, Annandale, N.S.W.
    Keynes, JM 1923, A tract on monetary reform, Macmillan, London.
    Kydland, FE & Prescott, EC 1982, ‘Time to Build and Aggregate Fluctuations’, Econometrica, vol. 50, no. 6, pp. 1345-70.
    Lo, AW 2004, ‘The Adaptive Markets Hypothesis: Market Efficiency from an Evolutionary Perspective ‘, Journal of Portfolio Management, vol. 30, no. Anniversary, pp. 15-29.
    Mäki, U 2002, Fact and fiction in economics, Cambridge University Press.
    Marshall, A 1920, Principles of economics: an introductory volume, 8th edn, Macmillan, London.
    Musgrave, A 1981, ‘”Unreal assumptions” in economics theory: the F-twist untwisted’, Kyklos, vol. 34, no. 3, pp. 377-87.
    Neuman, WL 2003, Social research methods: qualitative and quantitative approaches, 5th edn, Allyn and Bacon, Boston; London.
    Prescott, EC 1977, ‘Should Control Theory be Used for Economic Stabilization’, Carnegie-Rochester Conference Series on Public Policy, vol. 7, pp. 13-38.
    —- 1988, ‘Theory ahead of business cycle’, in JE Hartley, KD Hoover & KD Salyer (eds), Real business cycles: a reader, Routledge.
    Smith, VL 2007, Markets and Experimental Economics, Library of Economics and Liberties, .
    Weintraub, ER 1985, General equilibrium analysis : studies in appraisal, Historical perspectives on modern economics., Cambridge University Press, Cambridge [Cambridgeshire] ; New York.
    WordNet® 3.0 Princeton University 2009, Instrumentalism, 8 April 2009, .

  5. enteringthewhirlpool says:

    The Sonnenschein–Mantel–Debreu (SMD) Theorem (Debreu 1974; Mantel 1974; Sonnenschein 1972, 1973) proves that General Equilibrium has multiple equilibria.

    It does no such thing. It shows that multiple equilibria are possible even when individual agents have ‘nice’ properties

    Kirman (1992, p. 119) notes that the uniqueness of equilibrium in General Equilibrium Theory (GET) is required to justify comparative statics and its use in policy analysis,

    No it isn’t. Even with the existence of multiple equilibria you might have strong reason to believe that you are much more likely to end up at one rather than another after a policy shift. Let’s say a country has two types of equilibria: normality and anarchy. Policymakers want to decide what happens when they raise tobacco taxes. Are they better off comparing the current situation against normality with higher tobacco taxes instituted, or are they better off comparing it to an anarchic Hobbesian hell?

    SMD says that aggregate demand will not necessarily inherit certain properties from individual demands, whatever they may be. This is an abstract statement. What SMD does not say is that it is impossible, given individual preferences as they are, to make statements about aggregate demands as they are.

  6. HopeForTheDismalScience says:

    Enteringthewhirlpool thank you for your comments. The point of general equilibrium theory (GET), a “bottom up” model, was to find a price vector where all markets are in equilibrium. This endeavour known as the microfoundations project failed, see Keen (2001) for discussion. The CGE and DSGE models are both “top down” models. They just *assume equilibrium* but use the failed microfoundations assumptions of the GET. See Mitra-Kahn (2008) for discussion of “top-down” and “bottom-up” concepts. Neoclassical economics uses an aggregate demand curve and an aggregate supply curve usually portrayed as straight lines that cross at one point, making it possible to discuss comparative statics. Never does a lecturer suggest that the aggregate excess demand curve is shapeless as predicted by the SMD theorem but they should if they are using neoclassical assumptions. These multiple equilibria are an artefact of the GE models. A point made by the Noble Laureate Vernon Smith (2007) in experimental economics that stable equilibrium are found when participants have secret information and the auction process is spread out over time. Both these conditions are violated by the GE models, which assume perfect knowledge of prices by participants and all markets clearing at a fixed time.

    I will leave you with a comment from another Nobel laureate who was a prime mover in neoclassical economics. Solow (2003) asks “So how did macroeconomics arrive at its current state? … The original impulse to look for better or more explicit micro foundations was probably reasonable. … What emerged was not a good idea.The preferred model has a single representative consumer optimizing over infinite time with perfect foresight or rational expectations, in an environment that realizes the resulting plans more or less flawlessly through perfectly competitive forward-looking markets for goods and labour, and perfectly flexible prices and wages”. What Solow (2003) is describing as “not a good idea” is the neoclassical framework, incorporating the CGE and DSGE models.

    References

    Keen, S 2001, Debunking economics: the naked emperor of the social sciences, Pluto Press, Annandale, N.S.W.
    Mitra-Kahn, BH 2008, ‘Debunking the Myths of Computable General Equilibrium Models’, Schwartz Center for Economic Analysis, DOI Working Paper 2008-1, .
    Smith, VL 2007, Markets and Experimental Economics, Library of Economics and Liberties, .
    Solow, RM 2003, ‘Dumb and Dumber in Macroeconomics’, paper presented to Economics for an Imperfect World, Columbia University, 24-25 October, 2003.

  7. kiyallsmith says:

    Speaking as a sociologist, I would like to point out that many of the protesters at the G-8 are not just calling for better capitalism, they are calling for another economic structure. In that case it is not appropriate for them to demand the use of more appropriate models of the current economic structure.

    Keri

  8. HopeForTheDismalScience says:

    Hello Keri

    If you know how to open the window, why smash the glass? The models that I am suggesting incorporate economic structure to account for endogenous effects. Neoclassical economic models are dumb about endogenous effects acting through economic structure. Beinhocker (2006, p. 185) observes three factors that affect emergent phenomena in an economic system: exogenous inputs, the behaviour of participants and the structure of institutions. Consistent with Beinhocker’s economic emergent factors regarding behaviour and structure, Giddens (1990) states that “Society only has form, and that form only has effects on people, insofar as structure is produced and reproduced in what people do”. Giddens (1984) notes that society, an emergent phenomena, has both a structural and an agency-component: The structural environment constrains individual behaviour, but also makes it possible. There is a duality to behaviour and structure in that behaviour affects structure and structure affects behaviour. Beinhocker (2006, p. 185) notes that traditional economics focuses on the exogenous causes for oscillations while ignoring the latter two endogenous causes for oscillations.

    There is an obvious advantage for using models to simulate and to compare alternative economics structures over the more traditional methods of settling ideological debates over economic structure, for instance fascism versus communism during World War II.

    Regarding smashing the glass, if the technocrats and economists keep feeding the politicians from all parties the same Utopian vision of the world in the image of neoclassical economics, then the usual social safety valve of democracy to vote out ruling parties becomes ineffective. This situation is dangerous for democracy and partially to blame for the G8 protests.

    It is this danger that I see for the world’s global politics stemming from economists using the wrong models. Both Marxist and neoclassical ideologies are theoretically unsound, so there is no advantage in using the flawed logic “I have proven that you are wrong, therefore I am right”. For practical and theoretical reasons, the demise of neoclassical ideology will be as significant as the demise of Marxism. However both ideologies do and will persist and the worse case scenario is societies jumping from one extreme to another, for example Russia and Venezuela.

    The most pressing question is why does neoclassical economics persist to misinform our political leaders? I will leave you with an extract from the website of toxic textbooks that answers the question well.

    “This is a movement to encourage universities to use economics textbooks that engage honestly with the real world.

    Toxic textbooks helped cause the current economic meltdown. It did not result from natural causes or human conspiracy, but because society at all levels became infected with false beliefs regarding the nature of economic reality. And the primary sources of this infection are the “neoclassical” or “mainstream” textbooks long used in introductory economics courses in universities throughout the world.

    Every year these “mainstream” books serve to indoctrinate millions of students in a quaint ideology (perfect rationality of economic agents, market efficiency, the invisible hand, etc.) cunningly disguised as science. This mass mis-education deprives society of the moral and intellectual capacities it needs in order to design and maintain the support systems required by market economies.

    We need new, non-toxic textbooks. More economic catastrophes will befall us and our children if we do not replace our toxic textbooks with non-toxic ones immediately. If decency and good sense prevail in academia, then affecting this reform will not be a problem. But textbook reform will damage many economic faculties and toxic textbook authors. The former will suffer losses to their reputations, the latter to their royalties, which in some cases run to millions of dollars.

    Society can therefore expect well-placed and richly-funded strategic resistance to doing the right and necessary thing. This Facebook group and the website http://www.toxictextbooks.com exist to help citizens, especially students, mobilize and organize themselves to overcome these vested interests.”

    References

    Beinhocker, ED 2006, Origin of Wealth: Evolution, Complexity, and the Radical Remaking of Economics, Harvard Business School Press, Cambridge, Mass.
    Giddens, A 1984, The constitution of society: outline of the theory of structuration, University of California Press, Berkeley.
    —- 1990, The consequences of modernity, Stanford University Press, Stanford, Calif.

  1. 17th September 2009

    […] The kingdom of Supply and Demand may be a fantasy, but the mathematical wizardry is quite real. The SMD theorem proved that stability is not a general feature of markets [PDF], and Saari proved [PDF] that any […]

  2. 3rd October 2009

    […] our political systems via university economics departments indoctrinating undergraduates with the neoclassical ideology.  This article discusses how the indoctrination produces a world view which causes confusion over […]

  3. 10th October 2009

    […] our political systems via university economics departments indoctrinating undergraduates with the neoclassical ideology.  This article discusses how the indoctrination produces a world view which causes confusion over […]

  4. 16th January 2010

    […] However, Fama and French (2004) evaluate the performance of CAPM, concluding that empirical evidence invalidates the use of CAPM in applications, after finding that passive funds invested in low beta, small or value stocks tend to produce positive abnormal returns relative to CAPM predictions. This is relevant to EMH for two reasons, the criticisms come from the founder of EMH, Fama, and CAPM builds on the assumptions of EMH. Put simply, those using the CAPM invest more heavily in higher risk investments than is optimal, which contributes to the Global Financial Crisis (GFC). If these models continue to be taught in financial courses, they will help set the stage for the next GFC. Furthermore, this evidence is another nail in the coffin of neoclassical economics; the failings of neoclassical economics is discussed further in my post “The G8 protests and the logically inconsistent foundations of neoclassical economics“. […]

Leave a Reply

Your email address will not be published. Required fields are marked *