Showing posts with label econometrics. Show all posts
Showing posts with label econometrics. Show all posts

Wednesday, August 21, 2019

Econometrics and the problem of unjustified assumptions — Lars P. Syll


This is important but may be too wonkish for those who are not intimately familiar with econometrics. So let me try to simplify it and universalize it.

The basic idea in logical reasoning is that an argument is sound if and only if the premises are true and the logical form is valid.  Then the conclusion follows as necessarily true.

This is the basis of scientific reasoning.

In modeling, a set of assumptions, both substantive and procedural, is stipulated, that is, assumed to be true. In a well-founded model all the assumptions that make substantive claims are known to be true empirically on the basis of evidence. This is called semantic truth. The logical truth of logical form is formal proof. This is called syntactical truth. Only the former contains substance. The latter is purely procedural.

A key methodological assumption of the scientific method is naturalism. Being "scientific" signifies being based on observation, rather than say, intuition or "common sense," that is, self-evidence. No self-evident first principles — that's doing philosophy, not science. Not that such speculation is not useful. It's just not science and should not be conflated with science. There is often a tendency to do so.

This presents two major difficulties with scientific modeling versus philosophical speculation. The first is the empirical warrant of the starting points, the stipulations that are assumed to be true and serve as the premises of the argument. The second is knowing that all relevant information is included in the assumptions. This is called identification.

Paraphrasing Richard Feynman, we do science in order to avoid fooling ourselves and we are the easiest ones to fool (owing to confirmation bias, for example). This requires following scientific method scrupulously when substantial claims are made.

Keynes pointed out to Roy Harrod that econometrics did not conform to this strict procedure and that owing to the nature of the subject matter, economics was "moral science," which at the time signified what we would now call "philosophy." The social sciences and much of psychology fall into this category. They are basically speculative exercises that employ some formal methods that may be scientific, or not. 

Accounting is a formal method that is proto-scientific in the sense that double entry it is made up of tautologies. But the entries can be checked for substance against journals and inventories. It is a method to prevent fooling ourselves on one hand, and to prevent cheating on the other.

When accounting tautologies (identities) are interpreted causally, then causal explanation demands empirical corroboration through data, e.g., measurable changes in stocks and flows.

Lars P. Syll’s Blog
Econometrics and the problem of unjustified assumptions
Lars P. Syll | Professor, Malmo University

See also

Bond Economics
Comments On "Business Cycle Anatomy"
Brian Romanchuk

Monday, July 8, 2019

My Journey from Theory to Reality — Asad Zaman

Over the twenty years that I have been pursuing an Islamic approach — focusing on the production of USEFUL knowledge, I have managed to heal all three of these divides. This happens naturally, when you focus on solution of real world problems. You automatically need to combine information coming from many different specialization areas. You need to use reasoning and also intuition. You also need to use both theory and its applications to the real world experiences. This leads to substantial changes in the subject matter itself. I have applied this approach with great success to Econometrics, Statistics, Microeconomics, Macroeconomics, Experimental Economics, and even Mathematics itself. I am in process of creating textbooks and teaching materials in all of these areas. Because my work is most advanced in the area of Statistics, I am working on putting it all together in a new course on Real Statistics: An Islamic Approach. There is a large amount of pre-existing material – lectures, texts, exercises, references – that I have created over the past decade on working on this course. However, as I progress, I keep learning new things, and this time I want to put together a polished new version of this course for public use. My primary target audience is teachers of statistics — I would like to persuade them to use this new approach to teach statistics. Those who would like to follow my progress as I construct a new website on a lecture by lecture basis gradually are encourged to fill in the following Registration form. I will use emails to notify them when I complete a new lecture, and also invite feedback on what is there, so that we can build it up with clarity and consensus....
All thinking, since humans think in language, is based on context, meaning being determined by context. The shaper of context is the worldview in which the group is functioning. In the West, the contemporary worldview was shaped by the Western history, chiefly Greek thought, Judaeo-Christian religion, Roman law, and modern science. Its intellectual products were shaped by the Western intellectual tradition that culminated most recently in the rise of science, which new supervenes over what preceded it. The basic assumption of the Western scientific world view is methodological naturalism, which many if not most of the foremost exponents equate with metaphysical materialism.

This is taking place in the overarching worldview of Western liberalism that was developed in the 18th century as an antidote to theological dogmatism. Scientific naturalism and the ideal of unified scientific explanation, or consilience, replaced the great chain of being, as the dominant paradigm of explanation.

Regarding social, political and economic thought, many if not most of the foremost authorities equate economic liberalism with Western capitalism  as the dominant mode of production and also view political liberalism in the form of representative democracy being determined by capitalism as economic liberalism. Initially, economic liberalism implied laissez-faire and sought to replace government by the market. Subsequently, when it become clear that government was needed for institutional structure, classical economic liberalism shifted to neoliberalism, which is the view that economic and financial interest should control government and direct institutional arrangements and operations toward furthering economic interests.

While the West is still the most influential bloc worldwide, that is beginning to change. The rest of the world, which had accepted the assumptions on which this worldview is based owing to the success of the West. Now many are beginning to question whether these assumptions are as robust as they seemed as problems arise and the paradoxes of liberalism manifest.

Consequently, some of those that had accepted the Western stance previously and were also educated in it are beginning to rethink their positions in light of the traditional worldviews that prevail in their societies. Many of the traditional worldviews are embedded in a religious contexts that have become cultural. Even in secular China, President Xi is resurrecting Confucius as a cultural icon, and in the supposedly secular US, dominant religious groups are asserting influence more openly, with science itself subject to challenge when it is perceived to conflict with tradition.

Asad Zaman's post is good example of this rising trend, as well as what a highly educated person asking such questions might do about it. This process is an iteration of the historical dialectic as liberal and traditionalism interact to forge a complementary Zeitgeist that moves history forward a step.

What should a "good" liberal think about this? Freedom of thought and expression are fundamental to liberalism this implies tolerance. So the answer is given by none other than Mao Tse-Tung, "Let a hundred flowers bloom."

Asad Zaman makes one other point worth sharing for those that may not choose to read his post in full.
Sometime during this process of switching from teaching theory to teaching how to solve real world problems, I came across the “Statistics” textbook of David Freedman. This textbook actually implemented exactly this idea that I had come to believe in — do statistics in context of solving real world problems. One amazing characteristic of this textbook is that it has no mathematical formula – ZERO. Freedman explained that students use formulae as crutches to prevent them from thinking. So he explains all concepts in words only, exactly the same insight that I had learnt on my own. Formulas teach you techniques for calculation. We don’t need these techniques — leave them to the computer. We need to UNDERSTAND what these calculations mean. That is a VERY DIFFERENT process. I got involved in an email correspondence with David Freedman, who had very similar experience to mine. He had started out as a very heavily mathematically oriented researchers. His early papers are all very heavy mathematically. Later, when he got involved in doing some testimony in real world court cases, he realized that all of the theory he had learnt was useless in the real world. This is because the assumptions we make in theory are almost always false in the real world. Then he had to learn how to do real world statistics, exactly as I have had to do. Since most fancy assumptions we make in statistics and econometrics are wrong, we need to learn how to do simple and basic inferences, which actually makes life much easier for students of the subject — we need to teach them basic and intuitive things, not complex models and math....  
An Islamic Worldview
My Journey from Theory to Reality
Asad Zaman | Vice Chancellor, Pakistan Institute of Development Economics and former Director General, International Institute of Islamic Economics, International Islamic University Islamabad

Thursday, April 11, 2019

Michael Emmett Brady — Keynes’s Theory of Measurement is contained in Chapter III of Part I and in Chapter XV of Part II of the A Treatise on Probability

Abstract
Professor Yasuhiro Sakai (see 2016; 2018) has argued that there is an mysterious problem in the A Treatise on Probability, 1921 in chapter 3 on page 39 (page 42 of the 1973 CWJMK edition). He argues that there is an unsolved mystery that involves this diagram that has remained unexplained in the literature.
The mystery is that Keynes does not explain what he is doing in the analysis involving the diagram starting on the lower half of page 38 and ending on page 40 of chapter III. In fact, the mystery is solved for any reader of the A Treatise on Probability who reads page 37 and the upper half of page 38 carefully and closely. Keynes explicitly states on those pages that he will give only a brief discussion of the results of his approach to measurement on pages 38-40, but will provide a detailed discussion of his approach to measurement in Part II, after which the brief discussion of the results presented on pp.38-40 will be strengthened.
The Post Keynesian (Joan Robinson, G L S Shackle, Sydney Weintraub, Paul Davidson) and Fundamentalist (Donald Moggridge, Robert Skidelsky, Gay Meeks, Anna Carabelli, Athol Fitzgibbons, Rod O’Donnell, Tony Lawson, Jochen Runde) schools of economics, as well as economists, in general, such as Jan Tinbergen and Lawrence Klein, have ignored chapter XV of the A Treatise on Probability. Keynes demonstrates on pp.161-163 of the A Treatise on Probability in chapter XV that his approach to measurement is an inexact approach to measurement using approximation to define interval valued probability, which is based on the upper-lower probabilities approach of George Boole, who discussed this approach in great detail in chapters 16-21 of his 1854 The Laws of Thought. Therefore, the only conclusion possible is that the “mysterious” diagram presented on page 39 of the A Treatise on Probability is an illustration of Keynes’s approximation technique using interval valued probability, since the problem on pages 162-163 of the A Treatise on Probability explicitly works with seven “non numerical” probabilities while the illustration of Keynes’s approach using the diagram on page 39 works with six “non numerical” probabilities and one numerical. It is impossible for the diagram on page 39 to support any claim, as has been done repeatedly for the last 45 years by the Post Keynesian and Keynesian.
Fundamentalist schools, that Keynes’s theory was an ordinal theory that could only be applied some of the time. This leads precisely to the wrong conclusion that Keynes was arguing that macroeconomic measurement, in general, was impossible in economics, which was G L S Shackle’s conclusion.
An understanding of chapter XV of the A Treatise on Probability explains the conflict that existed between J M Keynes and J Tinbergen on the pages of the Economic Journal of 1939 -1940.The major point of discussion, underlying all of Keynes’s major points, was that Tinbergen’s exact measurement approach, taken from macroscopic physics, using the Normal probability distribution’s precise, exact, definite, linear, additive, and independent probabilities, was not possible given the type of data available in macroeconomics. Only an inexact approach to measurement using imprecise and indeterminate interval valued probability was tenable.
An understanding of chapter XV of Part II of the TP explains the fundamental point of disagreement between J M Keynes and J Tinbergen over the issue of measurement. Tinbergen brought his physic background with him to the study of economics. Tinbergen believed that the exact measurement approach that he had absorbed in his study of statistical physics, using additive, linear, exact, precise definite probability distributions like the Normal or log normal, could be used in the study of macroeconomics that would provide a precise and exact explanation of business cycles. Keynes, of course, given his great, overall experience in academia, industry, business, government, the stock markets, bond markets, money markets, banking, finance, and commodity futures markets, had vast experience that Tinbergen, an academic only, did not have. Keynes saw that Tinbergen’s application was the wrong one, although the technique would be applicable to studies of consumption and inventories.
Wonkish.

SSRN
Keynes’s Theory of Measurement is contained in Chapter III of Part I and in Chapter XV of Part II of the A Treatise on Probability (1921;1973 CWJMK Edition): Keynes Stated That the Exposition in Chapter III of the a Treatise on Probability Was 'Brief', While the Exposition in Chapter XV, Part II, Of the a Treatise on Probability, Was 'Detailed'
Michael Emmett Brady | California State University, Dominguez Hills

Saturday, October 13, 2018

Nick Hanauer — How to Destroy Neoliberalism: Kill ‘Homo Economicus’

I believe that these corrosive moral claims derive from a fundamentally flawed understanding of how market capitalism works, grounded in the dubious assumption that human beings are “homo economicus”: perfectly selfish, perfectly rational, and relentlessly self-maximizing. It is this behavioral model upon which all the other models of orthodox economics are built. And it is nonsense.
The last 40 years of research across multiple scientific disciplines has proven, with certainty, that homo economicus does not exist. Outside of economic models, this is simply not how real humans behave. Rather, Homo sapiens have evolved to be other-regarding, reciprocal, heuristic, and intuitive moral creatures. We can be selfish, yes—even cruel. But it is our highly evolved prosocial nature—our innate facility for cooperation, not competition—that has enabled our species to dominate the planet, and to build such an extraordinary—and extraordinarily complex—quality of life. Pro-sociality is our economic super power.
Economists are not wrong when they attribute the material advances of modernity to market capitalism’s genius for self-organizing an increasingly complex and intricate division of knowledge, knowhow, and labor. But it’s important to recognize that the division of labor was not invented in the pin factories of Adam Smith’s eighteenth century Scotland; at some level, it has been a defining feature of all human societies since at least the cognitive revolution. Even our least complex societies, small bands of hunter-gatherers, are characterized by a division of labor—hunting and gathering—if largely along gender lines. The division of labor is a trait that is universal to our prosocial species.
Viewed through this prosocial lens, we can see that the highly specialized division of labor that characterizes our modern economy was not made possible by market capitalism. Rather, market capitalism was made possible by our fundamentally prosocial facility for cooperation, which is all the division of labor really is.…
The following observation is critical.
This dispute over behavioral models has profound non-academic consequences. Many economists, while acknowledging its flaws, still defend homo-economicus as a useful fiction—a tool for modeling and understanding the economic world. But it is much more than just an economic model. It is also a story we tell ourselves about ourselves that gives both permission and encouragement to some of the worst excesses of modern capitalism, and of contemporary moral and social life...
While models purport to be descriptive, they function as metaphors. All models are limited in the interest of economy and tractability. It is simply not possible to construct a complete description of a system that is complicated, let alone complex. The purpose of the model is to isolate important relationships and regularities using the model as an analogy, whether its construction is conceptual or mathematical. The question is then how useful is the model in elucidating relations and regularities that are not evident without analysis.

There is nothing inherently wrong with exploring a domain using all models that may be useful in this regard. Those that are actually useful will be used and eventually the others will be discarded or supplanted.

The economic model based on homo economicus as outlived its usefulness for several reasons. The first is a descriptive issue. The second is a normative one.

The first is that the scope of such models is too limited to provide much useful information. The assumes humans in the "state of nature" following evolutionary principles based on survival of the fittest" through competition in a symmetrical environment. Thus, the appeal to "spontaneous natural order" on the conditions that "imperfections," such as prosocial policy, are minimized.

This state of affairs doesn't apply to modern societies and their embedded economics that are highly influenced by culture and institutions. This means that in econometrics, important information will be put aside for modeling convenient, e.g., in the interest of mathematical tractability. The result is that model equilibria may not reflect observed events accurately. This is accounted for using ceteris paribus although conditions are actually changing, positing constants when conditions call for variables, and an indefinite "long run."

The second is more serious because it is normative. Conclusions that purport to be positive are used normatively and prescriptively. This is especially the case when models use technical terms take from ordinary language. Even if the terms are defined operationally, the ordinary language meaning comes along, altering not only the denotation but also the connotation. For example, "debt" as a liability becomes "debt" as something bad, dangerous, and to be avoided.

None of this is in way "scientific" regardless the trappings in the terminology of science.
If we accept that it is true—if we internalize that most people are mostly selfish—and then we look around the world at all of the unambiguous prosperity and goodness in it, then it follows logically, it must be true, by definition, that a billion individual acts of selfishness magically transubstantiate into prosperity and the common good. If it is true that humans really are just selfish maximizers, then selfishness must be the cause of prosperity. And it must be true that the more selfish we are, the more prosperous we all become. Under this logical construct, the only good decision is a business decision—“Greed is good”—and the only purpose of the corporation must be to maximize shareholder value, humanity be damned. Welcome to our neoliberal world.
But if, instead, we accept a prosocial behavioral model that correctly describes human beings as uniquely cooperative and intuitively moral creatures, then logically, the golden rule of economics must be the Golden Rule: Do business with others as you would have them do business with you. This is a story about ourselves that grants us permission and encouragement to be our best selves. It is a virtuous story that also has the virtue of being true....
Unfortunately, Hanauer then concludes, without justification, that capitalism is the solution rather than the problem. The problem is the approach to capitalism.
Capitalism is the greatest problem-solving social technology ever invented. But knowing that capitalism works is different than knowing why it works. And contrary to economic orthodoxy, it is reciprocity, not selfishness that guides it—indeed—as if by an invisible hand. It is social reciprocity that builds the high levels of trust necessary for large networks of people to cooperate at scale. And it is only through these networks of highly-cooperative specialists that the complexity that defines our modern economy can emerge....
I argue that this cannot be true and it is contradicted by what he said previously.

It is generally agreed that there are three major factors of economic productions — capital, land and labor. Rent is income generated without productive work. Feudalism is a system that favors the ownership of land and extraction of rent through agriculture. Capitalism is a system that favors the ownership of industrial and finance capital and extraction of rent through ownership of capital.

What is needed instead of a rejiggering of capitalism is an integrated system that balances capital, land and labor, that is, the means of production with people and the environment. This is different from most definitions of socialism and might be termed holism or ecologism, or some such that denote a condition of harmony, balance, and wellbeing of people and the planet.

What is required is a vision of possibilities and plans to actualize them. This design process may be speeded up by necessity as climate change begins to bite down harder.

Evonomics
How to Destroy Neoliberalism: Kill ‘Homo Economicus’ — Debunking the failed paradigm of traditional economics
Nick Hanauer

Wednesday, June 27, 2018

Lars P. Syll — The main reason why almost all econometric models are wrong

Since econometrics doesn’t content itself with only making optimal predictions, but also aspires to explain things in terms of causes and effects, econometricians need loads of assumptions — most important of these are additivity and linearity. Important, simply because if they are not true, your model is invalid and descriptively incorrect. And when the model is wrong — well, then it’s wrong....
Simplifying assumptions versus oversimplification.

Lars P. Syll’s Blog
The main reason why almost all econometric models are wrong
Lars P. Syll | Professor, Malmo University

Wednesday, March 14, 2018

Brian Romanchuk — The Curious Profit Accounting Of DSGE Models

One of the more puzzling aspects of neo-classical economic theory is the assertion that profits are zero in equilibrium under the conditions that are assumed for many models. One should re-interpret this statement as "excess profits" are zero, but there are still some awkward aspects to the treatment of profits in standard macro models. This article works through the theory of profits for an example dynamic stochastic general equilibrium (DSGE) model, and discusses the difficulties with the mathematical formulation.
The example is taken from Chapter 16 ("Optimal Taxation With Commitment") in the textbook Recursive Macroeconomic Theory, by Lars Ljungqvist and Thomas J. Sargent (I have the third edition). For brevity, the text will be abbreviated as [LS2012] herein. If the reader is mathematically trained and wishes to delve into DSGE models, this textbook is the best place to start. The mathematics is closer to the original optimal control theory that DSGE macro is based upon, whereas other treatments follow the mathematical standards of academic economics, the difficulties with which are discussed later in this article....
Bond Economics
The Curious Profit Accounting Of DSGE Models
 Brian Romanchuk

Se also

Lars P. Syll’s Blog
Ricardian equivalence — nothing but total horseshit!
Lars P. Syll | Professor, Malmo University

Wednesday, December 13, 2017

Andrew Gelman — Yes, you can do statistical inference from nonrandom samples. Which is a good thing, considering that nonrandom samples are pretty much all we’ve got.

To put it another way: Sure, it’s fine to say that you “cannot reach external validity” from your sample alone. But in the meantime you still need to make decisions. We don’t throw away the entire polling industry just cos their response rates are below 10%; we work on doing better. Our samples are never perfect but we can make them closer to the population.
Remember the Chestertonian principle that extreme skepticism is a form of credulity.
Making assumptions is necessary. However, it is also necessary to recognize and acknowledge limitations. Formal modeling is never more accurate for the math than the assumptions permit.

Reasoning is a tool of intelligence. It is not a magic wand. Taking reasoning for a magic wand because it is highly formalized is magical thinking.

It is important to distinguish necessity from contingency. Necessity is based on logic necessity (tautology) and logical impossibility (contradiction). These are purely syntactical, that is, based on applying rules to signs. Logical necessity is probability one; contradiction is probability zero. All description is contingent on observation.

Statistics is a reasoning tool for dealing with contingency. The formal aspect of the tool does not vary, but its application is dependent on assumption and measurement. Thinking that the results will be the same owing to the invariant formal aspect is a mistake. Results can never be more precise than measurements or more accurate than assumptions permit, no matter how rigorous the formal methods applied.

Statistical Modeling, Causal Inference, and Social Science
Yes, you can do statistical inference from nonrandom samples. Which is a good thing, considering that nonrandom samples are pretty much all we’ve got.
Andrew Gelman | Professor of Statistics and Political Science and Director of the Applied Statistics Center, Columbia University

Lars P. Syll — The DSGE quarrel


Quote by Silvia Merler/Bruegel mentioning Lars, with a shoutout to Brian Romanchuk.

Lars P. Syll’s Blog
The DSGE quarrel
Lars P. Syll | Professor, Malmo University

More from Lars

Economic history — a victim of economics imperialism

Empirical economics and statistical power

Saturday, September 16, 2017

Lars P. Syll — Stiglitz and the full force of Sonnenschein-Mantel-Debreu


Just why is anyone still going to these people for policy advice, let alone putting some of them in charge of setting policy?

The power of elite discourse to persuade is dangerous when an elite controls the frame and there is no accountability for results.

Lars P. Syll’s Blog
Lars P. Syll | Professor, Malmo University


Saturday, June 24, 2017

Lars P. Syll — What is a statistical model?


Best explanation or best guess?
As Bertrand Russell put it at the end of his long life devoted to philosophy, “Roughly speaking, what we know is science and what we don’t know is philosophy.” In the scientific context, but perhaps not in the applied area, I fear statistical modeling today belongs to the realm of philosophy. — Rudolf Kalman
This is important. Since the scientific revolution the progress of knowledge has been moving information from best guess using reasoning (philosophy) to best explanation based on testing hypotheses of a theory against data obtained from observations (science).

Science is true explanation where truth is based on criteria and comparison with criteria. 

Philosophy asks why, that is, for reasons. Science asks how, that is, for mechanisms. 

Philosophical accounts are based chiefly on reasoning from assumptions and it is therefore speculative. 

Scientific explanations are based on how things stand and move in terms of observable relationships, ideally able to be expressed formally. Therefore, science is, well, scientific.

See Richard Feynman, Cargo Cult Science. Engineering and Science, Volume 37:7, June 1974. (pdf) (This is a seminal article in philosophy of science, and as a bonus for reading it, it's funny, too.)

A lot of economic, social science and political science fits Feynman's description of cargo cult science. 

A lot of putative scientific knowledge is actually philosophy, that is, speculation, because it rests on speculation based on assumptions rather than explanation based on rigorous observation. However, some it is even flawed formally because the reasoning process is invalid.

Doing good philosophy is difficult because it is easy to fool oneself by falling into logical traps, as Ludwig Wittgenstein spent the later par to his live exploring. 
Philosophy is a battle against the bewitchment of our intelligence by means of our language. — Philosophical Investigations, § 109
Thus, philosophy in Wittgenstein's sense is a prerequisite to critical thinking.

Doing real science is harder because it requires careful attention not only to logical but also to data collection and processing data into information, keeping the signal to noise ratio within tight boundaries. As Feynman points out, a lot of putative science is mostly just noise.
The first principle is that you must not fool yourself--and you are the easiest person to fool. So you have to be very careful about that. After you've not fooled yourself, it's easy not to fool other scientists. You just have to be honest in a conventional way after that.
I would like to add something that's not essential to the science, but something I kind of believe, which is that you should not fool the layman when you're talking as a scientist. I am not trying to tell you what to do about cheating on your wife, or fooling your girlfriend, or something like that, when you're not trying to be a scientist, but just trying to be an ordinary human being. We'll leave those problems up to you and your rabbi. I'm talking about a specific, extra type of integrity that is not lying, but bending over backwards to show how you are maybe wrong, that you ought to have when acting as a scientist. And this is our responsibility as scientists, certainly to other scientists, and I think to laymen. Feynman, "Cargo Cult Science," cited above.
Lars P. Syll’s Blog
What is a statistical model?
Lars P. Syll | Professor, Malmo University

Friday, June 16, 2017

Lars P. Syll — What is it that DSGE models — really — explain?

‘Rigorous’ and ‘precise’ DSGE models cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.
No matter how precise and rigorous the analysis, and no matter how hard one tries to cast the argument in modern mathematical form, they do not push economic science forwards one single millimeter if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not say anything about real world economies.
Proving things ‘rigorously’ in DSGE models is at most a starting-point for doing an interesting and relevant economic analysis. Forgetting to supply export warrants to the real world makes the analysis an empty exercise in formalism without real scientific value.
Mainstream economists think there is a gain from the DSGE style of modeling in its capacity to offer some kind of structure around which to organise discussions. To me that sounds more like a religious theoretical-methodological dogma, where one paradigm rules in divine hegemony. That’s not progress. That’s the death of economics as a science.
Lars P. Syll’s Blog
What is it that DSGE models — really — explain?
Lars P. Syll | Professor, Malmo University

Friday, February 3, 2017

Lars P. Syll — RBC models — nonsense on stilts

I don’t think that there is a way to write down any model which at one hand respects the possible diversity of agents in taste, circumstances, and so on, and at the other hand also grounds behavior rigorously in utility maximization and which has any substantive content to it. — James Tobin
Determining causality is a bitch in social science since many factors generally contribute to causality involving social behavior. Studying a single individual and making assumptions about future behavior based on habits and revealed preferences might hold but transferring this to groups of individuals involves the fallacy of composition. This makes the assumption of methodological individualism and microfoundations problematic.

Assuming methodological individualism ignores that regularity in social behavior is more likely induced by stable institutional arrangements than individual factors involving assumptions of homogeneity that rather obviously do not hold in the real world.

Choosing a single variable or a few variables as causal factors operating universally and timelessly to produce regular results is seldom realistic. This is clearly done for convenience, to make the math tractable, rather than as a matter of induction based on empirical data or abduction based on reasoning to the best explanation. 

In many cases the process of identifying assumptions in conventional economics seem to be driven by ideology, with conclusions supported by authority, which is justification by power and gatekeepers rather than either reasoning or evidence.

In addition to the problem identifying assumptions, there is also the issue of assuming ergodicity (time average of a same is equal to the ensemble average) in processes that are conditioned historically and dynamically.

Moreover, the greater the scope the less accurate the solution is likely to be. This is a reason that social sciences have tended to focus on case studies rather than general theories.

Lars P. Syll’s Blog
RBC models — nonsense on stilts
Lars P. Syll | Professor, Malmo University