Showing posts with label economic modeling. Show all posts
Showing posts with label economic modeling. Show all posts

Saturday, January 11, 2020

Lars P. Syll — Economics — too important to be left to economists


The problem with economics as a discipline, and this generally includes all forms of economics including heterodox economics to some extent, is "economics." That is is to say, economists assume that economics is chiefly or exclusively about economic behavior when economic behavior is embedded in social and political behavior and includes the entire "human condition."

The only "economist" that really grasped this in depth was Karl Marx, and he was a philosopher coming from a Hegelian background rather than being an "economist" in today's terminology. He understood and emphasized social embeddedness, as do the economic anthropologists, economic sociologists and institutionalists that followed. Even Keynes approached economics in terms of the existing neoclassical paradigm that prevailed, and he did it as a mathematician would since his background was it mathematics and his principle work was the Treatise on Probability.

Conversely, almost as a reaction to Marx, the economic profession got sidetracked by Alfred Marshall's emphasis on formalizing economics in an attempt to make it "scientific." This led to the presumption (hidden assumption) that economics is chiefly or even solely about economic behavior. The result was the assumption of homo economicus as a homogenous agent behaving "rationally" to maximize self-interest in economic dealings, principally in markets, which act as calculating machines to maintain economic equilibrium through adjustments in price based on supply and demand.

The problem is that this model is based on a generalizing from economic behavior at a micro level, irrespective of social embedding. Therefore the scope of application is limited by the extremely narrow scale. Unfortunately, most economists ignore this limitation of scope and the significance of scale.

Macroeconomics cannot be scaled up microeconomics because behavior at the individual level is not the foundation of behavior at the macro level as is conventionally assumed by those promoting microfoundations; for the simple reason that most people's behavior is not exclusively economic but includes social and political factors as will as ideological presumptions that differ temporally, geographically, by class and according to affiliation and personal disposition.

Economics is embedded in society just as are individuals and their behavior. Social networks (systems) are comprised of individuals as elements but networks (systems) also influence individuals that are related to them, either as members or those affected peripherally. A social system is not an aggregation of individuals acting independently but rather a system in which relationships are highly influential. Therefore, economic aggregates cannot tell the whole tale.

Individuals often have conflicting interests as well. People do not always prioritize their economic interest over their social and political interests as the assumption of homo economicus posits. Their choices are influenced by personal disposition, knowledge base, and cognitive-affective biases. At a most evident level, people that belong to different political parties and their factions think, feel, and act differently, including with respect to economic affairs and interests. For example, traditionalists regularly set tradition and traditional values above economic interests such that they act "irrationally" from the economic point of view while not being actually irrational. They view themselves as acting on the basis of higher reasons.

While homo economicus may have a place in the study of microeconomics, at the scale of macroeconomics and political economy, the agent is not homo economicus but homo socialis, and homo socialis is non-homogenous and not necessarily economically "rational." So game theory does not apply in this case.

The fact that societies are complex adaptive systems, especially at the national and international levels, means that model-building is challenged by tractability. Many conventional approaches are based on introducing conventions for inducing tractability at the expense of realism, which effects the usefulness of such models in pragmatic application.

This can result in social and political disarray that is serious enough to lead to conflict. But even if it does not, it can produce in anti-social consequences instead of the pro-social result that free markets, free trade and free capital flows promise based on the erroneous assumptions.

Many economist do not stay abreast of developments in economic anthropology, economic sociology, social science in general, evolutionary theory, and other fields that impact economics. Many do not even consider institutionalism as important, and even deprecate accounting, money & banking, and finance as relevant. Even in economic matters per se, many economists dismiss or deprecate externality, market imperfection, distributional factors, etc. as relevant to their field. Moreover, the insist that the methodological debate is over and they won it — end of discussion.

Economics needs to expand its horizon to remain relevant. 

Lars P. Syll’s Blog
Economics — too important to be left to economists
Lars P. Syll | Professor, Malmo University

Friday, January 10, 2020

Lars P. Syll — Does it–really–take a model to beat a model?


The implication here is "formal model." But formal models are not the only sort of models. Most models we use are conceptual and they are mostly sufficient to the task. For example, in the Tractatus, Wittgenstein showed how a descriptive statement is a model of a fact that allows for comparing the model to the fact observationally to determine it truth-value. He elaborated how the propositional calculus is used to to describe many fact in using the principles of descriptive logic to construct a conceptual model. 

Wittgenstein used the German term "Bild" (meaning picture) as the basis of his analysis of a proposition as a "picture" of a fact based on there being a one-to-one (logical) correspondence of the elements of a picture to those of the fact it represents. A descriptive proposition models a fact in a way (logically) similar to ordinary picturing.

This notion as not original with Wittgenstein. Wittgenstein was an engineer by training and the Tractatus is modeled on Hertz's introduction to the Principle of Mechanics. It lays the logical foundations for philosophy of science. Philosophers coming from a different angle of approach and take it as an ontological work are mistaken about the task that Wittgenstein set himself as logician with a background in science. Those coming from a psychological background thought that Wittgenstein was making a psychological claim about the mind creating pictures of reality in thought. This, too, is erroneous. Wittgenstein was simply elucidating how the logic of description works since the logic of description cannot describe itself. It is something that one must come to see through logical analysis.

While the Tractatus is about symbolic logic, it is not written in symbolic logic. That does not mean that it is not a highly rigorous work. Even so, very few people commenting on it have seen what Wittgenstein was doing from his own point of view rather than theirs. The comparison with commentary on MMT is striking to me. It seem that many people have difficulty moving beyond their cognitive-affective biases even when these are pointed out to them.

Of course, rigorous models are preferable to less rigorous ones where circumstances call for it. However, it is also evident that rigorous models that don't yield as good results and less rigorous one are not preferable. Formalism itself is not a criterion of truth-value. Logic and math must be consistent, but consistency says nothing about correspondence or pragmatic worth.

Overemphasis on formalism at the expense of model realism and usefulness is an elementary mistake. This should not need saying in a professional setting. 

Lars P. Syll’s Blog
Does it — really — take a model to beat a model?
Lars P. Syll | Professor, Malmo University

Wednesday, August 28, 2019

A spreadsheet version of the IS/MY model (alternative to IS/LM model) — Dirk Ehnts

I hope that this model will be taken up by more colleagues as it is very clear now that the IS/LM model “does not work”. If you make it more realistic by saying that investment does not depend on the rate of interest (vertical IS curve) and that the central bank determines the interest rate (horizontal LM curve), then you will have wasted 3-4 lectures to explain the goods market (IS curve) and the money market (LM curve) only to conclude that both do not matter in practice. It is only a small step from there to conclude that teaching the IS/LM model is a waste of time. You might just say that “demand determines supply, which determines employment” and that “government spending and private investment, which both do not depend on the rate of interest, increase demand”. Your students will easily get it and you save 3-4 lectures for something else, like my IS/MY model.
Bravo! A big step in the right direction in teaching Econ 101. And it is not just Econ 101, Paul Krugman has basically stated that he uses the IS/LM as his macroeconomic lens.

econoblog 101
A spreadsheet version of the IS/MY model (alternative to IS/LM model)
Dirk Ehnts | Lecturer at Bard College Berlin, research assistant at the Technical University of Chemnitz, and spokesperson of the board of Pufendorf-Gesellschaft eV in Berlin

Wednesday, August 21, 2019

Econometrics and the problem of unjustified assumptions — Lars P. Syll


This is important but may be too wonkish for those who are not intimately familiar with econometrics. So let me try to simplify it and universalize it.

The basic idea in logical reasoning is that an argument is sound if and only if the premises are true and the logical form is valid.  Then the conclusion follows as necessarily true.

This is the basis of scientific reasoning.

In modeling, a set of assumptions, both substantive and procedural, is stipulated, that is, assumed to be true. In a well-founded model all the assumptions that make substantive claims are known to be true empirically on the basis of evidence. This is called semantic truth. The logical truth of logical form is formal proof. This is called syntactical truth. Only the former contains substance. The latter is purely procedural.

A key methodological assumption of the scientific method is naturalism. Being "scientific" signifies being based on observation, rather than say, intuition or "common sense," that is, self-evidence. No self-evident first principles — that's doing philosophy, not science. Not that such speculation is not useful. It's just not science and should not be conflated with science. There is often a tendency to do so.

This presents two major difficulties with scientific modeling versus philosophical speculation. The first is the empirical warrant of the starting points, the stipulations that are assumed to be true and serve as the premises of the argument. The second is knowing that all relevant information is included in the assumptions. This is called identification.

Paraphrasing Richard Feynman, we do science in order to avoid fooling ourselves and we are the easiest ones to fool (owing to confirmation bias, for example). This requires following scientific method scrupulously when substantial claims are made.

Keynes pointed out to Roy Harrod that econometrics did not conform to this strict procedure and that owing to the nature of the subject matter, economics was "moral science," which at the time signified what we would now call "philosophy." The social sciences and much of psychology fall into this category. They are basically speculative exercises that employ some formal methods that may be scientific, or not. 

Accounting is a formal method that is proto-scientific in the sense that double entry it is made up of tautologies. But the entries can be checked for substance against journals and inventories. It is a method to prevent fooling ourselves on one hand, and to prevent cheating on the other.

When accounting tautologies (identities) are interpreted causally, then causal explanation demands empirical corroboration through data, e.g., measurable changes in stocks and flows.

Lars P. Syll’s Blog
Econometrics and the problem of unjustified assumptions
Lars P. Syll | Professor, Malmo University

See also

Bond Economics
Comments On "Business Cycle Anatomy"
Brian Romanchuk

Wednesday, August 14, 2019

Is There Really A Trade-Off Between Inflation And Unemployment? — Brian Romanchuk

Rather than attempt to explain what the mainly neoclassical economists are going on about, I want to step back and try to translate their debate into terms that would be understood by people who do not share the same assumptions. I am pretty sure that post-Keynesian economists have a lot to say about the topic as well, but once again, they tend to be discussing wonkish points that would elude an outsider.…

I have an engineering background, and engineering is largely the science of trade-offs. I have no strong objections to qualitative discussions, but I would argue that we need to at least know the sign of the exchange ratio between two variables in order to say that there is a trade-off between them.
Very simply, if we can have a policy that lowers both the unemployment rate and the inflation rate (or at least leaves inflation unchanged), we cannot pretend there is a meaningful "trade-off" between them.
And this is hardly theoretical: in the United States, we saw a near monotonic decrease in the unemployment rate after the Financial Crisis, yet the inflation rate has done absolutely nothing interesting....
Bond Economics
Is There Really A Trade-Off Between Inflation And Unemployment?
Brian Romanchuk

Monday, July 1, 2019

Lars P. Syll — The logic of economic models


Why conventional economics is a failed project. It's the approach to assumptions.

Lars P. Syll’s Blog
The logic of economic models
Lars P. Syll | Professor, Malmo University

Monday, June 24, 2019

Jason Smith — A Workers' History of the United States 1948-2020

After seven years of economic research and developing forecasting models that have outperformed the experts, author, blogger, and physicist Dr. Jason Smith offers his controversial insights about the major driving factors behind the economy derived from the data and it's not economics — it's social changes. These social changes are behind the questions of who gets to work, how those workers organize, and how workers identify politically — and it is through labor markets that these social changes manifest in economic effects. What would otherwise be a disjoint and nonsensical postwar economic history of the United States is made into a cohesive workers' history driven by women entering the workforce and the backlash to the Civil Rights movement — plainly: sexism and racism. This new understanding of historical economic data offers lessons for understanding the political economy of today and insights for policies that might actually work.…
Information Transfer Economics
A Workers' History of the United States 1948-2020
Jason Smith

Thursday, April 11, 2019

Michael Emmett Brady — Keynes’s Theory of Measurement is contained in Chapter III of Part I and in Chapter XV of Part II of the A Treatise on Probability

Abstract
Professor Yasuhiro Sakai (see 2016; 2018) has argued that there is an mysterious problem in the A Treatise on Probability, 1921 in chapter 3 on page 39 (page 42 of the 1973 CWJMK edition). He argues that there is an unsolved mystery that involves this diagram that has remained unexplained in the literature.
The mystery is that Keynes does not explain what he is doing in the analysis involving the diagram starting on the lower half of page 38 and ending on page 40 of chapter III. In fact, the mystery is solved for any reader of the A Treatise on Probability who reads page 37 and the upper half of page 38 carefully and closely. Keynes explicitly states on those pages that he will give only a brief discussion of the results of his approach to measurement on pages 38-40, but will provide a detailed discussion of his approach to measurement in Part II, after which the brief discussion of the results presented on pp.38-40 will be strengthened.
The Post Keynesian (Joan Robinson, G L S Shackle, Sydney Weintraub, Paul Davidson) and Fundamentalist (Donald Moggridge, Robert Skidelsky, Gay Meeks, Anna Carabelli, Athol Fitzgibbons, Rod O’Donnell, Tony Lawson, Jochen Runde) schools of economics, as well as economists, in general, such as Jan Tinbergen and Lawrence Klein, have ignored chapter XV of the A Treatise on Probability. Keynes demonstrates on pp.161-163 of the A Treatise on Probability in chapter XV that his approach to measurement is an inexact approach to measurement using approximation to define interval valued probability, which is based on the upper-lower probabilities approach of George Boole, who discussed this approach in great detail in chapters 16-21 of his 1854 The Laws of Thought. Therefore, the only conclusion possible is that the “mysterious” diagram presented on page 39 of the A Treatise on Probability is an illustration of Keynes’s approximation technique using interval valued probability, since the problem on pages 162-163 of the A Treatise on Probability explicitly works with seven “non numerical” probabilities while the illustration of Keynes’s approach using the diagram on page 39 works with six “non numerical” probabilities and one numerical. It is impossible for the diagram on page 39 to support any claim, as has been done repeatedly for the last 45 years by the Post Keynesian and Keynesian.
Fundamentalist schools, that Keynes’s theory was an ordinal theory that could only be applied some of the time. This leads precisely to the wrong conclusion that Keynes was arguing that macroeconomic measurement, in general, was impossible in economics, which was G L S Shackle’s conclusion.
An understanding of chapter XV of the A Treatise on Probability explains the conflict that existed between J M Keynes and J Tinbergen on the pages of the Economic Journal of 1939 -1940.The major point of discussion, underlying all of Keynes’s major points, was that Tinbergen’s exact measurement approach, taken from macroscopic physics, using the Normal probability distribution’s precise, exact, definite, linear, additive, and independent probabilities, was not possible given the type of data available in macroeconomics. Only an inexact approach to measurement using imprecise and indeterminate interval valued probability was tenable.
An understanding of chapter XV of Part II of the TP explains the fundamental point of disagreement between J M Keynes and J Tinbergen over the issue of measurement. Tinbergen brought his physic background with him to the study of economics. Tinbergen believed that the exact measurement approach that he had absorbed in his study of statistical physics, using additive, linear, exact, precise definite probability distributions like the Normal or log normal, could be used in the study of macroeconomics that would provide a precise and exact explanation of business cycles. Keynes, of course, given his great, overall experience in academia, industry, business, government, the stock markets, bond markets, money markets, banking, finance, and commodity futures markets, had vast experience that Tinbergen, an academic only, did not have. Keynes saw that Tinbergen’s application was the wrong one, although the technique would be applicable to studies of consumption and inventories.
Wonkish.

SSRN
Keynes’s Theory of Measurement is contained in Chapter III of Part I and in Chapter XV of Part II of the A Treatise on Probability (1921;1973 CWJMK Edition): Keynes Stated That the Exposition in Chapter III of the a Treatise on Probability Was 'Brief', While the Exposition in Chapter XV, Part II, Of the a Treatise on Probability, Was 'Detailed'
Michael Emmett Brady | California State University, Dominguez Hills

Thursday, February 7, 2019

Sandwichman — "I’m not sure I follow the arithmetic here."

All of the above, of course, is simply the fleshing out of assumptions. We assumeddiminishing productivity in the last hours, we assumed heightened productivity from a shorter working week and we assumed declining marginal utility of goods and services produced. Finally, we assumed a preference for free time over a vanishingly small increment of total income. The point is that each of these assumptions were relatively modest but when combined "add up" to a rather substantial cumulative result.
Econospeak
"I’m not sure I follow the arithmetic here."
Sandwichman

Wednesday, November 28, 2018

Brian Romanchuk — Representative Agent Macro And Recessions

J.W. Mason kicked off the latest skirmish in the never-ending macro wars with his Jacobin article "A Demystifying Decade for Economics." (Note: at the time of writing, the article was taken down until its publication in Jacobin.) This prompted a Twitter debate about representative agent macro, which eventually led to this Beatrice Cherrier article on heterogeneous agent models. In my view, the debate about representative agent models is a red herring. Mainstream macroeconomists main skill is in framing debates in a fashion that is congenial to the mainstream; however, the preferred framing leads to dead ends. My current research focus is on recessions, and although I have not gone too far in refreshing my survey of mainstream macro, the value of mainstream macro theory in this debate is limited....
Bond Economics
Representative Agent Macro And Recessions
Brian Romanchuk

Tuesday, November 6, 2018

Jason Smith — I'll say similar things for half the salary

Jan Hatzius made some macro projections about wages, unemployment, and inflation:
Goldman’s Jan Hatzius wrote Sunday that unemployment should continue to decline to 3% by early 2020, noting the labor market also has room to accommodate more wage growth. Hatzius predicted that average hourly earnings would likely grow in the 3.25% to 3.50% range over the next year. ... For now, Goldman has a baseline forecast of 2.3% for core PCE ...
Well, these are all roughly consistent with Dynamic Information Equilibrium Model (DIEM) forecasts from almost two years ago….
Information Transfer Economics
I'll say similar things for half the salaryJason Smith

See also

Sectoral balance chart.

Business Insider
Goldman's Top Economist Explains The World's Most Important Chart, And His Big Call For The US Economy
Joe Weisenthal

Sunday, September 16, 2018

Asad Zaman — Simple Model Explains Complex Keynesian Concepts


Not MMT, but you may find this of interest.
In the context of the radical Macroeconomics Course I am teaching, I was very unhappy with the material available which tries to explain what Keynes is saying. In attempting to explain it better, I constructed an extremely simple model of a primitive agricultural economy. This model has a lot of pedagogical value in that it can demonstrate many complex phenomenon in very simple terms. In particular, Keynesian, Marxists, Classical and Neo-Classical concepts can be illustrated and compared within our model. We will show the failure of all neoclassical concepts of labor, Supply and Demand, equality of marginal product, value theory — the whole she-bang — in an intuitive and easy to understand plausible model of a simple economy.
WEA Pedagogy Blog
Simple Model Explains Complex Keynesian Concepts
Asad Zaman | Vice Chancellor, Pakistan Institute of Development Economics and former Director General, International Institute of Islamic Economics, International Islamic University Islamabad

Thursday, September 13, 2018

Jason Smith — What do equations mean?


Jason Smith comments on J. W. Mason and Arun Jayadev on MMT and conventional economics from the point of view of scientific modeling in macro.

Information Transfer Economics
What do equations mean?
Jason Smith

George H. Blackford — Economists Should Stop Defending Milton Friedman’s Pseudo-science


Recommended reading on the history and philosophy of science, the philosophy of economics, and Milton Friedman's instrumentalism.

Evonomics
Economists Should Stop Defending Milton Friedman’s Pseudo-science
George H. Blackford | former Chair of the Department of Economics at the University of Michigan-Flint

Tuesday, August 7, 2018

Asad Zaman — Methodology of Modern Economics

My paper is a survey of the huge amount of solid empirical evidence against the utility maximization hypothesis that is at the core of all microeconomics currently being taught today in Economics textbooks at universities all over the world. It is obviously important, because if what it says is true, the entire field of microeconomics needs to be re-constructed from scratch. Nonetheless, it was summarily rejected by a large number of top journals, before being eventually published by Jack Reardon as: ” The Empirical Evidence Against Neoclassical Utility Theory: A Review of the Literature,” in International Journal of Pluralism and Economics Education, Vol. 3, No. 4, 2012, pp. 366-414. Speaking metaphorically, my paper documents the solid evidence that the earth is a round sphere in world where educational institutions teach the widely held belief that the earth is flat. Readers of RWER blog will recall that when challenged on the failure of macroeconomics after the Global Financial Crisis, economists retreated to the position that while macro theory may be in a bad shape, at least Microeconomics is solidly grounded. My paper blows this claim out of the water. As a result, nothing is left of Micro and Micro, and of economics as whole. This supports my earlier claim that a Radical Paradigm Shift is required to make progress — patching up existing theories cannot work...
The central question remains of what is to be done. A complex response which requires coordination on multiple fronts is required. The first step has to be the development of a coherent alternative. I have developed a two semester micro coursewhich starts with the Anti-Textbook of Hill & Myatt, and follows up by using two main tools OPPOSED to maximization and equilbrium. Maximization is replaced by behavioral heuristics, which allow operation in environments with genuine uncertainty. Maximization is actually IMPOSSIBLE in uncertain environments. If I have to choose between actions a and b but the outcome to me depends on unknown state of natures and unknown acts of other agents then maximization is not possible. I cannot calculate which of the two actions will yield greater utility because the outcome depends on many thing I do not know, and cannot learn. SIMILARLY, if we equip all agents with heuristic behavior and try to compute what will happen, this can only be done within an Agent Based Model using computer simulations — which will generally lead to disequilibrium outcomes. Using these two tools to replace the central micro tools, one can go quite far in replacing standard micro with sensible alternatives....
WEA Pedagogy Blog
Methodology of Modern Economics
Asad Zaman | Vice Chancellor, Pakistan Institute of Development Economics and former Director General, International Institute of Islamic Economics, International Islamic University Islamabad

Wednesday, June 27, 2018

Lars P. Syll — The main reason why almost all econometric models are wrong

Since econometrics doesn’t content itself with only making optimal predictions, but also aspires to explain things in terms of causes and effects, econometricians need loads of assumptions — most important of these are additivity and linearity. Important, simply because if they are not true, your model is invalid and descriptively incorrect. And when the model is wrong — well, then it’s wrong....
Simplifying assumptions versus oversimplification.

Lars P. Syll’s Blog
The main reason why almost all econometric models are wrong
Lars P. Syll | Professor, Malmo University