Tuesday, May 2, 2017

Jason Smith — The reason for the proliferation of macro models?

The situation Noah [Smith] describes is just baffling to me. You supposedly had some data you were looking at that gave you the idea for the model, right? Or do people just posit "what-if" models in macroeconomics ... and then continue to consider them as .... um, plausible descriptions of how the world works ... um, without testing them???
Information Transfer Economics
The reason for the proliferation of macro models?
Jason Smith

27 comments:

Matthew Franko said...

"Or do people just posit "what-if" models in macroeconomics ... and then continue to consider them as .... um, plausible descriptions of how the world works"

This is done all the time in Rationalist Science should not be surprising...

https://plato.stanford.edu/entries/rationalism-empiricism/


You cant just say "rationalism cant be used only in economics but everywhere else it is fine...."

Ralph Musgrave said...

The reason for the proliferation of macro models is that the purpose of macro models is to keep macro modelers employed at the taxpayer's expense. And since lots of people want to be supported by taxpayers, the increase in the number of macro modelers will continue till the entire workforce is employed doing macro modeling. But by that time robots will be doing all the work so it won't matter.

Quite easy really.

Brian Romanchuk said...

Ralph,
I already have algorithms for generating economics models. It's on GitHub.

Matt Franko said...

Stochastic "models" are not models...

Bob said...

Is a voodoo doll a model?

Matt Franko said...

How do you "test " a stochastic model?

Tom Hickey said...

Deterministic and Stochastic Models
SAS/STAT(R) 9.2 User's Guide, Second Edition

Tom Hickey said...

Deterministic vs. probabilistic (stochastic): A deterministic model is one in which every set of variable states is uniquely determined by parameters in the model and by sets of previous states of these variables; therefore, a deterministic model always performs the same way for a given set of initial conditions. Conversely, in a stochastic model—usually called a "statistical model"—randomness is present, and variable states are not described by unique values, but rather by probability distributions.

https://en.wikipedia.org/wiki/Mathematical_model#Classifications

Matt Franko said...

So if they are using a stochastic "model" then when the policy result doesn't turn out the way they predict, all they have to say is that the result is part of the set of results that doesn't comply with the prediction..... must be nice to have that job....

Matt Franko said...

What Brian and Neil are putting together lately are not stachastic models imo....

AXEC / E.K-H said...

Macro imbeciles
Comment on Jason Smith on ‘The reason for the proliferation of macro models?’

This is known since 2000+ years ― except among economists: “Research is in fact a continuous discussion of the consistency of theories: formal consistency insofar as the discussion relates to the logical cohesion of what is asserted in joint theories; material consistency insofar as the agreement of observations with theories is concerned.” (Klant)

Jason Smith quotes Noah Smith: “One thing I still notice about macro, including the papers Reis cites, is the continued proliferation of models. Almost every macro paper has a theory section. Because it takes more than one empirical paper to properly test a theory, this means that theories are being created in macro at a far greater rate than they can be tested.” And then comments: “This is fascinating, as it’s completely unheard of in physics.”

In his full-blown naivité, Jason Smith has not realized that economics is what Feynman called a cargo cult science. As Hands remarked “… suppose they [the economists] did reject all theories that were empirically falsified … Nothing would be left standing; there would be no economics.”

Economists need no testing at all because the outcome is ALREADY KNOWN: it is either refutation or inconclusiveness. The reason is simple, ALL macro models are axiomatically false.

Because of this Jason Smith’s proposal – more testing – is in the given context entirely beside the point. When the theory/model is known to be axiomatically false testing does NOT help only the replacement of false axioms by true axioms helps.*

Obviously, the physicist and hobby economist Jason Smith has never heard that progress in physics has always come from paradigm shifts, e.g. from Geo-centrism to Helio-centrism.

Egmont Kakarot-Handtke

* For details see
If it isn’t macro-axiomatized, it isn’t economics
http://axecorg.blogspot.de/2017/02/if-it-isnt-macro-axiomatized-it-isnt.html

From the pluralism of false models to the true economic theory
http://axecorg.blogspot.de/2017/04/from-pluralism-of-false-models-to-true.html

The IS-LM macro imbeciles
http://axecorg.blogspot.de/2016/12/the-is-lm-macro-imbeciles.html

IS-LM ― a crash course for EconoPhysicists
http://axecorg.blogspot.de/2017/04/is-lm-crash-course-for-econophysicists.html

What is REALLY wrong with macro
http://axecorg.blogspot.de/2017/04/what-is-really-wrong-with-macro.html

From false micro to true macro: the new economic paradigm
http://axecorg.blogspot.de/2016/11/from-false-micro-to-true-macro-new.html

Tom Hickey said...

So if they are using a stochastic "model" then when the policy result doesn't turn out the way they predict, all they have to say is that the result is part of the set of results that doesn't comply with the prediction..... must be nice to have that job....

The data type and intended use determine the choice of method.

The subject matter of the natural, life and social sciences is different, so different modeling approaches are indicated. The need for precision may be different, too.

Stochastic modeling is a "fuzzy" approach that results in a probability estimate rather than the mathematical precision of a deterministic function, whose precision is only limited by the accuracy of measurement.

Deterministic models that yield greater precision are generally more desirable than fuzzy models. However, if there is uncertainty or fuzziness, e.g, owing to measurement issues such as randomness, estimation owing to cost factor of obtaining precise data, etc. then a "fuzzy" approach is indicated.

If measurement is precise and a rule can be constructed relating variables, then a deterministic approach is indicated "as a rule," but not necessarily in all circumstances. Stochastic modeling is often used in science even when deterministic models can be developed.

Even if a precise model can be constructed, it may not be indicated owing to the size and complication of the model, especially when a simpler but less precise modeling approach would suffice for the purpose. As in just about everything, there are tradeoffs.

The US only takes a census every ten years instead of annually, for instance. Counting is time-consuming and expensive and a decision has been taken to do it effectively enough for the purposes to which the data will be put.







Tom Hickey said...

What Brian and Neil are putting together lately are not stachastic models imo...

The tax rate is exogenous and fixed by policy and stays constant.

The policy rate is fixed by the central bank as a target rate (constant). The overnight rate is a variable that fluctuates in a corridor.

Like most rates in econ that are not based on observables — unemployment rate, inflation rate, etc., the saving rate is variable cyclically based on changing liquidity preference. These figures are estimates.

Macro deals with aggregates and many aggregate in national accounting are based on statistical estimates rather than precise measurement (enumeration).

Observables are different from constructs.

Ignacio said...

Matt, there are very few models that assume non-ergodicity, hence is easy to prove correctness through probability theory (which is fine, as this how our universe at macro level operates most of the time).

There isn't really that much difference between most stochastic models and deterministic models because most stochastic models are not truly random, they are simply nonlinear, which is different.

A lot of people gets confused because nonlinearity appears 'random', but is not necessarily random. Modeling something as an stochastic process makes things easier when you don't have a clear picture of every factor involved in the development of some phenomena. This makes deduction of causality harder/impossible, so it has its limits when you try to explain something, but is not a bad approximation to certain complex problems as a mean to gain better understanding.

You cannot prove any model empirically right, that's epistemologically impossible, even for deterministic models (will let the philosophers take over this part of the argument). But you can prove something mathematically correct, which is a different thing (and is not necessary or sufficient for empirical correctness when it comes to science, you can have a mathematically incorrect model pull out correct empirical results for a load of reasons for example!).

Whether any of this applies to macroeconomics at any level or not is an other question, I don't think there is any hidden complexity behind macroeconomics that disallows us from using deterministic functions and justifies reliance of stochastic models, there is simply no cause for that, the magical thinking of economists and too much talking about "the markets!" as some sort of abstract entity has trapped their mind in a magical thinking framework. Is just that a lot of economists are very confused people who don't know what they are talking about most of the time.

Ignacio said...

As long as a process running long enough reaches some posterior stationary distribution you can approach an stochastic problem. You have the law of large numbers and central limit theorem backing you up, you are fine. This is what lies behind all the "Markovian" and stochastic theory which is how most stochastic models are built.

A non-stationary process which are random must be approached differently (ie. using kernel methods), but that is 'descriptive', there is no functional explanation of phenomena there (and so no causal explanation), this is just used to observe regularities in phenomena at given points in time.

AXEC / E.K-H said...

Ignacio

You simply do not get the interdependence of economic law and history straight. As Keynes used to say: “a little clear thinking or more lucidity can solve almost any problem” see ‘The Synthesis of Economic Law, Evolution, and History’
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2500696

Egmont Kakarot-Handtke

Ignacio said...

AXEC, I don't believe there are universal laws for "economics" as depicted by economists (production and distribution in a monetary economy). The whole thing is made up and is contextual, because is made up by humans, human laws and human social structures under a given context. So I'm aware of the weight of history.

Nevertheless, it's perfectly fine to use deterministic functions to describe our system as it CURRENTLY is right now, but that does not mean it will FOREVER hold, because it's not a natural phenomena, is a bunch of social constructs derived from social and individual behaviour in a given moment in time.

Don't get me wrong, I didn't mean you can construct an explanatory theory for economics which would be the equivalent to an accurate description of gravitational fields. I don't even believe in the so called "law of supply and demand".

Every "economic law" comes with expiration date at best, and is inherently wrong at worst (most of the time unfortunately). But to lesser extent the same happens with natural phenomena, don't be fooled, as human knowledge is not definitive.

Matthew Franko said...

" I don't even believe in the so called "law of supply and demand"."

Neither do I... we have stuff coming out of our asses all over the place all the time when there isnt a war.... there is a continuous surplus...

AXEC / E.K-H said...

Ignacio

What you believe is irrelevant. Science is about knowledge, and when you know nothing you are out. It is as simple as that. And if economists as a collective know nothing they are out, too. And the first thing they have to do is to abolish the so-called Nobel which is explicitly titled “Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel”.#1 Science is well-defined as material and formal consistency, so there is NO room for belief, opinion, storytelling and other wish-wash.

Bring scientific knowledge to the table#2 or hit the road.

Egmont Kakarot-Handtke

#1 See ‘The real problem with the economics Nobel’
http://axecorg.blogspot.de/2016/09/the-real-problem-with-economics-nobel.html

and ‘Heterodoxy’s scientific self-burial’
http://axecorg.blogspot.de/2016/08/heterodoxys-scientific-self-burial.html

and ‘How to get rid of the silly Queen’
http://axecorg.blogspot.de/2016/06/how-to-get-rid-of-silly-queen.html

and ‘Failed critique of failed economics’
http://axecorg.blogspot.de/2017/05/failed-critique-of-failed-economics.html

#2 Here is the Law of Supply and Demand
https://commons.wikimedia.org/wiki/File:AXEC64.png

Tom Hickey said...

Every system based on information is characterized by the relation of signal (information) to noise (randomness) over a range with pure signal at one extreme and complete randomness at the other extreme, although technically pure randomness is not a system, since a system assumes some organization.

In complex adaptive systems (chiefly biological and social systems) reflexivity (feedback, learning) and emergence (radical uncertainty) affect the signal to noise ratio.

Political systems are subject to changes in policy based on political choices. Social, political, and economic subsystems are entangled in societies and influence each other.

Modern economic systems are chiefly endogenous. Endogenous variables present many contingent possibilities that are affected by micro, meso and macro factors.

The greater the noise, the less determined the system.

AXEC / E.K-H said...

Tom Hickey

The most noise in economics comes from economists because they are too stupid to put 2 and 2 together.* The four main approaches ― Walrasianism, Keynesianism, Marxianism, Austrianism ― are mutually contradictory, axiomatically false, materially/formally inconsistent and all got the pivotal economic concept profit wrong. Economics is a failed science and economists are incompetent scientists.

Egmont Kakarot-Handtke

* See ‘Review of the economics troops’
http://axecorg.blogspot.de/2017/04/review-of-economics-troops.html

Ignacio said...

Not farther away than Wikipedia: "A scientific law always applies under the same conditions, and implies that there is a causal relationship involving its elements." Does not apply to "law of supply and demand". That's why I don't "believe" (well, maybe a poor choice, should say straight is wrong).

Is just an other one in a series of mischaracterization and simplification by economists that have become popular and misused all the time, sometimes to justify whatever crap (as usual with economists).

AXEC / E.K-H said...

Ignacio

Bring yourself methodologically up to speed and replace Wikipedia’s outdated definition of Law with the more general notion of Invariance.#1 There is NO causality in E=mc^2 but it is a Law.#2

Egmont Kakarot-Handtke

#1 See ‘The Law of Economists’ Increasing Stupidity’
http://axecorg.blogspot.de/2017/04/the-law-of-economists-increasing.html

#2 This is the causality-free First Economic Law
https://commons.wikimedia.org/wiki/File:AXEC06.png

AXEC / E.K-H said...

Ignacio, Tom Hickey

When you have no scientific knowledge you have NOTHING whatsoever to contribute to economic policy: “If economics cannot aspire to any substantive knowledge of economic relationships, it cannot speak with authority about questions of economic policy.” (Blaug)*

So, take the remote control, sit down on the couch and watch sitcoms. This is the best every economist can do for the survival and welfare of humanity.

Egmont Kakarot-Handtke*

* See ‘Economists and politics: Will you kindly shut up!’
http://axecorg.blogspot.de/2016/12/economists-and-politics-will-you-kindly.html

and ‘How economists shoot themselves non-stop in the methodological foot’
http://axecorg.blogspot.de/2017/03/how-economists-shoot-themselves-non.html

Ignacio said...

Causality in this context means that the terms expressed in the "law" are causally connected somehow, it's not necessary to express the direction of the causality (you cannot do that with an equality/inequality which is a snapshot of a state at a given time, and cannot ever incorporate the time domain, you can only do that through a function). This holds for the E=mc^2 equation (in the sense expressed above), but doesn't always as expressed in the case of "supply and demand".

AXEC / E.K-H said...

True macrofoundations: the reset of economics
Comment on ‘The reason for the proliferation of macro models?’

The ultimate reason for the senseless proliferation of micro/macro models is that they are methodologically false, more precisely: axiomatically false, and therefore fail sooner or later. After the definitive refutation of false Walrasian microfoundations and false Keynesian macrofoundations it is time for the true macrofoundations of economics. The methodological rule to follow is known since 2000+ years: “When the premises are certain, true, and primary, and the conclusion formally follows from them, this is demonstration, and produces scientific knowledge of a thing.” (Aristotle)

This is the correct core of premises:
(0) The objectively given and most elementary systemic configuration of the (world-) economy consists of the household and the business sector which in turn consists initially of one giant fully integrated firm.
(i) Yw=WL wage income Yw is equal to wage rate W times working hours. L,
(ii) O=RL output O is equal to productivity R times working hours L,
(iii) C=PX consumption expenditure C is equal to price P times quantity bought/sold X.
These premises are certain, true, and primary, and therefore satisfy all methodological requirements. The set of premises is minimalistic, that is, Occam’s Razor has been applied and the set cannot be reduced further, only expanded. The set contains no nonentities like maximization or equilibrium and no normative assertions. The graphical representation is given on Wikimedia
https://commons.wikimedia.org/wiki/File:AXEC31.png

At any given level of employment L, the wage income Yw that is generated in the consolidated business sector follows by multiplication with the wage rate W. On the real side, output O follows by multiplication with the productivity R. Finally, the price P follows as the dependent variable under the conditions of budget balancing, i.e. C=Yw and market clearing, i.e. X=O. Note that the ray in the southeastern quadrant is NOT a linear production function; the ray tracks ANY underlying production function. Note also that the wage rate W is an AVERAGE if the individual wage rates are different among the employees, which is normally the case.

Under the conditions of market clearing and budget balancing in each period the price is given by P=W/R (1), i.e. the market clearing price is always equal to unit wage costs. This is the MOST ELEMENTARY form of the price theorem also called LAW OF SUPPLY AND DEMAND. Note in passing that there is NO such thing as supply function, demand function, or production function with decreasing returns.

If the wage rate W is lowered, the market clearing price P falls. If the number of working hours L is increased the price remains constant, provided productivity R does not change. If productivity decreases the price P rises. If productivity increases the price falls. In any case, labor gets the whole product, the real wage W/P is invariably equal to the productivity R according to (1), and profit for the business sector as a whole is zero. All changes in the system are reflected by the market clearing price. The elementary market economy is indefinitely reproducible under the condition of no external/physical limitations like space, raw materials etcetera, which ALL have to be introduced later in the course of an ever more detailed analysis.

See part 2

AXEC / E.K-H said...

Part 2

This has been the first step. With the second step the conditions of market clearing and budget balancing have to be lifted. This GENERALIZATION produces the phenomena of inventory changes (O-X greater than 0 or less than 0) and of saving/dissaving (Sm≡Yw-C greater than 0 or less than 0) and of monetary profit/loss (Qm≡C-Yw greater than 0 or less than 0).

It always holds Qm+Sm=0 or Qm=-Sm, in other words, the business sector’s deficit (surplus) equals the household sector’s surplus (deficit). Loss is the counterpart of saving and profit is the counterpart of dissaving. This is the most elementary form of the PROFIT LAW. Profit for the economy as a WHOLE has NOTHING to do with productivity, the wage rate, the working hours, exploitation, competition, or the smartness of business people.

Given the minimalist core propositions (0) to (iii) one has to proceed top-down by successive DIFFERENTIATION until one arrives at the INDIVIDUAL agent. Differentiation is the opposite of bottom-up or aggregation. The bottom-up approach, also called methodological individualism, is false because it runs with necessity into the Fallacy of Composition.

How to reset economics: (a) THROW OUT the neoclassical and Keynesian set of premises. (b) Take the premises (0) to (iii) as common core of ALL economic analysis. (c) Differentiate the common core, that is, INCREASE COMPLEXITY successively, and derive ALL observable economic phenomena and relationships CONSISTENTLY from the common core. (d) Take the first opportunity to TEST one of the derived complex relationships. (e) If the relationship is corroborated continue with differentiation and solve concrete economic problems, otherwise look for a new set of foundational premises. This is how paradigm shifts work.

The acceptance of the premises (0) to (iii) depends ALONE on the outcome of empirical tests of more elaborate theorems, which have to be carried out independently by econometric experts, and NOT AT ALL on whether one likes the macro axioms or not, and certainly not on any political preconceptions: “Research is in fact a continuous discussion of the consistency of theories: formal consistency insofar as the discussion relates to the logical cohesion of what is asserted in joint theories; material consistency insofar as the agreement of observations with theories is concerned.” (Klant) Economics has to be strictly SEPARATED from politics. Political economics has produced NOTHING of scientific value in the last 200+ years.

For ALL economic research, the premises (0) to (iii) are the macrofoundations that are ABSOLUTELY necessary ― as Aristotle put it ― “to produce scientific knowledge of a thing.” It holds: If it isn’t macro-axiomatized, it isn’t economics.

Egmont Kakarot-Handtke