Saturday, April 11, 2015

Jason Smith — Information theory and economics, a primer


A physicist looks at economics and simplifies the formal modeling that economists conventionally use.
The physicist Eugene Wigner once referred to the "unreasonable effectiveness of mathematics in the natural sciences". One comment I frequently see out there on the economics blogs is the economics equivalent: economists equations and models don't anything that looks like a real human in them. Some economists actually defend this -- they say they don't need realistic humans or assumptions (microfoundations).

What this information equilibrium framework seems to say is that when markets are functioning properly, this is ok. To turn Tolstoy upside down, all functioning markets are alike; each market failure fails in its own way. Those functioning markets represent a common Platonic ideal -- something that can be described by mathematics. Interestingly, the information equilibrium framework also has some suggestions for how to approach the problem when markets fail, too! It's called non-ideal information equilibrium and it allows us not only to see markets that are functioning (those well-described by ideal information equilibrium), but point out ones that are failing. This also represents a divergence from mainstream -- markets aren't assumed to be functioning unless they can be shown to be functioning.

The basics of the framework come down to a couple of equations and what is effectively a supply and demand diagram -- which should be thought of as an entropic force diagram.
The analogy of the "scissors" of supply and demand can be called upon to summarize this albeit simplistically.  As long as each blade is functioning as it should, e.g., sharp enough to do the work, and the scissors is working correctly as a system, with the blades operating in alignment, the mechanism cuts as it should. However, if one of the blades is not functioning as it should, .e.g., is dull, or the scissors is out of alignment, e.g., the fulcrum screw loosens, then the scissors no longer operates correctly and the cuts are either off or done happen at all, that is, the system fails.

This analogy can be complicated, .e.g, by re;pacing the schism with a milling machine. The difference here is the assumptions underlying the different models. The milling machine has a more parts and more detailed design than a scissors, but it does essentially the same thing — cuts the material precisely.

Ideally, a complicated real process can be modeled usefully though a rather simple model, e.g., in economics using a "gadget" like ISLM. Ockham's razor holds that the simplest explanation for the purpose is to be preferred. Here, the less is less information.

The advantage of the information transfer model is that it simplifies the assumptions while still producing useful results using only "market forces" in modeling.
One major divergence from mainstream economic approaches is the lack of assumptions about what it is that is mediating economic activity. You really don't need any economic agents, firms, households, rational expectations, or really any kind of human thought at all ... when markets are functioning properly. Market forces are like entropic forces in thermodynamics -- diffusion and osmosis are a couple of well known ones. Molecules in a gas don't know about their density at the other side of the room, but collectively they will distribute themselves to achieve an almost equal density across their container. People and prices can be described by the same information equilibrium framework that describes the behavior of molecules. The equilibrium states in this model are the ones with maximum entropy [1].
I think this actually solves a really tough philosophical problem. If humans have free will and don't behave perfectly rationally all the time, why does it seem that when markets are functioning properly, why is mathematics of economics work so well -- as if they were atoms in an ideal gas?
Of course, the really interesting stuff in economics occurs when the system is not in equilibrium at efficient use of available resources, which is described by the equilibrium model.

This is the problem that conventional equilibrium economic modeling like DSGE deals with as the result of exogenous "shock" that could not be foreseen using the model alone.

Thus, for some economists, doing economics involves the "art" of choosing the right model at the right time.

The question then arises is whether and how information transfer economics deals with this in more satisfactory way, e.g., by enabling anticipation from within the model and ideally preventative maintenance to prevent breakdowns that result in idling resources and costly declines in efficiency and effectiveness. Smith seems to suggest that this is not possible in the case of simple information models.

However, it is possible in physics to detect when a situation is becoming unstable based on feedback from data. Let's see where Smith goes with this. As he points out, economics has been quite successful in dealing with static equilibrium models, but challenges arise with introduction of dynamic models as the rate of change of key variables shifts over time with economies in flux.

Hayek is often credited with introducing the concept of information to economics, e.g., in
"The Use of Knowledge in Society". But Hayek's essay raises more questions than it answers, as I think it was intended to do.
It is in many ways fortunate that the dispute about the indispensability of the price system for any rational calculation in a complex society is now no longer conducted entirely between camps holding different political views. The thesis that without the price system we could not preserve a society based on such extensive division of labor as ours was greeted with a howl of derision when it was first advanced by von Mises twenty-five years ago. Today the difficulties which some still find in accepting it are no longer mainly political, and this makes for an atmosphere much more conducive to reasonable discussion. When we find Leon Trotsky arguing that "economic accounting is unthinkable without market relations"; when Professor Oscar Lange promises Professor von Mises a statue in the marble halls of the future Central Planning Board; and when Professor Abba P. Lerner rediscovers Adam Smith and emphasizes that the essential utility of the price system consists in inducing the individual, while seeking his own interest, to do what is in the general interest, the differences can indeed no longer be ascribed to political prejudice. The remaining dissent seems clearly to be due to purely intellectual, and more particularly methodological, differences.
I would venture to say that no serious thinkers today assume that the "price system," that is, "the market, " is in any way dispensable. The question that remains is whether the market is sufficient as an institutional tool for organizing and governing a society. The simple answer is that it will never be discovered because "the market"is based on other institutions, and these institutions are  grounded in non-economy factors such as government not to mention emergence that occurs in complex adaptive systems. Politics follows on the introduction of government, and with it, networks and strata of social relationships based on status (class), power, and property ownership (wealth).

The question then becomes how to incorporate this apparently extraneous information into economics and economic information ("data").  It would seem that this is related to the art of choose the correct model for changing conditions. For example, it would be significant to know where an economy is headed on the curve of financial instability that Hyman Minsky hypothesized based on financial and other information (in the recent crisis, fraud) that economists generally have not taken into account in the conventional approach.

Information Transfer Economics

8 comments:

A said...

Jason Smith told me he believes that inflation is caused by an increase in the quantity of physical currency in circulation.

I lost interest after that.

Tom Hickey said...

Smith is a physicist whose hobby is developing information based models applicable to economics. He doesn't need to be right about the economics to say something interesting about applying information-based modeling in econometrics.

I think what he writes is interesting from the POV of modeling. He is not in paradigm with MMT, of course, since he is looking primarily at conventional econometrics.

I'm interesting in modeling from the POV of philosophy of science and logic, so information theory is of interest. It's also interesting to see how scientists from other fields relate to econ.

For example, it's unimportant the he believes in some quantity theory. What's important is his approach to modeling it.

Note that he is adopting chiefly an instrumentalist POV, in which what happens in the black box is not necessary for explanation. The model is concerned with input and output in terms of invariance that is observable relating the variables that can be expressed in functions.

In other words, he is interested in how, what, when, how much type questions and not why questions about causality.

Of course, realists are not on board with that, for several reasons.

First, realists hold that explanation is fundamentally causal, in agreement with Aristotle and contra Hume.

Secondly, without understanding causality involved, the necessity in a model is merely formal. There is on clear connection of the model with what is modeled. This is suggested, for example, in the admission that the so-called general case is really a special case in that markets do fail and equilibrium in the sense of efficient use of available resources is seldom achieved.

Still, thinking about econ in terms of information theory seems to be to be important, especially since the information age is upon us.

NeilW said...

The problem here is the approach.

You need to model the atoms and then simulate the atoms interacting and see if the behaviours in the simulation mirror those in real life.

Then you know that the model of your atomic entities and their interaction profile is somewhere near.

There's still far too much mathematics and assumptions. A gas molecule's properties stay pretty much the same regardless of the feedback from the environment.

Humans, on the other hand, learn and change. Not perfectly but they do. So unless you have the TEFCAS feedback from senses to behaviour you aren't modelling the real thing.

Greg said...

Ive been reading a little of Jason too and he is quite interesting.

@Phillipe
I haven't seen where Jason makes the claim you suggest but I wouldn't dismiss him so quickly. That is certainly a very monetarist position but he is far from a monetarist. In fact most of his best pieces are critiques of MMers, Sumner particularly.

Regarding quantity theory, I think MMT has some QTM properties to it as well. There is an intuitiveness to the QTM that sounds right but I think its way more complicated than it is suggested by Sumner, Rowe or Friedman. In my view its not that QTM is wrong its just that its not completely right. Monetarists are lazy and don't want to do any of the hard work figuring out things. They just polish their pet theories with fancy math and call it a day.
Its quite easy to look at any number of FRED graphs and find a correlation between certain quantities of "money" and prices and say "Look, this explains it!!" There certainly is a relationship between the amount of money "out there" and prices of things but its not simple to target the amount of money "out there", especially when money is poorly defined.

Brian Romanchuk said...

The problem I see is that it looks like a lot of mathematics to capture a few econometric relationships. If you have studied information transfer in physics, maybe this makes sense, but it looks pretty obscure to me. There's a few empirical regularities in economics, each of which could be generated by an infinite number of models. It may be that he has some more interesting results that I have missed.

Tom Hickey said...

Regarding quantity theory, I think MMT has some QTM properties to it as well.

Insofar as QTM is based on accounting identity, MMT is on board. It's the theoretical interpretation that is in question.

For example, a version of GTM that takes M to be the monetary base that is created exogenously and controls the economy through the money multiplier gets it wrong.

But the accounting identity itself stands. That' s just not the way to understand it.

A stock-flow consistent approach could capture it, however. If it's a valid accounting identity, then MMT observes it as a boundary condition a least, and may use it for theoretical interpretation wrt to direction of causality, too. But that goes beyond the accounting identity alone.

Jason Smith said...

I'd like to say thanks for reading and commenting; it's discussion, in particular critical discussion, that helps make progress of any kind. Even if ideas are wrong, knowing why they are wrong helps understanding.

There was a general thread in the comments above about the quantity theory of money (QTM) -- I'd like to stress that the information equilbrium framework (IE) is not just the quantity theory in two ways:

1. IE is a framework, not a theory -- you can use it to build all kinds of theories. A quantity theory is an example.
2. I've used the framework to cobble together the most empirically accurate pieces into something I've called "'the' information transfer model", which has a piece that can be approximated by the quantity theory under certain conditions. It is best seen as an "analytic continuation" between the ISLM model and the QTM.

Some specific follow-ups below:

@Philippe, you said: "Jason Smith told me he believes that inflation is caused by an increase in the quantity of physical currency in circulation."

Only when inflation is high or central banks are precisely hitting their inflation targets; if inflation is low (undershooting targets), then changes in the quantity of physical currency have no impact. That is to say the quantity theory is an approximation to the underlying theory in the former case, but not in the latter.

Emprically the model works best with physical currency in circulation.

@Neil Wilson, you said: "You need to model the atoms and then simulate the atoms interacting and see if the behaviours in the simulation mirror those in real life."

I have no problem with this approach ... and it should eventually lead to a good macroeconomic theory. However, this isn't what happened in the history of thermodynamics, so it can't be a general rule. We didn't have accurate models of atoms until the 1930s, but statistical mechanics had been working well for 50 years before. Boltzmann was arguing with people who didn't even believe in atoms but had nonetheless developed accurate thermodynamic laws.

@Brian Romanchuk, you said: "There's a few empirical regularities in economics, each of which could be generated by an infinite number of models."

I agree completely! That is exactly why I set out to create the information equilibrium framework ... I meant the framework to be a method to eliminate models and arguments. Since you can build the QTM or ISLM in the framework, both have some validity under specific conditions. Since the idea that raising the minimum wage leads to less employment can't be constructed without additional assumptions, the "econ 101" arguments that raising the minimum wage leads to less unemployment aren't valid. Okun's law is good, the Phillips curve may not be (it may be a model of how markets fail, but not how they work).

I discussed this in my second-ever blog post:

http://informationtransfereconomics.blogspot.com/2013/04/an-informal-abstract-addition-why-now.html

Tom Hickey said...

Thanks for the clarifications, Jason.

Keep up the good work.