Saturday, January 15, 2011

Information Entropy And The Efficient Market Hypothesis

Greg Mankiw posted an interview with Burt Malkiel about A Random Walk Down Wall Street and the efficient market hypothesis.

In it Malkiel says, "True News is random, and what you mean by random is something unpredictable...Don't think you can predict the short-term ups and downs of the market. It's essentially unpredictable, not capricious, but unpredictable because True News is unpredictable."

Those couple sentences immediately reminded FLG of a podcast he heard and commented on back in July. In that podcast, the person who wrote the book, Douglas W. Hubbard, was astonished that people are always measuring the wrong things. The metrics they use don't give them much value, and the things that they aren't measuring would. FLG responded that this isn't because people are stupid, but it all goes back to Claude Shannon's entropy calculation, which FLG summarized thusly:
Basically, the idea is that the less expected an event, the more information it contains.

The more you measure something, the more you can predict it. Therefore, it contains less information. The first time you create and measure some new metric, it contains a ton of new information, but the 1,000 time it contains far less.

FLG had never thought about the efficient market hypothesis the same way, but it really does come back to information entropy. Expected information is priced in. This doesn't mean the prices are correct or right. They are simple the product of interaction among the best educated guesses of a large group of people about what the present discounted value of future cash flows are. The sum all the expectations about the future. Information that would change the price is highly entropic and by definition unexpected.

Interesting...to FLG at least.

No comments:

 
Creative Commons License
This work is licensed under a Creative Commons Attribution-No Derivative Works 3.0 United States License.