Friday, July 16, 2010

Measurement and Information

FLG has been thinking about information theory a lot lately. The mathematical idea really goes back to Claude Shannon's entropy calculation. Basically, the idea is that the less expected an event, the more information it contains.

Today, FLG was listening to this podcast (mp3). The interviewee mentions that people in almost all fields most often measure metrics that contain the least value as far as information goes. The metrics historically given the highest level of attention contain the least information value.

The author's explanation is that people measure what they know how to measure and what they know how to measure they are already less uncertain about. But since they're already less uncertain the information value is lower. And then he says people stick with what they know how to measure, rather than measuring what they should measure.

But let's go back to the basic information theory definition FLG gave above. The more unexpected the event, or let's say the more unexpected the result, the more information contained. Well, the more you measure something, the more predictable it becomes. Law of Large Numbers and all that. Consequently, less information is contained in each subsequent measurement. Basically, the law of diminishing marginal utility applies to measurement.

Finding a way to measure something that has never been measured contains tons of information, but after hundreds of subsequent measurements, which are presumably easy because we've figured out how to measure it, the value of the information necessarily declines. It's not simply because people do what they know how to do or what they've always done in some sort of stubborn, lazy way, but an unavoidable law of the nature of information.

It's all very interesting nevertheless, at least to FLG.

No comments:

 
Creative Commons License
This work is licensed under a Creative Commons Attribution-No Derivative Works 3.0 United States License.