Wednesday, June 01, 2011
It is fact of life that if we throw together many independent random thingies, they organize themselves in a normal, Gaussian distribution. The more thingies we throw into the quantum cauldron, the more precisely they resemble the Gauss curve. Quantum universe (we have no other) is like that: the more an outside observer (aka the computer) looks at the thingies, the more they become real and defined. My problem in trying to build aggregate distributions is the enormous amount of computer time it demands. But what is time?
I am too old and have no time to reflect on the nature of time: the thingies are distributed in nature not only in space but also along the axis of time. Two thingies cannot occupy the same point, so how we aggregate thingies that are not simultaneous? I see in TASE that fantastic changes minute to minute, second to second. Agreggating a thingie of a minute ago and another that exists NOW viciates the whole process. Only very high velocity computation may give me an approximate idea of the distribution. But thingies are rather "slow", I dont have enough "thingies" near to each other on the time axis, so I process them to make from each thingie a million thingies and then aggregate them. But it smells badly, I am trying to cheat reality. I am not intelligent enough to "decular" (my barrio's slang for working out a problem) this situation and make money with it. I only have to look at my portfolio to see that: I am losing 10% from my maximum. A lot, ergo, I must be a drunken moron. On the pic, the third from the right standing is Heidelberg, although there is some uncertainty about that.