i just came to this conclusion:
I ran hundreds of tests and simulations and found that if a number is divided into "chunks", the probability of hitting the target increases many times.
That's not how probabilities work. The sum of all probabilities is always the same no matter what way you "split" the possibilities.
I generate two random numbers, convert them to hex, concatenate them and pass them to a modified rotor-cuda so that it can iterate through the remaining 8 values.
That's a very complicated way to waste performance, instead of simply generating a single random number.
Numbers do not "convert to hex", they are numbers. Representations convert.
I never iterate over the full value of 00000000-ffffffff because the likelihood of there being 4 zeros or 4 "f" at the beginning is extremely small.
If in the first chunk we generate a number within 65536**2 (1 00000000 00000000), and not two separate values 0-65535, then the simulation shows that getting into a number within 4 billion is much more difficult than hitting two numbers 0-65535 twice. Mathematics often says the opposite, but i only believe the simulation, which showed me that in this case it is much more likely.
There's zero-proof for your statement. You believe in simulating what? A single observation out of a gazllion choices, each with
identical probability?
Statistics work long-term, you can't have a conclusion from a single expected result.
A key with value 0xFFFFFFF....F has exactly the same chances as any other random key. It seems to me you are trying to say that randomness follows some "model", when in fact it's only definition is total impredictibility and any lack of rules or patterns.
Sorry for you addiction.