ok let say we have 60gb ram , and we computed the 2**30 x points in the ram will the lookup be instant (O(1)) ?
It won't be instant, because the keys are dispersed in range 1 to 2**256, so, as I already mentioned, this requires a dictionary data structure (map, tree, hash table, database are such examples).
The only way to have a O(1) lookup is, by basic principles, to know exactly the location that needs to be read. Since you have 2**30 keys, and each key is 256 bits, a truly O(1) lookup requires 2**256 * (sizeOfAssociatedValue) bytes, in order to do a direct O(1) lookup of any 256-bit key. You will have 2**30 locations that have an associated value, and (2**256 - 2**30) locations that are wasting space for nothing. But it is O(1) in complexity.
That's why other data structures like trees / maps / hash tables / storage databases are optimal: they are the perfect balance between minimizing storage and minimizing complexity down from O(n) to O(logN). O(1) is only practical if there is enough spare memory to keep all possible keys.
There are only around 2**70 bytes in all storage devices ever manufactured on Earth, so do you get the picture now?
So what is the fastest method to lookup in the 2**30 ,or in another words what should we store them in to get the fastest lookup possible ,i store them in 16**4 diffrent txt files based on the prefixes the points have so each file has 16k points but this only gives 45 lookups per seconds on my laptop , this might give faster on a better cpu .