Post
Topic
Board Development & Technical Discussion
Merits 1 from 1 user
Re: New PoW method using factorization of large numbers.
by
Taras
on 22/02/2018, 16:00:56 UTC
⭐ Merited by DarkStar_ (1)
Originally I was worried about primes but one of the other posters alerted me that everyone can be working on different numbers.  I didn't realize this is how bitcoin works currently.  Everyone's number is determined by the hash of their public key along with other block details.  So if one person gets dealt a prime (or two or three or 100 people) then the others will be the ones to win that block.
If one miner gets stuck with a prime, then they're not "out" until the next block is solved - they can and will periodically generate a new number by changing some of the block details, and then hope this new number they're factorizing isn't prime.

On a side note it is possible that miners will be biased toward even numbers, since it is more likely that an even number will be composite than an odd number (marginally, but at this scale it could make a difference). As miners can change certain parts of the block details to whatever they want, they can keep trying until they get an even number, and then work on that one.

Good catch.  That is a glaring vulnerability since we want to ultimately test whether numbers are prime or not.  My intention is that people can datamine the blockchain to know what numbers to rule out in the search for new prime numbers.  If everyone is mining even numbers than this wouldn't work.  honestly I think we would have to eliminate numbers that have any factors up to say 1000.  This way people can't just divide their number by everything from 1-100 to make sure it isn't prime.  The point is to rule out prime candidates by proving they are composite.

I don't know, any other ideas on this vulnerability?

Let's say that, to solve a block, you need to find a factor that is at least 1000 digits of a number that is 2000 digits. We already know we can skip odd numbers, so keep trying to hash your block until you get an even number. This should happen nearly instantly. Now, to find a 1000+ digit factor of a 2000 digit even number...... you just divide by 2. Even on a number this size, that kind of operation would take only milliseconds on a CPU.

More digits in the factor meaning higher difficulty is just not accurate. Finding a 50 digit factor would be harder than finding either a 1 digit or 1999+ digit factor of a 2000 digit number. But ruling out prime candidates is not the hard part. You can already prove that a number is composite if it is both greater than 2 and even. We already know it's greater than 2, so you just have to look at the last one bit - if 0 then composite, if 1 then unknown, try again.

Disallowing numbers with any factors (other than 1) up to 1000 is an interesting idea. They certainly exist (here's one, ~100 digits: 1332040162761665637631613428888885580516735653612226801746764441547065802389532 208097762306977281), but I don't know if there's a way to look at a number and calculate if it meets that criteria before you go bruteforcing factors. If there is (I imagine there is) then we're back to looking for a large enough factor anyways.

What you could do instead is have a window of factor length that has to be met. Rather than >= 1000 digits, say 100 to 110 digits, for a factor of a 2000 digit number.

edit: The very big number above is getting a space in it for some reason - if you try to calculate the factors, make sure that you remove that space