Post
Topic
Board Development & Technical Discussion
Re: New PoW method using factorization of large numbers.
by
ir.hn
on 19/12/2017, 04:30:40 UTC
You are essentially suggesting you have a solution that could/should be applied to a multi billion dollar industry.   And you are unwilling to entertain the idea of a mathematical proof.  It's likely that you're right, but in this arena, I think its pretty much a requirement to at least ATTEMPT a rigorous proof.   Otherwise your idea will forever remain an idea.  Personally I'd love to see the math that goes into such a proof for the sake of learning alone.

That said, its a very interesting concept.  

So would I Wink.  I would like to apologize to haltingprobability, I have been a bit harsh on him.  Some people like seeing proof, I for one am someone who will go out and search for myself.  But different strokes for different folks.

I am not the kind of meticulous person to work on proofs.  I don't have a degree in mathematics or computer science and I tend to get interested in developing concepts and ideas and implementing them.  I am more of an entrepreneur and engineer than a mathematician.  My brain works like a CPU and I take in lots of information from many sources and synthesize it into a product idea.  I don't need proof, I just need a general understanding how something works, and if it makes sense and passes the smell test, I go for it.  

Here is how I see it.  GPU's are like robots.  You program them for a specific task and they can beat any human in that specialized task.  But lets say something unexpected happens like the roof caves in.  The robots would get stuck, they wouldn't know how to move out of the way and bring their work elsewhere to continue.  The humans would then beat them because they are more adaptable.  CPU's are adaptable, GPU's are not.  CPU's are fast and think on their feet whereas GPU's are slow and methodical and share their work well.  So it would be very shortsighted to think that GPU's can accelerate any given task and be better than a CPU.  They are very different entities.  If GPU's are inherently better than CPU's at some tasks (and they certainly are like SHA-256 and Scrypt) Then it only makes sense that CPU's would be inherently better at other tasks.  These tasks would be tasks where lots of different and unpredictable things are happening.  The problem is finding a stable consistent task that also offers this unpredictability.

For me to attempt to prove that Large number factoring would offer this unpredictability would require that I know much more than I currently do about large number factoring.  But I take the experts word that factoring (more specifically sieving) large numbers is a very intensive thing that requires lots of unpredictable memory access and varied tasks.  Its about as random of a task as a non random task can be if that makes any sense.  So that is why I feel it is the #1 contender for a CPU tilted coin.  The cool thing is certain tasks in the sieving are predictable, and so the combination of a GPU and CPU would be best, which just so happens to be the perfect thing if we want to favor PC's.  Also it is good if we want to bring the GPU miners and CPU miners together to both adopt this PoW.