my GPU is small gtx 1050 just 1 millions is slow training
A bit off-topic, but i've heard my friend they're using Google colab and Kaggle which give you access to high-end professional/data server GPU. I don't know the limitation, but it might worth to try.
Colab has heavy limitations on the GPU in its free tier where they'll stop your whole notebook once you exceed a certain number of hours.
problem on ML.NET
dataset like a random no pattern
training with 1 million dataset result very low accuracy at 0.0001%
it not works
I will try on keras 5 layer and 256 NN
possible get result same
neural networks may be work only on dataset have pattern, NN can find pattern
but Secp256k1 or elliptic curve like a random
1m is nothing, i've tried 50m. I suppose there need tests for 1billion dataset
And for calculation need take small curves and increase numbers if success, to get formula and calculate how much data required for Secp256.
What are your detection rates for the 50m dataset (true positive/negative %, false positive/negative % etc.)?
Training error was 0.9916. But in tests it was 50%, shaking from 1000 to -1000. So still close to 50%