Post
Topic
Board Development & Technical Discussion
Merits 2 from 2 users
Re: Neural Networks and Secp256k1
by
fxsniper
on 13/05/2021, 12:01:50 UTC
⭐ Merited by aliashraf (1) ,ETFbitcoin (1)
How can use neural networks dataset?
do you have more detail ?
how to use classification?
how to deal with large number?

neural networks require to some large search support
it can not do alone one person or small team
I think we can not do it  (you only one alone or you team 2-5 person)

project neural networks risk to fail 100% no body to try it   (who do it fail not tell us to know)
nobody try one may be know result not easy possible

only keras 5 layer and perceptron 256 or 512 can not do this job
it can not normal neural networks to do this job must be very complex neural networks do this

easy you can try maximum layer keras can do

what is dataset to use
input?
output?

input
may be convert to binary and each column is one bit

neural networks not understand character must be convert to digit or only 1 and 0 (one-hot encode data)

problem Y point is 256 bit is very large number not easy to put to dataset
problem most AI. result have answer short is 1 and 0 or limited digit number of possibility

may be need to develop AI. 256 AI. for each  bit
may be need to use maximum neural networks layer
may be require minimum to level same or high more than OpenAI GPT-3

how many AI.? I guest over 256 AI. by one Ai. for one bit coalition
or may be 256*256 = 65,536 AI for do it

how many layer? I guess over 48 layer
may be (sample)
48-layer, 1600-hidden, 25-heads, 1558M parameters
24-layer, 2048-hidden, 16-heads, 1.3B parameters.
32-layer, 2560-hidden, 20-heads, 2.7B parameters.

BERT use 168M parameters
GPT-2 is with 1.5 billion parameters
full-sized GPT-3 has 175 billion parameters

so possible minimum require will be 175 billion parameters as research can do for now

some idea
using deep learning to train
use deep reinforcement learning (must be better than AlphaGo)
use NLP/MLM to translate (modify to special use)
modify BIRT to works
modify GPT-2 to works
use generative deep learning to train one ai generate and one verify

problem may be need to require training over a year

result possible to get very low accurate rate

if can do possible to get only just close up number on range not correct number
I believe some AI. search do some research about blockchain

not easy
project require large fund to support

what research want to do
blockchain is great project freedom without control need tp be upgrade to very stronger

research  need to be do first before some one do it and use at wrong way
I think may be natural networks possible can find found vulnerability need to fix it

layer reference see
https://huggingface.co/transformers/pretrained_models.html

DeepMind A.I. unit lost $649 million (so how many budget to use for this )

other, you need hardware like  Nvidia A100 over 1000 GPU
or NVIDIA DGX A100 x 100 (or NVIDIA DGX POD)
see hardware for training
https://en.wikipedia.org/wiki/AlphaGo

no body create project natural networks like this on GitHub
can not find sample project
try start project one and open to forked to extend update

found other project on this forum post
https://github.com/btc-room101/bitcoin-rnn