when a percentage of user data is used in development of an AI, then important and sensitive information are lost in the process and could be misused in the process, how does COVALENT plan to secure user data from being misused?
Judging from what Cova has proposed every data can be programmed to act the way the owner pleases, so going by this the suppose lost data wouldnt be possible.