when a percentage of user data is used in development of an AI, then important and sensitive information are lost in the process and could be misused in the process, how does COVALENT plan to secure user data from being misused?
Judging from what Cova has proposed every data can be programmed to act the way the owner pleases, so going by this the suppose lost data wouldnt be possible.
Yea, that's exactly the concept, giving every user the ability to use their data in the ways they seem fit and not at the mercy of the third party, the current internet protocol is pretty porous and has been abused over the years with no end to the manipulations of those that can access our data, so its a good thing that such a concept it coming up