Post
Topic
Board Politics & Society
Re: Is a Madmax outcome coming before 2020? Thus do we need anonymity?
by
CoinCube
on 30/01/2014, 01:32:20 UTC
TiagoTiago you are envisioning the robots we see in science fiction movies like terminator and Star Trek.
Let me go over each of your points and explain why I think they are very improbable.

Humans wouldn't take kindly to robots stealing their ore, messing with their electric grid etc, nor to viruses clogging their cat-tubes. But by then, they would already have advanced so much that we would at most piss them off; and it doesn't sound like a good idea to attract the wrath of a vastly superior entity.
You have this backwards. Advanced AI of the type you are envisioning would never need to steal any of these things. They would be the most productive aspect of society and thus control and make most of the wealth. They may even hire humans to do much of the mundane work of resource production because it is manual labor, boring and not really worth their time. More likely it will be humans that are doing the stealing.

Quote
The difference from neanderthals is a post-singularity AI would be self-improving, and would do so at timescales that are just about instant when compared to organic evolution or even human technological advancements.
Not in the sense you are thinking. Such low timescale evolution would only be relevant in virtual simulation with little relevance in the real world. Things that cause and effect the real world would still require empiric physical testing to confirm and would thus involve significant cost.  

Quote
If combination is better, we would be assimilated; resistance would be futile.
You misunderstand... my point was not about cyborgs. The question you need to ask is if a combination of AI and human society (meaning the combined productive/creative/work output of theses societies when summed up would exceed a society of AI without any humans at all. As we would occupy very different ecological niches the answer is yes.  

Quote
The difference is a post-singularity AI would be able to increase it's effective population much faster than humans, while at the same time improving the efficiency of its previously existing "individuals".
The "efficiency" you describe would be efficiency/adaption to survive in their enviornment mainly the net/web or whatever we have at that time. AI would not be immune to competitive pressures and would likely spend most of its time competing with other AI over whatever AI decides is important to it (data space/memory/who knows). They are not likely to be interested in being efficient humans.

Quote
The AI could for example decide it would be more efficient to convert all the biomass on the planet into fuel, or wipe all the forests to build robot factories, or cover the planet with solarpanels etc. Using-up-all-the-resources-of-the-planet seems like a very likely niche; humans themselves are already aiming to promote themselves up on the Kardashev scale in the long term...

FUD. AI superior to us would be intelligent enough to see the value in sustainability. Destroying a resource you can not replace is unwise. AI could also decide to call itself Tron and make all humans play motorcycle and Frisbee games in duels to the death. The key question is what is likely.

I highly recommend reading The Golden Age by John C. Wright. Is is Scifi which explores a future with vastly superior AI. It is a reasonable guess at what a future filled by a race of superior AI beings might look like and is a fun read.