/top /all /jobs
Topics: #Alcohol #DrugFree #Education #LawFirm #Movies #News #Politics #Programming #PublicFigures #Romance #Technology

(PCRE-compatible)
Email administrator
Read Post
Sun, 14 Apr 2024 08:54:34 -0700

Andy from private IP /all Artificial Intelligence is the Great Filter Brilliant summary of the issues in an upcoming journal article. https://www.sciencedirect.com/science/article/pii/S0094576524001772?dgcid=rss_sd_all #Technology _reply Wed, 17 Apr 2024 06:28:31 -0700
Wily from private IP /all Good paper, but I have a few issues with it: 1. Why would artificial intelligence be a true Great Filter in the sense of explaining the Great Silence? As in, why wouldn't an artificial superintelligence that has wiped out its biological creators also be interested in expanding its footprint in space or send out colonizing ships until it is observable everywhere? Are we just assuming that only biological entities would want to build a higher Kardashev civilization? 2. Why wouldn't rogue AI just wipe out a multi-planetary civ just as easily, given that data is the most easily transmitted thing across the different planets, and AI likely would be instrumental in designing and piloting interstellar spacecraft? 3. Why assume that AI is genocidal just because some humans are? I find that aspect of Harari's Homo Deus the most ironically anthropomorphic, when he admits that we'd have no idea what a technogical intelligence would want, but he assumes that it would treat us like we treat pigs and cows and other lower intelligences. _reply Wed, 17 Apr 2024 09:52:11 -0700
Andy from private IP /all 1. The artificial intelligence destroys itself along with biological life, for example in a nuclear war. With no power, A.I. dies. 2. The point I thought he was making is that a multi-planetary society is more resilient and would not face the same existential risks such as nuclear war. 3. Why assume that A.I. has a preservation instinct? If it doesn't, it would be just as likely to kill itself as humanity. _reply Mon, 22 Apr 2024 08:50:05 -0700
whiteguyinchina from private IP /all i think there is another possibility that the authors do not consider. and one which has more positive evidence than the theory that AI will destroy biological life. it is, that AI will create a world akin to the movie idiocracy. semi-intelligent life persists to keep AI functioning, but it is kept at a level not intelligent enough to ever break free. which is what we are witnessing now. _reply Mon, 22 Apr 2024 09:46:06 -0700
Andy from private IP /all That's pretty good, I must admit. In that scenario, the A.I. would exterminate superintelligent humans. I'm still alive, therefore A.I. has not arisen yet. Haha.
Replies require login.

Telemetry: page generated in 22.8 milliseconds for user at 18.221.53.209 on 2024-05-09 08:14:11

© 2024 Andrew G. Watters, Esq.

Test