Norda Seymour-Hall me-retweet

Max Tegmark says building AGI and superintelligence without regulation is civilizational suicide
Humanity may not lose control the moment it arrives, but that is when the fall begins
"we're closer to building it than controlling it"
If we create machines that outthink us in every way, the default outcome is that we lose
English


















