• 21 Posts
  • 1.09K Comments
Joined 3 years ago
cake
Cake day: July 1st, 2023

help-circle



















  • Obviously I don’t want the human race to go extinct, but if there was a choice of inevitable outcomes where we do, building an inimmical superintelligence at least implies agency, not carelessness.

    Anyway Big Yud’s fantasy of a precisely timed diamondoid-bacteria delivered killshot to every human being at the same time might sound terrible, but from a sensory perspective of the victims, it’s basically suffering-free. You go about your day, BAM nothingness. Maybe there’s a difference in timing so you see your partner keel over a split second before you die - again, you might not even realize what is happening.

    I am not sure from where this idea of a global instantaneous simultanous genocide comes from, maybe a tit-for-tat escalation to counter every argument against shutting down the robot god, but from a storytelling perspective it’s pretty useless. There’s no drama where the survivors lament their loss or brood over what might have been. It’s just a plug being pulled on the simulation.

    (weirdly it’s also the logical outcome of asking a computer to “end human suffering”, a bit like the Robobrain logic in Fallout 4’s Automatron expansion, but I doubt it’s meant that way)