I wonder if this is an Ian Malcolm in Jurassic Park situation, i.e. “your scientists were so preoccupied with whether they could they didn t stop to think if they should”.
Maybe the only way to avoid an unsafe superintelligence is to not create a superintelligence at all.
Maybe the only way to avoid an unsafe superintelligence is to not create a superintelligence at all.