Thoughts
The reason I don’t think superintelligence will exist is because I don’t think intelligence is linear. I thought that was pretty clear.
Turing invented the Turing test as a measure of intelligence, and so a computer that was “superintelligent” by this measure would be a computer that passed the Turing test 100% of the time.
Not exactly world-ending scary.