I'm implying that intelligence doesn't have quantifiable levels, so I disagree with "peak" and "better" as concepts. I know we can become better at solving problems in a domain, often with the assistance of tools. But I think the bogeyman of "superintelligence" is just a human with better tools. Perhaps including a synthetic brain. I expect its smarts to be cultural, made of ideas, mediated by technology, as usual, and domain-dependent, as usual, and not frighteningly "super" for the culture it exists in. I anticipate AGI as a development of us.
OK, so, maybe there are paradigm shifts to be reached that way, and maybe technology (familiarity with tools) will be crucial for these paradigm shifts which are the only way to understand certain new things intuitively. But, maybe this has been going on already, even before there were computers.
OK, so, maybe there are paradigm shifts to be reached that way, and maybe technology (familiarity with tools) will be crucial for these paradigm shifts which are the only way to understand certain new things intuitively. But, maybe this has been going on already, even before there were computers.