Discussion about this post

User's avatar
Jennie Pakula's avatar

I can't help but think that an essential link has been missed in the logic of your argument. You say this: 'Douglas Hofstadter, in Gödel, Escher, Bach, explores how the layering and recursion of symbolic systems can give rise to emergent structures of intelligence. Hofstadter uses fugues and visual illusions to highlight how the mind can perceive deeper levels of meaning that are missed by a narrower interpretative lens. But there’s a limit; beyond a certain threshold, the layers become so numerous and interwoven that they might exceed human capacity to parse them. Like trying to watch 8K footage on a 720p screen, we can’t make use of all the additional data.'

Then you go on with an argument that seems to assume that the additional layers of data are all that are needed, without addressing the question of whether AI has the complex symbolic reasoning that we employ - often without understanding ourselves the full extent that this reasoning contributes to our construction of ever deeper layers of meaning. AI doesn't know anything. It doesn't operate like a mind. Its patterns may end up being completely meaningless both to us, and indeed to the AI - because it doesn't care about meaning, or indeed anything at all.

Expand full comment

No posts