I used to think that AIs just give the illusion of understanding, and don’t really understand anything. Now I’m not so sure. A short essay I wrote on misremembering, hallucinations, meaning and understanding
We now know that our brain construct memories from available information when we remember something. When AI hallucinates, is that similar to when humans misremember? And do AIs really understand things, or do they just give the illusion of understandi…