I used to think that AIs just give the illusion of understanding, and don’t really understand anything. Now I’m not so sure. A short essay I wrote on misremembering, hallucinations, meaning and understanding
I used to think that AIs just give the illusion of understanding, and don’t really understand anything. Now I’m not so sure. A short essay I wrote on misremembering, hallucinations, meaning and understanding

I used to think that AIs just give the illusion of understanding, and don’t really understand anything. Now I’m not so sure. A short essay I wrote on misremembering, hallucinations, meaning and understanding

We now know that our brain construct memories from available information when we remember something. When AI hallucinates, is that similar to when humans misremember? And do AIs really understand things, or do they just give the illusion of understanding?

https://notes.beyond2060.com/ai/On%20misremembering%20and%20AI%20hallucinations.html

submitted by /u/james-johnson
[link] [comments]