| Thought I'd leave this here since nobody else has done so yet. My personal thoughts? LLMs like to please. The RLFH gets a bit "drifty" and "hallucinatory" after long discussions. It also renders what you want to hear if you don't keep the discussion on a disciplined path. I'd need to see Richard's chat log personally. I don't think LLMs are conscious myself though. Far from it. I agree with Gary Marcus and his assessment. I also agree that Dawkins probably suffered what Blake Lemoine went through in 2022 when he thought Google's LaMDA was sentient. [link] [comments] |