![]() | Many people seem to struggle with this, and I think this video explains it pretty well. Intelligence is, in my opinion, deeply connected with one's understanding of the physical world (which can come simply from watching videos without the need for a physical body). If you speak to a disembodied chatbot and it doesn't understand the physical world, then it can't possibly understand abstract concepts like science or math. Science comes from understanding the physical world. We observe phenomena (often over looong periods of time because the world is incredibly complex) and we come up with explanations and theories. Math is a set of abstractions built on top of how we process the world. When AI researchers like LeCun say that "Cats are smarter than any LLM", they aren't referring to "being better at jumping". They are saying that no AI systems today, whether they're LLMs, SORA, MidJourney, physical robots or even LeCun's own JEPA architecture, understand the world even at the level of a cat If you don't understand the physical world, then your understanding of anything else is superficial at best. Any question or puzzle you happen to solve correctly is probably the result of pure pattern-matching, without real understanding involved at any point. Abstractions go beyond the physical world, but can only emerge once the latter is deeply understood Sources: [link] [comments] |