People often use AI for its static knowledge. The most recent model releases appear to have no additional static knowledge than the previous generation. Where can I read more about the trade-offs among intelligence, knowledge, and efficiency?
People often use AI for its static knowledge. The most recent model releases appear to have no additional static knowledge than the previous generation. Where can I read more about the trade-offs among intelligence, knowledge, and efficiency?

People often use AI for its static knowledge. The most recent model releases appear to have no additional static knowledge than the previous generation. Where can I read more about the trade-offs among intelligence, knowledge, and efficiency?

Let's say you have a question for which there is not a prepackaged answer easily found on the web, and for which you cannot easily collect all the required information for determining the answer. Surely all of us have questions like these, maybe related to our jobs or hobbies.

The one I always ask LLMs is, "how did the compositional style of Anton Reicha change over his career"? Reicha was a contemporary of Beethoven and wrote some highly original music, mostly forgotten. But there is absolutely no settled answer to my question, and all models bungle it. There is a lot of academic writing now available on Reicha, but you would have to read not only it but all kinds of other musicological writing to come up with an answer, as well as primary texts. Forget for now that AI can't even read music!

I don't see anyone suggesting that any near-term models will have several orders of magnitude more training data, but even if they did, until you have one of these vaunted agents that can read hundreds of books in a short period of time, how exactly is AI supposed to answer difficult questions reliably?

What I really want is to read someone else who is more knowledgeable about AI and better at posing the question I am struggling to pose here.

submitted by /u/thythr
[link] [comments]