Can we measure the amount of written human knowledge with the size of trained LLMs?
GPT-4’s trained model (weights + structure) is said to be a few terabytes in size. Considering how much the model can do and how much data it’s been fed, is it fair to say that all written human knowledge — once compressed and generalized — fits into a…