The founding paper for the recent boom in generative AI stems from the paper “attention is all you need” which postulated the transformer architecture in June 2017, it took roughly 1 year for it to start being scaled with the release of GPT:1 in June 2018, after that it took 4 more years for chatgpt to be first released. If we apply the log rule given the new advances in processing and the AI boom it’ll probs be in 3-4 years from the initial release date of the new foundational paper BUT we don’t know if that paper has been released yet or not so until then.
[link] [comments]