A buzz is about in AI circles around “capsule networks”, a new variant on neural networks that backers say could simplify, cut the costs of, commoditize and, in the end, democratize how deep learning systems are taught to do what we want them to do.
How can capsule networks do all this? They hold out the hope of tackling one of the biggest problems in AI: radically reducing the amount of data and compute resources needed to train deep learning systems. This, in turn, means AI could become available to the broader market, no longer consigned to a few companies with mammoth compute resources and infinite volumes of data – i.e., the FANG (Facebook, Amazon, Netflix, Google) companies.
CapsNets are a hot new architecture for neural networks, invented by Geoffrey Hinton, one of the godfathers of deep learning.
In fact, Google is the father of capsule networks. Google researchers Sara Sabour, Nicholas Frosst, Geoffrey Hinton published a paper on the topic last month. Having read, or tried to read, the abstract we decided it might be best to ask someone to explain what it all means.
Read More