

Artificial neural networks are everywhere now because they are so useful. They do everything from predicting the weather to writing essays and translating them into Spanish. Their abilities have exploded during the last dozen years because they have grown far bigger. Unfortunately, though, bigger neural networks require more energy to train and run. Our brains use only about a fifth of the calories we eat to learn and perform a far greater variety of tasks. How does the network of neurons that comprises the brain manage to do all this at low energy cost? The answer is that it doesn’t use a computer–it learns on its own. Neurons update their connections without knowing what all the other neurons are doing. We have developed an approach to learning that shares this key property but is far simpler than the brain’s. Our approach exploits physics to learn and perform tasks for us. Using this approach, we have built physical systems that learn and perform machine learning on their own. Our work establishes a new paradigm for scalable learning.