Princeton University and Institute for Advanced Study│ Princeton, New Jersey
For applying concepts of theoretical physics to provide new insights on important biological questions in a variety of areas, including neuroscience and genetics, with significant impact on machine learning, an area of computer science.
It’s official. Our technology has begun to learn from the world around it. As you read this, someone’s smartphone is teaching itself to recognize its owner’s face. A self-driving car is learning to drive by watching us behind the wheel. The origin of this “machine learning” revolution can be traced back to John Hopfield, whose career is as fascinating as the technologies his ideas helped foster. Hopfield earned his Ph.D. in physics but is now a professor of molecular biology at Princeton. Over time, his research meandered from hard physics to neuroscience, where he applied his knowhow from the former to construct an artificial neural network capable of modeling certain functions of the human brain. Just as hearing a song can bring back the memory of an old friend, Hopfield’s neural network allowed machines to store simple memories and, like our brains, recall them with only partial information. Decades later, these fundamental concepts have helped to unleash the tide of “deep learning” technologies that allow machines to observe, remember, and learn on their own.