2024 Nobel Prize in Physics and Artificial Intelligence

Özlem Ekici
4 min read1 hour ago

--

The Royal Swedish Academy of Sciences has decided to award the 2024 Nobel Prize in Physics to John Hopfield and Geoffrey Hinton for their work in Artificial Intelligence and Machine Learning.

Bu makalenin Türkçe versiyonu için: Link

©Niklas Elmehed/The Royal Swedish Academy of Sciences

Prof. Geoffrey Hinton, known as the ‘Father of Artificial Intelligence’, developed artificial neural networks that enable machines to learn in a similar way to the human brain through his work at the University of Toronto. This technology forms the basis of artificial intelligence systems used today.

The brain is a neural network of neurons that send signals to each other across synapses (left). When we learn something, the connections between some neurons strengthen while others weaken. Artificial neural networks (right) consist of connected nodes encoded with a value. When the network is trained, the connections between nodes that are active at the same time are strengthened, otherwise, they are weakened.

Comparison of natural and artificial neural networks. ©Johan Jarnestad/The Royal Swedish Academy of Sciences

In 2012, Hinton and his team developed a technology that enabled computers to recognize visual objects, which attracted the interest of large companies such as Google. However, Hinton resigned from his position at Google in 2023, expressing concern that artificial intelligence could become dangerous.

Prof. John Hopfield, on the other hand, published a paper in 1982 that enabled artificial neural networks to model the brain’s information storage and memory recall processes. This work, known as the ‘The Hopfield Network’, played an important role in the development of machine learning.

Different Types of Networks: Diagram of a Hopfield network, a Boltzmann machine, and a constrained Boltzmann machine. ©Johan Jarnestad/The Royal Swedish Academy of Sciences

In the early 1980s, Hopfield developed his eponymous network, which can be used to store patterns and then recall them using missing information. This is called associative memory, and its analog in human cognition is remembering a word when you only know the context and perhaps the first letter or two.

A Hopfield network is a layer of neurons (or nodes) connected in such a way that the state 0 or 1 of each node is influenced by the states of its neighbors (we can see this in the image above). This is similar to how magnetic materials are modeled by physicists and a Hopfield network can be said to resemble a spin glass.

When an image is fed into the network, the strengths of the links between nodes are adjusted and the image is stored in a low-energy state. This minimization process is essentially learning. When an imperfect version of the same image is input, it is subjected to an energy minimization process that will change the values of some nodes until the two images are similar. Moreover, several images can be stored in a Hopfield network, which can usually discriminate between all of them. Later networks used nodes that could take more than two values, allowing more complex images to be stored and retrieved. As networks improved, finer differences between images could be detected.

Sometime later in the 1980s, Hinton was investigating how algorithms could be used to process patterns in the same way as the human brain. Using a simple Hopfield network as a starting point, Hinton and a colleague utilized statistical physics to develop the Boltzmann machine. It is so named because it works similarly to the Boltzmann equation, which says that some states are more likely than others, depending on the energy of a system.

A Boltzmann machine typically has two interconnected layers of nodes:
1. A visible layer, which is the interface for information input and output
2. A hidden layer.
A Boltzmann machine can be generative. For example, if trained on a set of similar images, it can produce a new, original image that is similar. The machine can also learn to categorize images. It has been realized that the performance of a Boltzmann machine can be improved by eliminating the connections between some nodes and creating ‘Restricted Boltzmann Machines (RBM)’.

Hopfield networks and Boltzmann machines laid the foundations for the development of later machine learning and artificial intelligence technologies, some of which we still use today.

Some resources for additional reading:

--

--