Think of the scent of roses. What memories does it conjure? A leisurely walk in a garden, someone wearing a rosy perfume? In the blink of an eye, you’re able to travel in time through your experiences. The rose leads you to your memories through associations.
You’re able to both have memories out of the things you experience (episodic memory) and memorize new facts without personal experiences (a Japanese rose is rosa multiflora in Latin, and remembering this is a case of semantic memory).
While an artificial neural network certainly does not experience roses the way you do, it can also fetch content from its memory storage in a somewhat similar fashion as you can – if there’s content associated with your search word “cat” in the network, it’s able to return all cat-related instances.
In computer science terms, this is called content-addressable memory. Traditional computer storage can only fetch a memory instance if it is given the exact location of the given instance.
In the history of science, the idea of associations–information grouped together by similarity or co-occurrence–takes us back to ancient Greece: its centrality to human thought was already noted by Aristotle.
During the Enlightenment period in the 18th century, David Hume revisited the idea, laying a cornerstone in the later development of psychology as a field of study. The idea of associations was put into numbers with the development of Hebbian learning and related algorithms, such as self-organizing maps, in the last decades.