PDP: a precursor to modern neural networks?

tech
deep-learning
history
technology
deeplearning
Looking back at the foundational PDP work from 1968 and how its eight core principles anticipate what we now call deep learning.
Author

Alex Strick van Linschoten

Published

May 23, 2021

Parallel Distributed Processing: Explorations in the Microstructure of Cognition, a multi-volume publication by David Rumelhart, James McClelland and the PDP Research Group, was released in 1968 and is recognised as one of the most important works relating to neural networks.

PDP (1968

They lay out eight features necessary to perform what they called ‘parallel distributed processing’ (which I suppose you can think of as a sort of precursor to modern-day deep learning):

I haven’t read the book, and I don’t fully understand all these different pieces, but it isn’t particularly hard to see the pattern of what would later come to be handled by modern-day neural networks in these features. The vocabulary used to describe it is slightly different, but you have the connectivity between neurons, and you have a process through which you update the layers…

This feels like a book that would reward returning to for a proper in-depth read later on in my studies.