Finding Structure in Time, Jeffrey L. Elman, 1990Cognitive Science, Vol. 14 (Wiley)DOI: 10.1207/s15516709cog1402_1 - A foundational paper introducing Simple Recurrent Networks (SRNs), demonstrating how neural networks can develop an internal representation of sequential information and context over time, thereby laying much of the groundwork for iterative processing.
Deep Learning, Ian Goodfellow, Yoshua Bengio, and Aaron Courville, 2016 (MIT Press) - This authoritative textbook provides a comprehensive introduction to recurrent neural networks, clearly explaining their fundamental structure, the concept of hidden states, and how they process sequential data iteratively.
Recurrent Neural Networks and Language Models, Christopher Manning, 2023 (Stanford University) - Lecture slides from a highly regarded course, offering clear explanations and visual representations of RNNs, their iterative processing, and the role of the hidden state in maintaining context for natural language processing tasks.