You may have to register before you can download all our books and magazines, click the sign up button below to create a free account.
Composed of three sections, this book presents the most popular training algorithm for neural networks: backpropagation. The first section presents the theory and principles behind backpropagation as seen from different perspectives such as statistics, machine learning, and dynamical systems. The second presents a number of network architectures that may be designed to match the general concepts of Parallel Distributed Processing with backpropagation learning. Finally, the third section shows how these principles can be applied to a number of different fields related to the cognitive sciences, including control, speech recognition, robotics, image processing, and cognitive psychology. The volume is designed to provide both a solid theoretical foundation and a set of examples that show the versatility of the concepts. Useful to experts in the field, it should also be most helpful to students seeking to understand the basic principles of connectionist learning and to engineers wanting to add neural networks in general -- and backpropagation in particular -- to their set of problem-solving methods.
Written for cognitive scientists, psychologists, computer scientists, engineers, and neuroscientists, this book provides an accessible overview of how computational network models are being used to model neurobiological phenomena. Each chapter presents a representative example of how biological data and network models interact with the authors' research. The biological phenomena cover network- or circuit-level phenomena in humans and other higher-order vertebrates.
Theory; Linguistic analysis; The computer model; Studies of language; Studies of visual perceptioon and problem solving; Extensions.
Surprising tales from the scientists who first learned how to use computers to understand the workings of the human brain. Since World War II, a group of scientists has been attempting to understand the human nervous system and to build computer systems that emulate the brain's abilities. Many of the early workers in this field of neural networks came from cybernetics; others came from neuroscience, physics, electrical engineering, mathematics, psychology, even economics. In this collection of interviews, those who helped to shape the field share their childhood memories, their influences, how they became interested in neural networks, and what they see as its future. The subjects tell stori...
The philosophy of cognitive science has recently become one of the most exciting and fastest growing domains of philosophical inquiry and analysis. Until the early 1980s, nearly all of the models developed treated cognitive processes -- like problem solving, language comprehension, memory, and higher visual processing -- as rule-governed symbol manipulation. However, this situation has changed dramatically over the last half dozen years. In that period there has been an enormous shift of attention toward connectionist models of cognition that are inspired by the network-like architecture of the brain. Because of their unique architecture and style of processing, connectionist systems are gen...
The interdisciplinary field of cognitive science brings together elements of cognitive psychology, mathematics, perception, and linguistics. Focusing on the main areas of exploration in this field today, Cognitive Science presents comprehensive overviews of research findings and discusses new cross-over areas of interest. Contributors represent the most senior and well-established names in the field. This volume serves as a high-level introduction, with sufficient breadth to be a graduate-level text, and enough depth to be a valued reference source to researchers.
Recent years have seen an explosion of new mathematical results on learning and processing in neural networks. This body of results rests on a breadth of mathematical background which even few specialists possess. In a format intermediate between a textbook and a collection of research articles, this book has been assembled to present a sample of these results, and to fill in the necessary background, in such areas as computability theory, computational complexity theory, the theory of analog computation, stochastic processes, dynamical systems, control theory, time-series analysis, Bayesian analysis, regularization theory, information theory, computational learning theory, and mathematical ...
Mind Readings is a collection of accessible readings on some of the most important topics in cognitive science. Although anyone interested in the interdisciplinary study of mind will find the selections well worth reading, they work particularly well with Paul Thagard's textbook Mind: An Introduction Cognitive Science, and provide further discussion on the major topics discussed in that book. The first eight chapters present approaches to cognitive science from the perspective that thinking consists of computational procedures on mental representations. The remaining five chapters discuss challenges to the computational-representational understanding of mind. Contributors John R. Anderson, Ruth M.J. Byrne, E.H. Durfee, Chris Eliasmith, Owen Flanagan, Dedre Gentner, Janice Glasgow, Philip N. Johnson-Laird, Alan Mackworth, Arthur B. Markman, Douglas L. Medin, Keith Oatley, Dimitri Papadias, Steven Pinker, David E. Rumelhart, Herbert A. Simon.
Research in cognitive psychology, linguistics, and artificial intelligence – the three disciplines that have the most direct application to an understanding of the mental processes in reading – is presented in this multilevel work, originally published in 1980, that attempts to provide a systematic and scientific basis for understanding and building a comprehensive theory of reading comprehension. The major focus is on understanding the processes involved in the comprehension of written text. Underlying most of the contributions is the assumption that skilled reading comprehension requires a coordination of text with context in a way that goes far beyond simply chaining together the mean...