Tags
Language
Tags
April 2024
Su Mo Tu We Th Fr Sa
31 1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30 1 2 3 4

Simon Haykin, "Neural Networks. A comprehensive Foundation", 2 ed. (repost)

Posted By: TimMa
Simon Haykin, "Neural Networks. A comprehensive Foundation", 2 ed. (repost)

Simon Haykin, "Neural Networks. A comprehensive Foundation", 2 ed.
Publisher: Prentice Hal | 2005 | ISBN: 8178083000 | English | PDF | 823 pages | 39.85 Mb

This text represents the first comprehensive treatment of neural networks from an engineering perspective. Thorough, well-organized, and completely up-to-date, it examines all the important aspects of this emerging technology. Neural Networks provides broad coverage of the subject, including the learning process, back propogation, radial basis functions, recurrent networks, self-organizing systems, modular networks, temporal processing, neurodynamics, and VLSI implementations. Chapter objectives, computer experiments, problems, worked examples, a bibliography, photographs, illustrations, and a thorough glossary reinforce key concepts. The author's concise and fluid writing style makes the material more accessible.

For graduate-level neural network courses offered in the departments of Computer Engineering, Electrical Engineering, and Computer Science.

Renowned for its thoroughness and readability, this well-organized and completely up-to-date text remains the most comprehensive treatment of neural networks from an engineering perspective. Thoroughly revised.
NEW TO THIS EDITION
• NEW—New chapters now cover such areas as:
• Support vector machines.
• Reinforcement learning/neurodynamic programming.
• Dynamically driven recurrent networks.
• NEW-End—of-chapter problems revised, improved and expanded in number.
FEATURES
• Extensive, state-of-the-art coverage exposes the reader to the many facets of neural networks and helps them appreciate the technology's capabilities and potential applications.
• Detailed analysis of back-propagation learning and multi-layer perceptrons.
• Explores the intricacies of the learning process—an essential component for understanding neural networks.
• Considers recurrent networks, such as Hopfield networks, Boltzmann machines, and meanfield theory machines, as well as modular networks, temporal processing, and neurodynamics.
• Integrates computer experiments throughout, giving the opportunity to see how neural networks are designed and perform in practice.
• Reinforces key concepts with chapter objectives, problems, worked examples, a bibliography, photographs, illustrations, and a thorough glossary.
• Includes a detailed and extensive bibliography for easy reference.
• Computer-oriented experiments distributed throughout the book
• Uses Matlab SE version 5.

Simon Haykin, "Neural Networks. A comprehensive Foundation", 2 ed. (repost)