Introduction to Neural Networks

by Nic Schraudolph and Fred Cummins

Our goal is to introduce students to a powerful class of model, the Neural Network. In fact, this is a broad term which includes many diverse models and approaches. We will first motivate networks by analogy to the brain. The analogy is loose, but serves to introduce the idea of parallel and distributed computation.

We then introduce one kind of network in detail: the feedforward network trained by backpropagation of error. We discuss model architectures, training methods and data representation issues. We hope to cover everything you need to know to get backpropagation working for you. A range of applications and extensions to the basic model will be presented in the final section of the module.

Lecture 1: Introduction

Lecture 2: The Backprop Toolbox Lecture 3: Advanced Topics

By popular demand: Lectures 1 & 2 as a ZIP file; Lecture 3 as a ZIP file.

© 1998-2006 Nic Schraudolph, Fred Cummins, and Jenny Orr. All rights reserved.