Professor and ECE Department Chair

Director; Smart Electric Power Systems Laboratory

Office Hours and Appointments: Click Here

Video Tour of Amstrong/ECE: Click Here

Download Course Materials Below

Lecture Materials

Written Lecture Materials by Deese

PowerPoint Lecture Materials by Deese

Back Propagation Learning Notes by Deese

Materials References in Lectures

Example of Pattern Recognition via Perceptron Learning

Example of Back Propagation Learning Algorithm

TensorFlow Neural Networks Online Tool

Design Projects / Assignments

Project #1: Introduction to Supervised Perceptron Learning

Project #2: Pattern Recognition via Perceptron

Project #3: Back Propagation Learning for Multi-Layer ANN

Modules (Materials By Week)

Note that any videos longer than 30 minutes are assumed to be optional.

Module #1: Introduction to Artificial Neural Nets

Videos below are a good introduction to Neural Networks...

The Art of Artificial Neural Networks on YouTube

What is a Neural Network? on Youtube

Videos below introduce learning for forward prop...

Neural Networks Demystified on Youtube

Feed-Forward Neural Networks on Youtube

Video below builds upon two above...

Gradient Decent to Solve Neural Networks on Youtube

Module #2: Artificial Neural Networks

These documents elaborates upon our lecture material...

A Simple Explanation of Artificial Neural Networks

Artificial Intelligence at Facebook

Videos below introduce practical applications of learning...

Machine Learning for Video Games on YouTube

Machine Learning Methods on YouTube

Video below introduces the back propogation algorithms...

Back-Propagation Learning Alogrithms on YouTube

Module #3: Logic Networks

Representing the XOR Gate with Neural Nets on YouTube

Finite State Machines Explanation on YouTube

Explanation of Harmonic Analysis on YouTube

Module #4: Weighted Networks - The Perceptron

Artificial Intelligence with Perceptrons on YouTube

Perceptron Training on YouTube

How we Teach Computers to Understand Pictures YouTube

Module #5: Perceptron Learning Algorithms

Nonlinear Classification on YouTube

Unsupervised vs. Supervised Learning on YouTube

Supervised Learning on YouTube

What is a Markov Chain on YouTube

Module #6: Deep-Learning Artificial Neural Nets

ELC470: Artificial Neural Networks

Catalog Information

Course Units: 1.0

Note that this syllabus is subject to change during the semester. Updates will be communicated to students and updated here.

Course Description

The objective of this course is to provide students with a sound and comprehensive understanding of artificial neural networks and machine learning, including subjects of the McCulloch-Pitts Model, activation functions, feed-forward and feed-back network structures, approximation of nonlinear functions, supervised and unsupervised machine learning algorithms, logic networks, recurrent networks, finite automata, finite state machines, harmonic analysis, weighted networks, pattern recognition, linear separability, perceptron learning algorithms, accelerating convergence, Markov Decision Processes, Dynamic Programming, and deep-learning techniques.

Primary Textbook

Deep Learning, by Ian Goodfellow and Yoshua Bengio, The MIT Press, 2016.

ISBN-13: 978-0262035613

Course Objectives*

Objective #1: to provide an introduction to the field of artificial neural networks and machine learning;

Objective #2: to teach students how to solve practical problems via implementation of these techniques via simulation;

Objective #3: to promote further independent learning on the topics of artificial neural networks and machine learning;

Evaluation / Grading

1. Projects (65%)

2. Midterm (10%) and Final Exams (15%)

3. Homework and Participation (10%)

Course Topics

Module #1: Introduction to Artificial Neural Networks

a. neural computation

b. models of computation

Module #2: Artificial Neural Networks

a. McCulloch-Pitts model

b. different network structures

c. approximation of nonlinear phenomenon

e. MCP error correction-based learning

Module #3: Logic Networks

a. boolean functions

b. feed-forward vs. recurrent networks

c. finite automata and finite statemachines

d. harmonic analysis via Hadamard-Walsh Transform

Module #4: Weighted Networks - The Perceptron

a. pattern recognition via Perceptron with example

b. limitations of the perceptron

c. linearly separable functions

Module #5: Perceptron Learning Algorithms

a. learning algorithm types (supervised vs. unsupervised)

b. vector notation

c. algorithmic learning

d. Markov Decision Processes and Dynamic Programming

Module #6: The Back Propagation Learning Algorithm

a. multi-layer perceptron networks

b. alternative activation functions (e.g. sigmoid)

c. back propagation learning algorithm theory

d. implementation of back propagation with example

Module #7: More Advanced Neural Network Topics Overview

a. clustering (ch5)

b. k-means and k-nearest neighbors (ch5)

c. PCA (ch5)

d. one vs. two-layer networks (ch6)

e. overfitting vs. underfitting (ch6)

f. gradient descent (ch7)

g. momentum (ch8)

h. initial weight selection (ch8)

i. data decorrelation (ch8)

j. complexity theory (ch10)

k. associative memories (ch12)

Module #8: Convolutional Networks

a. history, MNIST, ImageNet

b. applications including image, video, and speech processing

c. convolutional layers

d. rectified linear units

e. pooling

f. stride

g. feature maps

h. MSE vs. Softmax cost

i. LeNet, AlexNet, GoogleNet, and ResNet

j. network design

k. localization

l. visualization

m. refinement learning

n. training loss and error

Module #8: Adaptive Filtering (time permitting)

Projects

Project #1: Introduction to Supervised Perceptron Learning

Project #2: Pattern Recognition via Perceptron

Project #3: Back Propagation Learning for Multi-Layer ANN

Project #4: Dynamic Programming Approach to Traveling Salesman Problem Implemented as Markov Decision Process

Project #5: Clustering and PCA

Project #6: Least Squares Polynomial Curve Fitting

Project #7: Associative Memory

Project #8: CNN

Performance Criteria**

Same as course objectives.

Contribution

Engineering Science (75%)

Engineering Design (25%)

* Lower case letters in brackets refer to Educational Objectives of the department.

** Capital letters in brackets refer to evaluation methods used to assess student performance.