Homework Assignments

CHE/ECEN/MAE 5733 Neural Networks

(Homeworks may be collected, and/or a quiz may be given on the due date.)


Homework #1

E2.1-E2.6

E3.1, E3.2, E3.3, E3.7

Due Friday, January 27


Homework #2

E4.1, E4.2, E4.5, E4.6, E4.7, E4.10, E4.11

E5.1, E5.3, E5.5, E5.7, E5.8, E5.12, E5.17, E5.18, E5.21

MiniProject #1

It is possible to modify the perceptron learning rule by adding a learning rate, as in w_new = w_old + alpha*e*p', where alpha is the learning rate.  For the default perceptron rule, the learning rate is equal to 1, but other values could also be used. In this miniproject, you will investigate the use of different learning rates.

Write a MATLAB program to implement the perceptron rule with learning rate. Write the program using matrix operations (as in Equations (4.38) and 4.39)), so that vectors and matrices of any size can be accommodated. 

Make up a pattern recognition problem that can be solved with a perceptron network. Have the input vectors be two dimensional, and separate these vectors into four different classes. Use at least 12 different input vectors in the training set. Each person will make up their own test problem, so no two test problems should be the same. Explain how you made up your test problem.

Using your MATLAB program, train perceptron networks to solve the problem. Try several different learning rates and compare average convergence rates. To find average convergence rates, train the network at least 100 different times, with different initial random weights (use the matlab command rand to generate random weights, as in w=rand(1,2)-0.5; ). Find the average number of iterations to reach convergence from the 100 trials.

(Do not use the Neural Network Toolbox or any other toolboxes.  Write your own program using standard MATLAB commands, so that the program can be run on a computer with only the standard MATLAB installation. Each student should write their own program. Each student should have different test problems.)

Turn in the following items to the dropbox on the D2L class page (items 1 through 4 in a single PDF file):

  1. Discussion of the results you obtained in your tests. What advantages or disadvantages did you find for different learning rates (This is the most important part of the miniproject. In addition to tables, plots and codes, explain what you did in each part.)
  2.  Listing of all MATLAB code. Also, tables showing number of iterations to convergence for each learning rate.
  3. One set of final weights and biases of the networks.
  4. A plot of one final decision boundary, demonstrating that the problem is solved. The plot should be generated automatically by your program.  Show the input training vectors on the same plot as the decision boundary, to demonstrate the relationship between the decision boundaries and the training vectors.
  5. Zipped file containing your matlab program and instructions for running it.

Due Friday February 10


Homework #3

E6.1, E6.3, E6.6, E6.8, E6.11, E6.13, E6.17

E7.1, E7.2, E7.5, E7.7, E7.10

E8.2, E8.3i.,iii.,vii.,x., E8.5, E8.6 (for E8.3i.,iii.,vii.,x.)

E9.1, E9.2, E9.3, E9.5, E9.6, E9.7, E9.9

Due Friday Feb. 24


Homework #4

E10.3, E10.4, E10.5, E10.8, E10.13

E11.1, E11.2, E11.6, E11.7, E11.8, E11.13, E11.14, E11.24

This homework will not be collected. Exam #1 will take place on Friday, March 10.


Homework #5

Send me by email a list of your technical interest areas.  I might be able to use this information in assigning your term project.  If you have an interest area in which neural networks could be applied, this would be helpful for me to know.

E12.2, E12.3, E12.4, E12.5, E12.8, E12.11

E13.1, E13.2, E13.3, E13.6, E13.9, E13.10, E13.11

MiniProject #2

Write a MATLAB program to implement the incremental (stochastic gradient) form of MOBP. Test your program by approximating  the function g(p) = sin(pi*p)./(pi*p). (Write your program using matrix notation, so that it will accommodate any number of neurons. Follow equations 11.41 to 11.47, with appropriate changes for the momentum algorithm - equations 12.9 and 12.10.) Choose your training set to be inputs and targets for p=-2:.2:2. (This will give you 21 examples in your training set.)

Train networks with five neurons in the hidden layer and fifteen neurons in the hidden layer. Try two different values for the learning rate and two different values for momentum. This means a total of 8 training runs.

Turn in the following items to the dropbox on the D2L class page (items 1 through 4 in a single PDF file):
  1. Your discussion of the results. Your discussion might be the most important part of this mini project.  Discuss the effects of changing the number of neurons, changing the learning rate and changing the momentum.  What did you learn from doing this project?
  2. Plot of sum-squared error (over entire training set) versus iteration number (as in the left side of Figure 12.12, but use a log-log scale) for each run. To save time, you can calculate the total sum-squared error once every 20 iterations. You should make one plot for each learning rate/momentum combination and both the 5 neuron and 15 neuron cases.
  3. Plots of the network function and actual function at two stages of the training (after 100 iterations, and at the completion of training). This would be similar to Figure 11.12. When you plot the functions, use a spacing of p=-2:.01:2. Use +'s to indicate the points from the training set, as in Figure 11.15.  Be sure that when you compute the network function for the plot that you use the same weights and biases for all values of p. You only need to make these plots for one learning rate/momentum combination, but for both 5 neurons and 15 neurons.
  4. Listing of all programs.
  5. Zipped file containing your matlab program and instructions for running it.

Due Friday, March 31


Homework #6

E16.2, E16.6, E16.11, E16.13, E16.15, E16.17

E17.1, E17.2, E17.3, E17.4, E17.6, E17.7, E17.9

Due Friday April 14


Homework #7

(From electronic edition.)

E18.1, E18.2 i., ii, v, vii, E18.3, E18.5, E18.7, E18.9

Due Friday April 28

This homework will not be collected. The second exam will be given on the due date.


Syllabus