(Homeworks may be collected, and/or a quiz may be given on the due date.)
E2.1-E2.6
E3.1, E3.2, E3.3, E3.7
Due Friday, January 27
E4.1, E4.2, E4.5, E4.6, E4.7, E4.10, E4.11
E5.1, E5.3, E5.5, E5.7, E5.8, E5.12, E5.17, E5.18, E5.21
MiniProject #1
It is possible to modify the perceptron learning rule by adding a learning rate, as in w_new = w_old + alpha*e*p', where alpha is the learning rate. For the default perceptron rule, the learning rate is equal to 1, but other values could also be used. In this miniproject, you will investigate the use of different learning rates.
Write a MATLAB program to implement the perceptron rule with learning rate. Write the program using matrix operations (as in Equations (4.38) and 4.39)), so that vectors and matrices of any size can be accommodated.
Make up a pattern recognition problem that can be solved with a perceptron network. Have the input vectors be two dimensional, and separate these vectors into four different classes. Use at least 12 different input vectors in the training set. Each person will make up their own test problem, so no two test problems should be the same. Explain how you made up your test problem.
Using your MATLAB program, train perceptron networks to solve the problem. Try several different learning rates and compare average convergence rates. To find average convergence rates, train the network at least 100 different times, with different initial random weights (use the matlab command rand to generate random weights, as in w=rand(1,2)-0.5; ). Find the average number of iterations to reach convergence from the 100 trials.
(Do not use the Neural Network Toolbox or any other toolboxes. Write your own program using standard MATLAB commands, so that the program can be run on a computer with only the standard MATLAB installation. Each student should write their own program. Each student should have different test problems.)
Turn in the following items to the dropbox on the D2L class page (items 1 through 4 in a single PDF file):
Due Friday February 10
E6.1, E6.3, E6.6, E6.8, E6.11, E6.13, E6.17
E7.1, E7.2, E7.5, E7.7, E7.10
E8.2, E8.3i.,iii.,vii.,x., E8.5, E8.6 (for E8.3i.,iii.,vii.,x.)
E9.1, E9.2, E9.3, E9.5, E9.6, E9.7, E9.9
Due Friday Feb. 24
E10.3, E10.4, E10.5, E10.8, E10.13
E11.1, E11.2, E11.6, E11.7, E11.8, E11.13, E11.14, E11.24
This homework will not be collected. Exam #1 will take place on Friday, March 10.
Send me by email a list of your technical interest areas. I might be able to use this information in assigning your term project. If you have an interest area in which neural networks could be applied, this would be helpful for me to know.
E12.2, E12.3, E12.4, E12.5, E12.8, E12.11
E13.1, E13.2, E13.3, E13.6, E13.9, E13.10, E13.11
MiniProject #2
Write a MATLAB program to implement the incremental (stochastic gradient) form of MOBP. Test your program by approximating the function g(p) = sin(pi*p)./(pi*p). (Write your program using matrix notation, so that it will accommodate any number of neurons. Follow equations 11.41 to 11.47, with appropriate changes for the momentum algorithm - equations 12.9 and 12.10.) Choose your training set to be inputs and targets for p=-2:.2:2. (This will give you 21 examples in your training set.)
Train networks with five neurons in the hidden layer and fifteen neurons in the hidden layer. Try two different values for the learning rate and two different values for momentum. This means a total of 8 training runs.
Turn in the following items to the dropbox on the D2L class page (items 1 through 4 in a single PDF file):Due Friday, March 31
E16.2, E16.6, E16.11, E16.13, E16.15, E16.17
E17.1, E17.2, E17.3, E17.4, E17.6, E17.7, E17.9
Due Friday April 14
(From electronic edition.)
E18.1, E18.2 i., ii, v, vii, E18.3, E18.5, E18.7, E18.9
Due Friday April 28
This homework will not be collected. The second exam will be given on the due date.