Skip to content

the introduction and implementation Backpropagation for Feedforward Neural Networks

Notifications You must be signed in to change notification settings

haoyujiang1994/Neural-Networks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Neural-Networks

the introduction and implementation Backpropagation for Feedforward Neural Networks :

Question: In the directory gestures, there is a set of images1 that display "down" gestures (i.e., thumbs-down images) or other gestures. In this assignment, you are required to implement the Back Propagation algorithm for Feed Forward Neural Networks to learn down gestures from training images available in downgesture_train.list. The label of an image is 1 if the word "down" is in its file name; otherwise the label is 0. The pixels of an image use the gray scale ranging from 0 to 1. In your network, use one input layer, one hidden layer of size 100, and one output node. Use the value 0.1 for the learning rate. For each perceptron, use the sigmoid function Ɵ(s) = 1/(1+e-s). Use 1000 training epochs; initialize all w randomly between -1000 to 1000 (you can also choose your own initialization approach, as long as it works); and then use the trained network to predict the labels for the gestures in the test images available in downgesture_test.list. For the error function, use the standard least square error function. Output your predictions and accuracy.

Answer: The Python implementation can be divided into two main parts: trainNN and testNN. Besides, preparing the training data set as well as testing data set are significant, which uses imread function for turning 30x32-pixel PGM image into a 30-by-32 uint8 matrix and reshape function for transforming the matrix to a vector with 960 entries. At last, we will obtain training data set datanupx as 184x960 matrix and datanupy as 184x1 matrix, what’s more, the testing data set datanupx2 as 83x960 matrix and datanupy2 as 83x1 matrix. In the trainNN part, data preprocessing comes to be the first step. In that, we add x0 into the matrix and divide datanupx by 255 in order to range entries from 0 to 1. In feedforward step, we random numbers between -1 and 1 as randomw because of the low accuracy caused by random range (-1000,1000). As randomw multiplied with datanupx to be inputs of sigmoid function, we would gain the final deltas. In backpropagation, according to different formulas, we update randomws by the deltas indicating difference between prediction and true value for each image. In the trainNN part, after randomws being trained, we multiply the test data set with trained randomws and calculate the predictions, which are larger than 0.5 to 1 and otherwise to 0. The predictions need to be compared by datanupy2 which are true values of 0 and 1, generated by checking if the file name of a given image contains the word “down”. At last, we are able to compute the accuracy which is the ratio of matches to the total number of test images. There are two different kinds of loop: 1000 epoch running, within each going through all 184 training images and 1000 epoch running, within each going through random 1 training image. The former kind is explained and executed by annotation in python.

The calculation will take about 15 minutes and the accuracy of running 1000 epochs, within each going through all 184 training image on test set and the predictions on 83 test images are as follows: prediction_list = ['True', 'True', 'False', 'False', 'False', 'False', 'False', 'False', 'True', 'False', 'False', 'True', 'False', 'False', 'False', 'False', 'True', 'True', 'False', 'False', 'False', 'False', 'False', 'True', 'False', 'False', 'False', 'True', 'False', 'True', 'False', 'False', 'True', 'False', 'False', 'False', 'False', 'False', 'False', 'True', 'True', 'False', 'True', 'False', 'False', 'True', 'True', 'False', 'False', 'False', 'True', 'False', 'True', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'True', 'False', 'False', 'False', 'False', 'True', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'True', 'True', 'False', 'False', 'False', 'False', 'False', 'False', 'False'] accuracy = 0.903614457831

The calculation will take about 10 seconds and the accuracy of running 1000 epochs, within each going through random 1 training image on test set and the predictions on 83 test images are as follows: prediction_list = ['True', 'True', 'False', 'False', 'False', 'False', 'False', 'False', 'True', 'True', 'False', 'False', 'False', 'False', 'False', 'False', 'True', 'True', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'True', 'False', 'False', 'False', 'False', 'True', 'False', 'False', 'False', 'False', 'False', 'False', 'True', 'True', 'False', 'False', 'False', 'False', 'True', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'False', 'True', 'False', 'False', 'False', 'False', 'True', 'True', 'False', 'False', 'False', 'False', 'False', 'False', 'True', 'True', 'False', 'False', 'False', 'False', 'False', 'False', 'False'] accuracy = 0.939759036145

About

the introduction and implementation Backpropagation for Feedforward Neural Networks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages