2

This is the code I used to classify the handwritten digits of MNist database using back-propagation algorithm on an one-layered perceptron with 10 neurons. The digits are saved in an expanded (1's in last column) integer 2d matrix images[60000][785] and the labels in int tlabels[60000][10]. Every label contains a 1 in the position corresponding to the digits value and 0's in the rest (digits are 09).

// weights1[785][10] has been initialized to random values on [-0.5,0.5] in the constructor. 
// bias of neurons have been taken into account as the last row of the matrix.
// b is the learning rate , equal to 0.5.

const int Ninput=784;
const int Noutput=10;
float result[Noutput];
for (int epoch =0 ; epoch < 30 ; epoch ++) { 
    float error =0;
    for (int sample=0 ; sample < 1000 ; sample++) {

        for (int output= 0 ; output< Noutput; output++) {
            float sum = 0;
            for (int input = 0 ; input < Ninput+1; input++)
                sum += weights1[input][output]*images[sample][input];
            result[output] = sum;
        }
        float delta[Noutput];
        for (int output= 0 ; output< Noutput;  output++) {
            delta[output]=(tlabels[sample][output]-sigmoid(result[output]))*sigmoidDerivative(result[output]);
            error+= pow((tlabels[sample][output]-sigmoid(result[output])),2.0);    
        }
        for (int output= 0 ; output< Noutput; output++)
            for (int input=0 ; input < Ninput+1 ; input++)
                weights1[input][output] = weights1[input][output] +b*delta[output]*images[sample][input];
    }
    cout << error / 1000.0;
}

The error variable converges to 0.9 , meaning this code classifies all the digits into one of the classes, even though the samples are equally distributed among the classes. Is there a logical error in the code, or should I try different sets of parameters until the results become acceptable?

4

0 に答える 0