Search code examples
hashdecision-treeface-recognitionsurfaccord.net

SURF with hashing


I want to ask you if i can use hashing technique with SURF algorithm,i made a program to make face recognition by matching test image with saved image dataset.

i used Accord.net and made bag of features by BOW of this library then I made ID3 decision tree and KNN but the result in both ways were not very good, i am asking if i can use hashing technique to make fast and better result,or this will not be feasible ? this is the code for BOW

                 private void button2_Click(object sender, EventArgs e)
    {
        try
        {
            var watchFEC = System.Diagnostics.Stopwatch.StartNew();
            Accord.Math.Random.Generator.Seed = 0;
            bow.ParallelOptions.MaxDegreeOfParallelism = 1;
            bow.Learn(DatasetImages);
            // After this point, we will be able to translate
            // images into double[] feature vectors using
            features = bow.Transform(DatasetImages);
            watchFEC.Stop();
            var elapsedMs = watchFEC.ElapsedMilliseconds;
            MessageBox.Show("Feature Extraction and Clastering is done" + '\n' + "Time for Feature Extraction and Clastering for Dataset is: " + elapsedMs.ToString() + "  ms");
        } catch { MessageBox.Show("Error"); }        }

and this is the code for learn

private void button3_Click(object sender, EventArgs e)
    {
        try
        {
            var watchLearn = System.Diagnostics.Stopwatch.StartNew();
            inputs = features.ToInt32();
            tree = teacher.Learn(inputs, outputs);
            error = new ZeroOneLoss(outputs).Loss(tree.Decide(inputs));
            MessageBox.Show("Error rate of learning is : "+error.ToString());
            watchLearn.Stop();
            var elapsedMs = watchLearn.ElapsedMilliseconds;
            MessageBox.Show("Learning is done" + '\n' + "Time for Learning is: " + elapsedMs.ToString() + "  ms");
        }
        catch(Exception ex) { MessageBox.Show("Error"+ex); }

    }

and this code for test

      private void button4_Click_1(object sender, EventArgs e)
    {
        try
        {
            var watchTest = System.Diagnostics.Stopwatch.StartNew();
            Bitmap[] testimage = new Bitmap[1];
            testimage[0] = (Bitmap)pictureBox1.Image;
            var ff = bow.Transform(testimage);
            ff.ToInt32();
            var predicted = tree.Decide(ff);
            int i = 1;
            for (i = 1; i < sizeofdataset; i++)
            {
                if (predicted[0] == Convert.ToInt16(workSheet.Cells[i, 3].Value.ToString()))
                {

                    listBox1.SelectedItem = i;
                    MessageBox.Show("Test" + i);
                    break;
                }
            }
            MessageBox.Show("Test" + predicted[0]);
            pictureBox2.Image = new Bitmap(workSheet.Cells[i, 1].Value.ToString());
            watchTest.Stop();
            var elapsedMs = watchTest.ElapsedMilliseconds;
            MessageBox.Show("Time for Testing is: " + elapsedMs.ToString() + "  ms");
        }

        catch (Exception ex) { MessageBox.Show("Error" + ex); }

    }

Solution

  • Instead of ID3 or k-NN, please try using a SVM with a Chi-Square kernel.

    If you would like to give SVMs a try, there is an example on how to create multi-class kernel SVMs at the bottom of this page (see second example). You can replace all places where it is written "Gaussian" by "ChiSquare" in order to create a chi-square SVM.

    If you happen to run in a {"Index was outside the bounds of the array."} as you have indicated in the project's issue tracker, I think you might have a class without training or testing samples. Please make sure you have enough training samples for all classes, that your class numbers start at 0, that the highest class label in your output vector corresponds to the number_of_classes - 1 and that there are no integers in this interval without any associated training samples.

    I am posting below an example on how to train SVMs using a Chi-Square kernel in the Accord.NET Framework:

    // Let's say we have the following data to be classified
    // into three possible classes. Those are the samples:
    // 
    double[][] inputs =
    {
        //               input         output
        new double[] { 0, 1, 1, 0 }, //  0 
        new double[] { 0, 1, 0, 0 }, //  0
        new double[] { 0, 0, 1, 0 }, //  0
        new double[] { 0, 1, 1, 0 }, //  0
        new double[] { 0, 1, 0, 0 }, //  0
        new double[] { 1, 0, 0, 0 }, //  1
        new double[] { 1, 0, 0, 0 }, //  1
        new double[] { 1, 0, 0, 1 }, //  1
        new double[] { 0, 0, 0, 1 }, //  1
        new double[] { 0, 0, 0, 1 }, //  1
        new double[] { 1, 1, 1, 1 }, //  2
        new double[] { 1, 0, 1, 1 }, //  2
        new double[] { 1, 1, 0, 1 }, //  2
        new double[] { 0, 1, 1, 1 }, //  2
        new double[] { 1, 1, 1, 1 }, //  2
    };
    
    int[] outputs = // those are the class labels
    {
        0, 0, 0, 0, 0,
        1, 1, 1, 1, 1,
        2, 2, 2, 2, 2,
    };
    
    // Create the multi-class learning algorithm for the machine
    var teacher = new MulticlassSupportVectorLearning<ChiSquare>()
    {
        // Configure the learning algorithm to use SMO to train the
        //  underlying SVMs in each of the binary class subproblems.
        Learner = (param) => new SequentialMinimalOptimization<ChiSquare>()
        {
            // Estimate a suitable guess for the Gaussian kernel's parameters.
            // This estimate can serve as a starting point for a grid search.
            UseKernelEstimation = true
        }
    };
    
    // Configure parallel execution options (or leave it at the default value for maximum speed)
    teacher.ParallelOptions.MaxDegreeOfParallelism = 1;
    
    // Learn a machine
    var machine = teacher.Learn(inputs, outputs);
    
    // Obtain class predictions for each sample
    int[] predicted = machine.Decide(inputs);
    
    // Get class scores for each sample
    double[] scores = machine.Score(inputs);
    
    // Compute classification error
    double error = new ZeroOneLoss(outputs).Loss(predicted);