Next: Server Up: Tutorial Previous: Training Contents

Testing

We now have our network file, test.net, which contains the parameters of our Winnow algorithm as well as weights for all of the features which appeared in our training examples. Now that we've trained our network, we can test it on more examples.

> snow -test -I tutorial/testdata.snow -F tutorial/test.net

Our test file contains labeled examples of exactly the same format as those used in training, and we used the default output mode to simply let SNoW score its accuracy on those examples. In this mode, each example is given to the system and the resulting prediction output by the classifier is compared to the example's label. A mistake is scored if the two do not match. Here are our results:

SNoW+ - Sparse Network of Winnows Plus
Cognitive Computations Group - University of Illinois at Urbana-Champaign
Version 3.2.0
Input file: 'tutorial/testdata.snow'
Network file: 'tutorial/test.net'
Directing output to console.
Algorithm information:
Winnow: (1.35, 0.8, 4, 0.3245) Targets: 0-1
850 test examples presented
Overall Accuracy - 96.71% (822 / 850)

We can also verify how well the training data was learned by testing with our training set (that is, use the file tutorial/traindata.snow after the -I above).

To receive output on a more detailed level, we can specify different output modes on the command line with the -o outputmode parameter. For example, try executing SNoW as follows:

> snow -test -I tutorial/testdata.snow -F tutorial/test.net -o allactivations

This gives the output:

SNoW+ - Sparse Network of Winnows Plus
Cognitive Computations Group - University of Illinois at Urbana-Champaign
Version 3.2.0
Input file: 'tutorial/testdata.snow'
Network file: 'tutorial/test.net'
Directing output to console.
Algorithm information:
Winnow: (1.35, 0.8, 4, 0.3159) Targets: 0-1
Example 1 Label: 1
1: 0.98732 8.3553*
0: 0.352 3.3897

Example 2 Label: 0
0: 0.85602 5.7826*
1: 0.082134 1.5863

Example 3 Label: 1
1: 0.73296 5.0097*
0: 0.066876 1.3643
...

See the description of the -o command line parameter for more details about the information it produces.



Next: Server Up: Tutorial Previous: Training Contents
Cognitive Computations 2004-08-20