Next: Network Files Up: Example Files Previous: Interactive Mode Contents


Example


Table 6.1: Partial truth table and SNoW examples for the concept $ x_2 \vee x_5$. SNoW (with default parameters) will create two target nodes, and each can be said to learn a separate concept. One learns to build a bigger activation for positive examples (label 1), and the other learns to build a bigger activation for negative examples (label 0).
label $ x_1$ $ x_2$ $ x_3$ $ x_4$ $ x_5$ $ x_6$ SNoW Example
false 1 0 0 0 0 0 0, 2:
true 0 0 0 0 1 0 1, 6:
false 0 0 1 1 0 0 0, 4, 5:
false 1 0 1 1 0 1 0, 2, 4, 5, 7:
true 0 1 0 1 0 0 1, 3, 5:
false 1 0 0 1 0 1 0, 2, 5, 7:
true 1 0 1 1 1 0 1, 2, 4, 5, 6:
true 0 1 0 1 1 0 1, 3, 5, 6:
false 1 0 1 0 0 1 0, 2, 4, 7:


For example, consider a learning problem where a boolean concept over 6 boolean variables is to be learned. The value $ 1$ might be used as a target ID to represent the label True, and the value 0 the label False. The values $ 2$ through $ 7$ could then be features representing the variables $ x_1$ through $ x_6$ respectively. Let's say that, unbeknownst to SNoW, we have training data that represents the concept $ x_2 \vee x_5$. A partial truth table for this concept and the corresponding SNoW examples are given in table 6.1. Look in the booleanexample/ subdirectory of the software distribution to see scripts that show how SNoW might be used to train a network over these examples.



Next: Network Files Up: Example Files Previous: Interactive Mode Contents
Cognitive Computations 2004-08-20