In Part VI, I wrote that the fundamental building blocks of memory is a sequence of up to seven nodes. Since memory is organized hierarchically, like a tree, any bottom level sequence can serve as a node in a higher level sequence, itself consisting of up to seven nodes. The tree is open ended, that is to say, the number of levels is indefinite. In this post, I will describe the memory builder, i.e., the mechanism that combines incoming signals into sequences. I will also describe how the tree of knowledge is used for pattern completion and recognition purposes.
Sequence Learning and Certainty
Signals arriving from the separation layer have temporal relationships that can be learned. There are only two possible temporal relationships: signals can be either concurrent or sequential. Any two correlated signals, whether concurrent or sequential, will retain their relationship every time they repeat. In other words, they will have the same frequency. This is a powerful observation that we can use to devise a memory builder.
A five-node sequenceCurrently, Animal's sequence learning mechanism consists of two parts. One part serves as a filter that eliminates all signals whose frequencies do not match that of a reference signal. The second part is the sequence learner proper. Its function is to position the signal input lines on appropriate nodes in the sequence. Note that a node may receive multiple concurrent signals but not all the signals at a given node must arrive concurrently in order for the node to fire. For recognition purposes, node firing certainty is determined by the number of signals arriving at that node.
The way this works is as follows. A signal input line is chosen at random to serve as the reference signal for a sequence. Other lines are attached randomly to the sequence and are tested for frequency fitness. That is to say, their signals must have the same frequency as the reference signal. The lines attached to each individual node are tested for concurrency at that node. An important rule is that an input line can only be connected to one node in the sequence. Eventually, the sequence forms and it is up to the sequence learner to determine the predecessors and successors and link them accordingly. Once a sequence is learned, it can be subsequently reused for recording and recognition. The temporal intervals between the nodes are recorded and used for prediction purposes.
Long Term Memory, Pattern Completion and Recognition
It is important to note that a correlation may only last for a little while. For example, while looking at someone's face, the eye moves in small jerky movements called saccades. These will generate correlated signals that depend on the particular features of the face. The memory builder must be able to rapidly detect the correlations and remember them. That is to say, it must create permanent sequences in memory. Later, if the same face reappears in the field of vision, the recorded sequences will be reactivated and reused. Thus, long-term memory consists of recorded sequences.
The way recognition works in the tree of knowledge (TOK) is both simple and powerful. Upon receiving signals from the separation layer, bottom level nodes will activate higher level nodes. Multiple low-level activations in the right order can activate upper level nodes. Eventually, an entire multi-level branch of the tree will be activated when, say, a cat appears in the visual field. What makes this mechanism powerful is that a branch can be activated even in situations of sensory uncertainty, such as a partially occluded image or an object viewed under low light conditions. This is called pattern completion. Without it, we would not be able to understand or navigate the world around us.
Pattern completion is really synonymous with anticipation. Given enough sensory information, our brains can anticipate the future and this ability is what drives our goal-directed behavior. In Part VIII and Part IX, I will explain the difference between short and long term memory and describe how the tree of knowledge is used to generate adaptive motor behavior.
The Brain: Universal Invariant Recognition