Short-term memory, working memory, chunking memory
Dr. Fabien Mathy - Special Talk""
Short term memory is famously limited in capacity to Miller's (1956) magic number 7 or, in many more recent studies on working memory, about 4 chunks of information, but the deﬁnition of "chunk" in both the short-term memory and working memory contexts has never been clear. A more precise conception of chunk derived from the notion of Kolmogorov complexity is proposed: a chunk is a unit in a maximally compressed code. I present a series of experiments in which the compressibility of stimulus sequences was manipulated. In contrast to most working memory tasks where chunking is deliberately suppressed, the goal of our chunking memory tasks is to quantify chunking by introducing regular patterns into the stimuli. Our subjects' measured digit span appears to be about 3 or 4 chunks after compression, equivalent to about 7 uncompressed items of typical compressibility. I also present a new method based on an algorithm usually dedicated to DNA sequence alignment in order to reliably score short-term memory performance. The idea is that deletion, substitution, translocation, and insertion errors, which are typical in DNA, are also typical errors in short-term memory. I illustrate the difﬁculty of arriving at a good measure of the memory span.