In a classical neural network, where storage relies only on synapses, all memory is always present. Synaptic connections have specific weights, and any processing event uses them with all of the memory involved.
It could be of course that in a processing event only small regions of the brain network are being used, such that the rest of the connections form a hidden structure, a reservoir, a store or repository of unused information. Such models exist.
There is a real issue with stable pattern storage of a large number of patterns with a realistic amount of interference in a synaptic memory model. Classification, such as recognition of a word shape can be done very well, but storing 10,000 words and using them appropriately seems difficult. Yet that is still a small vocabulary for a human memory.
Another solution is conditional memory, i.e. storage that is only accessed when activated, which otherwise remains silent. Neurons offer many possibilities for storing memory other than at the strength of a synapse, and it should be worthwhile investigating if any of this may be exploited in a theoretical model.