George A. Miller’s paper The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information discussed some limits of the human brain with respect to information processing. In particular, his research had found that people are unable to keep up with more than 5-9 different chunks of information at a time. This is actually why phone numbers in the United States are seven digits long, or more accurately, why they used to be an exchange and four digits. (The exchange was eventually replaced by three digits.)
I know a lot of you are thinking that information cannot be true. After all, you know you can keep more things in mind at one time. Right? According to Miller, the key is the word chunks. These chunks can be different sizes. A concept that carries a lot of information is still a single chunk. This is why is is harder to remember 10 randomly chosen numbers or words than it is to remember the words to a song.
A large portion of the history of programming has been devoted to making our chunks more meaningful. Higher level languages allow us to work without keeping trying to remember what each register is holding and how many bytes to we need for that jump instruction. Each succeeding level allows us to keep more in our heads by making the chunks bigger.
But that only works as long as the concepts map well to single chunks. If you don’t have a name for a chunk of information or a concept, it takes up more of your memory. One of the more useful effects of Design Patterns was not new OO techniques. It was the new names. Suddenly, you could refer to the Singleton Pattern instead of this class that there’s only one of in the whole system but is available to everyone, sort of like global data but not quite.
This same concept applies to user interface design. Grouping related items on the screen and putting the most commonly used items where they are immediately accessible are two ways to reduce the amount of your mind tied up by keeping up with what goes where.
The concept of chunks and Miller’s magic number applies in many places in programming. Here’s a few to ponder:
- Good variable names make it easier to remember what they are used for.
- Global variables make code more difficult to understand, because you use up chunks thinking about those variables everywhere.
- Good method names replace all of the implementation details with one chunk.
- Flags are bad for the same reason as global variables.
- A good programming metaphor helps you develop because more concepts are tied into one chunk.
- Programming paradigms help you solve programming problems by giving you names for concepts you need to work with.
- A programming language is well suited for a problem when you can express your solution in fewer chunks in that language. (Some might say you can express it more naturally in that language.)
- Good data structures help to design by reducing a number of related variables into a single coherent chunk.
- Good classes embody one chunk of information. Bad classes either express multiple chunks or none.