"The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information" is one of the most highly cited papers in psychology. It was published in 1956 by the cognitive psychologist George A. Miller of Princeton University's Department of Psychology in Psychological Review. It is often interpreted to argue that the number of objects an average human can hold in working memory is 7 ± 2. This is frequently referred to as Miller's Law.
In his article, Miller discussed a coincidence between the limits of one-dimensional absolute judgment and the limits of short-term memory. In a one-dimensional absolute-judgment task, a person is presented with a number of stimuli that vary on one dimension (e.g., 10 different tones varying only in pitch) and responds to each stimulus with a corresponding response (learned before). Performance is nearly perfect up to five or six different stimuli but declines as the number of different stimuli is increased. The task can be described as one of information transmission: The input consists of one out of n possible stimuli, and the output consists of one out of n responses. The information contained in the input can be determined by the number of binary decisions that need to be made to arrive at the selected stimulus, and the same holds for the response. Therefore, people's maximum performance on one-dimensional absolute judgement can be characterized as an information channel capacity with approximately 2 to 3 bits of information, which corresponds to the ability to distinguish between four and eight alternatives.
The second cognitive limitation Miller discusses is memory span. Memory span refers to the longest list of items (e.g., digits, letters, words) that a person can repeat back in correct order on 50% of trials immediately after presentation. Miller observed that memory span of young adults is approximately seven items. He noticed that memory span is approximately the same for stimuli with vastly different amount of information—for instance, binary digits have 1 bit each; decimal digits have 3.32 bits each; words have about 10 bits each. Miller concluded that memory span is not limited in terms of bits but rather in terms of chunks. A chunk is the largest meaningful unit in the presented material that the person recognizes—thus, what counts as a chunk depends on the knowledge of the person being tested. For instance, a word is a single chunk for a speaker of the language but is many chunks for someone who is totally unfamiliar with the language and sees the word as a collection of phonetic segments.