Cognitive development refers to how a person perceives, thinks, and gains understanding of his or her world through the interaction of genetic and learned factors. Among the areas of cognitive development are information processing, intelligence , reasoning, language development , and memory.
Historically, the cognitive development of children has been studied in a variety of ways. The oldest is through intelligence tests, such as the widely used Stanford Binet Intelligence Quotient (IQ) test first adopted for use in the United States by psychologist Lewis Terman (1877–1956) in 1916 from a French model pioneered in 1905. IQ scoring is based on the concept of "mental age," according to which the scores of a child of average intelligence match his or her age, while a gifted child's performance is comparable to that of an older child, and a slow learner's scores are similar to those of a younger child. IQ tests are widely used in the United States, but they have come under increasing criticism for defining intelligence too narrowly and for being biased with regard to race and gender.
In contrast to the emphasis placed on a child's native abilities by intelligence testing, learning theory grew out of work by behaviorist researchers such as John Watson (1878–1958) and B. F. Skinner (1904–1990), who argued that children are completely malleable. Learning theory focuses on the role of environmental factors in shaping the intelligence of children, especially on a child's ability to learn by having certain behaviors rewarded and others