Parallel Processing Models of Consumer Information Processing: Their Impact on Consumer Research Methods

Dawne Martin, University of Calgary
Pamela Kiecker, Texas Tech University
ABSTRACT - Current models of consumer information processing assume that processing occurs in a serial, or sequential fashion. However, new models of parallel distributed processing (PDP) have been developed in the fields of cognitive science and artificial intelligence. These new models assume massively parallel processing in the human brain, and have implications for marketers' research on consumer information processing. Parallel models are introduced and underlying assumptions of serial and parallel models are compared. The impact of accepting parallel processing models on the development of consumer research methods and analysis of results is discussed.
[ to cite ]:
Dawne Martin and Pamela Kiecker (1990) ,"Parallel Processing Models of Consumer Information Processing: Their Impact on Consumer Research Methods", in NA - Advances in Consumer Research Volume 17, eds. Marvin E. Goldberg, Gerald Gorn, and Richard W. Pollay, Provo, UT : Association for Consumer Research, Pages: 443-448.

Advances in Consumer Research Volume 17, 1990      Pages 443-448


Dawne Martin, University of Calgary

Pamela Kiecker, Texas Tech University


Current models of consumer information processing assume that processing occurs in a serial, or sequential fashion. However, new models of parallel distributed processing (PDP) have been developed in the fields of cognitive science and artificial intelligence. These new models assume massively parallel processing in the human brain, and have implications for marketers' research on consumer information processing. Parallel models are introduced and underlying assumptions of serial and parallel models are compared. The impact of accepting parallel processing models on the development of consumer research methods and analysis of results is discussed.


When the study of cognition became popular in the 1960s, it was generally assumed that the human information processing system functioned much like a von Neumann computer. Processing was viewed as serial, consisting of a sequence of discrete operations. Memory was viewed as a set of separate stores. Serial processing models, originally introduced in 1936 by English logician Alan Turing, are the foundation of current models of human information processing used in marketing. The most basic assumption of serial processing models, as delineated by Newell and Simon's (1972) theory of problem solving, requires that only one information process occur at a time. This view has prospered for many years, with more recent models and theories, including frames (Minsky 1975), schemata (Bobrow and Norman 1975), scripts (Schank and Abelson (1977), knowledge assembly (Hayes-Roth 1977) and Bettman's information processing theory of consumer choice (1979), incorporating serial processing. This fact has not only influenced development of human information processing theory, but has directed researchers' selection of research methods used in the study of consumer information processing, particularly process tracing methods.

Evidence from neurobiology suggests that the brain does not operate in a serial manner (Shepherd 1979, Crick and Asanuma 1986). This evidence shows that the human brain is massively parallel, capable of processing a large number of relatively complex cognitive tasks simultaneously. Researchers in a variety of fields have now begun to develop models of human information processing based on this evidence of parallel processing (McClelland, Rumelhart and the Parallel Distributed Processing (PDP) Research Group 1986, Anderson and Hinton 1981). This work has lead to a reconceptualization of memory structure and knowledge representation (Anderson 1976, Feldman 1985, Rumelhart and Norman 1981). Testing these new models requires research methods that are not, necessarily, tied to the basic assumption of serial processing. If selection of either serial or parallel processing models determines how consumer research is conducted, effects of the respective models become a fundamental issue for researchers.

This paper has three objectives: (1) to introduce the concept of parallel processing; (2) to compare specific assumptions underlying serial and parallel processing models; and (3) to address implications of replacing serial processing models with parallel processing models for consumer research. In particular, the appropriateness of current research techniques used by consumer researchers are evaluated in light of parallel processing models.


Parallel processing models have grown out of research in cognitive science. Cognitive science is interdisciplinary, drawing on expertise from a variety of fields, including neurobiology, cognitive psychology, mathematics and computer science. Much of the work in this area has been conducted under the rubric of artificial intelligence (AI). The major focus of AI is to develop methods of replicating human intelligence in a computer environment. These attempts at replication of human intelligence have suggested a need for reevaluation of serial processing models. Now, in increasing numbers, researchers are exploring a different framework for thinking about cognitive processes. These researchers generally accept the computer metaphor as a useful approximation of the macrostructure of human thought, however, they have come to feel that an alternative framework may be more appropriate for characterizing its microstructure.

This alternative is suggested in various parallel processing models appearing in the literature over the past fifteen years. The models are referred to variously as Connectionist Models (Feldman 1981, Feldman and Ballard 1982), Parallel Distributed Processing (PDP) Models (McClelland, Rumelhart and the PDP Research Group 1986) and Parallel Models of Associative Memory (Anderson and Hinton 1981). The term "Connectionist Models" is the most general. It is used to refer to a class of information processing models that consist of a network of simple processing units interacting via a series of connections. PDP models are a type of Connectionist Model that stress the notion that processing activity results from processing interactions occurring among rather large numbers of processing units (Johnson-Laird 1988). Connections allow for interactions between units. Each specific Connectionist Model will make assumptions about the number of units and the pattern of connections between units. The set of units and their connections is typically called a network.

According to the Connectionist view, the human information processing system is a network operating under a set constraints. Each unit, which may be a neuron or some network of neurons, operates through connections with other units. The number and type of connections define the constraints imposed on the network. Units take on activation values based on a weighted sum of their inputs from the environment and form other units. The connections are also weighted--either positively or negatively--so that a particular input will either excite or inhibit the unit that receives it, depending on the sign of the weight.

The strength or weakness of the constraints is determined by the weight of the connection. Learning is a function of changing existing constraints through modification of weights, or introducing new constraints through new connections. In parallel models units do not represent complex concepts, such as those associated-with schemata (Rumelhart, et al. 1986). Rather, units are relatively simple, and "schemata emerge at the moment they are needed from the interaction of large numbers of much simpler elements all working in concert with one another" (Rumelhart, et al. 1986, p. 21).

This brief description suggests that parallel models are substantially different from conventional serial models of human information processing. Parallel models begin with neurophysiological evidence, then build on this evidence to describe the cognitive functions of human thought, learning and intelligence. The assumptions underlying parallel models are also very different from those underlying serial models. The next section discusses two important assumptions that distinguish parallel from serial models.


There are numerous assumptions, beginning at the most basic levels and continuing through complex cognitive designs, which differ significantly between parallel and serial models of information processing. Johnson-Laird (1988) considers each of several aspects of an information processing system, describing the general assumptions parallel models make about these aspects. In this section, only two assumptions which appear to have a direct impact on the appropriateness of different research methods currently designed to evaluate human information processing are reviewed. These assumptions are the most general and cover the nature of (1) the processing model and (2) memory structure. Each is reviewed in turn.

The Nature of the Processing Model

Perhaps the most important assumption distinguishing parallel processing models from serial processing models concerns that nature of the model. Contrary to Newell and Simon's (1972) suggestion that there are central control units or "executives" required to oversee the operations of various programs or production systems, in parallel processing models the human information processing system is seen to be capable of complex global goal directed processing (Rumelhart and Norman 1985).

Representations in parallel models are patterns of activation over the units in the network. The representations are truly active, in the sense that they give rise to further processing activity directly without any need for a central processor that examines them and takes action on the basis of the results of this examination. Knowledge is stored in the connections between processing units. This assumption works together with assumptions about representations. An active representation of a set of units, together with the knowledge stored in connections, will give rise to new patterns of activation on the same or other units. While the concept of executives was developed for serial models in order to address the need to have some control over the sequencing of the serial processes assumed in the models, parallel models do not require this type of control mechanism.

Neurobiological evidence, as mentioned in the preceding section, strongly suggests that the brain is massively parallel. Parallel models assume that this neurophysiological reality is the basis for construction of memory and, consequently, of knowledge representation (Rumelhart and Norman 1981). In contrast, Newell and Simon (1972) recognize evidence for parallel processing, but contend that "it is possible for an IPS [information processing system] to be parallel for two activities, but not parallel for three or more, since the processor may be capable of carrying along only two independent control sequences" (p. 797). Thus, Newell and Simon's imposition of a control system (a characteristic of serial processing models) excluded the possibility of parallel processing at higher levels of cognition.

Another response to the evidence for parallel processing is that the "computation devices" (e.g., neurons) should not have an impact on how concepts are encoded in memory. However, the underlying configuration of the neurons does impact the way information is processed. The way that computational devices operate influences how higher-level cognitive structures are created and operate. For instance, frames (Minsky 1975) and scripts (Schank and Abelson 1977) employ passive data structures which require explicit default values to explain the creation and operation of higher-level cognitive structures. Any variation in operation of passive data structures would alter the conceptualization of memory structure and knowledge representation (Norman 1986).

The Nature of Memory Structure

The second assumption to be addressed is that of memory structure. In parallel models, information is not stored in a "place." Rather, it is represented by the relationship among units, and each unit may be involved in the storage of multiple memories (Rumelhart and Norman 1981). Thus, memory is distributed across relationships, rather than associated with a particular place in long-term or short-term memory. Accessing memory requires tracing the relationships or connections between units. This conception of memory allows traces to interact, with the result that memory tends to focus on the commonality of information or experiences. Therefore, individuals are capable of forming generalizations from previous experiences when faced with new circumstances.

In contrast, serial models assume that information is stored in a particular place, distinct from the storage of other information. Retrieval of information from memory requires "addresses" or "pointers" to describe the place where information is stored. Retrieval is characterized by finding the information stored in memory, finding the right schema, or following the appropriate links in semantic networks.

Another major difference between serial and parallel models is the categorization of memory as long-term and short-term. Serial models, which require a problem space and short-term storage of information to be manipulated by the production system, focus on short-term memory. In these models, short-term memory is very small, able to hold only five to seven chunks of information (Newell and Simon 1972). The size of the short-term memory places a major constraint on human information processing capacity. This idea of limited processing capacity has generated the notions of information overload (Jacoby, Speller and Kohn 1974, Malhotra 1982, Jacoby 1984, Malhotra 1984) and distraction (Nelson, Duncan and Frontczak 1985, Petty, Wells and Brock 1976, Venkatesan and Haaland 1968, Wright 1974).

In contrast, parallel models do not make any assumptions regarding divisions in memory. What is termed short-term memory in serial models is essentially the electrical activation of the network of relationships that represents the information. Since short-term memory is based on activation, it decays rapidly and may be rather limited (Feldman 1985). However, it is not specifically limited to between five and seven chucks as cited by Newell and Simon (1972). What appears as long-term memory in serial models involves the actual changing of weights in the relationships between units in parallel models (Feldman 1985).

In summary, the conception of memory in parallel models does not call for separate "places@' for short-term and long-term memory. Instead, these types of distinctions are a function of the basic processes occurring in the brain (Anderson 1976, Feldman 1985). Furthermore, the assumptions underlying memory in parallel models are not as restrictive as those for serial models with respect to processing capacity. As the above discussion emphasizes, important differences exist between serial and parallel models of human information processing with respect to the nature of the processing models and the structure of memory. These differences underlie other substantive differences in the way the models address higher levels of cognitive activity, such as learning, inferences and choice. The differences also impact decisions with respect to the research techniques used in the investigation of consumer information processing. The appropriateness of many current techniques used by consumer researchers studying information processing needs to be evaluated in light of evidence supporting parallel models.


Process tracing techniques have become dominant in research aimed at describing actual consumer choice behavior (Slovic, Fischhoff and Lichtenstein 1977). A basic objective of the research has been to find alternative models or heuristics that take into account consumers' mental representations of information upon which decisions are based (Tversky and Kahneman 1981). However, these techniques have been criticized on two particular accounts: (1) their static nature (Jacoby, Szybillo and Busato-Schach 1977) and (2) their focus on overt, voluntary behavior (Lynch and Srull 1982). In addition to these criticisms, serial processing is assumed when these techniques are used. More importantly, use of process tracing techniques actually forces subjects into describing processing in a serial fashion.

Process tracing techniques include protocol analysis, information monitoring, and eye movement analysis. The assumptions underlying serial processing models are evident in each of these process tracing techniques. Both the task and the output of the techniques reflect serial processing by their very design. The output from the consumer, whether in terms of a verbal protocol or selection of information from a display board, is restricted to a serial task. Only one thing can be done at a time, by the design of the research. Consequently, the techniques themselves force participants into a serial processing mode, or at least require them to represent their decision processes as serial.

In reference to protocol methods, Bettman (1979) states that "...the subject is theoretically verbalizing thoughts as they occur in the course of problem solving" (p. 195). Thus, research assumes that the subject is thinking in a serial manner. The decision net method of analyzing protocols also assumes seriality by imposing a branching structure and rule-dominated method of judgment. Parallel models, in contrast, would assume that comparisons between attributes are made in parallel, based on global constraint satisfaction rather than application of specific rules (Rumelhart et al. 1986).

Information monitoring methods were originally used in the study of consumer processing by Jacoby (1975). Information display boards, for example, are designed in such 8 way that subjects must choose one piece of information at a time. This information generally relates to an attribute of a particular brand. The process requires subjects to make a serial inspection of the information. The researcher then assumes that the sequential selection of information represents the consumer's information processing and choice strategy.

Bettman's (1979, p. 197) analysis of this technique reveals several shortcomings that may be due to the basic assumption of serial processing. First, the external information gathering task may not represent the internal processing of the information. Second, the technique does not investigate the effects of internal information which may affect the search and processing of the chosen information. In contrast, parallel models propose that the subject is activating memory relationships that al-low for generalizations between past experience the current tasks (Rumelhart, et al. 1986).

Eye movement analysis (Russo and Rosen 1975) is another process tracing method that needs to be reevaluated in light of parallel processing models. The pattern and sequence of eye movements are assumed to represent information examination and are used to infer details of internal processing. In eye movement analysis, like protocol analysis and information monitoring, the subject is forced into a serial mode of search, this time by the physical separation of the articles to be inspected. Eye movement analysis has been designed to "...guard against ambiguous interpretations due to the subjects' use of peripheral vision (Bettman 1979, p. 197). As a result of this design, subjects are restricted from taking in more than one piece of information at a lime.

In contrast to the above process tracing methods, experimental techniques are being developed which allow for examination of parallel processing. Much of this research has utilized computer simulation to assist in development of computational theories of human information processing. For instance, Marr and Poggio's (1976) attempt to model human vision led to the need to incorporate interconnections between processing units in order lo explain human vision capabilities. However, some research has adapted more conventional experimental techniques to test parallel models.

For example, response time analysis has already been used to investigate the nature of processing (Jacoby, Speller and Kohn 1974). This method assumes that mean response time is an indication of the amount of processing required for a particular task. Traditionally, researchers have tended to assume serial processing in their analysis of response time results. However, discrete response times, combined with accuracy measures, have been used to investigate parallel versus serial models in sentence perceptions (Marslen-Wilson 1975). The interaction of speed and accuracy in restoration of disrupted words was investigated for various levels of disruption in semantic and contextual constraints. Results showed that "sentence perception is most plausibly modeled as a fully interactive parallel process" (Marslen-Wilson 1975, p. 226).

Another experimental technique, one that has been used in reading research, entails presenting two words to subjects, each located at different sides of the eye fixation (Mozer 1983). The words contain two letters which are the same, such as sand and lane. Through counterblocks of trails the duration of the word display is adjusted to provide a fixed rate of whole word response errors. Errors in the subject's ability to reproduce the specified word are analyzed based on error types produced. This analysis determines the level of connection that exists between the current input, or two word pair, and knowledge structures the subject possesses.

A similar procedure might be used to provide evidence of parallel processing in information display boards and eye movement analysis. For example, information display boards could be reproduced using video terminals. The subject would be shown multiple attributes of various brands for a very short duration, after which the subject would be asked to make a purchase decision. The outcome of this choice might then be compared to a choice made when the subject was able to view attributes or brands in a sequential manner. In the case of eye movement analysis, the researcher might reduce the physical separation of the articles to be inspected. The results of this movement analysis might then be compared to the results of the conventional eye movement analysis. There is some evidence to suggest that focusing attention on separate places can be accomplished within a single eye fixation, without eye movement (Posner 1978, Triesman and Gelade 1980, Feldman and Ballard 1982).

In summary, it appears that many research techniques used in consumer information processing are based on the underlying assumptions of serial processing models. Process tracing methods, in particular, have been tied to serial models. Researchers also tend to analyze data gathered from other methods, like response time analysis, under the rubric of serial processing. However, there are similar research methods that do not assume serial processing and, therefore, may be more appropriate.


This paper has introduced the concept of parallel processing. It has presented some of the differences in assumptions between parallel and serial processing models. The comparisons have been limited to two assumptions: (1) the nature of processing, and (2) the nature of memory structure. In light of these differences, it is suggested that some research techniques used to study consumer information processing may need to be reevaluated. In particular, process tracing methods have been shown to be explicitly tied to serial processing models. Some alternative methods, based on parallel processing models, are suggested for future research. Consideration of parallel processing models in research design and theory or model development may alter the way consumer researchers conceptualize important issues, including issues related to consumer choice strategies.

While research appears to support parallel processing models, there are some distinct problems with this approach. Most of these problems are a result of the relatively short history of research in Connectionist or Parallel Distributed Processing Models. The major problems which will impact consumer researchers are associated with the current level of development of the models. Specifically, research on parallel processing models has been focused on lower levels of cognition. Questions of inference, for example, have yet to be fully addressed. However, the models do provide some insight into problems tied to serial models of information processing. For that reason, parallel models warrant further investigation in the context of consumer research.


Anderson, John R. (1976), Language, Memory and Thought, (Hillsdale, NJ.: Lawrence Erlbaum Associates).

Anderson, John R. and Geoffrey E. Hinton (1981), "Models of Information Processing in the Brain," in Hinton, G. E. and J. R. Anderson (Eds.), Parallel Models of Associative Memory, (Hillsdale, N.J.: Lawerence Erlbaum Associates), 9-48.

Bettman, James R. (1979), An Information Processing-Theory of Consumer Choice, (Reading, Mass: Addision-Wesley Publishing).

Bobrow, D. G., and D. A. Norman (1975), "Some Principles of Memory Schemata," in D. G. Bobrow and A. G. Collins (Eds.), Representation and Understanding, (New York: Academic Press), 131-149.

Crick, F. H. and C. Asanuma (1986) "Certain Aspects of the Anatomy and Physiology of the Cerebral Cortez", in James L. McClelland, David E. Rumelhart and the PDP Research Group, Parallel Distributed Processing: Explorations in the Microstructure of Cognition. Volume 2: Psychological and Biological Models, (Cambridge, Mass: The MIT Press) 333-71.

Feldman, Jerome A. (1985), "Connections: Massive Parallelism in Natural and Artificial Intelligence," Byte, April, 277-284.

Feldman, J. A. and D. H. Ballard (1982) "Connectionist Models and Their Properties," Cognitive Science, 6, 205-254.

Hayes-Roth, Barbara (1977, "Evolution of Cognitive Structures and Processes," Psychological Review, Vol. 84, No. 3, 260-278.

Jacoby, Jacob (1975), "Perspectives on a Consumer Information Processing Research Program," Communication Research, 2, 203-215.

Jacoby, Jacob, Overload," Journal of Consumer Research, 10 (March), 432-435.

Jacoby, Jacob, Donald Speller and Carol Kohn (1974), "Brand Choice Behavior as a Function of Information Overload," Journal of Marketing Research, II, (February), 63-69.

Jacoby, Jacob, George J. Szybillo and Jacqueline Busato-Schach (1977), "Information Acquisition Behavior in Brand Choice Situations," Journal of Consumer Research, 3 (March), 209-216

Johnson-Laird, P. N. (1988), The Computer and the Mind, Cambridge Mass: Harvard University Press.

Lynch, John G. and Thomas K. Srull (1982), "Memory and Attentional Factors in Consumer Choice: Concepts and Research Methods," Journal of Consumer Research, 9 (June), 18-37.

Malhotra, Naresh (1982), "Information Load and Consumer Decision Making," Journal of Consumer Research, 8, (March), 419430.

Malhotra, Naresh (1984), "Reflections on the Information Overload Paradigm in Consumer Decision Making," Journal of Consumer Research, 10, (March), 436 440.

Marr, D. and T. Paggio (1976) "Cooperative Computation in Stereo Disparity", Science, 194, 283 -87 .

Marslen-Wilson, William D. (1975), "Sentence Perception as an Interactive Parallel Process," Science, 189, 226-228.

McClelland, James L., David E. Rumelhart and the PDP Research Group (1986), in Parallel Distributed Processing: Volume 2, (op. cite).

Minsky, M. (1975), "A Framework for Representing Knowledge," in P.H. Winston (Ed.), The Psychology of Computer Vision, (New York: McGraw-Hill), 211-277.

Mozer, M.C. (1983) "Letter Migration in Word Perception", Journal of Experimental Psychology: Human Perception and Performance, 9, 531-46.

Nelson, James E., C. Duncan and N. Frontczak (1985), "The Distraction Hypothesis and Radio Advertising," Journal of Marketing, 49 (Winter), 60-71.

Newell, Allen and Herbert A. Simon (1972), Human Problem Solving, (Englewood Cliffs: N.J.: Prentice-Hall).

Norman, D. A. (1986), "Reflections on Cognition and Parallel Distributed Processing," in Parallel Distributed Processing: Volume 2, (op. cite) 531 -546.

Petty, Richard E., G. L. Wells and T. C. Brock (1976), "Distraction Can Enhance or Reduce Yielding to Propaganda: Through Disruption versus Effect Justification," Journal of Personal and Social Psychology, 34, 874-884.

Posner, M. I. (1978), Chronometric Explanation of Mind, (Hillsdale, N.J.: Lawrence Erlbaum Associates.

Rumelhart, David E. and Donald A. Norman (1981), "Introduction: A Comparison of Models," in G. E. Hinton and J. A. Anderson (Eds.), Parallel Models of Associative Memory, (op. cite), 1-7.

Rumelhart, David E., P. Smolensky, J. C. McClelland and G. E. Hinton (1986), "Schemata and Sequential Thought Processes in PDP Models," in Parallel Distributed Processing: Volume 2, (op. cite) 757.

Russo, J. Edward and Larry D. Rosen (1975), "An Eye Fixation Analysis of Multi-alternative Choice," Memory and Cognition, 3, (May), 267-276.

Schank, R. C. and R. P. Abelson (1977), Scripts, Plans, Coals and Understanding, (Hillsdale, N.J.: Lawerence Erlbaum Associates).

Shepherd, G. M. (1979), The Synaptic Organization of the Brain, 2nd Ed., (New York: Oxford University Press).

Slovic, Paul, Baruch Fischhoff and Sarah Lichtenstein (1977), "Behavioral Decision Theory," Annual Review of Psychology, 28, 139.

Triesman, A. M., and G. Gelade (1980), "A Feature-Integration Theory of Attention," Cognitive Psychology, 12, 97-136.

Tversky, Amos and Daniel Kahneman (1981), "Framing of Decisions and the Psychology of Choice," Science, 211, 453-458.

Venkatesan, M. and Gordon A. Haaland (1968), "Divided Attention and Television Commercials: An Experimental Study," Journal of Marketing Research, 5, (May), 203-205.

Wright, Peter (1974), "The Harassed Decision Maker: Time Pressures, Distractions and the Use of Evidence," Journal of Applied Psychology, 59, (November), 555-561.