15
Pergamon www.elsevier.corn/locate/pnucene Progress in Nuclear Energy, Vol. 42, No. 2, pp. 221-235, 2003 Availableonline at www.sciencedirect.com © 2003 Elsevier Science Ltd. All rights reserved Printed in Great Britain 8e~ENeE D~Rae'r. 0149-1970/03/$ - see front matter P I h S0149-1970(02)00106-3 EFFECTS OF EDUCATION ON NUCLEAR RISK PERCEPTION AND ATTITUDE: THEORY MAN-SUNG YIM Department of Nuclear Engineering, North Carolina State University Box 7909, Raleigh NC 27695, USA PETR A. VAGANOV Department of Nuclear Geophysics and Radioecology, University of St. Petersburg 199034, St. Petersburg, Russia ABSTRACT Education has been considered a key means of changing people's risk perception of or attitude toward, nuclear technology. Major efforts have been and will continue to be expended to educate the public in this regard. However, the early experimental studies indicated that empirical support was lacking for the arguments that opposition toward nuclear energy stems from ignorance and greater information will change attitudes. At the same time, some studies indicated the positive effect of education in changing people's attitude. This study reviewed the theories of attitude formation and change, risk perception, and their relationship with the public's nuclear education, and attempts to explain the seemingly conflicting findings. Suggestions for future educational efforts were also made. © 2003 Elsevier Science Ltd. All rights reserved. 1. INTRODUCTION The course of nuclear-power development has been one of the most contentious public issues of our time. From a scientific development point of view, it was one of the major success stories. From a political point of view, it was the opposite. Although the technology was accepted enthusiastically by the majority of the public at the beginning, no new nuclear power plant has been ordered since 1978 in the U.S. Public concerns have all but ground the expansion of nuclear power in the U.S. to a stop. During this period, there has been a large gap in nuclear risk perception between the public and the scientific community. A common explanation of the existing nuclear risk perception of the public by the technical community is that the public is irrational, negatively biased, and poorly informed. The best answer thought by the nuclear industry to the problem was information and education [Roberts, 1975]: Information and education would breed public confidence - the more people understand about nuclear power, the more they tend to favor it. In this vein, the nuclear industry waged massive national advertising campaigns for public education. However, thes campaigns were not very successful. There have been mixed views on the effectiveness of education on affecting public risk perception on nuclear energy. Although there were a few exceptions, most of early experimental studies did not show a positive effect of education in changing public risk perception. The arguments that opposition toward nuclear energy stems from ignorance, and thus greater information will change attitudes, lack solid empirical evidence. The purpose of this study is to examine the effectiveness of education on changing people's perception based on a review of theories on how people perceive risk and form corresponding attitudes. Existing empirical studies on the effectiveness of public nuclear education were also reviewed. 221

Effects of education on nuclear risk perception and attitude: Theory

Embed Size (px)

Citation preview

Pergamon

www.elsevier.corn/locate/pnucene

Progress in Nuclear Energy, Vol. 42, No. 2, pp. 221-235, 2003

Available online at www.sciencedirect.com © 2003 Elsevier Science Ltd. All rights reserved Printed in Great Britain

8e~ENeE D~Rae'r. 0149-1970/03/$ - see front matter

PIh S0149-1970(02)00106-3

EFFECTS OF EDUCATION ON NUCLEAR RISK PERCEPTION AND ATTITUDE: THEORY

MAN-SUNG YIM

Department of Nuclear Engineering, North Carolina State University Box 7909, Raleigh NC 27695, USA

PETR A. VAGANOV

Department of Nuclear Geophysics and Radioecology, University of St. Petersburg 199034, St. Petersburg, Russia

ABSTRACT

Education has been considered a key means of changing people's risk perception of or attitude toward, nuclear technology. Major efforts have been and will continue to be expended to educate the public in this regard. However, the early experimental studies indicated that empirical support was lacking for the arguments that opposition toward nuclear energy stems from ignorance and greater information will change attitudes. At the same time, some studies indicated the positive effect of education in changing people's attitude. This study reviewed the theories of attitude formation and change, risk perception, and their relationship with the public's nuclear education, and attempts to explain the seemingly conflicting findings. Suggestions for future educational efforts were also made. © 2003 Elsevier Science Ltd. All rights reserved.

1. INTRODUCTION

The course of nuclear-power development has been one of the most contentious public issues of our time. From a scientific development point of view, it was one of the major success stories. From a political point of view, it was the opposite. Although the technology was accepted enthusiastically by the majority of the public at the beginning, no new nuclear power plant has been ordered since 1978 in the U.S. Public concerns have all but ground the expansion of nuclear power in the U.S. to a stop. During this period, there has been a large gap in nuclear risk perception between the public and the scientific community.

A common explanation of the existing nuclear risk perception of the public by the technical community is that the public is irrational, negatively biased, and poorly informed. The best answer thought by the nuclear industry to the problem was information and education [Roberts, 1975]: Information and education would breed public confidence - the more people understand about nuclear power, the more they tend to favor it. In this vein, the nuclear industry waged massive national advertising campaigns for public education. However, thes campaigns were not very successful.

There have been mixed views on the effectiveness of education on affecting public risk perception on nuclear energy. Although there were a few exceptions, most of early experimental studies did not show a positive effect of education in changing public risk perception. The arguments that opposition toward nuclear energy stems from ignorance, and thus greater information will change attitudes, lack solid empirical evidence.

The purpose of this study is to examine the effectiveness of education on changing people's perception based on a review of theories on how people perceive risk and form corresponding attitudes. Existing empirical studies on the effectiveness of public nuclear education were also reviewed.

221

222 M.-S. Yim and P. A. Vaganov

2. PREVIOUS E M P I R I C A L STUDIES ON T H E EFFECTS OF E D U C A T I O N

Education has been considered a key means of changing people's perception of nuclear risk or attitude toward nuclear technology. Major efforts have been and will continue to be expended to educate the public. However, on the effectiveness of education on affecting public perception of risk, there have been mixed views.

Some early studies reported links between people's knowledge about nuclear power plants and attitudes toward them [Crater, 1972; Nealey and Rankin, 1978; Kuklinski, et al., 1982] These studies showed positive statistical correlations between knowledge of and attitudes toward nuclear power plants. However, there have been more studies supporting the opposite. In the early 60s, Roder examined whether or not knowledge of flood protection is associated with optimism about the possibility of recurrence of floods in the minds of persons located in the flood plain [Roder, 1961]. The study suggested that attitudes of individuals are derived from basic factors in his/her personality, and may be relatively little influenced by what he/she hears and reads.

According to a study of the Swedish public on the effects of education on the public attitudes toward nuclear power, very little was achieved by a massive public education campaign to change public attitude [Nelkin, 1974]. Instead, increased knowledge contributed to uncertainty and indecision. It needs to be noted that the educational campaigns in Sweden were conducted by many different organizations with different political orientations or views on nuclear power.

An Oak Ridge study found no relationship between information levels and support for or opposition against nuclear energy [Sundstrom et al., 1977]. The information was specific factual data about a particular plant rather than the technology itself. A study of college student attitudes toward nuclear reactors also found no relationship between knowledge and attitude [Clelland and Bremseth, 1977]. A study done for California residents [Hensler and Hensler, 1979] found no relationship between knowledge about nuclear power plants and attitudes toward them. Further related studies indicated that knowledge serves primarily to confirm rather than to shape attitudes [Kasperson, et al., 1980]. Some have even predicted that as a person or group of people becomes more educated on a topic that involves risks, their existing perception of those risks will become more strengthened, rather than dissipating [Jungermann, et al., 1988]. These issues were also examined in a study with the subjects being high school teachers. The teachers participated in two-week educational workshops sponsored by the US Department of Energy and the changes in their attitudes toward energy issues were studied [Page and Hood, 1981]. Results indicated that the effect of workshop participation on energy attitudes was quite small. For some of the participants, the workshop experience appeared to intensify existing attitudes about energy sources. The contents of the workshop were focused on the knowledge of the entire range of energy technologies.

A conclusion drawn from these studies (from the 1970s and 1980s) was that there is no consistent empirical basis for the arguments that opposition stems from ignorance and greater information will change attitudes [Kasperson, 1980]. Some have even stated that educational attempts designed to reduce the 'perception gap' with respect to nuclear power are expected to fail [Slovic, et al, 1980a]. However, these findings are inconclusive considering the fact that the educational experiences of the subjects participating in these studies were mostly receiving factual information rather than being involved in issue-relevant deliberations. Between the studies, the measured data might have been inconsistent. Maybe the people participating in studies that produce seemingly inconsistent results are different [Maharik and Fischhoff, 1993].

A more recent empirical study [Maharik and Fischhoff, 1993] found that the more people know, the more favorable they are - except for two groups of people selected from organizations with strong pro-industry or pro-environment positions. The study used the risk knowledge regarding nuclear energy sources in space and suggested that knowledge would help in promoting positive attitudes if the situation has not become too polarized already.

Another recent study produced a more interesting observation regarding the effects of education on high school student's attitudes toward nuclear power plants [Showers and Shrigley, 1995]. It was shown that deliverance of factual knowledge on atoms, radiation, and fission increased the nuclear knowledge of the students but did not change their attitudes toward nuclear power. But the study further showed that a persuasive message deliberately designed to extol the merit of nuclear energy increased positive attitudes without affecting the nuclear knowledge of the students.

Here, one clarification needs to be made for further discussions, i.e., the distinction between education and persuasion. Education involves presenting a full presentation of issues and trusting people to reach their own conclusions. Persuasion is designed either to change people's values or to alter the application of those values, by

Effects of Education 223

presenting a well-selected and maybe biased selection of facts. It is implied in this study that educational activities may entail the components of persuasion.

3. THEORIES ON ATTITUDE FORMATION

For the interpretation of various different observations from empirical studies, basic theories on attitude formation need to be understood. In a historical sense, Hovland's learning theory model [Hovland, et al., 1953] was the first theoretical model for the formation of attitude or attitude change. The model, based on stimulus-response learning theory, explains the attitude of a person as the results of an action rendered by a messenger and the listening by the recipient. It served the social psychology community during the 1950s and 1960s as a theoretical framework embodied with variables relevant to attitude change. The model assumed that a person's attitude or perception is mainly dependent upon the credibility of the communicator and the effectiveness of the message. The model did not pay attention to the cognitive role of the message recipient. According to this model, education should be effective in changing attitudes as long as the educator has the proper credentials with well-prepared information.

With the surge of the cognitive psychology movement in the late 1960s, new developments were made in attitude research. The expectancy-value model of attitude formation proposed by Fishbein [Fishbein, 1963] and its corollaries [Fishbein and Ajzen, 1975; Ajzen and Fishbein, 1980; Ajzen, 1985] and the elaboration likelihood model of persuasion (ELM) [Petty and Cacioppo, 1981; Petty and Cacioppo, 1986] represent the two major developments. The expectancy-value model of attitude formation asserts that attitudes are affected by beliefs that can be evaluated as positive or negative. The theory argues that the effect of attitude on behavior is contingent upon the functioning of behavioral intention, subjective norm, and perceived behavioral control. This model basically assumes that the more a person believes the attitude object has good rather than bad attributes or consequences, the more favorable his or her attitude tends to be. The theory therefore predicts that much scientific knowledge will have little, if any, effect on attitudes of those who lowly value scientific and technical knowledge. Applications of this model confirmed the importance of a belief in attitude formation [Otway and Fishbein, 1976; Woo and Castore, 1980; van der Pligt, van der Linden and Ester, 1982].

The ELM of persuasion was developed based on the analysis of human information processing mechanisms [Neisser, 1967] and the related theoretical framework [Greenwald, 1968]. The model proposed that for the understanding of risk perception, it was necessary to see how the risk information was communicated and received by an individual. This is similar to the conceptual framework of the heuristic-systematic model (HSM) [Chaiken, 1980; Chaiken et al, 1989] which is another influential model. The ELM and HSM models are generally more similar than different and can accommodate the same empirical results, though the explanatory language used (systematic vs. heuristic processing in the HSM in contrast to the central vs. peripheral route processing in the ELM) and the assumed mediating processes vary [Petty, et al., 1997]. The following discussions will follow the ELM approach.

In the ELM, two important principles (that seemed to be working in people's mind) were hypothesized [Petty and Cacioppo, 1986]: (1) People are motivated to hold correct attitudes; and (2) Although people want to hold correct attitudes, the amount and nature of issue-relevant elaboration in which people are willing or able to engage to evaluate a message vary with individual and situational factors.

This second point implies the possibility that the perception changes as people get older. As for the young children, they have less ability and little motivation to think about issue-relevant argumentation other than what they were taught or what affects their feelings. They may be particularly reliant on primitive heuristics or certain cognitive rules based on personal experience with their parents or things they play with. Therefore, their perception will primarily be based on positive and negative affective cues associated with the attitude object. As development proceeds, some perception may be formed on the basis of simple inferences, decision rules, and social attachments. As people move into adulthood, interests become more focused and the consequences of holding correct opinions on certain issues increase. As people's acquired knowledge and cognitive skills grow, this renders them more able to critically analyze issue-relevant information on certain topics and makes them less reliant on primitive heuristics [Ross, 1981 ]. The formation and change of perception become very thoughtful processes in which issue-relevant information is carefully scrutinized and evaluated in terms of existing knowledge. However, although people may have the requisite ability and motivation to scrutinize certain attitude issues, they may lack motivation and ability on others. Thus simple inferences and affective cues may still have effects on attitude or perception change.

The ELM hypothesized two possible routes, central and peripheral, by which a persuasive message may be cognitively processed by a recipient [Petty and Cacioppo, 1986]. The central route refers to a communication

224 M.-S. Yim and P. A. Vaganov

process in which the receiver examines each argument carefully and balances the pros and cons in order to form a well-structured attitude. The peripheral route refers to a faster and less laborious strategy to form an attitude by using specific cues or simple heuristics. When the peripheral route occurs, the receiver is less inclined to deal with each argument, but forms an opinion or even an attitude on the basis of simple cues and heuristics. Under the peripheral mode, the schema may be accessed only once to incorporate the affect or inference elicited by a salient cue. The schema here refers to the cognitive part of a person's knowledge structures. The "knowledge structures" represent a rich store of general knowledge and preconceptions of objects, people, events, and their characteristic relationships within a person. Some of this knowledge may be represented as beliefs or theories, i.e., reasonably explicit "propositions" about the characteristics of objects or object classes. Some of it is organized by a variety of less "propositional", more schematic, cognitive structures (for example, the knowledge underlying one's awareness of what happens in a restaurant, one's understanding of the Good Samaritan parable, or one's conception of what an introvert is like) which are referred to as "schema" [Nisbett and Ross, 1980].

The route selection depends on two factors: ability and motivation. Ability refers to the physical/intellectual capability and opportunity of the receiver to follow the information without distraction. Motivation refers to the readiness and interest of the receiver to process the message. The information content has to be relevant (referring to personal interests, salient values, or self-esteem) and it should trigger personal involvement (with the issue, the content, or the source). These motivational factors are reinforced if the receiver has some prior knowledge or academic interest in the subject or is in need of new arguments to support his pre-formed point of view. When ability and motivation to scrutinize the information are relatively high, the central route takes place and vice versa.

Another theory that should be noted in this discussion is the contact effects on liking. The theory predicts that forced contact with initially disliked others intensifies preexisting attitudes [McGuire, 1985], making the initially positive more favorable and those initially ill-disposed more hostile. This indicates that the attitude also depends upon the pre-existing feelings, belief, or perception toward the contacted individuals. This agrees with Fazio's proposed theory that an attitude can be thought of as a link between the representation of an attitude object and its evaluation in memory [Fazio, 1986]. This may also explain why interactions in a workshop setting could be ineffective in changing public attitudes toward nuclear technology.

4. THE PROCESS OF RISK PERCEPTION

Many of the public's reactions to risk can be attributed to a sensitivity to technical, social, and psychological qualities of hazards. The public has a broad conception of risk, qualitative and complex [Slovic, 1998; Douglas and Wildavsky, 1982]. These cover the so-called "multi-dimensionality" of risk [Covello, 1996; Kunreuther and Slovic, 1996; Freudenburg, 1996]. Major dimensions underlying perceived riskiness were to include [Vlek and Keren, 1992]: 1) Potential degree of harm or fatality, 2) Physical extent of damage (area affected), 3) Social extent of damage (number of people involved), 4) Time distribution of damage (immediate and/or delayed effects), 5) Fear/dread, 6) Probability/ambiguity of undesired consequence, 7) Controllability (by self or trusted expert) of consequence, 8) Experience with, familiarity, imaginability of consequence, 9) voluntariness of exposure (freedom of choice), 10) Extent and clarity of expected benefit, 11) Social distribution of risks and benefit, 12) Harmful intentionality.

Different people may attach differing weights to these dimensions. Such differences are also partly related to people's attitudes towards the expected benefits of a risky activity [Vlek and Keren, 1992]. As discussed in Section 3, the risk attitude is affected by the belief system of a person, and other individual and situational factors related to these dimensions. In particular, the process of cognitive information processing and the factors involved in these will affect the person's risk perception and attitude. Since most people do not actually experience the risk from nuclear technology by their senses but learn mostly through communication, understanding human cognitive information processing is important for the examination of personal risk perception formation.

An average individual in today's society is exposed to an abundance of information, much more than the individual can digest. Out of necessity, most of information to which the average person is exposed is not attended to. If the information does not carry certain symbolic cues or match the receiver's interests, it may not pass through the so-called attention filter. Peripheral cues such as novelty of the information, specific symbolic keywords or signals, or mentioning of prestigious persons or institutions, tend to influence the initial selection of information. Ability and motivation of the receiver will also affect this filtering process. Once the information passes through this "becoming aware of the information" process, the following subsequent steps can occur [Renn, 1991]:

Effects of Education 225

(1) Selecting the relevant parts of the information for processing; (2) Interpretation of the meaning of the information; (3) Employing intuitive heuristics and drawing one's own inferences; (4) Comparing the interpretation/inference with other messages from other sources or previous experiences; (5) Evaluating the potential for personal involvement, the potential effect on personal life, the perceived

consistency with existing beliefs (to avoid cognitive dissonance), reference group judgments (to avoid social alienation), and personal value commitments;

(6) Forming specific beliefs about the subject of the message (or reassuring previously held beliefs); (7) Rationalizing the beliefs and attitudes and generating intentions for future actions that are in accordance with

the beliefs.

Individuals are likely to evaluate whether it is necessary to study the content of the information in detail or to make a quick judgment according to salient cues in the message received. Intuitive cognitive heuristics refer to the processes used by the receiver using "common sense" mechanisms to draw inferences. In this, the receiver tries to resolve ambiguities, makes educated guesses about events that cannot be observed directly, and forms inferences about associations and causal relations, beyond the information given. This inference processes are influenced by the "knowledge structures" [Nisbett and Ross, 1980]. The knowledge structures house generic knowledge and preconceptions of the person about the world and provide the basis for quick, coherent, but occasionally erroneous interpretations of new experience. Thus values, worldviews, personal experiences, existing attitudes towards the risk source, proximity to and affinity to one's own interests, motivational and ability factors all play roles. Emotions and moods have also been found to have some influence on these processes [Tesser and Shaffer, 1990; Petty, et al., 1997].

According to the ELM, the concurrent/subsequent cognitive processing of information is theorized to follow two different routes, i.e., central vs. peripheral, depending upon the ability and motivation of the receiver as discussed in the previous section. If the peripheral mode is selected, the steps (4) and (5) in the above can be ignored. In this mode, the receiver does not bother to deal with each argument separately, but looks for easily accessible cues to make his/her judgment on the whole package. These cues could be related to the source, message, transmitter, and context. The source-related cues are credibility, reputation, and social attractiveness of the source. The message- related cues include the length of a message, the number of arguments, proximity and affinity of the information to one's own interests and understanding, and the presence of symbolic signals that trigger immediate emotional responses ("the framing effects") [Kasperson, 1988]. The transmitter-related cues are the perceived neutrality, the personal satisfaction with the transmitter in the past, the similarity with the political or ideological position of the transmitter, and the social credibility of the transmitter. The context-related cues could include presence of competing messages, expert controversy, or social reputation. As motivation and/or ability to process arguments is decreased, peripheral cues become relatively more important in risk perception. If the central mode is chosen, the receiver performs two types of evaluations [Petty and Caeioppo, 1986]: first, an assessment of the probability that each argument is true, and second, an assignment of weight to each argument according to the personal salience of the argument's content. The credibility of each argument can be tested by referring to personal experience, plausibility, and perceived motives of the communicator.

The worldview held by the message recipient affects this mode (central vs. peripheral) selection. The judgment on the outcome of the information processing such as defining acceptable risk can also be quite different among different worldview holders. The worldviews affect the political, ideological position of the receiver as existing beliefs and the perceived credibility of the message source or the transmitter, independent from the message itself. To better understand how the value system/worldview of a person affect the inferencing processes of intuitive heuristics, people are categorized as egalitarian, hierarchist, individualist, or fatalist 0aermite) [Adams, 1995]. An egalitarian cares more about others, is more sensitive to risk issues, and requires not only a substantive, equal distribution of risk, but also demand that all interested and affected persons have an equal voice. An egalitarian believes that democratic procedure should not violate the interests of a minority in the decision processes of risk management. Inequitable distribution of risk and benefits is intolerable to egalitarians. In contrast, a hierarchist uses a utilitarian approach where ignoring the safety of some minority of persons to provide the greatest safety (the least risk) for the greatest number of persons is acceptable [Shrader-Frechette, 1993]. A hierarchist employs technical rationality and tends to mast government or industry in the management of risk. Individualists would care more about the personal benefit or risk imposed. Perceived personal benefit from a source will play a more important role in their risk attitude. A fatalist tends to be pessimistic about social systems and risks from them.

People endorsing egalitarian and individualistic views perceived more risk in chemical pollutants than people endorsing opposing views [Bouyer, et al., 2001]. People endorsing hierachic'views perceived less risk in public

226 M.-S. Yim and P. A. Vaganov

transportation and nuclear energy production than people endorsing opposing views [Bouyer, et al., 2001]. By contrast, people endorsing fatalistic views perceived more risk in public transportation and nuclear energy production than people endorsing opposing views. In general, people who endorse egalitarian views tend to perceive health hazards as more risky than people who do not endorse such view [Krewski, et al., 1995; Peters and Slovie, 1996; Bouyer, et al., 2001].

Negatiye emotional arousal/fear is also known to influence the extent to which information is systematically processed [Baron, et al., 1994; Meijnders, et al., 2001]. Fear arousal is expected to increase the motivation to elaborate on stimuli that are relevant to the threat under consideration, leaving less capacity free for the processing of other stimuli [Baron, et al., 1994]. Several studies have shown that fear arousal increases the elaboration of relevant information [Baron, et al., 1994; Meijnders, et al., 1995], while impeding the elaboration of other relevant information [Baron, et al., 1992; Wilder and Shapiro, 1989].

Various psychological heuristics also affect the inferencing process for the given information. These include availability, anchoring effect, avoidance of cognitive dissonance, representativeness, and overconfidence [Renn, 1990; Rennet al., 1996; Kahneman and Tversky, 1974; Kahneman, et al., 1982; Nisbett and Ross, 1980; Slovic et al., 1979; Slovic et al., 1980 ]. Availability means that events or activities coming to people's mind immediately are evaluated and rated as more probable compared with events and activities that are less mentally available. Empirical studies demonstrate, for instance, that most persons can recall easily at least one or two serious nuclear reactor accidents (Three Mile Islands or Chernobyl), but they do not recall any dam failure although a significant number of fatalities have been associated with dam failures [Inhaber, 1982].

Anchoring effect causes probabilities to be adjusted to the information available or the perceived significance of information. Mental anchors depend on how stable the associations with adverse consequences of the exposure are. The easier a person can imagine a dangerous event or activity the more likely he or she perceives it as more probable to occur.

Cognitive dissonance refers to an unpleasant state of arousal when people hold inconsistent beliefs. Avoidance of cognitive dissonance was postulated by Festinger [Festinger, 1957] as a common way of reducing the dissonance by people. The theory asserts: (a) The existence of dissonance, being psychologically uncomfortable, will motivate the person to try to reduce the dissonance and achieve consonance; (b) When dissonance is present, in addition to trying to reduce it, the person will actively avoid situations and information which would likely increase the dissonance. Avoidance of cognitive dissonance indicates specific resistance of a belief system which has already formed after receiving some probabilistic information. Persons with negative attitudes toward an event or activity will tend to reject any information challenging their already formed opinion, preferring to seek information that reinforces their initial position. Further examinations of the theory [Cooper and Fazio, 1984] indicated that cognitive dissonance is not simply brought about by the perception of inconsistency among cognitions but rather by the perception of having brought about an aversive and irrevocable event. Also, the arousal state by itself may not be sufficient to lead to the pressure for avoidance that is typically associated with dissonance. Aversive pressure to change one's attitude may not occur if the arousal is labeled positively. Nonetheless, cognitive dissonance in general appears to lead to negative effects and the theory has provided rich insights into the way in which people deal with the consequences of cognitive inconsistency.

Representativeness means that singular events experienced in person or associated with properties of an event are regarded as more typical than information based on frequencies [Renn, 1990]. A prominent nuclear reactor accident (such as Chernobyl accident) can leave an impression that the accident is typical with nuclear energy. [Khaneman and Tversky, 1972].

Overconfidence means that people are typically very confident in their own judgments. The psychological basis for this unwarranted certainty appears to be people's insensitivity to the tenuousness of the assumptions upon which their judgments are based [Slovic et al, 1980a]. Such overconfidence keeps people from realizing how little we know and how much additional information is needed for a well-informed thinking about the risk. All of these heuristics continue to affect the remaining information processing steps, i.e., comparison and evaluation stages, until forming or confirming specific beliefs or attitudes.

Another important factor affecting these heuristics is trust. Several studies have identified trust as an important determinant of perceived risk [Bord and O'Connor, 1992; Flynn, et al, 1992]. From a psychological perspective, trust entails a state of perceived vulnerability or risk that is derived from individuals' uncertainty regarding the motives, intentions, and prospective actions of others on whom they depend [Kramer, 1999]. For example, Robinson

Effects of Education 227

[Robinson, 1996] defined trust as a person's "expectations, assumptions, or beliefs about the likelihood that another's future actions will be beneficial, favorable, or at least not detrimental to one's interests".

As far as the trust in organization is concerned, individuals' perceptions of others' trustworthiness and their willingness to engage in trusting behavior are largely history-dependent processes [Boon and Holmes, 1991 ]. When trust is present, members of a social community engage in cooperative, altruistic, and extra-role behaviors [PEW, 1996]. However, trust is easier to destroy than create [Barber, 1983; Janoff-Bulman, 1992]. Trust-destroying events are more visible and noticeable than trust-building events. Trust-destroying events carry more weight in judgment than trust-building events of comparable magnitude [Slovic, 1993].

Wildavsky and Dake stated that the great struggles over the perceived dangers of technology in our time are essentially about trust and distrust of societal institutions [Wildavsky and Dake, 1990]. Distrust, once initiated, tends to inhibit personal contacts and experiences that are necessary to overcome distrust. Initial distrust (or trust) colors people's interpretation of events, thus reinforcing their prior beliefs [Slovic, 1997]. An extensive documentation of how extreme distrust frequently results in a pattern of information processing leading to interpretations that justify the initial trust has been provided [Kramer, 1998].

The limits to the importance of trust in risk perception, however, need to be noted also. Depending upon how the limits of knowledge are perceived, the importance of trust can be diminished. When people believe that there are effects of technology that are not yet understood, the trust would have a limited importance. In this case, the expert or authority wishing to communicate a reassuring message may find it hard to establish full credibility because of the prevalence of judgments in the public about the limits of scientific knowledge [Sj oberg, 2001 ].

There are other related human psychological factors affecting the cognitive information processing. The asymmetry filter refers to the negativity bias in human information processing - humans are more sensitive to negative/bad news. For a negatively framed information, the human mind's asymmetry filter [Slovic, 1997] provides added weight and impact on the processing of information [Slovic, 1993]. Along with the multidimensional nature, risks are perceived to be higher if the activity is perceived to be catastrophic, involuntary, not personally controllable, inequitable in the distribution of the risk and benefits, unfamiliar, and highly complex. Also, when the person fails to perceive significant benefit from an activity, he/she becomes intolerant of risk. Usually risks characterized by low probabilities and high consequences are perceived as more threatening than more probable risks with medium or low consequences [Slovic et al., 1979]. This may be due to their fears of unfamiliar or involuntary imposed hazards, of potentially catastrophic consequences. People also seem to perceive lower risk estimates (for example, 10 .6 as opposed to 10 -3) as less credible [Johnson and Slovic, 1995]. This response may be caused by the feeling that a very low risk estimate is based more on assumptions with low grounding. On.the other hand, this may be due to distrust in the managers of risky technologies. Studies also indicate that when people are wealthier and have more to lose, they become more cautious in risk taking and keen to risk issues [Wildavsky and Dake, 1990].

Public risk perception can cause a set of complex public reactions related under the given social circumstances [Kasperson and Kasperson, 1991]. The influencing factors on the message transmission in the social arena may include social, political, cultural effects such as social learning effects, false consensus, political interests and community issues, and social amplification of risk. Most risks that modem society faces are not experienced by people but learned through communication, mainly through the mass media, leading to the so-called "social learning effects" [Bandura, 1963, 1986; Cairns, 1979]. The mass media shapes people's perception of technological risks and can impede as well as promote technological innovation processes. The so-called vicarious learning from the symbolic environment through the rapid advances in the technology of communication has more profound societal consequences than direct experience, because the vicarious mode can affect the lives of a vast number of people. When news of events, rather than events, provides stimuli, the events need not have yet occurred for them to have significant effects [Rappaport, 1996]. Negative bias in media reporting of technological risks can negatively bias the reactions oftbe receivers who are exposed regularly to this information. Social learning effects trigger a variety of fears and intensification of personal worries. It can also amplify the signals of fear appeals incorporated in the media news. This affects not only the personal risk perception but the process of social polarization of opinions.

People are found to imagine that if they prefer an item, other people prefer that item as well. This is called false consensus [Markus and Zajonc, 1985]. False consensus sometimes plays an important role in estimates of social consensus as an egocentric bias. I fa person thinks the public is against a risk source, he/she would tend to overestimate the proportion of others who actually oppose the risk source. Political interests groups or community

228 M.-S. Yim and P. A. Vaganov

groups can impact the behavior of the public by developing false consensus through the presentation of a vivid social preference.

Studies have indicated that negative information can be amplified along with fear appeals presented (explicitly or implicitly) and that its psychological impact can catastrophize personal concerns and worries [Johnston and Davey, 1997]. Social amplification of risk, as its name implies, means the amplification of the perceived risk through social systems. The structure and processes that compose the social amplification of risk have been conceptualized by Kasperson, Renn, Slovic, et al. [Kasperson et al., 1988]. It was argued that information systems (scientists communicating the risk assessment, the news media, public agencies, activist social organizations, informal networks including colleagues and friends) were capable to process risk events information in two ways: (1) by intensifying or weakening signals of the risk, and (2) by filtering the signals with respect to the attributes of risk and their importance. Information flow appears to be a key ingredient in public response and act as a major agent of amplification. Attributes of information that may influence the social amplification are volume, the degree to which information is distributed, the extent of dramatization, and the symbolic connotations of the information [Kasperson, et al., 1988]. Regardless of the validity of the information, large volume of information flow appears to serve as a risk amplifier [Mazur, 1984].

The Three Mile Island accident of 1979 is a good example of this effect. When the governor of Permsylvania issued a calm and measured advisory suggesting that pregnant women and preschool children living within five miles of the plant might want to evacuate and that all other people within ten miles ought to consider taking shelter in their homes, some 200,000 persons were alarmed enough to take to the public highways. And they fled, on average, a remarkable 100 miles [Erikson, 1990]. This is a wild discrepancy between the scale of an advisory and the scale of an actual evacuation. A deep and profound dread was driving the behaviors of the evacuees [Erikson, 1990]. At the same time, the accident has not been associated with any actual fatality. Nonetheless, the accident, with the massive information flow at the time of accident, caused shut down of nuclear plants worldwide, caused a huge increase in the cost of nuclear energy generation, and eroded public confidence in nuclear industry and regulatory institutions [Kasperson and Kasperson, 1991 ]. Through the accident, the importance of human factors not only in public communications but also in engineering design, operations and management of a risky technology was rediscovered [Kemeny, 1979; Rogovin, 1980].

5. DISCUSSION

Risk perception is the process and product of perceiving the dangers and uncertainties of potentially harmful environmental conditions, affected by various psychological factors of the receiver and the social influences. In this sense, risk is the potential for realization of unwanted, negative consequences [Meyer, 1996]. As discussed, personal characteristics of the receiver including personal interests and concerns, personal experiences, wealth, ability, motivation, values, worldviews, attitude toward the risk source, emotions, moods, and various psychological factors all affect personal risk perception. The theories on attitude formation also indicated that the acceptability of a technology is also controlled by the beliefs and the values associated with the technology in the minds of the public.

Existing theories of risk perception offered various explanations of the empirical observations made by the early studies. The fact that the nuclear technology was first introduced to the public in the form of a most potent bomb during war with a vivid image of the mushroom clouds may suggest that various intuitive heuristics such as, availability, anchoring effect, avoidance of cognitive dissonance, and representativeness could play important roles in cognitive processing of any information associated with the word "nuclear". This may particularly be true with the generations who were regularly exposed to the media with negatively biased information about "nuclear".

The theory of avoidance of cognitive dissonance may suggest that people opposed to the use of nuclear energy technology will avoid information that goes against what they hold to be true about the subject. The theory of contact effects on disliking also suggests that, if the public member has apr io r i belief toward nuclear educators, education would serve only to polarize existing attitudes: Those who are initially prone to support the use of nuclear technology will become more supportive after extended education on the topic; However, those who initially dislike the idea of nuclear technology could become even more opposed to it after the educational experience. According to these theories, education should be ineffective in changing attitudes towards nuclear technology for those with anti- nuclear beliefs.

Although existing beliefs play important roles, the tenaciousness of existing beliefs appears to depend upon the route/mode of information processing employed for the initial development of the belief [Petty and Cacioppo,

Effects of Education 229

1986]. Since perception development induced via the central mode involves considerably more cognitive work than the one through the peripheral mode, perceptions resulted from the central mode would show greater temporal persistence and resistance to counter-persuasion. Under the central mode, the issue-relevant perception schema may be accessed, rehearsed, and manipulated more times, strengthening the interconnections among the components. This renders the schema more internally consistent, accessible, enduring, and resistant than under the peripheral mode. If the existing risk perception was formed through the peripheral mode, the perception is not expected to be tenacious.

This indicates that depending upon how much of the existing perception among the audience is based on issue- relevant examination (central mode) or simple cues (peripheral mode), education could be effective in changing nuclear risk perception. This may have been reflected in some of the early empirical studies showing positive relationship between education and perception change [Crater, 1972; Nealey and Rankin, 1978; Kuklinski, et al., 1982]. For younger generations or people with very little prior information, their perception of nuclear technology is expected to be through the peripheral model. Even for those with a central mode-produced perception, education could have an effect on their perception if the information was appropriately delivered in a proper setting. This means that the avoidance of cognitive dissonance or the contact effects on disliking could possibly be overcome by a properly conducted information exchange that involves issue-relevant central mode examination of the issues.

At the same time, the fact that the stability of a person's attitude toward an object could be affected by other additional factors makes the issue more complicated. These factors include [Erber, et al., 1995] the consistency of people's beliefs and feelings about the issue and conviction. Conviction is known to have three components [Abelson, 1988]: emotional commitment (e.g., how much people think their attitudes express the "real you" and are related to their moral beliefs), cognitive elaboration (e.g., how knowledgeable people are and how long they say they have held their current views), and ego preoccupation (e.g., how important people say the issue is and how often they have thought about it).

It was discussed that trust is an important determinant in public risk perception. The conditions that lead into an erosion of the public's trust in administrative agencies and firms have been identified as following [La Porte and Metlay, 1996; Sjoborg, 2001]: 1) Agency/firm managers or regulators are unable or unwilling to respect the views of vulnerable parties; 2) Agency/firm leaders are unable or unwilling to fulfill promises, to maintain consistent levels of promised operational performance; 3) There is a mismatch in the distribution of benefits and costs associated with realizing the agency's/firm's mission; 4) The risks or hazards associated with significant program failure appear very high and very long lasting; 5) There is a long time lag between taking an action and discovering its success or failure; 6) A relatively high level of technical knowledge is required to operate the production system and/or evaluate its success, risk, and hazards; 7) The agency/firm withholds complete information about difficulties and failures; 8) There is a decline in the competence of agency/firm members relative to the demands oftbe problem/process central to effective operations; 9) There is a decline in an agency's/firm's operating reliability, and; 10) There are clear limits to how much the experts know.

The history of nuclear power generation in the U.S. can be easily identified with many of these conditions, This is particularly true in the government's history of misrnanaging nuclear waste, i.e., the Lyons, Kansas high-level waste (HLW) repository debacle, leaking HLW tanks at the Hartford reservation, near-critical mass development at Hartford 'cribs', and building constructions with uranium mill tailings in Colorado, etc. [Lipschutz, 1980]. Lack of trust through this history was further complicated by global, regional, and procedural concerns [Dunlap, et al., 1993].

In the early days of the environmental awakening era in the U.S., environmental activists played a critical role. Nuclear technology was considered a symbol of commitment to growth, consumption, and high technology which were the nexus of concern among the environmental activists in the 70s [Kasperson et al., 1980]. A network of environmental groups provided the leadership to anti-nuclear organizations at state and local levels. Concerns on the safety of reactor operation, disposal of nuclear waste, and possible diversion of nuclear material capable of use in weapons manufacture, sometimes in the form of horror stories, were the more visible counts in these opposition movements involving the public. Exploiting the wave of distrust in institutions which had been heightened by the Vietnam war and Watergate tragedies along with the timely occurrence of the Three Mile Island accident, the anti- nuclear movement was extremely successful with fear/dread appeals and affected many people through the central information processing mode.

Rothman and Lichter, in their investigation of risk perception in nuclear energy policy, have argued that nuclear energy is "a surrogate issue for more fundamental criticism of U.S. institutions" [Rothman and Lichter, 1987].

230 M.-S. Yim and P. A. Vaganov

Information about even minor negative risk events of a nuclear nature usually initiate significant adverse behavioral responses due to the predominance of negative aspect of all nuclear issues and due to the social amplification of risk, in general.-

However, the past is becoming a history. Social risk perception evolves over time reflecting a society. The nuclear industry has reached a critical turning point with the passage of site approval of the Yucca Mountain repository along with the renewed interest in nuclear energy in the country. The nuclear industry has been very successful in achieving high degrees of safety and performance lately. Many of the current members of the public have never experienced the anti-nuclear movements nor the nuclear risks personally. Some of the perception/attitude of the present day public is expected to be neutral and/or fragile due to the lack of issue-relevant argument elaborations in their cognitive information processing. One of the major remaining challenges to the nuclear industry is "Can they rebuild and sustain the trust?". The on-going Department of Energy's efforts in improving public participation to decision-making in the Office of Environmental Management [Cames, et al., 1998] represent a good example of this.

As a result of the awareness of trust's importance in risk communication and management, the social sciences have attended to the fundamental question of how trust is created and sustained [Cvetkovich, 1999], Kramer, Brewer, and Hanna [Kramer, et al., 1996] draw attention to motivational and affective dimensions of trust that might prompt people to engage in trust behavior, especially in collective action contexts. In particular, they suggest that identification with a group or collective enhances concern for collective processes and outcomes. For organizations facing a serious deficit of trust and confidence, nothing less than a new culture of awareness is called for [La Porte and Metlay, 1996]. Every organizational action must be understood as having a potential impact on an agency's/firm's trustworthiness. Organizations may be forced to make new and heavy investments in time and other resources when actions have to.be transparent.

This study finds that for any educational effort to be effective, the information exchange must affect the value system of the participating individuals. The presented information should be accurate, balanced, and to the point. Elaborations on acceptable risk decision making [Fischhoff, et al., 1981] may be necessary. The educators must understand the differences in the rationalities taken by the experts and the public [Vaganov and Yirn, 2000; Garvin, 2001]. Framing of information with issue-relevance, perceived credibility of the educator, and the method of information exchange will all be important in determining the effectiveness of the educational efforts in the information exchange. This echoes with the following observation [Slovic, et al., 1980b]: "Given an atmosphere of trust in which both experts and lay persons recognize that each group may have something to contribute to the discussion, exchange of information and deepening of perspectives may well be possible." An atmosphere of trust can be generated based on a belief that those with whom you interact will take your interest into account and a sense of confidence that the party trusted is able to empathize with your interests and is competent to act on that knowledge [La Porte and Metlay, 1996]. There has to be also a reasonably high respect/regard for each other along with the sense of technical competence for the experts.

The range of individual and social responses to risk is possibly symptomatic of far more global anxieties about the functioning and future of the world in general [Langford, 2002]. Risk issues and conflicts are not merely a product of a technology-oriented society, but an integral part of its operation. By providing people with a genuine chance to understand, have hope, and believe in the possibility of bringing up change, risk educators can reach the minds of the public.

ACKNOWLEDGMENTS

The authors would like to thank Prof. Baruch Fischhoff of Carnegie Mellon University and Profs. Michael Wogalter and Donald Dudziak of North Carolina State University for their valuable comments. The authors are also very grateful for the insightful comments from the anonymous reviewers.

REFERENCES

Adams, J., Risk, University College London Press, London, UK, 1995.

Abelson, R. P., "Conviction," American Psychologists, 43,267-275, 1988.

Ajzen, I. and M. Fishbein, Understanding Attitudes and Predicting Social Behavior, Prentice-Hall, Englewood Cliffs, NJ, 1980.

Effects of Education 231

Ajzen, I., From Intention to Action: A Theory of Planned Behavior, in J. Kuhl & J. Beckman (Eds.), Action-Control: From Cognition to Behavior, Springer, Heidelberg, Germany, 1985.

Bandura A. and R. Waiters, Social Learning and Personality Development, Holt, Rinehart and Winston, Inc., New York, NY, 1963.

Bandura, A. Social Fotmdations of Thought and Action, 1986. Prentice Hall, Englewoods Cliffs, N J, 1986.

Barber, B., The Logic and limits of Trust, Rutgers University Press, New Brunswick, NJ, 1983.

Baron, R. S., M. L. Inman, C. F. Cao, H. Logan, "Negative Emotion and Supercifial Social Processing," Motivation and Emotion, 4, 323-345, 1992.

Baron, R. S., H. Logan, J. Lilly, M. L. Inman, and M. Brerman, "Negative Emotion and Message Processing," Journal of Experimental Social Psychology, 30, 181-201, 1994.

Boon, S.D. and J. G. Holmes, "The Dynamics of Interpersonal Trust: Resolving Uncertainty in the Face of Risk," in Cooperation and Pro-Social Behavior, Hinde, R. A. and J. Groebel, pp. 167-182, Cambridge University Press, New York, 1991.

Bord, R. J. and R. E. O'Connor, "Determinants of Risk Perceptions of a Hazardous Waste Site," Risk Analysis, 12, 411-416, 1992.

Cairns, R. B., Social Development: The Origins and Plasticity of Interchanges, W.H.Freeman and Company, San Francisco, CA, 1979.

Carnes, S. A., M. Schweitzer, E. B. Peelle, A. K. Wolfe, and J. F. Munro, "Measuring the Success of Public Participation on Environmental Restoration and Waste Management Activities in .the U. S. Department of Energy," Technology in Society, 20, 385-406, 1998.

Chaiken, S., "Heuristic Versus Systematic Information Processing and the Use of Source Versus Message Cues in Persuasion," Journal of Personality and Social Psychology, 39, 5,752-766, 1980.

Chaiken, S., A. Liberman, A. H. Eagly, "Heuristic and Systematic Processing Within and Beyond the Persuasion Context, in Unintended Thought, ed. Uleman, J. S. and J. A. Bargh, pp. 212-252, Guilford, New York, 1989.

Clelland, D. and M. Brermeth, "Student Reactions to Breeder Reactors", Department of Sociology, University of Tennessee, Knoxville, Presented at Annual Meeting of the American Sociological Association, Chicago, 1977.

Cooper, J. and R. H. Fazio, "A New Look at Dissonance Theory," Advances in Experimental Social Psychology, 17, 229-266, 1984.

Covello, V.T., "Communicating Risk in Crisis and Nonerisis Situations,"in: Rao V. Kolluru (ed.), Risk Assessment and Management Handbook, pp.15.3 - 15.44, New York: McGraw-Hill, Inc., 1996.

Crater, H. L. Jr., "The Identification of Factors Influencing College Students' Attitudes Toward Radioactivity," Ph.D. Dissertation, University of Texas at Austin, Austin, TX, 1972.

Cvetkovich, G., "Attribution of Social Trust," in Cvetkovich, G. and R. E. Lofstedt (eds.), Social Trust and the Management of Risk, pp. 53-61, Earthscan, London, UK, 1999.

Douglas, M. and A. Wildavsky, Risk and Culture, University of California Press, Berkeley and Los Angeles, California, 1982.

Dunlap, R. E., M. E. Kraft, and E. A. Rosa, (Eds.), Public Reactions to Nuclear Waste, Duke University Press, Durham and London, 1993.

232 M.-S. Yim and P. A. Vaganov

Erber, M. W., S. D. Hodges, and T. D. Wilson, "Attitude Strength, Attitude Stability, and the Effects of Analyzing Reasons," in Attitude Strength, Antecedents and Consequences, Petty, 1L E. and J. A. Krosniek, (Eds.), Lawrence Erlbaum Associates Publishers, Mahwah, N J, 1995.

Erikson, K., "Toxic Reckoning: Business Faces a New Kind of Fear," Harvard Business Review, January-February, 118-126, 1990.

Fazio, R. H., "How Do Attitudes Guide Behavior?", in Sorrentino, R. M. and E. T. Higgins (eds.), The Handbook of Motivation and Cognition: Foundation of Social Behavior, pp. 204-243, Guilford, New York, 1986.

Festinger, L., A Theory of Co~mitive Dissonance, Stanford University Press, Stanford, CA, 1957.

Fishbein, M., "An Investigation of the Relationships between Beliefs about an Object and the Attitude toward that Object," Human Relations, 16, 233-240, 1963.

Fishbein, M. and I. Ajzen, Belief, Attitude, Intention, and Behavior, Addison Wesley Publishing Co., Reading, MA, 1975.

Fischhoff, B., S. Lichtenstein, P. Slovic, S. L. Derby, and R. Keeney, Acceptable Risk, Cambridge University Press, Cambridge, UK, 1981.

Flynn, J. H., W. J. Bums, C. K. Mertz, and P. Slovic, "Trust as a Determinant of Opposition to a High-Level Radioactive Waste Repository: Analysis ofa' Structural Model," Risk Analysis, 12, 417-429, 1992.

Freudenburg, W. R., "Risky Thinking: Irrational Fears About Risk and Society," The Annals of the American Academy of Political and Social Sciences, 545, May, 44-53, 1996.

Garvin, T., "Analytical Paradigms: The Epistemological Distances between Scientists, Policy Makers, and the Public," Risk Analysis, 21, 3,443-455, 2001.

Greenwald, A. G., Cognitive Learning: Cognitive Response to Persuasion and Attitude Change, In A. G. Greenwald, T. C. Brock, and T. M. Ostrom (Eds.), psychological Foundations of Attitude Change, Academic Press, New York, NY, 1968.

Hensler, D: R. and C. P. Hensler, Evaluating Nuclear Power: Voter Choice on the California Nuclear Energy Initiative, Rand, Santa Monica, California, 1979.

Hovland, C. I., I. L. Janis, H. H. Kelley, Communication and Persuasion, Yale University Press, New Haven, CT, 1953.

Inhaber, H, Energy Risk Assessment, Gordon and Breach Science Publishers, New York, NY, 1982.

Janoff-Bulman, R., Shattered Assumptions, Free Press, New York, 1992.

Johnson B.B. and P. Slovic, "presenting Uncertainty in Health Risk Assessment: Initial Studies on Risk Perception and Trust," Risk Analysis, 15, 4, 485 - 494, 1995.

Johnston W.M. and G. C. L. Davey, "The Psychological Impact of Negative Television News Bulletins: The Catastrophizing of Personal Worries," Risk Abstracts, 14, 4, 6, 1997.

Jungermann, H., H. SchtJtz, H., and M. Thilring, "Mental Models in Risk Assessment: Informing People About Drugs." Risk Analysis, 8, 1, 1988.

Kasperson, R. E., G. Berk, D. Pijawka, A. B. Sharaf, J. Wood, "Public Opposition to Nuclear Energy: Retrospect and Prospect," Science, Technology, & Human Values, 5, 31, 11-23, 1980.

Kasperson, R.E., O. Renn, P. Slovic, "The Social Amplification of Risk: A Conceptual Framework," Risk Analysis, 8, 2, 177-187, 1988.

Effects of Education 233

Kasperson R.E. and J. X. Kasperson, "Hidden Hazards," In: D. G. Mayo and R. D. Hollander (eds.) Acceptable Evudence: Science and Values in Risk Management, Oxford University Press, New York, NY, 1991.

Kemeny, J. G. (Chairman), Report of the President's Commission on the Accident at Three Mile Island, Pergamon Press, New York, NY, 1979.

Khaneman, D. and A. Tversky, "Subjective Probability: A Judgment of Representativeness," Cognitive Psychology, 3,430-454, 1972.

Khaneman, D. and A. Tversky, "Judgment under Uncertainty: Heuristics and Biases," Science, 185, 1124 - 1131, 1974.

Khaneman, D., P. Slovic, and A. Tversky, Judgment under Uncertainty: Heuristics and Biases, Cambridge University Press, Cambridge, UK, 1982.

Kramer, R. M., "Paranoid Cognition in Social Systems: Thinking and Acting in the Shadow of Doubt," Perspectives in Social Psychology Review, 2, 251-275, 1998.

Kramer, R. M., "Trust and Distrust in Organizations: Emerging Perspectives, Enduring Questions," Annual Review of Psychology, 50, 569-598, 1999.

Kuklinski, J., D. Metlay, and W. Kay, "Citizen Knowledge and Choices in the Complex Issue of Nuclear Energy," American Journal of Political Science, 26, 615-642, 1982.

Kunreuther, H. and P. Slovic, "Science, Values, and Risk," The Annals of the American Academy of Political and Social Sciences, 545, May, 116-125, 1996.

Langford, I. H., "An Existential Approach to Risk Perception," Risk Analysis, 22, 1,101-120, 2002.

La Porte, T. R. and D. S. Metlay, "Hazards and Institutional Trustworthiness: Facing a Deficit of Trust," Public Administration Review, 56, 4, 341-347, 1996.

Lipschutz, R. D., Radioactive Waste: Politics, Technology, and Risks, Ballinger Publishing Company, Cambridge, MA, 1980.

Maharik, M. and B. Fischhoff, "Risk Knowledge and Risk Attitudes Regarding Nuclear Energy Sources in Space," Risk Analysis, 13, 3, 345-353, 1993.

Markus, H. and R. B. Zajonc, "The Cognitive Perspective in Social Psychology," in The Handbook of Social Psychology, Vol. 1, G. Lindzey and E. Aronson (ed.), Random House, New York, NY, 1985

Mazur, A., "The Journalist and Technology: Reporting about Love Canal and Three Mile Island," Minerva, 22, 45- 66, 1984.

McGuire, W., "Attitudes and Attitude Change," in The Handbook of Social Psychology, Volume II, Random House, New York, NY, 1985.

Meijnders, A. L., C. J. H. Midden, and H. A. M. Wilke, "The Role of Fear and Threat in Communicating Risk Scenarios and the Need for Actions: Effect of Fear on Information Processing," in Zwerver, S., et al., (eds.), Climate Change Research Evaluation and Policy implications, Proceedings of the International Climate Change Research Conference, pp. 1387-1392, Elsevier, Amsterdam, The Netherlands, 1995.

Meijnders, A. L., C. J. H. Midden, and H. A. M. Wilke, "Role of Negative Emotion in Communication about CO2 Risk," Risk Analysis, 21, 5,955-966, 2001.

Meyer, M. A., "The Nuclear Community and the Public: Cognitive and Cultural Influences on Thinking About Nuclear Risk," Nuclear Safety, 37, 2, 97-108, 1996.

234 M.-S. Yim and P. A. Vaganov

Nealey, S. M., B. D. Melber, and W. L. Rankin, Public Opinion and Nuclear Energy, Lexington Books, Lexington, MA, 1983.

Nealey, S. M. and W. L. Rankin, Nuclear Knowledge and Nuclear Attitudes: Is Ignorance Bliss?, Human Affairs Research Centers, Seattle, WA, 1978.

Neisser, U., Cognitive Psychology, Appleton-Century-Crofts, New York, NY, 1967.

Nelkin, D., Technological Decisions and Democracy: European Experiments in Public Participation, Sage, Berkeley, CA, 1974.

Nisbett, R. and L. Ross, Human Inference: Strategies and Shortcomings of Social Judgment, Prentice-Hall, Englewood Cliffs, NJ, 1980.

Otway, H. J. and M. Fishbein, "The Determinants of Attitude Formation: An Application to Nuclear Power," Research Memorandum RM-76-80, International Institute for Applied Systems Analysis, Laxenburg, Austria, 1976.

Page, A. L. and T. C. Hood, "Attitude Change Among Teachers in U.S. Department of Energy Educational Workshops," The Journal of Social Psychology, 115, 183-188, 1981.

Petty, R. E. and J. T. Cacioppo, Attitude and Persuasion: Classic and Contemporary Approaches, Wm. C. Brown, Dubuque, IA, 1981.

Petty, R. E. and J. T. Cacioppo, Communication and Persuasion: Central and Peripheral Routes to Attitude Change, Springer-Verlag, New York, NY, 1986a.

Petty, R. E. and J. T. Cacioppo, "The Elaboration Likelihood Model of Persuasion," Advances in Experimental Social Psychology, 19, 123-204, 1986b.

Petty, R. E., D. T. Wegener, and L. R. Fabrigar, "Attitudes and Attitude Change," Annual Review of Psychology, 48, 609-647, 1997.

PEW Research Center for the People and the Press, Trust and Citizen Engagement in Metropolitan Philadelphia: A Case Study, PEW, Washington, DC, 1996.

Rappaport R.A., In: H. Kunreuther and P. Slovic (eds.), Challenges in Risk Assessment and Risk Management, Thousand Oaks: SAGE Periodical Press, London, UK, 1996.

Renn O., "Risk Perception and Risk Management: A Review. Part I: Risk Perception," Risk Abstracts, 7, 1, 1 - 9, 1990.

Reun, O., "Risk Communication and the Social Amplification of Risk," in Communicating Risks to thePublic, edited by R. Kasperson and P.J.M. Stallen, Kluwer Academic Publishers, 1991.

Renn O., T. Webler, and Kastenholz, "Perception of Uncertainty: Lessons for Risk Management and Communication," In: Virginia H. Sublet, Vincent T. Covello, and Tim L. Tinker (eds.) Scientific Uncertainty and Its Influence on the Public Communication Process, Kluwer Academic Press, Dordrecht,1996.

Roberts, R., "Public Acceptance of Nuclear Energy - The Government's Role," Speech to the Atomic Industrial Forum, San Francisco, 29, November 1975.

Robinson, S. L., "Trust and Breach of the Psychological Contract," Admin. Sci. Q., 41,574-599, 1996.

Roder, W., "Attitudes and Knowledge on the Topeka Flood Plain," In Papers on Flood Problems. Edited by G. F. White. Research Paper 70, Department of Geography, University of Chicago, Chicago, IL, 1961.

Rogovin, M. (Director), Three Mile Island, A Report to the Commissioners and to the Public, report of Special Inquiry Group, U.S. Nuclear Regulatory Commission, Washington, DC, 1980.

Effects of Education 235

Ross, L., "The "Intuitive Scientist" Formulation and Its Developmental Implications," In J. H. Flavell & L. Ross (Eds.), Social Cognitive Development: Frontiers and Possible Futures, Cambridge University Press, New York, NY, 1981.

Rothman S. and S. R. Lichter, "Elite Ideology and Risk Perception in Nuclear Energy Policy," American Political Science Review, 81, 81, 1987.

Showers and Shrigley, "Effects of Knowledge and Persuasion on High-School Students' Attitudes Toward Nuclear Power Plant," Journal of Research in Science Teaching, 32, 1, 29-43, 1995.

Shrader-Frechette, K.S., BuryLag Uncertainty: Risk and the Case against Geological Disposal of Nuclear Waste, University of California Press, Berkeley, CA, 1993.

Sjoborg, L, "Limits of Knowledge and the Limited Importance of Trust," Risk Analysis, 21, 1,189-198, 2001.

Slovic, P., B. Fischhoff, and S. Lichtenstein, "Rating the Risks," Environment, 21, 3, 36 - 39, 1979.

Slovic, P., B. Fischoff, and S. Lichtenstein, "Facts and Fears," in Societal Risk Assessment, R. C. Schwing and W. A. Albers (editors), Plenum Press, New York, NY, 1980a.

Slovic, P., B. Fischhoff, and S. Lichtenstein, "Informing People about Risk," in Product Labeling and Health Risks, L. A. Morris, M. B. Mazis, and I. Barofsky (Eds.), Banbury Report 6, Cold Spring Harbor Laboratory, 1980b.

Slovic, P., B. Fischhoff, and S. Lichtenstein, "Perception on Acceptability of Risk from Energy Systems," in A. Baum and J. E. Singer (eds.), Advances in Environmental Psychology, Vol.3, Lawrence Erlbaum, Hillsdale, NJ, 1981.

Slovic, P., "Perceived Risk, Trust, and Democracy," Risk Analysis, 13, 6, 675-682, 1993.

Slovic, P., "Risk Perception and Trust," In: Molak, V. (ed.) Fundamentals of Risk Analysis and Risk Management, Lewis Publishers, Boca Raton, FL, 1997.

Slovic, P., "The Risk Game," Reliability Engineering and System Safety, 59, 73-77, 1998.

Sundstrom, E.P, E. J. Costirniris, R. C. DeVault, D. A. Powell, J. W. Lounsbury, T. J. Mattingly, Jr., E. M. Passino, and E. Peelle, "Citizens' Views About the Proposed Hartsville Nuclear Power Plant: A Survey of Residents' Perceptions in August 1975," Oak Ridge National Laboratory, Oak Ridge, TN, 1977.

Tesser, A. and D. R. Shaffer, "Attitudes and Attitude Change," Annual Review of Psychology, 48, 609-647, 1990.

Vaganov, P. A. and M.-S. Yim, "Societal Risk Communication and Nuclear Waste Disposal," lnt. J. Risk Assessment and Management, 1, l&2, 20-41, 2000.

van der Pligt, J., J. van der Linden, and P. Ester, "Attitudes to Nuclear Energy: Beliefs, Values and False Consensus," Journal of Environmental Psychology, 2, 221-231, 1982.

Vlek, C. and G. Keren, "Behavioral Decision Theory and Environmental Risk Management: Assessment and Resolution of Four 'Survival' Dilemmas," Acta Psychology, 80, 249-278, 1992.

Wilder, D. A. and P. Shapiro, "Effects of Anxiety on Impression Formation in a Group Context: An Anxiety- Assimilation Hypothesis," Journal of Experimental Social Psychology, 25, 481-499, 1989.

Wildavsky A. and K. Dake, "Theories of Risk Perception: Who Fears What and Why?" Daedalus, 119 4, 41 - 60, 1990.

Woo, T. O. and C. H. Castore, "Expectancy-Value and Selective Determinants of Attitudes Toward a Nuclear Power Plant," Journal of Applied Social Psychology, 10, 224-234, 1980.